In Search of a Data Transformation That Accelerates Neural Field Training
- Title
- In Search of a Data Transformation That Accelerates Neural Field Training
- Authors
- 서준원
- Date Issued
- 2024
- Publisher
- 포항공과대학교
- Abstract
- Neural field is an emerging paradigm in data representation, wherein a neural network is trained to closely approximate a given signal. Its broader application is impeded by the encoding speed; requires neural fields necessitates extensive neural network overfitting, requiring numerous Stochastic Gradient Descent (SGD) steps to achieve targeted fidelity. In this paper, I explore the effects of data transformations on neural field training efficiency, with a focus on the impact of pixel location permutations on SGD convergence. Counterintuitively, it is observed that random pixel permutation significantly boosts the training process. To elucidate this phenomenon, I study neural field training through PSNR curves, loss landscapes, and loss distribution patterns. The findings indicate that random pixel permutations remove the easy-to-fit patterns. While these patterns ease early optimization, they pose challenges in capturing the finer details of the signal later in the training.
- URI
- http://postech.dcollection.net/common/orgView/200000736044
https://oasis.postech.ac.kr/handle/2014.oak/123343
- Article Type
- Thesis
- Files in This Item:
- There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.