seminars

Shubhan Patni presents Fourier Features Let Networks Learn High Frequency Functions in Low Dimensional Domains

On 2022-01-13 11:00 at https://feectu.zoom.us/j/98555944426
"Fourier Features Let Networks Learn High Frequency Functions in Low
Dimensional
Domains", Matthew Tancik, Pratul P. Srinivasan, Ben Mildenhall, Sara
Fridovich-Keil, Nithin Raghavan, Utkarsh Singhal, Ravi Ramamoorthi, Jonathan T.
Barron, Ren Ng, NeurIPS 2020

Paper url:
https://proceedings.neurips.cc/paper/2020/hash/55053683268957697aa39fba6f231c68-Abstract.html

Paper abstract:
We show that passing input points through a simple Fourier feature mapping
enables a multilayer perceptron (MLP) to learn high-frequency functions in
low-dimensional problem domains. These results shed light on recent advances in
computer vision and graphics that achieve state-of-the-art results by using
MLPs
to represent complex 3D objects and scenes. Using tools from the neural tangent
kernel (NTK) literature, we show that a standard MLP has impractically slow
convergence to high frequency signal components. To overcome this spectral
bias,
we use a Fourier feature mapping to transform the effective NTK into a
stationary kernel with a tunable bandwidth. We suggest an approach for
selecting
problem-specific Fourier features that greatly improves the performance of MLPs
for low-dimensional regression tasks relevant to the computer vision and
graphics communities.

See the page of reading groups
http://cmp.felk.cvut.cz/~toliageo/rg/index.html

Responsible person: Petr Pošík