Meta-Learning Sparse Implicit Neural Representations
Meta-Learning Sparse Implicit Neural Representations
Information
2022년 03월 04일 (금) | 발표자: 박준현
Overview
This paper proposes a novel training framework in Implicit Neural Representation (INR), called Meta-SparseINR.
This is a first approach in INR studies to combine pruning(magnitued-based) and meta-learning(MAML).
By alternatively repeating meta-learning and pruning, the model can efficiently learn the initial parameters of INR for multiple signals, and the initial model can fit new image in few gradient steps.
Also, they showed better performance in 2D image regression than the baseline methods.
Replay on YouTube