This paper introduces a novel paradigm termed Neural Metamorphosis (NeuMeta), which aims to build self-morphable neural networks.
Unlike traditional methods that rely on separate models for different network tasks or sizes, NeuMeta directly learns the continuous weight manifold of neural networks. Once trained, we can sample weights for any-sized network directly from the manifold, even for previously unseen config- urations, without retraining. To achieve this ambitious goal,NeuMeta trains neural implicit functions as hypernetworks. They accept coordinates within the model space as input, and generate corresponding weight values on the manifold. In other words, the implicit function is learned in a way, that the predicted weights is well-performed across various models sizes. In training those models, we notice that, the final perfor- mance closely relates on smoothness of the learned manifold. In pursuit of enhancing this smoothness, we employ two strategies. First, we permute weight matrices to achieve intra-model smoothness, by solving the Shortest Hamiltonian Path problem. Besides, we add a noise on the input coordinates when training the implicit function, ensuring models with various sizes shows consistent outputs. As such, NeuMeta shows promising results in synthesizing parameters for various network configurations. Our extensive tests in image classification, semantic segmentation, and image generation reveal that NeuMeta sustains full-size performance even at a 75% compression rate.
This paper also draw inspiration from the a series works on continuous neural network and fitting INR to neural network
Integral Neural Networks
Kirill Solodskikh, Azim Kurbanov, Ruslan Aydarkhanov, Irina Zhelavskaya, Yury Parfenov, Dehua Song, Stamatios Lefkimmiatis
CVPR 2023
Continuous Neural Networks
Nicolas Le Roux, Yoshua Bengio
AISTATS 2007
NeRN: Learning Neural Representations for Neural Networks
Maor Ashkenazi, Zohar Rimon, Ron Vainshtein, Shir Levi, Elad Richardson, Pinchas Mintz, Eran Treister
ICLR 2023
@article{yang2023neumeta,
author = {Xingyi Yang, inchao Wang},
title = {Neural Metamorphosis},
journal = {ECCV},
year = {2024},
}