DeRy pipeline

Metamorphosis is a biological process that allows organisms to change their forms. We use it to describe the process of transforming a neural network into another.

Neural Metamorphosis

ECCV 2024

National University of Singapore
NeuMeta pipeline

NeuMeta encapsulates a range of neural networks into a singular Implicit Neural Representation (INR). Once trained, it can produce weights for diverse networks, enabling them to adapt flexibly to various requirements.

Abstract

This paper introduces a novel paradigm termed Neural Metamorphosis (NeuMeta), which aims to build self-morphable neural networks.

Unlike traditional methods that rely on separate models for different network tasks or sizes, NeuMeta directly learns the continuous weight manifold of neural networks. Once trained, we can sample weights for any-sized network directly from the manifold, even for previously unseen config- urations, without retraining. To achieve this ambitious goal,NeuMeta trains neural implicit functions as hypernetworks. They accept coordinates within the model space as input, and generate corresponding weight values on the manifold. In other words, the implicit function is learned in a way, that the predicted weights is well-performed across various models sizes. In training those models, we notice that, the final perfor- mance closely relates on smoothness of the learned manifold. In pursuit of enhancing this smoothness, we employ two strategies. First, we permute weight matrices to achieve intra-model smoothness, by solving the Shortest Hamiltonian Path problem. Besides, we add a noise on the input coordinates when training the implicit function, ensuring models with various sizes shows consistent outputs. As such, NeuMeta shows promising results in synthesizing parameters for various network configurations. Our extensive tests in image classification, semantic segmentation, and image generation reveal that NeuMeta sustains full-size performance even at a 75% compression rate.

Related Resource

This paper also draw inspiration from the a series works on continuous neural network and fitting INR to neural network

BibTeX

@article{yang2023neumeta,
  author    = {Xingyi Yang, inchao Wang},
  title     = {Neural Metamorphosis},
  journal   = {ECCV},
  year      = {2024},
}
-->