Abstract
Many concepts have been proposed for meta learning with neural networks (NNs), e.g., NNs that learn to reprogram fast weights, Hebbian plasticity, learned learning rules, and meta recurrent NNs. Our Variable Shared Meta Learning (VSML) unifies the above and demonstrates that simple weight-sharing and sparsity in an NN is sufficient to express powerful learning algorithms (LAs) in a reusable fashion. A simple implementation of VSML where the weights of a neural network are replaced by tiny LSTMs allows for implementing the backpropagation LA solely by running in forward-mode. It can even meta learn new LAs that differ from online backpropagation and generalize to datasets outside of the meta training distribution without explicit gradient calculation. Introspection reveals that our meta learned LAs learn through fast association in a way that is qualitatively different from gradient descent.
Original language | English (US) |
---|---|
Title of host publication | Advances in Neural Information Processing Systems 34 - 35th Conference on Neural Information Processing Systems, NeurIPS 2021 |
Editors | Marc'Aurelio Ranzato, Alina Beygelzimer, Yann Dauphin, Percy S. Liang, Jenn Wortman Vaughan |
Publisher | Neural information processing systems foundation |
Pages | 14122-14134 |
Number of pages | 13 |
ISBN (Electronic) | 9781713845393 |
State | Published - 2021 |
Event | 35th Conference on Neural Information Processing Systems, NeurIPS 2021 - Virtual, Online Duration: Dec 6 2021 → Dec 14 2021 |
Publication series
Name | Advances in Neural Information Processing Systems |
---|---|
Volume | 17 |
ISSN (Print) | 1049-5258 |
Conference
Conference | 35th Conference on Neural Information Processing Systems, NeurIPS 2021 |
---|---|
City | Virtual, Online |
Period | 12/6/21 → 12/14/21 |
Bibliographical note
Publisher Copyright:© 2021 Neural information processing systems foundation. All rights reserved.
ASJC Scopus subject areas
- Computer Networks and Communications
- Information Systems
- Signal Processing