Neural Stored-program Memory

Hung Le, Truyen Tran, Svetha Venkatesh

Keywords: fewshot learning, memory, memory augmented neural networks

Tues Session 1 (05:00-07:00 GMT) [Live QA] [Cal]
Tues Session 2 (08:00-10:00 GMT) [Live QA] [Cal]

Abstract: Neural networks powered with external memory simulate computer behaviors. These models, which use the memory to store data for a neural controller, can learn algorithms and other complex tasks. In this paper, we introduce a new memory to store weights for the controller, analogous to the stored-program memory in modern computer architectures. The proposed model, dubbed Neural Stored-program Memory, augments current memory-augmented neural networks, creating differentiable machines that can switch programs through time, adapt to variable contexts and thus fully resemble the Universal Turing Machine. A wide range of experiments demonstrate that the resulting machines not only excel in classical algorithmic problems, but also have potential for compositional, continual, few-shot learning and question-answering tasks.

Similar Papers

MEMO: A Deep Network for Flexible Combination of Episodic Memories
Andrea Banino, Adrià Puigdomènech Badia, Raphael Köster, Martin J. Chadwick, Vinicius Zambaldi, Demis Hassabis, Caswell Barry, Matthew Botvinick, Dharshan Kumaran, Charles Blundell,
Meta-Learning Deep Energy-Based Memory Models
Sergey Bartunov, Jack Rae, Simon Osindero, Timothy Lillicrap,
Extreme Tensoring for Low-Memory Preconditioning
Xinyi Chen, Naman Agarwal, Elad Hazan, Cyril Zhang, Yi Zhang,
Memory-Based Graph Networks
Amir Hosein Khasahmadi, Kaveh Hassani, Parsa Moradi, Leo Lee, Quaid Morris,