Sliced Cramer Synaptic Consolidation for Preserving Deeply Learned Representations

Soheil Kolouri, Nicholas A. Ketz, Andrea Soltoggio, Praveen K. Pilly

Keywords: capacity, catastrophic forgetting, incremental learning, memory, unsupervised

Thurs Session 4 (17:00-19:00 GMT) [Live QA] [Cal]
Thurs Session 5 (20:00-22:00 GMT) [Live QA] [Cal]
Thursday: Continual Learning and Few Shot Learning

Abstract: Deep neural networks suffer from the inability to preserve the learned data representation (i.e., catastrophic forgetting) in domains where the input data distribution is non-stationary, and it changes during training. Various selective synaptic plasticity approaches have been recently proposed to preserve network parameters, which are crucial for previously learned tasks while learning new tasks. We explore such selective synaptic plasticity approaches through a unifying lens of memory replay and show the close relationship between methods like Elastic Weight Consolidation (EWC) and Memory-Aware-Synapses (MAS). We then propose a fundamentally different class of preservation methods that aim at preserving the distribution of internal neural representations for previous tasks while learning a new one. We propose the sliced Cram\'{e}r distance as a suitable choice for such preservation and evaluate our Sliced Cramer Preservation (SCP) algorithm through extensive empirical investigations on various network architectures in both supervised and unsupervised learning settings. We show that SCP consistently utilizes the learning capacity of the network better than online-EWC and MAS methods on various incremental learning tasks.

Similar Papers

Uncertainty-guided Continual Learning with Bayesian Neural Networks
Sayna Ebrahimi, Mohamed Elhoseiny, Trevor Darrell, Marcus Rohrbach,
Efficient and Information-Preserving Future Frame Prediction and Beyond
Wei Yu, Yichao Lu, Steve Easterbrook, Sanja Fidler,