BlockSwap: Fisher-guided Block Substitution for Network Compression on a Budget

Jack Turner, Elliot J. Crowley, Michael O'Boyle, Amos Storkey, Gavin Gray

Keywords: capacity, cnn, compression, imagenet, model compression, network compression, neural architecture search

Wed Session 2 (08:00-10:00 GMT) [Live QA] [Cal]
Wed Session 3 (12:00-14:00 GMT) [Live QA] [Cal]

Abstract: The desire to map neural networks to varying-capacity devices has led to the development of a wealth of compression techniques, many of which involve replacing standard convolutional blocks in a large network with cheap alternative blocks. However, not all blocks are created equally; for a required compute budget there may exist a potent combination of many different cheap blocks, though exhaustively searching for such a combination is prohibitively expensive. In this work, we develop BlockSwap: a fast algorithm for choosing networks with interleaved block types by passing a single minibatch of training data through randomly initialised networks and gauging their Fisher potential. These networks can then be used as students and distilled with the original large network as a teacher. We demonstrate the effectiveness of the chosen networks across CIFAR-10 and ImageNet for classification, and COCO for detection, and provide a comprehensive ablation study of our approach. BlockSwap quickly explores possible block configurations using a simple architecture ranking system, yielding highly competitive networks in orders of magnitude less time than most architecture search techniques (e.g. under 5 minutes on a single GPU for CIFAR-10).

Similar Papers

Adversarial AutoAugment
Xinyu Zhang, Qiang Wang, Jian Zhang, Zhao Zhong,
Batch-shaping for learning conditional channel gated networks
Babak Ehteshami Bejnordi, Tijmen Blankevoort, Max Welling,
AtomNAS: Fine-Grained End-to-End Neural Architecture Search
Jieru Mei, Yingwei Li, Xiaochen Lian, Xiaojie Jin, Linjie Yang, Alan Yuille, Jianchao Yang,