Probabilistic Connection Importance Inference and Lossless Compression of Deep Neural Networks

Xin Xing, Long Sha, Pengyu Hong, Zuofeng Shang, Jun S. Liu

Keywords: compression, pruning

Mon Session 1 (05:00-07:00 GMT) [Live QA] [Cal]
Mon Session 3 (12:00-14:00 GMT) [Live QA] [Cal]

Abstract: Deep neural networks (DNNs) can be huge in size, requiring a considerable a mount of energy and computational resources to operate, which limits their applications in numerous scenarios. It is thus of interest to compress DNNs while maintaining their performance levels. We here propose a probabilistic importance inference approach for pruning DNNs. Specifically, we test the significance of the relevance of a connection in a DNN to the DNN’s outputs using a nonparemtric scoring testand keep only those significant ones. Experimental results show that the proposed approach achieves better lossless compression rates than existing techniques

Similar Papers

Skip Connections Matter: On the Transferability of Adversarial Examples Generated with ResNets
Dongxian Wu, Yisen Wang, Shu-Tao Xia, James Bailey, Xingjun Ma,
Shifted and Squeezed 8-bit Floating Point format for Low-Precision Training of Deep Neural Networks
Leopold Cambier, Anahita Bhiwandiwalla, Ting Gong, Oguz H. Elibol, Mehran Nekuii, Hanlin Tang,