24420

Non-Parametric Adaptive Network Pruning

Lin Mingbao, Ji Rongrong, Li Shaojie, Wang Yan, Wu Yongjian, Huang Feiyue, Ye Qixiang
Media Analytics and Computing Laboratory, Department of Artificial Intelligence, School of Informatics, Xiamen University, Xiamen 361005, China
arXiv:2101.07985 [cs.CV], (20 Jan 2021)

@misc{mingbao2021nonparametric,

   title={Non-Parametric Adaptive Network Pruning},

   author={Lin Mingbao and Ji Rongrong and Li Shaojie and Wang Yan and Wu Yongjian and Huang Feiyue and Ye Qixiang},

   year={2021},

   eprint={2101.07985},

   archivePrefix={arXiv},

   primaryClass={cs.CV}

}

Popular network pruning algorithms reduce redundant information by optimizing hand-crafted parametric models, and may cause suboptimal performance and long time in selecting filters. We innovatively introduce non-parametric modeling to simplify the algorithm design, resulting in an automatic and efficient pruning approach called EPruner. Inspired by the face recognition community, we use a message passing algorithm Affinity Propagation on the weight matrices to obtain an adaptive number of exemplars, which then act as the preserved filters. EPruner breaks the dependency on the training data in determining the "important" filters and allows the CPU implementation in seconds, an order of magnitude faster than GPU based SOTAs. Moreover, we show that the weights of exemplars provide a better initialization for the fine-tuning. On VGGNet-16, EPruner achieves a 76.34%-FLOPs reduction by removing 88.80% parameters, with 0.06% accuracy improvement on CIFAR-10. In ResNet-152, EPruner achieves a 65.12%-FLOPs reduction by removing 64.18% parameters, with only 0.71% top-5 accuracy loss on ILSVRC-2012.
No votes yet.
Please wait...

* * *

* * *

HGPU group © 2010-2024 hgpu.org

All rights belong to the respective authors

Contact us: