Composability-Centered Convolutional Neural Network Pruning
- North Carolina State Univ., Raleigh, NC (United States)
- Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)
This work studies the composability of the building blocks of structural CNN models (e.g., GoogleLeNet and Residual Networks) in the context of network pruning. We empirically validate that a network composed of pre-trained building blocks (e.g. residual blocks and Inception modules) not only gives a better initial setting for training, but also allows the training process to converge at a significantly higher accuracy in much less time. Based on that insight, we propose a composability-centered design for CNN network pruning. Experiments show that this new scheme shortens the configuration process in CNN network pruning by up to 186.8X for ResNet-50 and up to 30.2X for Inception-V3, and meanwhile, the models it finds that meet the accuracy requirement are significantly more compact than those found by default schemes.
- Research Organization:
- Oak Ridge National Laboratory (ORNL), Oak Ridge, TN (United States)
- Sponsoring Organization:
- USDOE
- DOE Contract Number:
- AC05-00OR22725
- OSTI ID:
- 1427608
- Report Number(s):
- ORNL/TM-2018/777
- Country of Publication:
- United States
- Language:
- English
Similar Records
Wootz: a compiler-based framework for fast CNN pruning via composability
A Novel Pruning Method for Convolutional Neural Networks Based off Identifying Critical Filters
Filter pruning of Convolutional Neural Networks for text classification: A case study of cancer pathology report comprehension
Conference
·
Sat Jun 01 00:00:00 EDT 2019
·
OSTI ID:1543204
A Novel Pruning Method for Convolutional Neural Networks Based off Identifying Critical Filters
Conference
·
Mon Jul 01 00:00:00 EDT 2019
·
OSTI ID:1557493
Filter pruning of Convolutional Neural Networks for text classification: A case study of cancer pathology report comprehension
Conference
·
Wed Feb 28 23:00:00 EST 2018
·
OSTI ID:1468241