skip to main content
OSTI.GOV title logo U.S. Department of Energy
Office of Scientific and Technical Information

Title: Composability-Centered Convolutional Neural Network Pruning

Abstract

This work studies the composability of the building blocks ofstructural CNN models (e.g., GoogleLeNet and Residual Networks) in thecontext of network pruning. We empirically validate that a networkcomposed of pre-trained building blocks (e.g. residual blocks andInception modules) not only gives a better initial setting fortraining, but also allows the training process to converge at asignificantly higher accuracy in much less time. Based on thatinsight, we propose a {\em composability-centered} design for CNNnetwork pruning. Experiments show that this new scheme shortens theconfiguration process in CNN network pruning by up to 186.8X forResNet-50 and up to 30.2X for Inception-V3, and meanwhile, the modelsit finds that meet the accuracy requirement are significantly morecompact than those found by default schemes.

Authors:
 [1];  [1];  [2];  [2]
  1. North Carolina State University
  2. ORNL
Publication Date:
Research Org.:
Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)
Sponsoring Org.:
USDOE
OSTI Identifier:
1427608
Report Number(s):
ORNL/TM-2018/777
DOE Contract Number:
AC05-00OR22725
Resource Type:
Technical Report
Country of Publication:
United States
Language:
English

Citation Formats

Shen, Xipeng, Guan, Hui, Lim, Seung-Hwan, and Patton, Robert M. Composability-Centered Convolutional Neural Network Pruning. United States: N. p., 2018. Web. doi:10.2172/1427608.
Shen, Xipeng, Guan, Hui, Lim, Seung-Hwan, & Patton, Robert M. Composability-Centered Convolutional Neural Network Pruning. United States. doi:10.2172/1427608.
Shen, Xipeng, Guan, Hui, Lim, Seung-Hwan, and Patton, Robert M. Thu . "Composability-Centered Convolutional Neural Network Pruning". United States. doi:10.2172/1427608. https://www.osti.gov/servlets/purl/1427608.
@article{osti_1427608,
title = {Composability-Centered Convolutional Neural Network Pruning},
author = {Shen, Xipeng and Guan, Hui and Lim, Seung-Hwan and Patton, Robert M.},
abstractNote = {This work studies the composability of the building blocks ofstructural CNN models (e.g., GoogleLeNet and Residual Networks) in thecontext of network pruning. We empirically validate that a networkcomposed of pre-trained building blocks (e.g. residual blocks andInception modules) not only gives a better initial setting fortraining, but also allows the training process to converge at asignificantly higher accuracy in much less time. Based on thatinsight, we propose a {\em composability-centered} design for CNNnetwork pruning. Experiments show that this new scheme shortens theconfiguration process in CNN network pruning by up to 186.8X forResNet-50 and up to 30.2X for Inception-V3, and meanwhile, the modelsit finds that meet the accuracy requirement are significantly morecompact than those found by default schemes.},
doi = {10.2172/1427608},
journal = {},
number = ,
volume = ,
place = {United States},
year = {Thu Feb 01 00:00:00 EST 2018},
month = {Thu Feb 01 00:00:00 EST 2018}
}

Technical Report:

Save / Share: