skip to main content
OSTI.GOV title logo U.S. Department of Energy
Office of Scientific and Technical Information

Title: Towards Efficient Convolutional Neural Networks Through Low-Error Filter Saliency Estimation

Abstract

Filter saliency based channel pruning is a state-of-the-art method for deep convolutional neural network compression and acceleration. This channel pruning method ranks the importance of individual filter by estimating its impact of each filter’s removal on the training loss, and then remove the least important filters and fine-tune the remnant network. In this work, we propose a systematic channel pruning method that significantly reduces the estimation error of filter saliency. Different from existing approaches, our method largely reduces the magnitude of parameters in a network by introducing alternating direction method of multipliers (ADMM) into the pre-training procedure. Therefore, the estimation of filter saliency based on Taylor expansion is significantly improved. Extensive experiments with various benchmark network architectures and datasets demonstrate that the proposed method has a much improved unimportant filter selection capability and outperform state-of-the-art channel pruning method.

Authors:
 [1];  [2];  [3]; ORCiD logo [4]
  1. University of Tennessee, Knoxville (UTK)
  2. The University of Tennessee, Knoxville
  3. Sun Yat-Sen University, Guangzhou, China
  4. ORNL
Publication Date:
Research Org.:
Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)
Sponsoring Org.:
USDOE
OSTI Identifier:
1561628
DOE Contract Number:  
AC05-00OR22725
Resource Type:
Conference
Resource Relation:
Journal Volume: 11671; Conference: 16th Pacific Rim International Conference on Artificial Intelligence (PRICAI 2019) - cuvu, , Fiji - 8/29/2019 12:00:00 AM-8/31/2019 12:00:00 AM
Country of Publication:
United States
Language:
English

Citation Formats

Wang, Zi, Li, Chengcheng, Wang, Xiangyang, and Wang, Dali. Towards Efficient Convolutional Neural Networks Through Low-Error Filter Saliency Estimation. United States: N. p., 2019. Web. doi:10.1007/978-3-030-29911-8_20.
Wang, Zi, Li, Chengcheng, Wang, Xiangyang, & Wang, Dali. Towards Efficient Convolutional Neural Networks Through Low-Error Filter Saliency Estimation. United States. doi:10.1007/978-3-030-29911-8_20.
Wang, Zi, Li, Chengcheng, Wang, Xiangyang, and Wang, Dali. Thu . "Towards Efficient Convolutional Neural Networks Through Low-Error Filter Saliency Estimation". United States. doi:10.1007/978-3-030-29911-8_20. https://www.osti.gov/servlets/purl/1561628.
@article{osti_1561628,
title = {Towards Efficient Convolutional Neural Networks Through Low-Error Filter Saliency Estimation},
author = {Wang, Zi and Li, Chengcheng and Wang, Xiangyang and Wang, Dali},
abstractNote = {Filter saliency based channel pruning is a state-of-the-art method for deep convolutional neural network compression and acceleration. This channel pruning method ranks the importance of individual filter by estimating its impact of each filter’s removal on the training loss, and then remove the least important filters and fine-tune the remnant network. In this work, we propose a systematic channel pruning method that significantly reduces the estimation error of filter saliency. Different from existing approaches, our method largely reduces the magnitude of parameters in a network by introducing alternating direction method of multipliers (ADMM) into the pre-training procedure. Therefore, the estimation of filter saliency based on Taylor expansion is significantly improved. Extensive experiments with various benchmark network architectures and datasets demonstrate that the proposed method has a much improved unimportant filter selection capability and outperform state-of-the-art channel pruning method.},
doi = {10.1007/978-3-030-29911-8_20},
journal = {},
issn = {0302--9743},
number = ,
volume = 11671,
place = {United States},
year = {2019},
month = {8}
}

Conference:
Other availability
Please see Document Availability for additional information on obtaining the full-text document. Library patrons may search WorldCat to identify libraries that hold this conference proceeding.

Save / Share: