Published on Tue Feb 09 2021

RANP: Resource Aware Neuron Pruning at Initialization for 3D CNNs

Zhiwei Xu, Thalaiyasingam Ajanthan, Vibhav Vineet, Richard Hartley
0
0
0
Abstract

Although 3D Convolutional Neural Networks are essential for most learning based applications involving dense 3D data, their applicability is limited due to excessive memory and computational requirements. Compressing such networks by pruning therefore becomes highly desirable. However, pruning 3D CNNs is largely unexplored possibly because of the complex nature of typical pruning algorithms that embeds pruning into an iterative optimization paradigm. In this work, we introduce a Resource Aware Neuron Pruning (RANP) algorithm that prunes 3D CNNs at initialization to high sparsity levels. Specifically, the core idea is to obtain an importance score for each neuron based on their sensitivity to the loss function. This neuron importance is then reweighted according to the neuron resource consumption related to FLOPs or memory. We demonstrate the effectiveness of our pruning method on 3D semantic segmentation with widely used 3D-UNets on ShapeNet and BraTS'18 datasets, video classification with MobileNetV2 and I3D on UCF101 dataset, and two-view stereo matching with Pyramid Stereo Matching (PSM) network on SceneFlow dataset. In these experiments, our RANP leads to roughly 50%-95% reduction in FLOPs and 35%-80% reduction in memory with negligible loss in accuracy compared to the unpruned networks. This significantly reduces the computational resources required to train 3D CNNs. The pruned network obtained by our algorithm can also be easily scaled up and transferred to another dataset for training.

Tue Oct 06 2020
Artificial Intelligence
RANP: Resource Aware Neuron Pruning at Initialization for 3D CNNs
3D Convolutional Neural Networks (CNNs) are essential for most learning based applications involving dense 3D data. However, their applicability is limited due to excessive memory and computational requirements. We introduce a Resource Aware Neuron Pruning (RANP) algorithm.
0
0
0
Fri Jun 11 2021
Computer Vision
HR-NAS: Searching Efficient High-Resolution Neural Architectures with Lightweight Transformers
High-resolution representations (HR) are essential for dense prediction tasks such as segmentation, detection, and pose estimation. Learning HR representations is typically ignored in previous Neural Architecture Search(NAS) methods. This work proposes a novel NAS method, called HR-NAS, which is able to find efficient and accurate networks for different tasks.
1
0
0
Tue Feb 12 2019
Computer Vision
Fast-SCNN: Fast Semantic Segmentation Network
The encoder-decoder framework is state-of-the-art for offline semantic image segmentation. Our network combines spatial detail at high resolution with deep features extracted at lower resolution, yielding an accuracy of 68.0%mean intersection over union at 123.5 frames per second.
0
0
0
Mon Oct 05 2020
Computer Vision
Joint Pruning & Quantization for Extremely Sparse Neural Networks
We investigate pruning and quantization for deep neural networks. Our goal is to achieve extremely high sparsity for quantized networks to enable implementation on low cost and low power accelerator hardware. Almost 99% of memory demand can be cut while hardware costs can be reduced.
0
0
0
Tue Mar 31 2020
Computer Vision
Real-Time Semantic Segmentation via Auto Depth, Downsampling Joint Decision and Feature Aggregation
AutoRTNet is a framework to search for the optimal building blocks of networks automatically. The network depth, downsampling strategy, and feature aggregation way are still set in advance by trial and error. AutoRTNet achieves 73.9% mIoU on the Cityscapes test
0
0
0
Sun Jan 17 2021
Machine Learning
KCP: Kernel Cluster Pruning for Dense Labeling Neural Networks
Pruning is a promising technique used to compress and accelerate neural networks. The prevailing filter channel pruning method removes the entire filter channel. In this study, we proposed kernel cluster pruning (KCP) to prune dense labeling networks.
0
0
0