SkipConv: Skip Convolution for Computationally Efficient Deep CNNs

Pravendra Singh, Vinay P. Namboodiri

Research output: Chapter or section in a book/report/conference proceedingChapter in a published conference proceeding

6 Citations (SciVal)

Abstract

Convolution operation in deep convolutional neural networks is the most computationally expensive as compared to other operations. Most of the model computation (FLOPS) in the deep architecture belong to convolution operation. In this paper, we are proposing a novel skip convolution operation that employs significantly fewer computation as compared to the traditional one without sacrificing model accuracy. Skip convolution operation produces structured sparsity in the output feature maps without requiring sparsity in the model parameters for computation reduction. The existing convolution operation performs the redundant computation for object feature representation while the proposed convolution skips redundant computation. Our empirical evaluation for various deep models (VGG, ResNet, MobileNet, and Faster R-CNN) over various benchmarked datasets (CIFAR-10, CIFAR-100, ImageNet, and MS-COCO) show that skip convolution reduces the computation significantly while preserving feature representational capacity. The proposed approach is model-agnostic and can be applied over any architecture. The proposed approach does not require a pretrained model and does train from scratch. Hence we achieve significant computation reduction at training and test time. We are also able to reduce computation in an already compact model such as MobileNet using skip convolution. We also show empirically that the proposed convolution works well for other tasks such as object detection. Therefore, SkipConv can be a widely usable and efficient way of reducing computation in deep CNN models.

Original languageEnglish
Title of host publication2020 International Joint Conference on Neural Networks, IJCNN 2020 - Proceedings
PublisherIEEE
ISBN (Electronic)9781728169262
DOIs
Publication statusPublished - 28 Sept 2020
Event2020 International Joint Conference on Neural Networks, IJCNN 2020 - Virtual, Glasgow, UK United Kingdom
Duration: 19 Jul 202024 Jul 2020

Publication series

NameProceedings of the International Joint Conference on Neural Networks
Volume2020
ISSN (Print)2161-4393
ISSN (Electronic)2161-4407

Conference

Conference2020 International Joint Conference on Neural Networks, IJCNN 2020
Country/TerritoryUK United Kingdom
CityVirtual, Glasgow
Period19/07/2024/07/20

Keywords

  • Computation (FLOPS) compression
  • Computationally efficient CNN
  • Convolutional neural network
  • Deep learning
  • Image recognition

ASJC Scopus subject areas

  • Software
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'SkipConv: Skip Convolution for Computationally Efficient Deep CNNs'. Together they form a unique fingerprint.

Cite this