Passive Batch Injection Training Technique: Boosting Network Performance by Injecting Mini-Batches from a different Data Distribution

Pravendra Singh, Pratik Mazumder, Vinay P. Namboodiri

Research output: Chapter or section in a book/report/conference proceedingChapter in a published conference proceeding

Abstract

This work presents a novel training technique for deep neural networks that makes use of additional data from a distribution that is different from that of the original input data. This technique aims to reduce overfitting and improve the generalization performance of the network. Our proposed technique, namely Passive Batch Injection Training Technique (PBITT), even reduces the level of overfitting in networks that already use the standard techniques for reducing overfitting such as L2 regularization and batch normalization, resulting in significant accuracy improvements. Passive Batch Injection Training Technique (PBITT) introduces a few passive mini-batches into the training process that contain data from a distribution that is different from the input data distribution. This technique does not increase the number of parameters in the final model and also does not increase the inference (test) time but still improves the performance of deep CNNs. To the best of our knowledge, this is the first work that makes use of different data distribution to aid the training of convolutional neural networks (CNNs). We thoroughly evaluate the proposed approach on standard architectures: VGG, ResNet, and WideResNet, and on several popular datasets: CIFAR-10, CIFAR-100, SVHN, and ImageNet. We observe consistent accuracy improvement by using the proposed technique. We also show experimentally that the model trained by our technique generalizes well to other tasks such as object detection on the MS-COCO dataset using Faster R-CNN. We present extensive ablations to validate the proposed approach. Our approach improves the accuracy of VGG-16 by a significant margin of 2.1% over the CIFAR-100 dataset.

Original languageEnglish
Title of host publication2020 International Joint Conference on Neural Networks, IJCNN 2020 - Proceedings
PublisherIEEE
ISBN (Electronic)9781728169262
DOIs
Publication statusPublished - 28 Sept 2020
Event2020 International Joint Conference on Neural Networks, IJCNN 2020 - Virtual, Glasgow, UK United Kingdom
Duration: 19 Jul 202024 Jul 2020

Publication series

NameProceedings of the International Joint Conference on Neural Networks
Volume2020
ISSN (Print)2161-4393
ISSN (Electronic)2161-4407

Conference

Conference2020 International Joint Conference on Neural Networks, IJCNN 2020
Country/TerritoryUK United Kingdom
CityVirtual, Glasgow
Period19/07/2024/07/20

Bibliographical note

Publisher Copyright:
© 2020 IEEE.

Copyright:
Copyright 2020 Elsevier B.V., All rights reserved.

Keywords

  • Convolutional neural network training
  • Deep CNN training
  • Deep learning
  • Object recognition

ASJC Scopus subject areas

  • Software
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Passive Batch Injection Training Technique: Boosting Network Performance by Injecting Mini-Batches from a different Data Distribution'. Together they form a unique fingerprint.

Cite this