GRIG: Data-efficient generative residual image inpainting

Wanglong Lu, Xiaogang Jin, Xianta Jiang, Yongliang Yang, Minglun Gong, Kaijie Shi, Tao Wang, Hanli Zhao

Research output: Contribution to journalArticlepeer-review

Abstract

Image inpainting is the task of filling in missing or masked regions of an image with semantically meaningful content. Recent methods have shown significant improvement in dealing with large missing regions. However, these methods usually require large training datasets to achieve satisfactory results, and there has been limited research into training such models on a small number of samples. To address this, we present a novel data-efficient generative residual image inpainting method that produces high-quality inpainting results. The core idea is to use an iterative residual reasoning method that incorporates convolutional neural networks (CNNs) for feature extraction and transformers for global reasoning within generative adversarial networks, along with image-level and patch-level discriminators. We also propose a novel forged patch adversarial training strategy to create faithful textures and detailed appearances. Extensive evaluation shows that our method outperforms previous methods on the data-efficient image inpainting task, both quantitatively and qualitatively.
Original languageEnglish
Pages (from-to)1329-1361
Number of pages33
JournalComputational Visual Media
Volume11
Issue number6
Early online date13 Nov 2025
DOIs
Publication statusPublished - 31 Dec 2025

Data Availability Statement

The data involved in this study are all public data, which can be downloaded through public channels.

Keywords

  • generative adversarial networks
  • image inpainting
  • iterative reasoning
  • residual learning

ASJC Scopus subject areas

  • Computer Vision and Pattern Recognition
  • Computer Graphics and Computer-Aided Design
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'GRIG: Data-efficient generative residual image inpainting'. Together they form a unique fingerprint.

Cite this