MageAdd: Real-Time Interaction Simulation for Scene Synthesis

Shao Kui Zhang, Yi-Xiao Li, Yu He, Yongliang Yang, Song-Hai Zhang

Research output: Chapter or section in a book/report/conference proceedingChapter in a published conference proceeding

9 Citations (SciVal)
173 Downloads (Pure)


While recent researches on computational 3D scene synthesis have achieved impressive results, automatically synthesized scenes do not guarantee satisfaction of end users. On the other hand, manual scene modelling can always ensure high quality, but requires a cumbersome trial-and-error process. In this paper, we bridge the above gap by presenting a data-driven 3D scene synthesis framework that can intelligently infer objects to the scene by incorporating and simulating user preferences with minimum input. While the cursor is moved and clicked in the scene, our framework automatically selects and transforms suitable objects into scenes in real time. This is based on priors learnt from the dataset for placing different types of objects, and updated according to the current scene context. Through extensive experiments we demonstrate that our framework outperforms the state-of-the-art on result aesthetics, and enables effective and efficient user interactions.
Original languageEnglish
Title of host publicationMM 2021 - Proceedings of the 29th ACM International Conference on Multimedia
Place of PublicationU. S. A.
PublisherAssociation for Computing Machinery
Number of pages9
ISBN (Electronic)9781450386517
Publication statusPublished - 17 Oct 2021
Event29th ACM International Conference on Multimedia, MM 2021 -
Duration: 20 Oct 202124 Oct 2021

Publication series

NameMM 2021 - Proceedings of the 29th ACM International Conference on Multimedia


Conference29th ACM International Conference on Multimedia, MM 2021

Bibliographical note

Funding Information:
This work was supported by the National Key Technology R&D Program (Project Number 2017YFB1002604), the National Natural Science Foundation of China (Project Numbers 61772298, 61832016), Research Grant of Beijing Higher Institution Engineering Research Center, and Tsinghua–Tencent Joint Laboratory for Internet Innovation Technology. Yong-Liang Yang was partly supported by RCUK grant CAMERA (EP/M023281/1, EP/T014865/1), and a gift from Adobe.

Publisher Copyright:
© 2021 ACM.


  • 3D indoor scene synthesis
  • spatial inference
  • user interaction

ASJC Scopus subject areas

  • Human-Computer Interaction
  • Software
  • Computer Graphics and Computer-Aided Design


Dive into the research topics of 'MageAdd: Real-Time Interaction Simulation for Scene Synthesis'. Together they form a unique fingerprint.

Cite this