Real-time content-aware texturing for deformable surfaces

C. Koniaris, D. Cosker, X. Yang, K. Mitchell, I. Matthews

Research output: Chapter in Book/Report/Conference proceedingChapter

1 Citation (Scopus)

Abstract

Animation of models often introduces distortions to their parameterisation, as these are typically optimised for a single frame. The net effect is that under deformation, the mapped features, i.e. UV texture maps, bump maps or displacement maps, may appear to stretch or scale in an undesirable way. Ideally, what we would like is for the appearance of such features to remain feasible given any underlying deformation. In this paper we introduce a real-time technique that reduces such distortions based on a distortion control (rigidity) map. In two versions of our proposed technique, the parameter space is warped in either an axis or a non-axis aligned manner based on the minimisation of a non-linear distortion metric. This in turn is solved using a highly optimised hybrid CPU-GPU strategy. The result is real-time dynamic content-aware texturing that reduces distortions in a controlled way. The technique can be applied to reduce distortions in a variety of scenarios, including reusing a low geometric complexity animated sequence with a multitude of detail maps, dynamic procedurally defined features mapped on deformable geometry and animation authoring previews on texture-mapped models.
Original languageEnglish
Title of host publicationACM International Conference Proceeding Series
Number of pages10
DOIs
Publication statusPublished - 6 Nov 2013

Fingerprint Dive into the research topics of 'Real-time content-aware texturing for deformable surfaces'. Together they form a unique fingerprint.

  • Cite this

    Koniaris, C., Cosker, D., Yang, X., Mitchell, K., & Matthews, I. (2013). Real-time content-aware texturing for deformable surfaces. In ACM International Conference Proceeding Series [11] https://doi.org/10.1145/2534008.2534016