Multi-Character Motion Retargeting for Large Scale Changes

  • Maryam Naghizadeh

Student thesis: Doctoral ThesisPhD


Multi-character motion retargeting (MCMR) aims at generating motion for multiple target characters given the motion data for their corresponding source subjects. Unlike single-character motion retargeting, MCMR algorithms should be able to retarget each character’s motion correctly while maintaining the interaction between them. Existing solutions focus on small scale changes between interacting characters. However, many retargeting applications require large scale transformations. For example, movies like Avatar (2009) use motion retargeting to drive characters that are much taller or shorter than the human actors controlling them. Current solutions in the industry require a significant amount of clean-up, increasing costs and post-processing time considerably.

In this research, we propose a new algorithm for large-scale MCMR using space-time constraint-based optimisation. We build on the idea of interaction meshes, which are structures representing the spatial relationship among characters. We introduce a new distance-based interaction mesh that embodies the relationship between characters more accurately by prioritizing local connections over global ones. We introduce a stiffness weight for each skeletal joint in our optimisation function, which defines how undesirable it is for the interaction mesh to deform around that joint. This parameter increases the adaptability of our algorithm for large-scale transformations and reduces optimisation time considerably. Our optimisation function also incorporates: a) a pose prior model, which ensures that the output poses are valid; b) a balance term, which aims at preserving balance in the output motion; and c) a distance adjustment element, which adapts the distance between characters according to their scale change.

We compare the performance of our algorithm with the current state-of-the-art MCMR solution (baseline) for several motion sequences based on runtime, bone-length error, balance and pose validity metrics. Furthermore, we complete two more experiments to evaluate our method’s competency against the baseline. The first experiment involves converting retargeting results to an angular representation and measuring inverse kinematics (IK) error. For the second experiment, we conduct a user study and ask participants to rank the output of our method and the baseline according to their retargeting quality for various test sequences. Our results show that our method outperforms the baseline over based on runtime, balance, pose validity, IK error and retargeting quality score measures. They display similar performance regarding bone-length error.
Date of Award29 Mar 2023
Original languageEnglish
Awarding Institution
  • University of Bath
SupervisorDarren Cosker (Supervisor) & Neill Campbell (Supervisor)

Cite this