ReMoS: 3D Motion-Conditioned Reaction Synthesis for Two-Person Interactions
Abstract
Current approaches for 3D human motion synthesis generate high-quality animations of digital humans performing a wide variety of actions and gestures. However, a notable technological gap exists in ad- dressing the complex dynamics of multi-human interactions within this paradigm. In this work, we present ReMoS, a denoising diffusion-based model that synthesizes full-body reactive motion of a person in a two- person interaction scenario. Given the motion of one person, we employ a combined spatio-temporal cross-attention mechanism to synthesize the reactive body and hand motion of the second person, thereby completing the interactions between the two. We demonstrate ReMoS across chal- lenging two-person scenarios such as pair-dancing, Ninjutsu, kickboxing, and acrobatics, where one person’s movements have complex and di- verse influences on the other. We also contribute the ReMoCap dataset for two-person interactions containing full-body and finger motions. We evaluate ReMoS through multiple quantitative metrics, qualitative vi- sualizations, and a user study, and also indicate usability in interactive motion editing applications.
Approach
ReMoCap Dataset
We propose the ReMoCap dataset for two-person interactions consisting of fullbody and hand motions.
The dataset captures interactive, challenging two-person motions in two scenarios: the fast-paced swing style Lindy Hop dancing and the martial art technique of Ninjutsu.
Dataset Download Link : Remocap.zip
Results
Generalizability of ReMoS in various two-person scenarios.
Applications of ReMoS in character animation and motion editing.
Quantitative Evaluation
Citation
@InProceedings{ghosh2024remos, title={ReMoS: 3D Motion-Conditioned Reaction Synthesis for Two-Person Interactions}, author={Ghosh, Anindita and Dabral, Rishabh and Golyanik, Vladislav and Theobalt, Christian and Slusallek, Philipp}, booktitle={European Conference on Computer Vision (ECCV)}, year={2024} }
Contact
For questions, clarifications, please get in touch with:Anindita Ghosh
anghosh@mpi-inf.mpg.de