[Research] We trained a self-balancing physics-based character to follow interactive motion capture.
Here’s a twitter thread including a video of a ragdoll getting lots of cubes in the face: https://twitter.com/profbof/status/1194734191843454976
Here’s a blog post: https://montreal.ubisoft.com/en/drecon-data-driven-responsive-control-of-physics-based-characters
Here’s a paper, which will be presented at Siggraph Asia next week: https://static-wordpress.akamaized.net/montreal.ubisoft.com/wp-content/uploads/2019/11/13214229/DReCon.pdf
And here’s a high level explanation of what this is all about:
Physics-based animation holds the promise of unlocking unprecedented levels of interaction, fidelity, and variety in games. The intricate interactions between a character and it’s environment can only be faithfully synthesized by respecting real physical principles. On the other hand, data-driven animation systems utilizing large amounts of motion capture data have already shown that artistic style and motion variety can be preserved even when tight constraints on responsiveness and motion control objectives are required by a game’s design.
To combine the strengths of both methods we developed DReCon, a character controller created using deep reinforcement learning. Essentially, simulated human characters learn to move around and balance from precisely controllable motion capture examples. Once trained, gamepad controlled characters can be fully simulated using physics and simultaneously directed with a high level of responsiveness at a surprisingly low runtime cost on today’s hardware.
submitted by /u/profbof
[link] [comments]