[SIGGRAPH 2024] Categorical Codebook Matching for Embodied Character Controllers
ฝัง
- เผยแพร่เมื่อ 3 ก.ค. 2024
- Translating motions from a real user onto a virtual embodied avatar is a key challenge for character animation in the metaverse. In this work, we present a novel generative framework that enables mapping from a set of sparse sensor signals to a full body avatar motion in real-time while faithfully preserving the motion context of the user. In contrast to existing techniques that require training a motion prior and its mapping from control to motion separately, our framework is able to learn the motion manifold as well as how to sample from it at the same time in an end-to-end manner. To achieve that, we introduce a technique called codebook matching which matches the probability distribution between two categorical codebooks for the inputs and outputs for synthesizing the character motions. We demonstrate this technique can successfully handle ambiguity in motion generation and produce high quality character controllers from unstructured motion capture data. Our method is especially useful for interactive applications like virtual reality or video games where high accuracy and responsiveness are needed.
Project Page: github.com/sebastianstarke/AI... - วิทยาศาสตร์และเทคโนโลยี
I'm literally speechless. No joke. I don't know what to say. This is amazing. I'll look into the repo for more info but I have a use case in mind where this would be so valuable when trained on a custom dataset. Thanks Starke and everyone involved for brining us one step closer to a thriving future.
if you implement this technology as a plugin for ue5 vr, I will definitely buy it!
This is amazing Sebastian, great job as always 👏
been such a joy to watch how this work has evovled and be applied to so many areas
Absolutely fantastic as always!
This is What I want for my own UE VR system that I'm developing, instead of having to A mix OF animation & procedural foot placement which can look robotic. The bending of the Knees in the correct directions here is very impressive because dynamic crouching can be a challenge
Absolutely amazing.
Looking forward to the code/demo/dataset! Great work!
We are working on the polishing and will upload them soon, thanks!
You were featured in 3 minutes papers. Congrats!
Good intersting paper + Code implementation = pur hapiness.
Sweet!
Impressive.
this is what's lacking in vrchat ! ugly movement really kills the experience so implementing this tech in game would be such a welcomed upgrade !
Very nice! How can we try your "mirror demo" in our headsets? Is there a demo available?
how come we don't see these implemented in videogames?
Does anyone else also like the dance at the end of the video? lol
Is this just for VR?
You can customize the code in order to make a mocap for animations maybe