[SIGGRAPH 2024] Categorical Codebook Matching for Embodied Character Controllers

แชร์
ฝัง
  • เผยแพร่เมื่อ 3 ก.ค. 2024
  • Translating motions from a real user onto a virtual embodied avatar is a key challenge for character animation in the metaverse. In this work, we present a novel generative framework that enables mapping from a set of sparse sensor signals to a full body avatar motion in real-time while faithfully preserving the motion context of the user. In contrast to existing techniques that require training a motion prior and its mapping from control to motion separately, our framework is able to learn the motion manifold as well as how to sample from it at the same time in an end-to-end manner. To achieve that, we introduce a technique called codebook matching which matches the probability distribution between two categorical codebooks for the inputs and outputs for synthesizing the character motions. We demonstrate this technique can successfully handle ambiguity in motion generation and produce high quality character controllers from unstructured motion capture data. Our method is especially useful for interactive applications like virtual reality or video games where high accuracy and responsiveness are needed.
    Project Page: github.com/sebastianstarke/AI...
  • วิทยาศาสตร์และเทคโนโลยี

ความคิดเห็น • 19

  • @ForexLearnerAI
    @ForexLearnerAI 19 วันที่ผ่านมา +3

    I'm literally speechless. No joke. I don't know what to say. This is amazing. I'll look into the repo for more info but I have a use case in mind where this would be so valuable when trained on a custom dataset. Thanks Starke and everyone involved for brining us one step closer to a thriving future.

  • @mbit5818
    @mbit5818 28 วันที่ผ่านมา +4

    if you implement this technology as a plugin for ue5 vr, I will definitely buy it!

  • @TerraUnityCo
    @TerraUnityCo 18 วันที่ผ่านมา +2

    This is amazing Sebastian, great job as always 👏

  • @mattanimation
    @mattanimation 29 วันที่ผ่านมา +2

    been such a joy to watch how this work has evovled and be applied to so many areas

  • @tomhalpin8
    @tomhalpin8 หลายเดือนก่อน +4

    Absolutely fantastic as always!

  • @BaseRealityVR
    @BaseRealityVR 18 วันที่ผ่านมา +1

    This is What I want for my own UE VR system that I'm developing, instead of having to A mix OF animation & procedural foot placement which can look robotic. The bending of the Knees in the correct directions here is very impressive because dynamic crouching can be a challenge

  • @r.m8146
    @r.m8146 18 วันที่ผ่านมา +1

    Absolutely amazing.

  • @fanshengmeng6102
    @fanshengmeng6102 หลายเดือนก่อน +2

    Looking forward to the code/demo/dataset! Great work!

    • @paul-starke
      @paul-starke หลายเดือนก่อน +4

      We are working on the polishing and will upload them soon, thanks!

  • @r.m8146
    @r.m8146 18 วันที่ผ่านมา +1

    You were featured in 3 minutes papers. Congrats!

  • @bause6182
    @bause6182 17 วันที่ผ่านมา

    Good intersting paper + Code implementation = pur hapiness.

  • @ZadakLeader
    @ZadakLeader หลายเดือนก่อน +1

    Sweet!

  • @skavenqblight
    @skavenqblight หลายเดือนก่อน +1

    Impressive.

  • @leoako7775
    @leoako7775 18 วันที่ผ่านมา

    this is what's lacking in vrchat ! ugly movement really kills the experience so implementing this tech in game would be such a welcomed upgrade !

  • @importon
    @importon 28 วันที่ผ่านมา +2

    Very nice! How can we try your "mirror demo" in our headsets? Is there a demo available?

  • @bahshas
    @bahshas วันที่ผ่านมา +1

    how come we don't see these implemented in videogames?

  • @honglinchen4681
    @honglinchen4681 18 วันที่ผ่านมา

    Does anyone else also like the dance at the end of the video? lol

  • @ThaLiveKing
    @ThaLiveKing 29 วันที่ผ่านมา +2

    Is this just for VR?

    • @bause6182
      @bause6182 17 วันที่ผ่านมา

      You can customize the code in order to make a mocap for animations maybe