How I Use Move.AI and Metahumans to Achieve AAA Character Animation in Unreal Engine 5

แชร์
ฝัง
  • เผยแพร่เมื่อ 12 ธ.ค. 2024

ความคิดเห็น • 104

  • @jbenoit1962
    @jbenoit1962 3 หลายเดือนก่อน +6

    There's tolerances in voice sync back and forward because of the way the brain is used to compensating (for someone speaking IRL across the table vs someone across the room.) In general if you experiment with moving your audio earlier it'll be "tighter" to the performances and elevate the end product (says the dude whose worked in audio specifically for longer than I care to admit!)

  • @viktortoleski
    @viktortoleski 3 หลายเดือนก่อน +6

    Looks great! As time passes processes that took days even weeks or months to be done we are now able to do them in a day..
    Oh how times have changed.

  • @iloveeveryone8611
    @iloveeveryone8611 3 หลายเดือนก่อน +5

    literally EXACTLY what I was looking for!!! Please make another!!!!
    EDIT: Also pls go into depth about prices and alternatives

  • @hitmangamesyndicate5970
    @hitmangamesyndicate5970 3 หลายเดือนก่อน +2

    Looks great... Using a similar workflow in my productions. Looking forward to seeing more...

  • @JsAnimation24
    @JsAnimation24 4 หลายเดือนก่อน +4

    Amazing hiqh-quality work, thanks for sharing your process! I first saw your work on Reddit in the UE forums, and also your videos have now started popping up on the TH-cam feed 🙂looking forward to seeing your future videos!

  • @ElPibeMagic
    @ElPibeMagic 4 หลายเดือนก่อน +2

    Thanks for sharing this pipeline

  • @Ronin2079
    @Ronin2079 4 หลายเดือนก่อน +4

    also yes a much more in depth breakdown would be extremely beneficial!

  • @SilexProductions7
    @SilexProductions7 4 หลายเดือนก่อน

    impressivly dope !!! so inspiring for futur projects !

  • @emanfinding2603
    @emanfinding2603 2 หลายเดือนก่อน

    Lovely and Inspiring -Thank you!

  • @advvymar
    @advvymar 4 หลายเดือนก่อน

    Great workflow. I would love to see some more breakdown. 🙏🏾

  • @freenomon2466
    @freenomon2466 3 หลายเดือนก่อน

    in my experience making the lipsync a bit earlier than audio helps make it look more in sync.

  • @docmn6602
    @docmn6602 4 หลายเดือนก่อน +1

    though it's a mocap video, I really like the fascinating cinematic lighting coming along the faces

  • @aaagamingtelugu3458
    @aaagamingtelugu3458 3 หลายเดือนก่อน +1

    Could you please try to make a In Depth Tutorial on this short film making process. Which would be more helpful.

    • @NorthwoodsInteractive
      @NorthwoodsInteractive  3 หลายเดือนก่อน +3

      Yes I definitely will, and soon. I need to find a mocap solution that is more accessible to more people, since I am using Move.AI which is a pro software license and also not the best solution for everybody. I am thinking of doing a short film using entirely the Rokoko free dual camera setup, if I can get it to look good enough.

  • @chrisleefilm
    @chrisleefilm 3 หลายเดือนก่อน

    Wow, great sharing!

  • @IliaCinema
    @IliaCinema 6 วันที่ผ่านมา

    Thanks for sharing! Isn't a Rokoko suit a better offer in terms of pricing and quality?

    • @NorthwoodsInteractive
      @NorthwoodsInteractive  6 วันที่ผ่านมา

      It's a great question. Depends if you are getting gloves with your suit or not, and a few other things. Since this video, i have figured out how to get a much better calibration out of Move, so I don't have such horrible jitters that need to be smoothed. That makes a big difference. With Move, I can do two people at the same time, unlike with a suit. Also, suits are a pain to put on and off, especially if you are putting it on someone who has never worn one, or who has a lot bigger or smaller body. Also, I have not used a rokoko suit, although I have seen a lot of videos of them in use, and it requires a similar amount of cleanup, albeit in Maybe different ways, than Move. Ultimately, they are different tools that can be used to achieve similar outcomes, both with their tradeoffs. I am preferring Move and other Ai-based solutions. This is based on my experi3nce owning an Xsens Awinda, and ones of the very first Perception Neuron suits (which was godawful).

    • @IliaCinema
      @IliaCinema 6 วันที่ผ่านมา

      @@NorthwoodsInteractive thank you for your answer! Good luck with your projects!

  • @The_MegaClan
    @The_MegaClan 4 หลายเดือนก่อน +3

    I can predict that 7000 used coming down soon .

  • @sylvaindelaine
    @sylvaindelaine 4 หลายเดือนก่อน

    Great ! Thanks for sharing

  • @pabloqsanchez
    @pabloqsanchez 4 หลายเดือนก่อน

    Interesting and very cool!

  • @sylvaindelaine
    @sylvaindelaine 4 หลายเดือนก่อน +6

    The pipeline is intriguing and appears to be functioning, though the final output is not yet ready for the final show. It requires post-production work to make the animations believable and lifelike, which is where the final touch from an animator becomes essential. Nonetheless, it is commendable to have reached this stage! I believe an additional layer is needed to refine some animations, such as hand gestures and facial expressions, to bring everything to life. However, I understand that this would compromise real-time performance, which is not the current objective. I am interested in learning more about the tools you are using, as I am working on a similar workflow and still experimenting. Keep up the great work, and thank you very much!

  • @robertdouble559
    @robertdouble559 3 หลายเดือนก่อน +1

    Low pass butterworth. Great tip. Cheers gents.

  • @nu-beings
    @nu-beings 2 หลายเดือนก่อน

    Can you make a video on how you linked animations together so the root and mesh moved correctly in Sequencer?

    • @NorthwoodsInteractive
      @NorthwoodsInteractive  2 หลายเดือนก่อน

      All the animations come in as root motions. If I have to link animations, I just cut around it rather than blending

  • @nu-beings
    @nu-beings 4 หลายเดือนก่อน +9

    Sucks that Move AI raised their Indie prices after the Beta. Business is business though! Great job.

    • @hellomistershifty
      @hellomistershifty 3 หลายเดือนก่อน +9

      If anyone sees this and is wondering, it starts at $15/month with 3 minutes of tracking included, then $7.50 per minute of tracking after that. Each person tracked counts against the time separately, so 90 seconds included and $15 a minute after that for tracking two people.
      I was more interested in their multicam option, until I saw that the price was 'contact us'.

    • @NorthwoodsInteractive
      @NorthwoodsInteractive  3 หลายเดือนก่อน

      ​@@hellomistershiftyI have a tutorial coming using Move One and Metahuman Animator

  • @JamieDunbar
    @JamieDunbar 3 หลายเดือนก่อน

    It sounds like you’ve tried MoveAI and Xsens. Have you also tried Rokoko?
    I currently have a Rokoko suit and Im curious which you think gives the best results.
    Rokoko has some clear issues, but the overall quality doesn’t appear too different to what you’ve achieved here. That said, I’ve seen some results with MoveAI that blew me away. So I’m wondering if those examples had a lot of cleanup, or if you’re still figuring out how to get the best results?

    • @NorthwoodsInteractive
      @NorthwoodsInteractive  3 หลายเดือนก่อน +1

      Move.AI is almost amazing for me. Because it's vision based, as opposed to inertia based, the accuracy of the movements seems to be better than xsens. However, I get a bunch of micro jitters in my animations that needs to be filtered out, and I am not sure why. Also, the hands I think really throw off the rest of the animation. They don't get captured really well unless you have your hands close to one of the cameras, and then the retargeter seems to make them a bit claw-like. I can probably fix that with some pre-made hand poses that I can just drop on an additive layer over the hands in sequencer.

    • @JamieDunbar
      @JamieDunbar 3 หลายเดือนก่อน

      @@NorthwoodsInteractive Interesting! So you think MoveAI might be better than Xsens?
      I've wondered whether paying the extra for an Xsens suit might give better results than Rokoko, but that suggests the improvement probably isn't worth the price.
      I think using those pre-made hand poses will work well for you. I basically do the same thing with the Rokoko gloves. The actual movement is really good, but the key poses are often off. Fix the poses and all the in-betweens usually look pretty good 👍

  • @Megasteakman
    @Megasteakman 4 หลายเดือนก่อน +2

    That's great: amazing volume size!

    • @NorthwoodsInteractive
      @NorthwoodsInteractive  4 หลายเดือนก่อน +1

      Haha thank you, empty garages make great mocap volumes!

  • @veith3dclub
    @veith3dclub 4 หลายเดือนก่อน

    Really love it! Could you also show your script or your approach in recording?

    • @NorthwoodsInteractive
      @NorthwoodsInteractive  4 หลายเดือนก่อน +1

      This was a super rough script we wrote in about 20 minutes, and memorized in about the same time, with a little bit of freestyle. We need some proper Monty Python writing up in here!

  • @IAmVo
    @IAmVo 4 หลายเดือนก่อน +1

    This is awesome! Where are you based? I’d love to work with you! - Vo Williams

  • @inhaf00
    @inhaf00 4 หลายเดือนก่อน

    needdd moreeeeee!!!!!!!!!!!!!!!

  • @aknittel1
    @aknittel1 3 หลายเดือนก่อน +1

    The facial Mocap does not seem to be capturing the movements around the eyes or forehead on the meta human, and so the expressions are more wooden than they should be. What in the workflow could be modified? Does the actor need more extreme expressions? Did you attempt to tweak and exaggerate those controllers at the end of the workflow?

    • @NorthwoodsInteractive
      @NorthwoodsInteractive  3 หลายเดือนก่อน

      I didn't tweak the face animations at all. In my other video with the talking pig I used the transform tool to elongate the curves for certain parts of the face, like the mouth control, lips, and brow

  • @kilansshetty5079
    @kilansshetty5079 4 หลายเดือนก่อน

    Super workflow.. this is what we need, time saver. This workflow has so many applications. Is there any possibility of this workflow working with blender? I understand UE is great for this workflow. but just wondering... how would it work with blender, like create a ai character convert it to 3d and then rig it in blender? cheers

    • @NorthwoodsInteractive
      @NorthwoodsInteractive  4 หลายเดือนก่อน

      I think Metahumans are only supposed to be used in Unreal Engine, as per their license stipulations. I have seen people pull metahumans into Blender, to do custom topology work. Metahuman Animator is all inside Unreal Engine, and is one of the main reasons I have made Unreal Engine and Metahumans the center of my 3D animation.

    • @skiez7430
      @skiez7430 3 หลายเดือนก่อน

      using meta-humans outside Unreal Engine breaks their TOS. I wouldnt recommend it

  • @moejahi3d3
    @moejahi3d3 4 หลายเดือนก่อน +1

    Could you guys list which guy on upwork got the job done? :> also looking into having some custom characters turned into metahumans using meta pipe

    • @NorthwoodsInteractive
      @NorthwoodsInteractive  3 หลายเดือนก่อน +1

      Arman Avetisov

    • @moejahi3d3
      @moejahi3d3 3 หลายเดือนก่อน

      @@NorthwoodsInteractive you the mvp!

  • @madisepler7059
    @madisepler7059 4 หลายเดือนก่อน

    Very nice. I am just developing similar workflow for myself. I have two xsens suits though. What is your opinion on that Rokoko Headrig? Is it sturdy enough? What about wobbling and fit? and is the camera location far enough from face so there will be no clipping / focus problems? I am in between on buying this Rokoko headrig and second option i consider is Facegood D4 helmet which is much pricier but has two custom cameras. What are your thoughts on that? Regards and keep up inspiring work ;)

    • @NorthwoodsInteractive
      @NorthwoodsInteractive  4 หลายเดือนก่อน +1

      I like the rokoko headrig. If you are using an iPhone bigger than a mini, you will have to unscrew the phone mount and mount it backwards, if that makes sense, in order for the phone to be far enough from your face. Other than that, it is pretty good. It will move around a little bit if you are really whipping your head around, but mostly it is snug and small movements don't seem to affect the animation quality too much. Overall it is super good since it will fit any head size.

    • @madisepler7059
      @madisepler7059 4 หลายเดือนก่อน

      @@NorthwoodsInteractive thank you for answer. did you use iphone also for recording the voice or did tou had separate dedicated mics on set?

    • @NorthwoodsInteractive
      @NorthwoodsInteractive  3 หลายเดือนก่อน +1

      ​@madisepler7059 I just used the phone audio

  • @gn0015
    @gn0015 10 วันที่ผ่านมา

    Have you tried anything like doing a facepalm or scratching your face with the facial capture thingy? I assume it won't work and will screw with the capture, but I'm still curious about it.

    • @NorthwoodsInteractive
      @NorthwoodsInteractive  9 วันที่ผ่านมา +1

      I have, and it does mess up the face capture. Check out my latest video, I did some smoking shots where I brought a cigarette to my face.

  • @themightyflog
    @themightyflog หลายเดือนก่อน

    Who did you use on Upwork for the conversion? Having a hard time find oune.

    • @NorthwoodsInteractive
      @NorthwoodsInteractive  หลายเดือนก่อน +1

      Arman Avetisov, he's great! Does amazing character work from scratch, too

  • @stevenbaker153
    @stevenbaker153 4 หลายเดือนก่อน

    Great video. I tried this workflow two months ago for a short and all it did was detach the head., Have you encountered this?

    • @NorthwoodsInteractive
      @NorthwoodsInteractive  4 หลายเดือนก่อน +1

      Oh yes, metahumans like to do that. Was a huge PITA for a long time. Try making your metahuman the child of a basic actor, then adding the actor to your timeline, instead of just adding the metahuman on its own. Then, whenever you have to move the metahuman, move the actor instead. Sometimes, the head will still get detached if you are doing this, so just restart UE, that should fix your sequence

  • @pifpafich
    @pifpafich 3 หลายเดือนก่อน

    full please

  • @sam14986
    @sam14986 4 หลายเดือนก่อน +2

    Sir how much does move ai cost , is it expensive. I saw thier plans but i wonder if you do a lot such animations would suit be cheaper

  • @ZZWWYZ
    @ZZWWYZ 4 หลายเดือนก่อน

    4:35 I don't have experience in 3D animation but that graph seems nightmarish . There's a channel called Rotted that seem to be making show in UE4 , idk if they use the same tools

    • @NorthwoodsInteractive
      @NorthwoodsInteractive  4 หลายเดือนก่อน

      The graph is rough, which is why the only thing I really do with it is select all the keyframes and apply a filter. I'm definitely not getting too granular with it.
      Also, I checked out that show, Rotted. Pretty hilarious, and yeah they seem to be using the same tools. Metahumans, Metahuman Animator for the face, but then just key framing the body animations, so they do not seem to have a mocap solution. Still pretty funny.

  • @alexhoverby
    @alexhoverby 3 หลายเดือนก่อน

    Have you had any luck getting a hold of the team at move? I've been trying to get a move multicam plan for a few weeks now and no one will respond to multiple outlets of communication

    • @NorthwoodsInteractive
      @NorthwoodsInteractive  3 หลายเดือนก่อน

      I think I initially reached out through their website? Not sure, but it was something like that. They have been fairly responsive for me

  • @AlexandreRossLive
    @AlexandreRossLive 3 หลายเดือนก่อน

    Do you have advice for attaching a head to a metahuman body. I am struggling to get my animated body and head to be attached

    • @NorthwoodsInteractive
      @NorthwoodsInteractive  3 หลายเดือนก่อน

      So, I usually make my metahuman a child of a basic actor, and use the actor to move and position the metahuman, since once it has animations applied to both the face and body in sequencer, it can get a little buggy and the head can come off and just float. If you recorded your face animation separately, and did not use a headrig, then make sure you disable the neck and head movement when processing the animation in Metahuman Animator.

  • @michaellaviola6540
    @michaellaviola6540 3 หลายเดือนก่อน +24

    "AAA" is a bit of a stretch here, this is nowhere near the level of quality that it claims to be. The issues is that none of these animations were actually polished after capturing.

    • @NorthwoodsInteractive
      @NorthwoodsInteractive  3 หลายเดือนก่อน +4

      Yes you are correct, and the hands are pretty stiff too. I was mostly referring to the facial animation, which is pretty much there. When combined with even rudimentary mocap, the results are decent, especially if you frame out as much of the jank in the animations as possible

    • @michaellaviola6540
      @michaellaviola6540 3 หลายเดือนก่อน +14

      @@NorthwoodsInteractive no, sorry mate, but even the facial animation is nowhere near "there", firstly the audio and video aren't synced properly (there's a noticeable delay) secondly, it clearly really needs an animation pass, it's quite uncanny and the poses don't really hold well together. Like, if this were stuff for background characters, sure, but on a cinematic with closeups etc? Not really. The shaders also need a lot of work, you can really tell the guard is a metahuman character, and it just looks like a bunch of assets that don't really belong together, all bundled in a scene. You could somewhat fake it with some different lighting, to bring the non-metahuman asset to the same level of fidelity as the metahuman face, or you could modify the MH shader to be lower in fidelity and match the rest. I've done my fair share of facial animation at my previous studio, so I kinda have that engrained into me at this stage

    • @LANGIMATION
      @LANGIMATION 3 หลายเดือนก่อน

      Unfortunately even the facial animation is just facial capture, there's a lot that goes into facial animation to get it looking natural. And you have done any of them except for motion capture. Blend shapes are a good start but building onwards from those shapes is how you get to that level you claim to be on.​@@NorthwoodsInteractive

    • @itsdw2323
      @itsdw2323 2 หลายเดือนก่อน

      “Nowhere near the level of quality it claims to be” ? Have you seen AAA games recently ?
      It’s a pretty decent effort. Not sure what you’ve made yet and when you gonna show us

    • @michaellaviola6540
      @michaellaviola6540 2 หลายเดือนก่อน

      @@itsdw2323 The keyword there being "recently", don't take Ubisoft as an example, their games are only expensive to make due to mismanagement lol AAA quality is stuff like Black Myth Wukong, or at least, Spacemarine 2, that is what AAA quality is meant to be. That said you can easily find one of my old reels if you google my name.

  • @Maxsez
    @Maxsez 2 หลายเดือนก่อน

    How did u linkup the animation of the face to the body?

  • @RedninjaMultimediaProductions
    @RedninjaMultimediaProductions 4 หลายเดือนก่อน

    Very interesting! Another awesome tech I didnt know about, totally looking forward to more!

  • @sadekinborno4079
    @sadekinborno4079 3 หลายเดือนก่อน

    What are the mounts called he used to hold camera to record his face

  • @guillaumevieille8034
    @guillaumevieille8034 2 หลายเดือนก่อน

    How do you have access to the move ai cloud processing platform? I only have mone one app on my phone and I can download a .fbx but that's it

    • @NorthwoodsInteractive
      @NorthwoodsInteractive  2 หลายเดือนก่อน

      Yep I have Move Pro, can use as many gopros as I want and can capture up to two people

  • @evilvenom6659
    @evilvenom6659 3 หลายเดือนก่อน

  • @Silentiumfilms007
    @Silentiumfilms007 4 หลายเดือนก่อน

    Bro
    1- is move ai has any free stuffs?
    2-can i do face animation via Android?
    3 any free alternative for motion capture
    Can i do face animation to any character or only metahuman character?

    • @NorthwoodsInteractive
      @NorthwoodsInteractive  4 หลายเดือนก่อน

      Move.AI has the Move One app, whish isn't free but has some free credits. It uses just one phone camera, and has decent results for what it is.
      I do not believe Metahuman Animator works with Android, since it is using the True Depth camera on the iPhone to get such accurate face animations
      you can do face animation on characters if they have blendshapes or a face rig, and I believe there is a way to use face animations that are captured with Metahuman Animator on non metahuman characters. It involves baking the animation to a different face rig, I think. Look around, I know there are some recent tutorials showing this.

  • @shameelsha
    @shameelsha 4 หลายเดือนก่อน +6

    How much you paid for meta human conversion

  • @MulleDK19
    @MulleDK19 2 หลายเดือนก่อน

    They need to add LIDAR to other phones.. all this crApple exclusivity is pissing me off...

  • @리저드
    @리저드 3 หลายเดือนก่อน

    대사 싱크가 잘 안맞는데요

  • @MatiasPiens
    @MatiasPiens 3 หลายเดือนก่อน +1

    7000 $. wtf !!!!

  • @AlanMafalda87
    @AlanMafalda87 3 หลายเดือนก่อน

    You need improve your photography of footage, its not just about technology, how do the movie e cinema technnics it´s for important.

  • @xaby996
    @xaby996 หลายเดือนก่อน

    7k annual lmao

  • @MikAlexander
    @MikAlexander 3 หลายเดือนก่อน

    As someone who grew up on stage and in movie sets, this tech is impressive but completely devoided of any fun. Sets, costumes, attention, preparation, those were parts of that that were extremely tough but rewarding and fun. Digital is the complete opposite and a bore.

    • @NorthwoodsInteractive
      @NorthwoodsInteractive  3 หลายเดือนก่อน

      It's too bad you feel that way, I find this kind of filmmaking to be extremely fun. As someone who has been on many shoots of all sizes, both for work and for passion projects, I also love the energy and creative spontaneity of a set. But sometimes you just want to make something that is in your head and it is just not practical because of budget, logistics, etc. The feeling of finding some cool assets that spark an idea for a scene and playing with them in UE is awesome. It's a different kind of creative reward than the synergy of a good film crew on set, but it's rewarding all the same.

    • @MikAlexander
      @MikAlexander 3 หลายเดือนก่อน

      @@NorthwoodsInteractive Sure, it's cool for tinkering with it. I do experiment myself as well. I'm thinking more broadly though.

  • @blendragon28
    @blendragon28 4 หลายเดือนก่อน +2

    Wow $7,000 for some jittery AI mocap? I'll stick with Xsens

    • @NorthwoodsInteractive
      @NorthwoodsInteractive  4 หลายเดือนก่อน +3

      Yeah idk why my animation turns out so jittery. I am trying to work it out with Move. The reason I have kept the service is because I am able to get the animation smooth enough, without too much work. Xsens is great, but their raw capture is still a little jittery, you need the cloud processing to get the full value of that suit, and that is $500per month minimum, on top of your $3.5k minimum suit, and that only works for one person at a time. Like i said, tradeoffs. If you don't need to do multiple people at the same time, a suit might make more sense. I am going to do a video a out the tradeoffs and why I use Move over Xsens, which I do have experience with.

  • @Jungleroo
    @Jungleroo 3 หลายเดือนก่อน

    Metapipes a bag of crap with poor support unfortunately