Record UE5 Metahuman Facial Animations Using the Live Link Face iOS App

แชร์
ฝัง
  • เผยแพร่เมื่อ 1 ต.ค. 2024
  • This episode is not an official part of the Let's Build the RPG! video series, but I felt this was a good time with the other content we've been making on Metahuman to do a quick tutorial on how to record facial animations for Metahuman using the free Live Link Face app.
    Special thanks to my friend, Ricardo Chaves, who is the artist who created the character for this demonstration. He's available for hire if you are looking for an artist with mastery of ZBrush, Substance Painter, and Maya.
    Ricardo's Behance:
    www.behance.ne...
    Artstation:
    www.artstation...
    TH-cam:
    / @rchavz

ความคิดเห็น • 59

  • @NumenBrothers
    @NumenBrothers  ปีที่แล้ว +2

    Special thanks to my friend Ricardo Chaves for creating the character used in this demonstration. You can check out his content here:
    Behance:
    www.behance.net/ricardochaves9
    Artstation:
    www.artstation.com/ricardochaves
    TH-cam:
    www.youtube.com/@rchavz

    • @rchavz
      @rchavz ปีที่แล้ว

      My pleasure bro. It's always a pleasure to work with you ❤️

  • @ledvase
    @ledvase หลายเดือนก่อน

    Hello. This is a really great tutorial. Thank you for making it. I have one question. In the first scene of the tutorial, you applied animations to both the body and the head simultaneously, and I’m curious about how you did that. The issue I’m facing is that after bringing in the MetaHuman actor in the Level Sequencer and deleting the Control Rig, when I try to apply separate animation sequences to the body and the face, the head movement does not apply. If you know how to solve this issue, could you please let me know?

  • @Paulie1232
    @Paulie1232 ปีที่แล้ว +1

    I love your work, I need huge favor. can you contact me...please

  • @srivamsiNavi
    @srivamsiNavi 3 หลายเดือนก่อน

    hi, I'm having a problem finding the source device in unreal.
    I checked the following,
    both iphone and device are on same wifi,
    enabled required plugins,
    disabled firewall(followed another video, after doing that, it worked for hime, but not for me)
    still the iphone is not getting listed as arkit source subject.
    please help.

  • @shotundred
    @shotundred หลายเดือนก่อน

    Under Allow apps in Firewall
    It needs to include the Unreal Launcher to connect.
    Thank you~!

  • @Oceana_3D
    @Oceana_3D 2 หลายเดือนก่อน

    I use live face inside of iclone and use live link to transfer the data into unreal engine 5. When I hit my usual record, the body animations save to sequencer but her face animations will not. Even though it shows everything animating inside of the viewport. She's a metahuman. Any suggestions? No one seems to have the solution lol...

  • @Stonefieldmedia
    @Stonefieldmedia 7 วันที่ผ่านมา

    Very clear and well presented. Very helpful and thank you.

  • @GatlingHawk
    @GatlingHawk ปีที่แล้ว +1

    This is the wildest video I've seen in a while. im dying i gotta try this💀

  • @SayonBiz
    @SayonBiz ปีที่แล้ว +2

    This is a really helpful video!
    I was wondering how I involve the hair in the animation sequence as well?
    Currently following the video, I've only been able to bake the animation for the face. But I was wondering how do I do the same for my model's facial hair and actual hair too.

    • @NumenBrothers
      @NumenBrothers  ปีที่แล้ว +1

      All the hair (groom) assets have to be attached to the metahuman's face accordingly. The groom assets use chaos physics for natural movement then- not anim sequences themselves, because the groom assets don't have bones. Hope this makes sense.

    • @SayonBiz
      @SayonBiz ปีที่แล้ว +1

      @@NumenBrothers alright. I’m actually still quite new to Unreal Engine. I actually managed to get an animation in with live link on my metahuman model. My main motive was to actually export that out as an FBX model (along with textures) to be used on Blender for another project. Is there any possible to do that?

    • @NumenBrothers
      @NumenBrothers  ปีที่แล้ว +1

      tbh I use everything directly within Unreal Engine, so I haven't explored it myself, but I believe it is possible. This is where I would start: docs.unrealengine.com/4.26/en-US/AnimatingObjects/Sequencer/Workflow/ImportExport/

  • @poe1554
    @poe1554 ปีที่แล้ว +1

    You're a badass. Clarity personified here.

  • @VirtualRealityStudio
    @VirtualRealityStudio 3 หลายเดือนก่อน +1

    Thank you for fast str8 to the point tuts!

  • @FPChris
    @FPChris 6 หลายเดือนก่อน

    4:41 when you bake does it keep the audio?

  • @RoryMcC42
    @RoryMcC42 ปีที่แล้ว +1

    Nice, thanks for the info guys. Keep it coming :)

  • @street2stage235
    @street2stage235 2 หลายเดือนก่อน

    is this available for android

  • @ignarunrealengine8163
    @ignarunrealengine8163 ปีที่แล้ว +1

    you remove all the bone controllers from the sequence, who needs to animate only the face, but how to manually animate the body later? why do all the lessons have the same thing. Tell me please.

    • @NumenBrothers
      @NumenBrothers  ปีที่แล้ว

      You would keep the controllers on the sequence to the extent that you want to manually animate the sequence. I.E., using keyframes to specify what each control is doing at each point in time. It's only necessary if you're manually animating vs. recording an animation. The body actually uses a different animation blueprint (at least for metahuman), so you can record the animations separately (or use separate pre-recorded body animations).

    • @ignarunrealengine8163
      @ignarunrealengine8163 ปีที่แล้ว

      @@NumenBrothers and how to do it at the same time, how to return the body controllers to the sequence, for example, I first want to record the animation of the face, and then separately do the animation of the whole body except the head, is it possible?

    • @NumenBrothers
      @NumenBrothers  ปีที่แล้ว

      @@ignarunrealengine8163 yes, that's exactly what I did to make the 5 second intro. Except I didn't record the body animation myself. That is from Mixamo. You can map Metahuman onto the ThirdPersonCharacter, and any Mixamo onto the ThirdPersonCharacter. The process of mapping animations to the third person character is in the series on this channel. Mixamo to ThirdPersonCharacter is episode 25, and then Metahuman to ThirdPersonCharacter is episode 45. I hope they help get you started.

  • @TolisPiperas82
    @TolisPiperas82 ปีที่แล้ว +1

    For android is there app?

    • @NumenBrothers
      @NumenBrothers  ปีที่แล้ว

      Unfortunately not, because it works off of the Apple ARKit. There are other apps for android, but I don't believe they work as well (or integrate natively with Metahuman)

  • @3ATPAXEP
    @3ATPAXEP 10 หลายเดือนก่อน

    bro do you know how to make head move more smoothly? when i'm capturing its too shaky

  • @noceanstudioGC
    @noceanstudioGC ปีที่แล้ว +1

    Helped a lot! Thx man

  • @serhiipes
    @serhiipes 6 หลายเดือนก่อน

    Is it possible to do without wifi? I have only 5g internet in my iPhone, and I share it via cable with my computer.

    • @NumenBrothers
      @NumenBrothers  6 หลายเดือนก่อน

      For this your phone needs to communicate with your computer in real time. Could you plug your phone also into a router? Never done that but I think it's possible

    • @serhiipes
      @serhiipes 6 หลายเดือนก่อน +1

      @@NumenBrothers Thanks for reply. The problem is that I don't have any router. I live in a rent room. I have only my iPhone with cellular internet. And I use an iPhone's hotspot to share internet to PC. Anyway, I have sorted out the issue. Fortunately I have an Ipad Pro. I installed the Face Link App to my Ipad, and connected iPad to my iPhone's Hotspot as well as PC. It works now. Thanks you for tutorial. It is very useful.

  • @wolfwirestudios
    @wolfwirestudios 7 หลายเดือนก่อน

    Nice tutorial mate but it's not working for me. I tried the PlayMontage by making a montage of my animation and it's just not playing. Debug shows that it reaches the "PlayMontage" node but never actually plays the montage for some reason. Any solutions? I literally followed your tutorial step by step.

    • @NumenBrothers
      @NumenBrothers  7 หลายเดือนก่อน

      It's most likely an issue with your character. Is your character using the animation blueprint for the head/face?

    • @wolfwirestudios
      @wolfwirestudios 7 หลายเดือนก่อน

      Yes it is. Basically I copied all Skeletal Meshes straight up from Metahuman BP to my NPC BP hence every variable is the same. It's literally an Animation and it should play right? but nothing happens. Do I need to enable something in montage? or is there any other setting on the face mesh?@@NumenBrothers

    • @NumenBrothers
      @NumenBrothers  7 หลายเดือนก่อน +1

      I don't know enough to say. I would try to do it straight from a Metahuman, and if it works, then you know it's a problem in the difference between your metahuman and your NPC. Is your NPC's face skeleton rigged the same way, same bone/morph target/etc. structure, etc @@wolfwirestudios

    • @wolfwirestudios
      @wolfwirestudios 7 หลายเดือนก่อน

      Yes, like I said, the face Mesh was copied as is from all respective metahumans. They use the same base skel and same AnimBP. I'll try to run the same code from Metahumans to see what's the difference.
      @@NumenBrothers

    • @wolfwirestudios
      @wolfwirestudios 7 หลายเดือนก่อน

      Okay so I sort of fixed it. For continuity, I was using the SetLeaderPoseComponent Node and hooking all parts to body as the leader. I detached Face mesh and now it's working. But one problem that's prevalent in both Metahuman BP and my NPC BP that the head movement is not working. I mean, the animation asset shows the head movement but not while it's playing on the character as a montage. Any suggestions mate ?@@NumenBrothers
      Edit - For the record, like I said, the NPC is copy pasting the face that's inside the metahuman BP so yeah, same skeleton, same mesh ofc and same AnimBP. I feel that the SetLeaderPose node was locking the face mesh from independently animating but idk why the head rotation isn't working. It's not working in the metahuman BP either.

  • @loganou9746
    @loganou9746 ปีที่แล้ว +1

    Thanks a lot !

  • @romanbruni
    @romanbruni 5 หลายเดือนก่อน

    nice and clean... what about the spoken sound ?

  • @theshizon
    @theshizon ปีที่แล้ว

    I feel like you missed something. You said go to the other episode to check out the metahuman tutorial. How did you get a totally custom Avatar Navi model to be rigged for live link facial animation, with an animated body attached? That's the part that confuses me.

    • @NumenBrothers
      @NumenBrothers  ปีที่แล้ว

      Ricardo Chaves www.youtube.com/@rchavz

  • @aaronbircher9981
    @aaronbircher9981 ปีที่แล้ว

    great video, thanks! im trying to figure out how to combine the body and face animation. what video of yours should I watch? its really just about making the montage, not how to animate metahuman though...

    • @NumenBrothers
      @NumenBrothers  ปีที่แล้ว

      Check out the fireball episode, episode 30 in the series. It's long, but you could fast forward to the parts on integrating your montage with the character.

  • @godofvideoss
    @godofvideoss 10 หลายเดือนก่อน

    Nice! When using the live link, is speech audio also recorded?

    • @NumenBrothers
      @NumenBrothers  10 หลายเดือนก่อน

      nope, has to be done separately and spliced after the fact

  • @rifat.ahammed
    @rifat.ahammed 4 หลายเดือนก่อน

    Thanks

  • @alan112223
    @alan112223 ปีที่แล้ว

    Thanks! Subscribed

  • @ElliotPollaro
    @ElliotPollaro ปีที่แล้ว

    Does this record audio as well? Or do you have to dub over your voice?

    • @NumenBrothers
      @NumenBrothers  ปีที่แล้ว

      No, does not record audio. I recorded it separately and simultaneously. But I recorded it several times, and just matched up the best two.

  • @joeanrachelmiller6529
    @joeanrachelmiller6529 ปีที่แล้ว

    Is it only an apple app or is it on andriod?

    • @NumenBrothers
      @NumenBrothers  ปีที่แล้ว

      Apple only unfortunately. There are other means of animating the face- episode 47 is an example. We're doing another one in a few episodes.

  • @redwolf831
    @redwolf831 ปีที่แล้ว

    I followed your steps and it’s not working

    • @NumenBrothers
      @NumenBrothers  ปีที่แล้ว

      Couple things to check: First, are you able to see your device on the Metahuman? If not, then you know the problem is the connection between the phone and Unreal Engine. The first thing then is to make sure your phone's wifi is on, and that it's connected to the same network as your computer.

    • @redwolf831
      @redwolf831 ปีที่แล้ว

      @@NumenBrothers I can see my device it must be something else

    • @NumenBrothers
      @NumenBrothers  ปีที่แล้ว

      @@redwolf831 trying going into the animation blueprint of the face itself (linked from the Metahuman character) and click that same variable to 'true', the live link face variable, but do it directly on the Animation blueprint and save the default on the blueprint. Then delete out the character in the world and re-add them to the world.

    • @redwolf831
      @redwolf831 ปีที่แล้ว

      @@NumenBrothers I can see it moving around in the face blue print, but not on the MetaHuman in my map

    • @NumenBrothers
      @NumenBrothers  ปีที่แล้ว

      @@redwolf831 make sure the checkbox is checked in both places. Try removing the character and placing again in the map.

  • @benblaumentalism6245
    @benblaumentalism6245 ปีที่แล้ว

    Metanumen. 😁

  • @suneilangel
    @suneilangel ปีที่แล้ว +1

    Dude, you are an absolute lifesaver.

  • @FPChris
    @FPChris 6 หลายเดือนก่อน

    Honestly I find it quite average. I was playing with it and it always looks way artificial. I guess it’s better than nothing.