Industry-Leading Face Capture for your Stylized 3D Characters - FREE & Easy with MetaHuman Animator

แชร์
ฝัง
  • เผยแพร่เมื่อ 31 ธ.ค. 2024

ความคิดเห็น •

  • @tiedtkeio
    @tiedtkeio  3 หลายเดือนก่อน +5

    As always, let me know if you have any questions about the video. 🙂

    • @Miditator
      @Miditator หลายเดือนก่อน

      Hello,thank you for the tutorial but I had a one question - I already have blendshapes in my stylized char, import them in CC4. Totally I have my char in CC4 with my blendshapes but CC4 won't create face rig for him (.
      help me please

  • @axiomvicarious
    @axiomvicarious 3 หลายเดือนก่อน +2

    Bro, this gives me both faith and confidence.

    • @tiedtkeio
      @tiedtkeio  2 หลายเดือนก่อน

      Amazing! Hope you create something cool!!

  • @ompansare4516
    @ompansare4516 3 หลายเดือนก่อน +2

    woooow just what i needed wtf ... thankyou man ❤❤❤❤

  • @cukiris_
    @cukiris_ หลายเดือนก่อน +1

    One more time, thanks so much for the great tips

    • @tiedtkeio
      @tiedtkeio  หลายเดือนก่อน

      Happy to help! Thanks for watching! :)

  • @ompansare4516
    @ompansare4516 3 หลายเดือนก่อน +3

    bro no shit , started learning ue5 recently and ur video were a big help
    ;p-;

    • @tiedtkeio
      @tiedtkeio  3 หลายเดือนก่อน +1

      That's great! Thanks for being here and happy to help! 🥰

  • @josesanjuan1513
    @josesanjuan1513 3 วันที่ผ่านมา +1

    GREAT! . Does Amber character has cc4 extended or standard expressions? Can this influence in a better performance to the metahuman control rig? Thank you!

    • @tiedtkeio
      @tiedtkeio  2 วันที่ผ่านมา +1

      Amber has the Extended expressions. I highly recommend tweaking these CC4 blendshapes to match a Metahumans blendshapes visually, to get better results from this workflow. Thanks for watching! :)

  • @albertusbodenstein1976
    @albertusbodenstein1976 2 หลายเดือนก่อน +1

    Thanks so much.. Never knew you could do this.. I'd like to add a Mocap capture workflow to this.. Do you know how to do this?

    • @tiedtkeio
      @tiedtkeio  2 หลายเดือนก่อน

      Sure, I have another video on my channel which goes over how to add animation your character - that could be a start. Then I would say it depends on what type of mocap solution you have; if you stream it to Unreal or if you import fbx animation sequences etc. But overall it's just a matter of combining two sequences in UE5.

  • @itestthings5337
    @itestthings5337 3 หลายเดือนก่อน +1

    Great stuff! been looking for precisely this for a while. I have a question for you, in your experience is MH animator worth all the extra steps compared to, for example, using live face/accuface straight in iclone? (my experience with accuface is bad, but live face in iclone has decent results). It seems like MH animator really shines only with metahumans because of the topology and rig, but using it with a CC character seems a bit underwhelming, you don't get those amazing deformations. I know from a previous video that you are working with accuface on a film, so you are uniquely positioned to share good insight.
    Amazing channel, please continue to share!

    • @tiedtkeio
      @tiedtkeio  3 หลายเดือนก่อน +3

      Thanks for your comprehensive comment! Yes and no is the short answer haha.
      Accuface is great and really the only alternative when you have a pre-recorded video of an actor. Maybe they're not in the same part of the world as you etc. Metahuman Animator really takes the facial capture to the next level due to the fact it uses Lidar Depth Sensor, achieving a much higher fidelity. The CC control rig is in its early stage and a version 1 right now, it will be updated in the future to achieve a much closer resemblance to the Metahuman Performances.
      Both Accuface and Metahuman workflows will continue to evolve with new features, so what it really comes down to is where your end project will end up. If it's Unreal Engine, then MHA is worth the hassle. If it's Blender, iClone, Unity, Maya etc, then Accuface is what I recommend.

  • @CamiloDuarteFranco
    @CamiloDuarteFranco หลายเดือนก่อน +1

    hey thanks for the tutorial, i am getting that effect where my capture and identity looks cool on engine, but when i pass the curves to my character looks very off. do it needs any retargeting ? thanks

    • @tiedtkeio
      @tiedtkeio  27 วันที่ผ่านมา

      Not retargeting, but tweaks to the blendshapes. I would suggest you go over your custom characters blendshapes and compare them to a MetaHuman! The easiest way to do that is to bring a MetaHuman into the UE5 level, drag in one controller on the control board and see how the mesh blendshape looks. If you can make that blendshape more similar on your CC4 character, then the animation will look nicer. It's a matter of tweaking your morphs/blendshapes in CC4 to better match a MetaHuman. Not to make it look like a human, but how the MetaHuman mesh deforms with the blendshape. Hope this helps!!

    • @CamiloDuarteFranco
      @CamiloDuarteFranco 27 วันที่ผ่านมา

      @@tiedtkeio Legend thanks for the reply

  • @HussinKhan
    @HussinKhan 3 หลายเดือนก่อน +1

    Nicely explained, thank you!

    • @tiedtkeio
      @tiedtkeio  3 หลายเดือนก่อน

      Great to hear! Thanks for watching! 🤗

  • @poet8236
    @poet8236 2 หลายเดือนก่อน +1

    Thanks a lot, this is truly helpful as getting lost in these processes is so damn easy. :) I got my facial animation now on my sequence and it works. The big but is that the quality is still quite mediocre. On the Metahuman performance the matching is actually very astonishing but in the process to get it onto the character (bought Amber from La Famila too to full match the tutorial :) ) most detail is lost. I suppose this all has to do with the blendshapes and the CC control rig being an early version. Still already quite powerful!
    The only thing I am struggling with is the CC_Rig_BP. It is being removed whenever I restart the Unreal Engine although I save the level and "Save All". Which causes my "LS_Amber" to always have to be set up again. Outliner shows the error "The world contains invalid actor files" and the log says "LogWorldPartition: Warning: Unknown actor base class `/Game/_Characters/Amber_001/Rigs/CC_Rig_BP.CC_Rig_BP_C`: Actor: 'CC_Rig_BP_C_UAID_C87F5400CDAF7A2802_1089411207' (guid 'B5DD4D1F49FB59339413B49B6978E51B') from package '/Game/__ExternalActors__/FirstPerson/Maps/FirstPersonMap/A/FS/3VHWPU13SMVILXSI0DHHN5'". I have done nothing but export from CC as you did, import into UE5 and drag the CC_Rig_CP to the scene. I tried it now with 3 fresh projects in 5.4.4, going step by step as you did. EDIT: Found my mistake, you HAVE to rename "CC_Rig_BP" to something else upon creating the control rig, it's not a nice to have but a must.
    Anyways, what I really wanted to say: THANK YOU! :)

    • @tiedtkeio
      @tiedtkeio  2 หลายเดือนก่อน

      Thank you so much for watching and your comprehensive comment! Yes, I should make a follow up video to this one to show the difference you can get by adjusting the blendshapes. My quick tip is; the morphs for the CC4 character doesn't quite match the Metahumans morphs. I'd recommend going through all your blendshapes in CC4 and match them more to the MH equivalent shapes. I also know Reallusion is working on better support for this specific workflow, I've talked with them. 🙂

  • @nu-beings
    @nu-beings 3 หลายเดือนก่อน +1

    Ok, very nice, but how would you add an updated animation for her face once you're done with this one? Do you have to start all the way back over?

    • @tiedtkeio
      @tiedtkeio  3 หลายเดือนก่อน

      Sorry for late reply! You simply remove the control rig from sequencer, change the animation in sequencer (import the new one first to your Unreal project like shown in this video), and then add the control rig back. Its difficult to describe with words, but very simple in reality.

  • @sergiopaz3263
    @sergiopaz3263 2 หลายเดือนก่อน +1

    "Hi @tiedtkeio, does this mean I can apply this to any character I want? For instance, if I download an Orc from the Marketplace without blend shapes, could I still add facial expressions to it? I recently learned about Metapipe, but it seems quite complicated... 😔 I have a small story in mind that features an Orc, and I'm unsure how to proceed. Does this only work with Metahumans?"

    • @tiedtkeio
      @tiedtkeio  2 หลายเดือนก่อน +1

      Yes it does. If you download an Orc from the Marketplace (whether that's Unreal or Reallusion marketplace) you can still use it with this workflow. However let's assume the worst and that the Orc isn't rigged and doesn't have blendshapes; then you'll have to first rig the Orc using either Blender, Maya, Mixamo or my personal suggestion AccuRig and CC4. Then you'll have to manually create the blendshapes for the face yourself. This is a somewhat tedious task, but not hard at all. I have another tutorial on my channel which shows how to create blendshapes in Blender for your CC4 character. In summary, the steps/tutorials you'll need in order to get the Orc to work as shown in this video are:
      1. Rigging a character (you choose software, I suggest AccuRig/CC4).
      2. Creating blendshapes for a character (Also suggest CC4 here).

    • @sergiopaz3263
      @sergiopaz3263 2 หลายเดือนก่อน

      @@tiedtkeio Thank you for your response! I'm excited to move forward with this and have been learning so much from your content. I truly appreciate your help!

  • @archichampin
    @archichampin 3 หลายเดือนก่อน +1

    Really nice video mate, thanks, I started watching it just out of curiosity but the background music trapped me in and kept watching it til the end. I think is a really nice workaround but can't really understand why would someone would do all of this, plus fixing all the issues that seem to happen, instead of setting the character up properly from the beginning from CC to Unreal so can use the real time facial performance? I'm working on a short at the moment and I'm looking for ways to speed up character animation to Unreal without having to buy an expensive tools and can't really figure out how to benefit from this workflow, can you tell me what I'm missing here? Cheers pal

    • @tiedtkeio
      @tiedtkeio  3 หลายเดือนก่อน

      Thank you for your kind words and watching the video! The main takeaway here which I probably glanced over is the fact that MHA uses 3D lidar depth information with machine learning to achieve a much higher fidelity facial capture than regular 2D video capture using tracking and AI can (like Live Link Face or iClones AccuFace tools).

  • @shivangipriya4153
    @shivangipriya4153 3 หลายเดือนก่อน

    Thank you. what about the body? if we have animation from Mixamo how will mix this ??

    • @ardagenc4674
      @ardagenc4674 3 หลายเดือนก่อน

      For using it in an anim blueprint you can benefit from layer blend per bone node
      Use the neck bone as the transition point and connect the body animation to pose 1 and facial animation to pose 2
      To create a combined animation sequence I am guessing a sequencer can work. Dragging the skeleton to sequencer enabling its control rig, making it layered and adding the both animation but I am not really sure whether if this one works smoothly
      I would normally use the facial animation on a layered control rig and hand animate the body

    • @shivangipriya4153
      @shivangipriya4153 3 หลายเดือนก่อน

      Ok thank you so much

    • @shivangipriya4153
      @shivangipriya4153 3 หลายเดือนก่อน

      Ok thank you so much

  • @danodesigndanomotion2068
    @danodesigndanomotion2068 3 หลายเดือนก่อน +1

    thanks for this video bro

    • @tiedtkeio
      @tiedtkeio  3 หลายเดือนก่อน

      That's fantastic! Thanks for watching it! 😄🙏

  • @rakeshmani8787
    @rakeshmani8787 3 หลายเดือนก่อน +1

    can you make one video how to make cinematic dark clouds in unreal engine

    • @tiedtkeio
      @tiedtkeio  3 หลายเดือนก่อน

      Oh, that's a great idea, will do! 🙂🙏

  • @kashifhussain7388
    @kashifhussain7388 หลายเดือนก่อน

    thanks man

  • @Ariymore
    @Ariymore 2 หลายเดือนก่อน

    So cool man , can we join cc charecter with live link?

  • @Faneva_Jorah
    @Faneva_Jorah หลายเดือนก่อน

    If anybody has a solution, tell me. I have a full rigged character from blender using accurig to cc pipeline in blender and face it for the face. It's already animated and I want to import it in UE5 but it failed, a bunch of error that I don't even understand, I started learn UE5 recently I like the control that I have in UE, rendering in blender is way too long and I cant freely preview my work

  • @michaellemon9183
    @michaellemon9183 หลายเดือนก่อน

    Found out the hard way today that Animator now only works with iPhone 12 or newer. Still using an 11 other no plans to upgrade until now….

  • @amigoface
    @amigoface 3 หลายเดือนก่อน +1

    does it work with android phone camera

    • @tiedtkeio
      @tiedtkeio  3 หลายเดือนก่อน +1

      Unfortunately not! It uses the iPhones Lidar sensor with 3D depth. The alternative way to iPhone is to use a Stereo HMC i.e. a dual camera setup on a headcamera to achieve 3D depth, but that is more advanced than the iPhone.

  • @t_claassen
    @t_claassen 3 หลายเดือนก่อน +2

    Sub +1. Is it somehow possible to send this facial animation back to iclone?

    • @tiedtkeio
      @tiedtkeio  3 หลายเดือนก่อน +4

      That's great, thanks! 🤗 It is! You can simply right click the body and face components, select bake animation, then right click on the animation in content browser and export as .fbx. I could do a video on this too later on if you'd like. 🙏

    • @t_claassen
      @t_claassen 3 หลายเดือนก่อน

      @@tiedtkeio Thanks for that íf you do and thanks for the swift reply. 😊 For some of my projects, this iclone > ue5 > iclone > blender (live-link) is my go-to and would save me a lót of time. Although UE5 is not exactly my cup of tea to be honest atm. And while Acculips, in iClone8, dóes get the job done eventually it's very time-consuming. That's why I'm watching these series. Thanks again. *#sweet* 😉

    • @t_claassen
      @t_claassen 3 หลายเดือนก่อน

      @@tiedtkeio 🙏

    • @davidvideostuff
      @davidvideostuff 3 หลายเดือนก่อน +1

      @@tiedtkeio Yes !! Do please make a video on the workflow with the animation back to Iclone !!!

  • @leizervieira1166
    @leizervieira1166 3 หลายเดือนก่อน +1

    ❤❤❤

    • @tiedtkeio
      @tiedtkeio  3 หลายเดือนก่อน

      Hope it helped! Thanks for watching! 🫶

  • @ArthurBaum
    @ArthurBaum 3 หลายเดือนก่อน +1

    That's cool and all. But don't we already have like hundreds of MHA tutorials? What is this about?

    • @tiedtkeio
      @tiedtkeio  3 หลายเดือนก่อน +2

      This is mostly for those looking for how to record facial performance capture for their stylized or anime character from CC4 or Blender and not a Metahuman as end target (see end of the video). ☺️🙏