How to use Faceware with MetaHumans in Maya & Unreal Engine 5

แชร์
ฝัง
  • เผยแพร่เมื่อ 14 ต.ค. 2024

ความคิดเห็น • 61

  • @pixelpriority
    @pixelpriority 3 ปีที่แล้ว +2

    Very well spoken, perfectly balanced between thorough tech and conceptual reasons behind the choices. Thanks for sharing!!

    • @echoperformances
      @echoperformances  3 ปีที่แล้ว

      Thanks for the feedback! I really appreciate it.

  • @MiloDelMalwgb
    @MiloDelMalwgb ปีที่แล้ว +1

    Great video. I was having a really hard time with facial animations.
    I am using Faceware Studio, but it seems to be more geared towards real time, and there is no way to export those face animations to maya (that I know of); so now I am considering using Analyzer and Retargeter.
    The only comment I have is on body animation exports; You only need to select DHI_Body:root, export selected, and keep the same settings. The root_drv is a rig that connects the head and body rig and keeps them in one place.

  • @joelberg8576
    @joelberg8576 3 ปีที่แล้ว

    Yes very well done, every step thank you! Most every other video misses something here and there

    • @joelberg8576
      @joelberg8576 3 ปีที่แล้ว

      Oh. But then I get stuck on the character setup file (which I found in the comments). Can you go over how to create the character setup file? Thanks!

    • @echoperformances
      @echoperformances  3 ปีที่แล้ว

      Thanks for the feedback! I’m glad you found the tutorial helpful. And thanks for the suggestion to make a tutorial specific to the Character Setup. I’ll try to find some time to record one.

  • @davidmonteiro6822
    @davidmonteiro6822 ปีที่แล้ว +1

    fantastic stuff! Does the Maya retargeter plugin also work if i dont have a joint driven setup but only a character with 52 ARkit blendshapes for example?

    • @echoperformances
      @echoperformances  ปีที่แล้ว +1

      Hi David! I’ve always used a control rig that drives my blendshapes and/or joints, but in theory, Retargeter’s character setup can drive any animatable parameters (transforms or otherwise). In the character setup window, you can try selecting the mesh that has the blendshapes and hitting the “Update” button to display its blendshape parameters.

  • @TheWolverinepower
    @TheWolverinepower 3 ปีที่แล้ว +1

    Very informative tutorial. Thanks!

  • @kaitlynstaveley
    @kaitlynstaveley 3 ปีที่แล้ว

    Really helpful tutorial video, thank you for making this!

    • @echoperformances
      @echoperformances  3 ปีที่แล้ว

      That’s great to hear! Your feedback is really appreciated.

  • @POKATILO_3D
    @POKATILO_3D 2 ปีที่แล้ว

    Amazing, thank you for tutorial, but maybe I didn't understand something, but the important part is setting up a meta-human in maya to transfer information from the analyzer, could you tell me more about this setting? In the video, you just select a file with this setting

    • @echoperformances
      @echoperformances  2 ปีที่แล้ว +1

      Hi POKATILO, apologies for the late reply. The step you're describing is the Character Setup. This is the process in which we register our facial rig's controllers and assign them to their corresponding face groups. I've been working on a course that goes into a lot more detail about the entire Faceware workflow, including the Character Setup. I expect it should go live in a few weeks, but in the meantime, you can check out this video that should help you get started: th-cam.com/video/_VrxWvIb0Rc/w-d-xo.html

    • @POKATILO_3D
      @POKATILO_3D 2 ปีที่แล้ว

      @@echoperformances Thank you! Now I have studied FaceWare more and I can say that I asked the question incorrectly. I am more interested in how you set up the Metahuman poses in Retargeter to better match the video reference, do I understand correctly that you moved the controls manually from the Meta-human in Maya or did you use ready-made meta-human poses from Unreal?

  • @konnerbonsen8893
    @konnerbonsen8893 3 ปีที่แล้ว +1

    So cool and detailed ! Thank you a lot ! I would love to work on that too. But I am working on a mac. Do you think, there is a chance to do so ? It seems, that Faceware does not exist for mac...

    • @echoperformances
      @echoperformances  3 ปีที่แล้ว +1

      To be honest, I haven't met any Faceware users working on a mac. Perhaps their support team could recommend you an alternate solution to get their products running on a mac. You could enquire by contacting them through this page:
      facewaretech.com/contact-us/
      Thank you for your comment! I hope that helps.

    • @konnerbonsen8893
      @konnerbonsen8893 3 ปีที่แล้ว +1

      @@echoperformances Thanks a lot for your recommendation ! Gives me the next steps ! :)

  • @mohannadsalim9541
    @mohannadsalim9541 2 ปีที่แล้ว

    Your face animations are lovely! More and more I'm finding that faceware is the king of facial mocap. For your body animation, wouldn't it have been easy to use iclone? and get an actorcore animation?

    • @echoperformances
      @echoperformances  2 ปีที่แล้ว +2

      Those are great suggestions. Facial animation is really my expertise, but I’ve started exploring different options for body animations. I’m hoping to incorporate some of them in my next projects rather than keyframing.

  • @pointandshootvideo
    @pointandshootvideo 3 ปีที่แล้ว +1

    AWESOME Tutorial! Well done! Thank you! I thought Tough Guy's performance was the most believable and I'm trying to figure out why. Does it have anything to do with camera height/angle and distance from the actor? I'm wondering if Nervous Guy's performance would look move believable if Jon did it. I'm also wondering what it would look like if you swapped MetaHumans. If you had to redo all of this, what would you have done differently or told the actors? Thanks!

    • @pointandshootvideo
      @pointandshootvideo 3 ปีที่แล้ว +1

      In Over 40, I think putting a microphone in front of her face and giving her headphones like in the video would have made the performance more believable. In your video it looks like she's in a live interview setting and her eyes don't seem to be looking at the interviewer. As a result, the eye movements just seem random. What do you think?

    • @echoperformances
      @echoperformances  3 ปีที่แล้ว +1

      @@pointandshootvideo Thanks for the feedback! I really appreciate your notes. It's funny how either character seems to appeal more or work better for different people. I can't put my finger on it either. Maybe it's a matter of relatability or familiarity of the model?
      I'm currently working on another short film with the same 2 actors. I'll probably give them new models to better fit the story. The biggest thing I learned from working on "Old Scores" is that animating the skeleton directly in FK was a bad idea. A proper character rig will go a long way. There are a few autorig scripts built for MetaHumans available for pretty cheap. I'll also try to get actual body animators to lend a hand this time around.
      As for the mic idea and the wandering eyes in "Turning 40", I completely agree. Trying to replicate the performer's original environment will definitely help to sell the final result.

  • @Sanhanat_Pop
    @Sanhanat_Pop ปีที่แล้ว +1

    Hi, Your video is amazing, so I have a problem for Faceware retargeter, When I open performance and select .fwr file and .xml I don't get any think poses like you, Can you help me about this? Thank you

    • @echoperformances
      @echoperformances  ปีที่แล้ว +1

      Hello, thanks for your message. The first time you introduce Retargeter to a new character rig, you can either create poses using the Character Setup’s Expression Set (50 default/generic poses), or you can create custom poses specific to your actor’s performance. You can start with 5-10 poses per face group, retarget to fill your timeline, evaluate your animation, then add corrective poses where needed, and repeat the process. Don’t forget to hit the “Update” button to register your controller values in Retargeter before hitting the “Retarget” button.
      For your initial poses, I recommend adding poses on the same frames that you had created Training Frames in Analyzer. It’s not required, but generally you’ll find most of your extreme or most defined expressions.

    • @Sanhanat_Pop
      @Sanhanat_Pop ปีที่แล้ว +1

      @@echoperformances Thank you so much

    • @echoperformances
      @echoperformances  ปีที่แล้ว

      My pleasure! Good luck and have fun!

  • @facemotion3d971
    @facemotion3d971 3 ปีที่แล้ว +1

    Awesome, thanks for sharing. Is it possible to bake a first pass then use maya's animation layer to layer in extra tweaks on top of baked animation then merge layers before exporting the control rigs as fbx to UE?

    • @echoperformances
      @echoperformances  3 ปีที่แล้ว +1

      Precisely. In fact, with Retargeter you can continue to add as many in-between poses to improve your transitions and iterate as many times as you want to fill in the blanks.
      After you feel your animation is satisfactory, you can edit the curves directly to iron out some jitters or fine tune some subtle details. You could also add an animation layer on top for your changes, just keep in mind that Retargeter only evaluates keys on the base layer.

    • @facemotion3d971
      @facemotion3d971 3 ปีที่แล้ว

      @@echoperformances That's awesome. I am working on a pipeline/solution that allows me to get base first pass animation from our Facemotion3d ios app to metahuman rig inside maya. After the firsr pass I want to apply more details on visimes and other suttle nuances and tongue anination via layers then merge it before expirtibg. Your Faceware animation is one if the best I have seen. :) Would love to show you my progress and also integrate Faceware pipeline into my workflow. Are you subscribed to the pro-version of analyzer and retargetter? I would really not want to not use the storable training frames function even if I have to pay close to four times the amount compared to the base version!

    • @echoperformances
      @echoperformances  3 ปีที่แล้ว

      @@facemotion3d971 Thanks for the praise! What you describe in your first lines is "spot on". Regardless of the solver (Faceware, ARKit, ...), that approach will give you a solid foundation, but if you're looking for high quality facial animations, any automated method won't get you 100% of the way there. A polish pass is always recommended, which includes the tongue if it's not part of your solve.
      To answer your question, I am using the pro-version which offers batch support for Analyzer and Retargeter. This includes the ability to export Tracking Models from Analyzer and Shared Poses for Retargeter. If you're looking to experiment with these features, I would recommend you look into Faceware's PLE license (Personal Learning Edition).

    • @facemotion3d971
      @facemotion3d971 3 ปีที่แล้ว

      @@echoperformances The PLE faceware studio which I already tested is not same as analyzer n retargetter is it? Didnt like the quality of the studio. I think there is a trial version of analyzer n try them. :)

    • @echoperformances
      @echoperformances  3 ปีที่แล้ว +1

      @@facemotion3d971 You're correct. Faceware Studio is their real-time solution, whereas Analyzer and Retargeter offers a lot more customization. The batch feature I mentioned in my previous reply is mostly beneficial for large volumes or for repeat use of the same actor/character. If your project justifies it, you might want to consider the cost of the higher license with batch support versus the cost of an animator setting up each job "from scratch".

  • @arvindpalep6372
    @arvindpalep6372 3 ปีที่แล้ว

    Great tutorial. Curious if Metahumans exports to 3dmax yet? Last time I checked it was still not working in bridge. Or if it doesn’t have that option working yet, is it possible to export a metahumans rig in Maya to 3dmax via fbx with face animation.

    • @echoperformances
      @echoperformances  3 ปีที่แล้ว

      Hi Arvind. That’s an interesting proposition. I haven’t thought of experimenting since I work primarily in Maya. If you do get around to testing it, let me know.

  • @virtualfilmer
    @virtualfilmer 2 ปีที่แล้ว

    This is a great overview - I’ve been working on my own, with my own tips and tricks, but it takes so long and I keep scrapping it and starting again 😂 There are so many bugs in UE and Faceware and Maya, it’s a frustrating and CRASH-PRONE process for me. :)

    • @echoperformances
      @echoperformances  2 ปีที่แล้ว +1

      Hi Toby! Thanks for the comment. It can definitely be frustrating to encounter bugs and crashes along the way. My best advice would be to save your scenes often and regularly hit the "Update" button in Retargeter to store your changes independently of your scene.

  • @B.BAKDASH
    @B.BAKDASH 2 ปีที่แล้ว +1

    Can you make another tutorials for the faceware and metahuman in maya and how to animate the body using mixamo?

    • @echoperformances
      @echoperformances  2 ปีที่แล้ว

      Hi Belal, thanks for stopping by. I work primarily with creating facial animations, including the pipelines and tools that support it, so I would need to explore Mixamo before I could create a tutorial for it. Have you found any particularly interesting videos that you would recommend?

    • @B.BAKDASH
      @B.BAKDASH 2 ปีที่แล้ว +1

      @@echoperformances I will be happy if you make a detailed video to use facware because I'm having some problems with it..and about animating the body, you can watch these videos, some of them are related to the topic
      th-cam.com/play/PLlJ0LuZkvf-XueBlSDGPhpMmX7pV3SczH.html

    • @echoperformances
      @echoperformances  2 ปีที่แล้ว

      @@B.BAKDASH Thanks for the playlist! I'm currently working on a more detailed and more "official" Faceware course that should be available in March. Keep an eye on my Twitter page for more details as the date approaches.

  • @arvindpalep6372
    @arvindpalep6372 3 ปีที่แล้ว

    Not sure if you ever use Iclone. But the only issue I have with Metahumans is that in only allows you to render in Unreal (or publish) according to the Unreal disclaimer. I know a lot of people including myself would be curious about a tutorial using retargeter with an iclone or Daz generated human.

    • @echoperformances
      @echoperformances  3 ปีที่แล้ว

      I’ve never tried iClone or Daz, but I could add those to my list of future projects. Thanks for the suggestion!
      As for publishing your work with MetaHumans outside of Unreal, I believe Epic offers the options to license MetaHumans to be used outside of Unreal. As long as it’s used within Unreal, it comes included for free.

    • @arvindpalep6372
      @arvindpalep6372 3 ปีที่แล้ว +1

      Ah, good to know! Thanks for the reply!

  • @davidmonteiro6822
    @davidmonteiro6822 ปีที่แล้ว

    also, whats the price tag for these tools from Faceware? as far as i understand i only need the analyser+ retargeter correct?

    • @echoperformances
      @echoperformances  ปีที่แล้ว

      To get prices for Analyzer and Retargeter, I recommend you browse the options here and contact a Sales Rep to obtain a quote:
      facewaretech.com/pricing/

    • @davidmonteiro6822
      @davidmonteiro6822 ปีที่แล้ว

      @@echoperformances i will do that and see if its affordable as a freelancer. Just wanted to make sure that analyser + retargeter are all i need to follow this exact workflow!?

    • @echoperformances
      @echoperformances  ปีที่แล้ว +1

      Correct! This tutorial covers Analyzer and Retargeter. Faceware has recently introduced their neural-net cloud-based tracker called Portal. This removes the need to manually track your footage, however it’s still relevant to understand how Analyzer works.
      Other than that, you’ll need a license of Autodesk Maya. You can check out their Indie pricing here:
      makeanything.autodesk.com/maya-indie

  • @barulicksama3838
    @barulicksama3838 2 ปีที่แล้ว

    Do tutorial on using Faceware Live Link with UE5.

  • @matthewisikhuemen8907
    @matthewisikhuemen8907 3 ปีที่แล้ว +1

    Wow, this is a very amazing tutorial. thanks for this. Please I wanna repeat the same process onto a meta human, but I don't seam to understand the character setup. Please could you help with the xml file you created for the metahuman so I could just import it and test with it and better understand how it works, since all meta humans use the same rig. Please, I will really appreciate that. Hope to hear from you shortly

    • @echoperformances
      @echoperformances  3 ปีที่แล้ว

      Thanks for the comment! Feel free to share this video and the channel with others.
      Perhaps I’ll create a more in-depth tutorial specific to create a Character Setup file. For now, let me know if this one works for you:
      bit.ly/MH_FWR_CharSetup_01

    • @matthewisikhuemen8907
      @matthewisikhuemen8907 3 ปีที่แล้ว

      @@echoperformances Woww. thanks so much bro for this I really do appreciate this. I will try it out. Please if you don't mind how may I reach out to you, I would love to discuss somethings with you. Please I would really love for us to connect I wanna have a discussion with you. thanks and God bless Bro

    • @echoperformances
      @echoperformances  3 ปีที่แล้ว +1

      @@matthewisikhuemen8907 In the "About" section of our channel, you can find an e-mail address to contact us with your questions.

    • @matthewisikhuemen8907
      @matthewisikhuemen8907 3 ปีที่แล้ว +1

      @@echoperformances Oh thanks so much, I will send an email. thanks once again, I really appreciate

  • @virtualfilmer
    @virtualfilmer 2 ปีที่แล้ว

    Btw, have you noticed that maya 2022 screws up the color of metahuman skin when you bring them in? No biggie, cause they aren’t used, but it’s weird. :) Some other things I’ve found: Faceware Retargeter crashes in Maya 2022 a lot with metahumans, so I’ve gone back to maya 2020. And not using very latest FW Retargeter as that also crashes lots. Faceware Retargeter conflicts with Facegood - they can’t both be installed. Facegood does a better job than Faceware but is super buggy and primarily in Chinese and poorly translated. (Also v difficult to install due to windows defender and Norton hating it; and possibly contains viruses / malware - who knows)

    • @echoperformances
      @echoperformances  2 ปีที่แล้ว +1

      Very useful insight. Thanks! To be honest, the reasons you've provided are why I've hesitated to try out FG thus far. I've been a long-time user of Retargeter and I suppose I've been fortunate to not encounter crashes that often, granted I'm still working in Maya 2020.
      For what it's worth, I generally work on facial animations in a standalone scene with only 1 character, no body animation, and no environments to keep things as light as possible. When adding, copying, or updating poses in Retargeter, best to avoid undoing the previous operation with CTRL-Z. I would say that's probably my most consistent crash. I hope that helps!