Animate Metahumans Using an Audio File in Unreal Engine 5.5

แชร์
ฝัง
  • เผยแพร่เมื่อ 31 ธ.ค. 2024

ความคิดเห็น • 68

  • @commontimeproductions
    @commontimeproductions 2 หลายเดือนก่อน +9

    This is amazing! Thanks so much for sharing that this is now available. After the last "free" plugin that was allowing audio to lip sync decided to stop all of us using it for free and started charging obscene prices and deemed only for large studios, so many of us indie devs were completely derailed in using the metahuman lip sync functionality. Epic is amazing for making this a part of the engine finally. No longer will we be stopped by a pay wall for our projects. Hooray! I will be sharing your video :)

  • @tbowren
    @tbowren 2 หลายเดือนก่อน +1

    For sure this was my most favorite thing I saw and demoed at Unreal Fest. Thanks for making a video so quickly!

    • @PixelProf
      @PixelProf  2 หลายเดือนก่อน

      Hah! Thanks Tony!

  • @jatinderarora2261
    @jatinderarora2261 4 วันที่ผ่านมา

    Thanks for sharing the awesome tutorial. Very helpful. Is it possible to add expressions from pose library or something similar on top of lip sync ?

  • @olavandreasterrasource8034
    @olavandreasterrasource8034 2 หลายเดือนก่อน +1

    thanks you so much i have been looking for this tutorial for weeks

  • @pondeify
    @pondeify 2 หลายเดือนก่อน +1

    oh man this is awesome - thanks for sharing!

  • @Monoville
    @Monoville 2 หลายเดือนก่อน +2

    Great simple tutorial, so glad they've implemented this. Gave it a couple of tests and the results are very impressive, particularly when combined with the facial animations from city sample to replace the eye movements/blinking. Will still use the previous animator method for accurate facial, but this is a great quick method.

    • @PixelProf
      @PixelProf  2 หลายเดือนก่อน

      Glad it helped!

    • @cwchris_
      @cwchris_ 20 วันที่ผ่านมา

      Question for you, it seems when I do this it works kinda, but the animations seem overlapped. Is there a fix to that to where you use specific parts from both animations?

    • @Monoville
      @Monoville 20 วันที่ผ่านมา +1

      @cwchris_ as long as you only copy and paste the specific parts (eg eye movement) from one amd delete those specific parts from the other there shouldn't be any overlap

  • @dddharmesh
    @dddharmesh 2 หลายเดือนก่อน +1

    Amazing. So quick and easy to follow.... Thanks

  • @JayceeeFilmVfx
    @JayceeeFilmVfx 27 วันที่ผ่านมา

    Thanks for this Nick! :)

  • @nsyed3d
    @nsyed3d 2 หลายเดือนก่อน +1

    This is great, Thank you.

  • @duchmais7120
    @duchmais7120 2 หลายเดือนก่อน +1

    Awesome. I recall trying something like this way back using Blender phonemes/visemes (mouth shapes)...A, O, E, W/R, T/S, L/N, U/Q, M/B/P, F/V to aid in Lip Sync. Where each phoneme was associated with specific lip and facial movements. Haven't tried it in years. Waiting to try this out once they Release of the Unreal Engine 5.5. The Eye movements (Blinking) is what is puzzling Me.

  • @ValicsLehel
    @ValicsLehel 2 หลายเดือนก่อน +6

    If can be Blueprinted and feed a wav to playback as a stream would be perfect.

    • @mazreds
      @mazreds 2 หลายเดือนก่อน

      I agree

    • @xaby996
      @xaby996 2 หลายเดือนก่อน

      Yep. But its hard wired inside of MetaHuman Performance asset. Chances of this being live are low..

  • @jonaltschuler2024
    @jonaltschuler2024 2 หลายเดือนก่อน +2

    Hoping to try this for real time streamed audio, but looks like it’s not there yet. If anyone has suggestions for that, please let me know 🙏

    • @PixelProf
      @PixelProf  2 หลายเดือนก่อน +2

      Yup. Fingers crossed that's a near-future thing, but this function (for now) is recorded audio/post-process only.

  • @tmaintv
    @tmaintv 2 หลายเดือนก่อน +1

    Very interesting thanks. What can u do with the depth input?

    • @PixelProf
      @PixelProf  2 หลายเดือนก่อน +1

      Depth input is to process data captured with the LiveLink Face iOS app when it’s set to “MetaHuman Animator” mode.

  • @VolkanKucukemre
    @VolkanKucukemre หลายเดือนก่อน +1

    Thanks for the tutorial. So, at the moment this can't be used by streamed audio and we basically need to bake the animation?

    • @PixelProf
      @PixelProf  หลายเดือนก่อน +1

      Yup. Currently bake animation from a recorded WAV file.

  • @dimension3plus
    @dimension3plus หลายเดือนก่อน

    I can't open the 5.5 (preview) project after updating to the official 5.5. :(

  • @SpinxSage
    @SpinxSage 2 หลายเดือนก่อน

    When I export it and bring it into my level sequence, my metahuman does not move. But when checking the animation sequence, the head does move

  • @metternich05
    @metternich05 2 หลายเดือนก่อน +1

    Could you do a more comprehensive tutorial on metahuman lip sync? There's almost none out there.

    • @PixelProf
      @PixelProf  2 หลายเดือนก่อน +1

      Sure… are you looking for something on editing animation that already came from performance or audio capture(like this video shows), ….. or “starting from scratch” using only keys & curves on the control board for animating?

    • @metternich05
      @metternich05 2 หลายเดือนก่อน

      @@PixelProf Starting from scratch is more accurate :) I'm somewhat familiar with UE, I do environments but haven't even tried metahumans. What I actually have in mind is creating an avatar or virtual character that would be the face of a youtube channel and do all the talking. I'm thinking of something more realistic than the one above, with more complex facial expressions, head movement and even gestures. I'm not sure how much effort this is. Though with all the AI rage out there, avatars will be on the rise, if they aren't already.

  • @lordnaps
    @lordnaps 2 หลายเดือนก่อน +4

    pretty cool but the lack of any emotion on the rest of the face because its only tracking audio and not a facial structure makes me wonder what this could be used for practically

    • @PixelProf
      @PixelProf  2 หลายเดือนก่อน +2

      Since Unreal has an extremely capable (ind continuously improving), layered animation system, the synchronized viseme animation from this process can be readily combined with other motion capture, hand-keyed or procedural animation sources.
      For example, a LiveLink Face performance capture can be quite effective for overall face expressions, but lacks fidelity around the mouth, so something I’m hoping to experiment with soon is using this to process the audio from a LiveLink capture, and then art-direct the layering of results from both performance capture methods.
      Also, I imagine this is a step in the direction of realtime audio->animation, which could facilitate performance based on realtime voice generation systems (just speculation on my part for now)

    • @lordnaps
      @lordnaps 2 หลายเดือนก่อน +1

      @@PixelProfwould love to see that idea of mixing the live link and facial cap. but i definitely seeing it going in that direction

    • @xavierhatten9011
      @xavierhatten9011 2 หลายเดือนก่อน

      @@PixelProf ok I"ve done this with Livelink Face app you can move the head In real time to make it more human like but the eyes needs to be able to blink more but you can run the animation and do the head movement at the same time

    • @Ronaldograxa
      @Ronaldograxa 2 หลายเดือนก่อน +1

      I wanted to use meta humans to create digital avatars but I see tools like hey gen coming up and I really wonder if meta human is actually worth the time.. AI is moving too quick

    • @ETT-b6q
      @ETT-b6q 2 หลายเดือนก่อน

      Great tutorial. I’m trying to get additional animation on top of that audio file to animation. Normally I would bake mocap to rig and add additive- but with this facial mocap it breaks or reduces the lip sync to very small movements. How would you go about it?

  • @nandini8904
    @nandini8904 หลายเดือนก่อน

    This is a great video! I have a question- I keep getting an error while trying to install the metahuman plugin, Error Code: MD-0011-0. Can't find any help online, was wondering if you know how to fix this. I heard that it's because of the fab migration.

  • @aerospacenews
    @aerospacenews 2 หลายเดือนก่อน +1

    Appreciate the effort that went into this video and that Epic is rolling out the capability @PixelProf

  • @abdullahalsaadi5991
    @abdullahalsaadi5991 หลายเดือนก่อน

    Hi. Great tutorial. Do you know if it is possible to stream audio data into this instead of using a pre-existing audio file?

    • @PixelProf
      @PixelProf  หลายเดือนก่อน

      So far it just processes audio recording files.

  • @RichardRiegel
    @RichardRiegel 2 หลายเดือนก่อน +3

    Yap... then export it via FBX, import to 5.4 and smoke it.. works perfect for now :)

    • @SpinxSage
      @SpinxSage 2 หลายเดือนก่อน

      I haven't gotten lucky with this, any advice?

    • @RichardRiegel
      @RichardRiegel 2 หลายเดือนก่อน +1

      ⁠@@SpinxSageit´s hard to help if you not specific describe where’s the problem 😊

    • @7ribeh
      @7ribeh หลายเดือนก่อน

      would you be able to help me with this proces?:)

  • @hanasprod
    @hanasprod หลายเดือนก่อน +1

    do you know how to animate the eyes?

    • @PixelProf
      @PixelProf  หลายเดือนก่อน +1

      Yup…. Add the face board control rig in layered mode, then use that to animate the eyes to taste. Will try to make time to record a tutorial video on this.

    • @GustavoTommaso-xk8vk
      @GustavoTommaso-xk8vk 25 วันที่ผ่านมา +1

      Into the sequencer, click on the + on the Face Layer, Control Rig > Select [Layered] option!!!

  • @Felix-iv2ns
    @Felix-iv2ns 2 หลายเดือนก่อน +3

    runtime?

    • @PixelProf
      @PixelProf  2 หลายเดือนก่อน +2

      Not yet.

  • @haydenbushfield6351
    @haydenbushfield6351 หลายเดือนก่อน

    for some reason when i try this the animation does not play in the sequencer? could this just be a bug for me?

    • @PixelProf
      @PixelProf  หลายเดือนก่อน

      Not sure.... I didn't do any "behind the scenes" tricks to get this to work, just what you see in this video. (I think the only edits were to skip waiting times)

  • @manojkennedy26
    @manojkennedy26 2 หลายเดือนก่อน

    Data type selection was not visible, what will be the solution

    • @PixelProf
      @PixelProf  2 หลายเดือนก่อน

      Should be there if you installed the 5.5 version of the plugin into Unreal 5.5
      It’s not available in earlier versions.

  • @olavandreasterrasource8034
    @olavandreasterrasource8034 2 หลายเดือนก่อน +1

    now how do can i use the eyes i cant find it out

    • @PixelProf
      @PixelProf  2 หลายเดือนก่อน

      I'll work on a follow-up video that shows how to apply adjustments and otherwise animate the face and eyes in conjunction with the viseme results from this tool, but it is basically to use sequencer to bake this result onto a face control board, and layer on additional animation inputs.

    • @olavandreasterrasource8034
      @olavandreasterrasource8034 2 หลายเดือนก่อน

      i am looking forward to see your next videos thanks so much

    • @GustavoTommaso-xk8vk
      @GustavoTommaso-xk8vk 25 วันที่ผ่านมา +1

      Into the sequencer, click on the + on the Face Layer, Control Rig > Select [Layered] option!!!

  • @lunabeige
    @lunabeige หลายเดือนก่อน +1

    does it work for other languages other than english?

    • @PixelProf
      @PixelProf  หลายเดือนก่อน

      I haven't tried other languages, but my understanding is that yes, it generally works in other languages as well.

  • @ke_sahn
    @ke_sahn 2 หลายเดือนก่อน +1

    is it possible to combine this with a performance capture?

    • @PixelProf
      @PixelProf  2 หลายเดือนก่อน

      Yup. Body motion can be applied independently and other facial performance can be layered on with Unreal’s Sequencer.

    • @andrewstrapp
      @andrewstrapp หลายเดือนก่อน

      @@PixelProf Would LOVE a tutorial on this if you ever have a chance. Thank you! You're the best.

  • @massinissa8697
    @massinissa8697 2 หลายเดือนก่อน

    it is already in Nvidia omniverse app with some expression faces !

  • @terezaancheva1
    @terezaancheva1 หลายเดือนก่อน

    I've installed 5.5 and metahuman plugin is still missing from the content browser.. Do you have any advice?

    • @PixelProf
      @PixelProf  หลายเดือนก่อน

      The Metahuman plugin is installed from the "Fab Library" section of your Epic Launcher (on the Unreal Engine->Library page). Be sure that you have added it to your Fab Library (should be free or already added), search for "Metahuman" in Launcher and it should be found as one of the options with an "Install to Engine" button. Click that button and you can select a compatible, installed engine to add to.
      Hope this helps.

  • @archcast5550
    @archcast5550 12 วันที่ผ่านมา

    This will make metahuman sdk obsolete

  • @virtualworldsbyloff
    @virtualworldsbyloff 2 หลายเดือนก่อน

    Animate = Lipsync = CLICKBAIT