This is what I've been waiting for! I usually create head movement by myself for my film. So this is exactly what I want. I add facial expressions using additive. I don't have to use Audio2face anymore. UE5 finally did it! I am so happy, J!
This is great, does the same thing that SDK does but for free! And the results seem to be a bit more accurate. For those that don't know, to use the face animations that you create in UE5.5 in previous UE versions, just export the anim as FBX and import to older UE version as FBX using Face archetype or face mesh as the mesh (migrating won't work so has to be done this way). Useful for older projects that you don't want to update to 5.5. Also, to add eye animations just copy and paste all the eye curves from a city sample face animation (or similar) to the lipsync face animation.
Thanks!!! May i have a question about facial animations of metahuman. After this Audio 2 Face Settings, can i blend this Lip-Syn with Facial Anim from Live link face. I wana it looks more natural!
How would you go about getting eyebrow/full face animations going paired with this? It looks great. Just curious how to make the rest of the face ‘real’ with the animations.
Thanks J! Do you know any way to do this procedurally from Blueprint? I'm trying to figure out how to turn incoming TTS audio to facial animation at runtime.
Lip-synch is pretty good, but rest of face is dead. Would be good to (maybe) additive blend lip-synch over face mocap to give it all a bit of life. My blend blueprint 100% reliably crashes Unreal :)
This is good to create very quickly a facial animation but the facial animation with IPhone stay better because you can move the head, close the eyes when you want. I dont’t say that it isn’t good but the capture method works better or at least is more realistic. 😊
I am still playing with AWS Polly example which does this real-time but that example is old. Have not tried Nvidia audio2face, but the bigger picture I think is voice2voice but with audio2face baked in, in other words, the whole STT->LLM->TTS process. Heygen is working on real-time voice to voice avatars which looks interesting, but uses semi-fixed talking head tech. This new UE 5.5 gives us smart NPC's someday
I wonder if this would be possible: let’s say I’m using a basic idle animation for the entire face, like from the crowd sample. Bake that to control rig - delete the mouth components and then do the same using this mouth control to replace it. Does that make sense 🤣
Thank you very much for this JSFILMZ. I'm a total newbie so have a (probably silly) question. I exported my first ever talking face in UE and what's next? 🤣 How can I add this face to my metahuman?
Thanks, Jae. Just wondering - do you think this audio to lips will ultimately work with other rigs such as Character Creator? I haven't researched this any further than your demo here, but plan to dig into this. This means of animating lips is much faster than iClone's lipsync!
Hey Jay, I am currently working on a software solution to utilize any video and process necessary depth information to drive a MH Animator performance. The videos can be captured using any hmc, webcam or any camera. I was wondering if you would take a crack at it. I am planning to release the software in near future. Let me know :)
Some people actually have hair, not bald. Is there a way to have her hair and for beginners how to i export this to get an mp4 or mkv or something to upload to youtube
Hey man, sorry for being out of subject right now But I saw once you posted a video about packaging , was wondering how can I be sure that it run on other pc’s , like in ue5 , everyone should have visual on their other pc ?? For me it runs very smoothly but let’s say for eg I need to package for everyone, should I disable hardware ray tracing ? I’ve created the ability to play on low res(without the sun) , will it work on old GTX1070 ? Sorry noob here I mostly pack for good pc but now not sure about older ones , I have an archivz scene , not big just living room kitchen and bedroom
@@Jsfilmz:yeah I optimized geometry but not so sure about the rest , I’ll see but your fortnite games are on another level man , wish I had time between work to test them , they look so good
Is it possible to make it run at runtime? Meaning through blueprint I pass the audio clip to a character and it lipsyncs to it without offline processing?
mha yes but this is for live service eventually like ai customer service/ NPCs think of the possibilities i already demod an ai metahuman npc here that uses audio to face anim
@@Jsfilmz it's a technical term for a specific effect of upper and lower lips sticking together in animation, which give more realism to mouth animation, google it, uncharted 4 implemented, makes mouth shape more natural.
While the mouth is animated beautiful, the rest of the face looks dead, like in IClone. So please Epic Games bring us at least with UE 6.0 ( or even earlier, maybe at christmas ;-) a complete solution for filmmakers... and combine the lipsync with the rest of livelink facedata... somehow
@@xcalaktrader yea they got more work to do but like i said nvidia started similar two years ago now theu have audio to emotions and boey animations give it time
Alr who else is using this. What do you think?
You just done reviewing game and now this? Wow😅
@@chBd01i dont sleep
This is what I've been waiting for! I usually create head movement by myself for my film. So this is exactly what I want. I add facial expressions using additive. I don't have to use Audio2face anymore. UE5 finally did it! I am so happy, J!
goodbye nvidia audio 2 face, welcome this new thing, thanks!
@@diegogomez6634 it probably usin same llm as audio2face tbh hahaha
In a 5 minute video J has helped me find solutions to 2 issues I was having. Thanks again bro.
This is great, does the same thing that SDK does but for free! And the results seem to be a bit more accurate. For those that don't know, to use the face animations that you create in UE5.5 in previous UE versions, just export the anim as FBX and import to older UE version as FBX using Face archetype or face mesh as the mesh (migrating won't work so has to be done this way). Useful for older projects that you don't want to update to 5.5. Also, to add eye animations just copy and paste all the eye curves from a city sample face animation (or similar) to the lipsync face animation.
@@Monoville yea but sometimes epic changes bones etc without sayin sh*t hahahaha in theory yes it should work unless they changed it
@@Jsfilmz for sure! Definitely works with 5.5 to 5.4 anyway, I tested it earlier. Saved me having to create a 5.5 copy of a 350GB 5.4 project 😅
Can you please make a tutorial on it, please Monoville?
@@phantomabid will try to if I get time in the next day or two
Very kind, thank you pal 😊@@Monoville
Thank you 99 times! that was AMAZING!!!!
Thank you - excellent presentation, not wasting viewers time and very clear instruction!
@@Risto_Linturi thats what i do man
Its a suprise how adept this looks. I´ll most likely look into to this down the road perhaps. Seems simple to set up as well.
I'll definitely be using this! Very cool!
Thanks!!! May i have a question about facial animations of metahuman. After this Audio 2 Face Settings, can i blend this Lip-Syn with Facial Anim from Live link face. I wana it looks more natural!
Aaaaasss always! Great video :)
Awesome tutorial! Thanks a lot!
How would you go about getting eyebrow/full face animations going paired with this? It looks great. Just curious how to make the rest of the face ‘real’ with the animations.
Has someone done this in runtime , maybe using some custom C++ code?
That's awesome, gonna try this today. Thank you!
this is amazing would be good to see if they could add different expressions to the face animations like having a sad, angry, happy preset
Thanks J! Do you know any way to do this procedurally from Blueprint? I'm trying to figure out how to turn incoming TTS audio to facial animation at runtime.
This is awesome!
Lip-synch is pretty good, but rest of face is dead. Would be good to (maybe) additive blend lip-synch over face mocap to give it all a bit of life. My blend blueprint 100% reliably crashes Unreal :)
This is good to create very quickly a facial animation but the facial animation with IPhone stay better because you can move the head, close the eyes when you want. I dont’t say that it isn’t good but the capture method works better or at least is more realistic. 😊
I'm assuming it works with facial expressions too.. would be nice to see a video with more natural head movement, facial expressions and lip sync.
I am still playing with AWS Polly example which does this real-time but that example is old. Have not tried Nvidia audio2face, but the bigger picture I think is voice2voice but with audio2face baked in, in other words, the whole STT->LLM->TTS process. Heygen is working on real-time voice to voice avatars which looks interesting, but uses semi-fixed talking head tech. This new UE 5.5 gives us smart NPC's someday
Looks great!!! Is it available also in real-time, for gaming? Thank you.
this is so good.
amazing!
I wonder if this would be possible: let’s say I’m using a basic idle animation for the entire face, like from the crowd sample. Bake that to control rig - delete the mouth components and then do the same using this mouth control to replace it. Does that make sense 🤣
Thank you very much for this JSFILMZ. I'm a total newbie so have a (probably silly) question. I exported my first ever talking face in UE and what's next? 🤣 How can I add this face to my metahuman?
Thanks, Jae. Just wondering - do you think this audio to lips will ultimately work with other rigs such as Character Creator? I haven't researched this any further than your demo here, but plan to dig into this. This means of animating lips is much faster than iClone's lipsync!
Would love if this worked for live audio capture..
hello! i cant find the meta human plugin (its just not on the list) and meta human animator. help pls
nivida audio2face lets you add a bit of head movemnt, blinking and eye movement can this do that?
nvidia ace audio2face is also real time. You cant do this real time afaik
wow
holy fak, this will save me DAYS of work! deuces, Livelink
How do I put this animation on the entire character? Do you have any tutorial?
Great stuff!
So do you know how to use this amin in the sequencer? I can apply the anim the track appears but no face anim nor audio is playing.
Hey Jay, I am currently working on a software solution to utilize any video and process necessary depth information to drive a MH Animator performance. The videos can be captured using any hmc, webcam or any camera. I was wondering if you would take a crack at it. I am planning to release the software in near future. Let me know :)
Some people actually have hair, not bald. Is there a way to have her hair and for beginners how to i export this to get an mp4 or mkv or something to upload to youtube
Not uncanny almost, cool 😅
is it possible to make the eyes blink more often or able to combine face animations
yea its def missing that omniversexhas audio to emotions im sure it will come to unreal something similar
just layer anims in anim graph ez
How can we add more emotion
Hey man, sorry for being out of subject right now But I saw once you posted a video about packaging , was wondering how can I be sure that it run on other pc’s , like in ue5 , everyone should have visual on their other pc ?? For me it runs very smoothly but let’s say for eg I need to package for everyone, should I disable hardware ray tracing ? I’ve created the ability to play on low res(without the sun) , will it work on old GTX1070 ? Sorry noob here I mostly pack for good pc but now not sure about older ones , I have an archivz scene , not big just living room kitchen and bedroom
@@zatlanibrahim5438 gotta learn optimization man thats a whole new artform im currently learning it in my fortnite maps
@@Jsfilmz:yeah I optimized geometry but not so sure about the rest , I’ll see but your fortnite games are on another level man , wish I had time between work to test them , they look so good
It is work for sings, I test it with a song created by SunoAI and have upload the video to my channel.
Nice...it is a shame that takes a long time for a version to get stable...I was using 5.4 to test motion matching and It crashes ALL the time...
it dependw on what youre doin. im learning now u gotta optimize your scenes
You could use this to bake animations and migrate them to older engine versions I think, I'm using 5.3 for my project and am gonna try that
@@JJHazyyes but sometimes they change bones without saying lol when they update
Is it possible to make it run at runtime? Meaning through blueprint I pass the audio clip to a character and it lipsyncs to it without offline processing?
I'm interested in the same topic, but I didn't find an answer so far...
nice !, is just english compatible ?
Is this possible on 5.4? my whole project is on there
Isnt the live link for face better??
mha yes but this is for live service eventually like ai customer service/ NPCs think of the possibilities i already demod an ai metahuman npc here that uses audio to face anim
Back to the X & O’s, eh? Welcome.
metahuman been boring for awhile tbh thet release cool stuff ill make more content
now if we hook this up to openai realtime chat we are DONE !!!!! Lol
I hope we can link audio at runtime and play face animations
@@stevecoxiscool you know thats already in the works man ive already seen some companies do em
is rtx nessasary to try this?
Benefits of this over A2F?
what christiandebney said above
@@Jsfilmz Thank you Jsfilmz! ♥️
Does it have sticky lips?
@@timmygilbert4102 keep it pg 13 man its yt
@@Jsfilmz it's a technical term for a specific effect of upper and lower lips sticking together in animation, which give more realism to mouth animation, google it, uncharted 4 implemented, makes mouth shape more natural.
@@Jsfilmz it's a technical term, Google it, it's how the upper lips and lower lips stick in realistic animation, uncharted 4 has implementation of it
@@Jsfilmz it's a technical term, Google it, for realistic animation
While the mouth is animated beautiful, the rest of the face looks dead, like in IClone. So please Epic Games bring us at least with UE 6.0 ( or even earlier, maybe at christmas ;-) a complete solution for filmmakers... and combine the lipsync with the rest of livelink facedata... somehow
Unreal Engine is too difficult and too complicated to learn. It would be great if some Unity tutorials were available.
if i can self teach myself (a highschool grad) unreal engine during the pandemic while being in the military full time you can too
Upper half of face looks static, doesn't convey emotion.
@@xcalaktrader yea they got more work to do but like i said nvidia started similar two years ago now theu have audio to emotions and boey animations give it time
00:00 when I'm inside her husling but she asks if I've even started 🤗
everyday one step closer to making an onlyfans =P
@@renanmonteirobarbosa8129 yall dont have one yet?
99 times and a glitch ain’t one
@@precociouscurmudgeon937 hit me
Not too useful, as you can't do it at runtime. Only a matter of time!
@@xaby996 gotta crawl before u can run