MetaHuman Animator helps developers unlock the storytelling power of high-fidelity facial animation. From performance to animation in minutes with just an iPhone and a PC! Support the channel with a LIKE ❤SUBSCRIBE ☑Turn on NOTIFICATIONS 🔔 and JOIN to get access to perks 👑
This is where we need 60fps to make the showcase look realistic, cause 24fps is not realistic, it's cinematic, so this upscaled to 60fps would look creepy good, good thing my OLED tv has trumotion to do that with all video content.
Umm...that first clip is insane. Materials, lighting, surfacing details are pretty much bang on. Eyes are incredibly convincing. I think the facial muscles and mouth movements still need work to cross the uncanny valley but daaamn that is so so so close. I think even just doing some realtime AI off of deepfake training on the face would push this beyond any movie quality and in realtime...that's just crazy.
It truly was remarkable, but I think they should have put just a little more work into the clothing detail. As he walked the hood and shirt collar should have moved and the materials they used were far from realistic. But regarding the face it was pretty damn close to reality.
@@3D_Blender_C4D Honestly, it would've fooled me as a movie scene if I wasn't told this was CGI and actively looking for the flaws/micro details. If it's good enough to fool a few of us Unreal nerds, you really think the average consumer is gonna be able to tell the difference?
I’m not sure what impressed me more the visuals or the audio!! Listening to this on my phone landscape and the voices on the second bit were so well panned
At certain moments, I completely lost track of the fact that I was watching a digital creation and instead felt completely immersed in a captivating movie-like experience.
3:17 I love how his foot steps match perfectly, in time, with that "THUD" in the music. I've seen a lot of Unreal Engine presentations but this one hit me the most and made me want to see more. Such a brilliant, emotional, short presentation...
i literally kept saying to myself "THATS A REAL PERSON, THATS A REAL PERSON" like this is the craziest technology i have EVER seen when it comes to realism on humans
I would not say for ALL AAA games, because theres games that has artistic freedom and that dont need to have the same Human reality (Like japanese games or any other game which arent stick to reality), but I agree, this probably will be the Standard to AAA games WHICH WILL follow graphical reality.
@@hicarodestrui the actual look of the game may differ like a more 2D/3D world like the "open world" Naruto but even those models will incorporate this. this coupled with the AI NPCs will be amazing.
i still believe epic is gonna be on top for quite a few years IMO. Have you seen other render engines of other big companies? Cant say i've seen anything similar, i.e. the new Assassins Creed.
Wow, this is astonishing, and im constantly watching 3d footage, and this is the best game engine footage ive seen. Guessing it was raytraced, but hopefully one day soon it will be realtime.
The 'Unreal engine' sure is unreal, amazing how far tech has come in the last few years. Each year things are improving fast as PCs get faster and faster. The tech level is rising fast thanks to PCs themselves, the growth/intelligent scale is ramping up faster each year. Cheers.
This is the level of realistic facial animation we need for UE5 games, nobody must be able to pick out reality from animation, and then we can make movies and shows "live action", without any limitations at all, put all the money into this and space research, the only interesting things to develop, while saving our planet by improving efficiency in general of course.
I wish it were that easy, but sadly our society doesn't work like that. We've made this system of ours so complex and inefficient that it will get us killed in the end.
@@sladepruitt564Nerd is always positive word imo, it just means somebody who's really enthusiastic about a subject, spends a lot of time on it, and knows a lot about it, or in this case, a realistic, selfless person, who mentions the objective truths, nd a perfectionist, who's never satisfied with anything, and always push harder, cause he knows it's possible.
Facial capture has truly gotten jaw dropping. But even tho this unreal metahuman animator system is superior in nearly everyway, there is still ONE area that LA Noir's tech got that this and other modern face cap systems don't. Small wrinkles forming and disappearing with subtle facial expression changes. You can see it best at the transition from live to cg at 4:47 The fore head wrinkles and smile wrinkles that connect the corner of his mouth to his nose are not more defined in the cg model the way they are live. That's because those wrinkles are too small to be rendered via polygons and can only be shown via texture. LA Noir's system being much older couldn't render them via polygons either, but they DID have wrinkles like that become more defined via the texturing. And that's something I'm surprised modern face cap systems don't take advantage of. The skin texture on this model is static. Only the actual model of the face moves. But until we get 3D models that are upwards of a multiple hundred _trillion_ polygons the facial models won't capture small wrinkles forming/becoming more defined without having altering textures on the face.
I think the most important thing about art is that we can still express ourselfs, I'm happy and hope that we still need to give value to the humam work involved in acting on games, just making it more realistic in the digital world, but still true to us. I was just recently watching Matrix move and now it looks so close to a documentary than ever, I just hope technology make us greater and not robots
Problem I ran into was putting custom clothing onto the Meta Humans. Besides that the technology is wild and i'm looking forward to see how it develops. Fingers crossed they add some more customization options for the MetaHumans!
This is by far one of the best I have seen, but no matter how good they get at mimicking reality, I can't help but instantly notice the difference between a real person and one of these meta faces. The giveaways are usually the eyes or slight head motions that don't seem normal which could be the result of the excess gear they have to carry around while acting.
I think it's definitely micromovements, specifically around the eyes and that mouth that give it away. But this is probably around 90% there to being almost impossible to distinguish.
This is stunning. 10 years ago we had similar tech, but the lighting and performance improvements pretty much fix all the issues we had with mocap to 3D representation. We have the tech now to build something amazing. And I bet a lot more movie directors are going to jump on in the gaming industry bandwagon! Though, to make a good game, have amazing graphics AND story telling will take a lot of time and effort. (Also money) See you in 2033!
@JishinimaTidehoshi With this technology, they can also 3D scan anyone and put their faces on actors, as well as use AI to change voices. It's really limitless what they could do.
The beginning, when he turns to the camera and makes facial expressions, that felt a bit artificial in parts of his face. I went back and watched it a few times. It's close, but not perfected yet. The rest of the scene was very good and convincing to me.
Damn it. Tell me this first was an actor in a rather unconvincing CHI Background. That guy looked like Berlín from „La casa del Papel“ (I have no idea what it’s called in the US and I don’t know Spanish so please don’t judge the spelling). That was a Movie like quality of CGI. Just the part where picks up the helmet doesn’t really look real. But he does, all the time. This is going to be nuts… Brace for playable Movies!
Impressive, but curious on how well the face tracking with a wide angle lens translates to longer focal lengths, as the level of lens distortion is different.
Almost perfect. The eyes are quite real. The mouth needs to be improved (the lips sometimes need to stick together), and the cheek muscles need to be softer and bounce slightly due to gravity. To conclude, even the neck muscles need to move along with the mouth as the person speaks.
lol i used to think like this too. That was rendered in pathtracing and im telling you from what i've tested, its close to becoming real time. Maybe a year or so, but not years, for sure. Check the new pathtracing nvidia methods. If we unite all epic, nvidia and other compatnies AI technologies, we will see its actually right around the corner... I've been in the industry for a few years now and damn its moving really fast now, but i might be wrong.
@@-1nterruption-960 sorry friend but I highly doubt it, the series X can’t handle mediocre facial expressions (Starfield) at 60FPS. Idk if they’ll announce a mid refresh gen next year, but so far both consoles showed their “true” power at the 3rd year.
@@worlds_greatest_detective6667 Starfield doesn't apply to this since the game is massive with tons of A.I calculations, galaxy orbital simulation. G.I. etc. It's incredible the character models look and micro details in the game look as good as they do given the sheer scope and scale of the game. Probly even much more complex and straining on the gpu, cpu, and memory than one simple character model budd. So your comparison is moot. These consoles also run UE5 and there will be plenty of games utilizing this graphics engine pushing the consoles to their limits. We may very well see this once they are. And I find it funny how you only make a mention of the Series X when it's more powerful than the ps5 . Are you a Sony fanboy by any chance
The new Apple Silicon ARM chips are the way to go for this, they are so far ahead of anything Nvidia or x86 in terms of performance per watt, so if the M3 chips get ray tracing and dlss, these could boost the consoles to astronomous levels.
isanely good... imagine an AI controlling context driven conversations and you simply talk to your phone and you are acting in a game or movie and getting intelligent feedback through conversations with the NPCs... a whole new interface could be possible interaction driven through dialogue and emotional response. Of course this is the beginning of your future home... the MATRIX mwhuahahahahahahaha!
And it's only going to take another 10-15 years before a system will be powerful enough to actually display this level of detail in a full blown open world game.
But, how to concile both digital and real humans in situations like what we see here ? Meaning, when the digital is meant to replicate exactly the human, how to maybe use the human as is, there won't be anything more real than a human, will it ?
I certainly looks good but i still say it doesnt look "real", much of this has to do with motion blur and lens optics, of which there is very little. Typically gamers would see something like this and swear to its realism, but their visual sensitivity is not acute to live photography. very few games even come close to crossing that uncanny valley. Its about simulated optics. and refinement in post, Unreal still has a ways to go to getting a faithful interpretation of real world camera lenses and their optical results.
I think if you write an a.i. and take actors body and face datas and with a 3d animating program you can animate automatically movies and etc. it is not so far. Beowulf movie was so good. And it is going to be real Al pacino's movie Simone :)
They need to retire the term “Uncanny Valley”, the space that historically defined believability is rapidly disappearing. CGI still excites me as much today as it did decades ago, I think the advancements are more attention grabbing than the material that stems from them.
MetaHuman Animator helps developers unlock the storytelling power of high-fidelity facial animation. From performance to animation in minutes with just an iPhone and a PC!
Support the channel with a LIKE ❤SUBSCRIBE ☑Turn on NOTIFICATIONS 🔔 and JOIN to get access to perks 👑
This is where we need 60fps to make the showcase look realistic, cause 24fps is not realistic, it's cinematic, so this upscaled to 60fps would look creepy good, good thing my OLED tv has trumotion to do that with all video content.
What iPhone were you using? I'm looking to buy an iPhone 12, but I'm not completely sure it would work. Thanks.
This is the most convincing CGI human I have seen, this includes movies
Yeah, ecen better than Luke Skywalker deepfake, which is up there too.
Have you seen the unity demo?
@@corentinguillo5577 I have not… I’ll need to check it out!
@@haydn.oneillit’s better than the Unity one too
I'll still say that Avatar 2's Na'vi still looks more real than than this CGI human.
This is the first time ever that a CG face doesn't fall into the uncanny valley for me - this is better than movies
for real
for real
Logan movie (evil clone)
Umm...that first clip is insane. Materials, lighting, surfacing details are pretty much bang on. Eyes are incredibly convincing. I think the facial muscles and mouth movements still need work to cross the uncanny valley but daaamn that is so so so close. I think even just doing some realtime AI off of deepfake training on the face would push this beyond any movie quality and in realtime...that's just crazy.
Something is wrong with your eyes lol. Smh there's always one
@@DGreen951 Oh shush child. Is that all you got?
It truly was remarkable, but I think they should have put just a little more work into the clothing detail. As he walked the hood and shirt collar should have moved and the materials they used were far from realistic. But regarding the face it was pretty damn close to reality.
@@3D_Blender_C4D Honestly, it would've fooled me as a movie scene if I wasn't told this was CGI and actively looking for the flaws/micro details. If it's good enough to fool a few of us Unreal nerds, you really think the average consumer is gonna be able to tell the difference?
Agree, amazing, but face muscles above the lips / middle of face are too frozen, but very close.
I’m not sure what impressed me more the visuals or the audio!! Listening to this on my phone landscape and the voices on the second bit were so well panned
That bit is from the sequel from the game Hellblade: Senua's Sacrifice. It does this really well too.
At certain moments, I completely lost track of the fact that I was watching a digital creation and instead felt completely immersed in a captivating movie-like experience.
I love and hate it how much I got invested into this guy, story and setting. And it's just a cinematic showcase. I need more of that. Whole game.
Hell yeah!
I'm beyond ready for a miniseries to be made with this technology!
Imagine the kinds of worlds we'll get to see!
3:17 I love how his foot steps match perfectly, in time, with that "THUD" in the music. I've seen a lot of Unreal Engine presentations but this one hit me the most and made me want to see more. Such a brilliant, emotional, short presentation...
i literally kept saying to myself "THATS A REAL PERSON, THATS A REAL PERSON" like this is the craziest technology i have EVER seen when it comes to realism on humans
There was moments where I actually thought it was a composite of real footage on a cgi background. this is absolutely insane.
Wow everything beautiful! writing, performance, graphics, music... It looks like a game I'd definitely play, just amazing!
This meta human standard will be standardized in the next 5 years for all AAA games imo
I would not say for ALL AAA games, because theres games that has artistic freedom and that dont need to have the same Human reality (Like japanese games or any other game which arent stick to reality), but I agree, this probably will be the Standard to AAA games WHICH WILL follow graphical reality.
@@hicarodestrui the actual look of the game may differ like a more 2D/3D world like the "open world" Naruto but even those models will incorporate this. this coupled with the AI NPCs will be amazing.
i still believe epic is gonna be on top for quite a few years IMO. Have you seen other render engines of other big companies? Cant say i've seen anything similar, i.e. the new Assassins Creed.
Wow, this is astonishing, and im constantly watching 3d footage, and this is the best game engine footage ive seen. Guessing it was raytraced, but hopefully one day soon it will be realtime.
Can’t wait to watch this ten years from now and laugh about how real we thought this looked!… but yeah it looks amazing.
It's crazy how gaming tech is developing movies and TV shows as well.
is beautiful, we are so close to the end of reality and go to another one.
The 'Unreal engine' sure is unreal, amazing how far tech has come in the last few years.
Each year things are improving fast as PCs get faster and faster.
The tech level is rising fast thanks to PCs themselves, the growth/intelligent scale is ramping up faster each year.
Cheers.
This is the level of realistic facial animation we need for UE5 games, nobody must be able to pick out reality from animation, and then we can make movies and shows "live action", without any limitations at all, put all the money into this and space research, the only interesting things to develop, while saving our planet by improving efficiency in general of course.
I wish it were that easy, but sadly our society doesn't work like that. We've made this system of ours so complex and inefficient that it will get us killed in the end.
nerd
@@sladepruitt564hater
@@PolarBearFromNY I mean by definition he is a nerd. I agree with him though haha.
@@sladepruitt564Nerd is always positive word imo, it just means somebody who's really enthusiastic about a subject, spends a lot of time on it, and knows a lot about it, or in this case, a realistic, selfless person, who mentions the objective truths, nd a perfectionist, who's never satisfied with anything, and always push harder, cause he knows it's possible.
Facial capture has truly gotten jaw dropping. But even tho this unreal metahuman animator system is superior in nearly everyway, there is still ONE area that LA Noir's tech got that this and other modern face cap systems don't.
Small wrinkles forming and disappearing with subtle facial expression changes.
You can see it best at the transition from live to cg at 4:47
The fore head wrinkles and smile wrinkles that connect the corner of his mouth to his nose are not more defined in the cg model the way they are live. That's because those wrinkles are too small to be rendered via polygons and can only be shown via texture.
LA Noir's system being much older couldn't render them via polygons either, but they DID have wrinkles like that become more defined via the texturing. And that's something I'm surprised modern face cap systems don't take advantage of.
The skin texture on this model is static. Only the actual model of the face moves. But until we get 3D models that are upwards of a multiple hundred _trillion_ polygons the facial models won't capture small wrinkles forming/becoming more defined without having altering textures on the face.
Man, I wish someone remade that “The Casting” tech demo from Heavy Rain (PS3) on UE5 to really see how far we’ve come in just 15 years.
Soon, there will be a day when I don't have to see Tom friggin Hanks in a movie ever again. Celebrate.
All of that from an iphone?! That is crazy!! Ureal Engine is definitely going to revolutionize gaming forever!
thanks man you thought me what metahuman is in a single video.
Can’t wait to see this realism in video games. We almost there brother.
truly amazing I'm baffled by how far we are I can't even imagine how far we still have to go!
I think the most important thing about art is that we can still express ourselfs, I'm happy and hope that we still need to give value to the humam work involved in acting on games, just making it more realistic in the digital world, but still true to us. I was just recently watching Matrix move and now it looks so close to a documentary than ever, I just hope technology make us greater and not robots
The only uncanny part for me is the eyes. It’s gonna be so hard to get them perfect. But this is so beautiful
BOOOOOLESNO DOBRO!!! :) Great job!!! :)
OMG the subtle EXPRESSIVENESS! WOW WOW WOW !!!
Anyone else remember that PS9 commercial? This is like one step away.
Problem I ran into was putting custom clothing onto the Meta Humans. Besides that the technology is wild and i'm looking forward to see how it develops. Fingers crossed they add some more customization options for the MetaHumans!
This is by far one of the best I have seen, but no matter how good they get at mimicking reality, I can't help but instantly notice the difference between a real person and one of these meta faces.
The giveaways are usually the eyes or slight head motions that don't seem normal which could be the result of the excess gear they have to carry around while acting.
I think it's definitely micromovements, specifically around the eyes and that mouth that give it away. But this is probably around 90% there to being almost impossible to distinguish.
@@ttanizawa901
Exactly.
I don’t have a clue what he was talking about, I could barely even focus on what he was saying. Yet I couldn’t look away for a second.
Imagine if this was used for the new Indiana Jones movie but it was young Indiana instead of old Indiana. Would have been epic.
This is pretty amazing. The uncanny valley is about to be transcended.
Ok i’m 😮😮, it’s amazing and scary at the same time
I really hope they release it very soon!
Wtf! It cant be that easy. That's insane! I need this now🤯😂
Incredible talented actor and actresses
This is stunning.
10 years ago we had similar tech, but the lighting and performance improvements pretty much fix all the issues we had with mocap to 3D representation.
We have the tech now to build something amazing. And I bet a lot more movie directors are going to jump on in the gaming industry bandwagon!
Though, to make a good game, have amazing graphics AND story telling will take a lot of time and effort. (Also money) See you in 2033!
Theoretically they could use dead celebrities as CGI and map them to another actors body for films because these graphics are insane
For example, Paul Walker, Marilyn Monroe, Robin Williams etc
It's a slippery slope.
now i undestand it she was going to send this letter to him and than got in an accident and probably died
Want to know when this is gonna be used to... cof... you know... po...p corn.
Very. very good, but I see he doesn't blink enough, and the lips are a little stiff but overall nearly perfect. the rest is very interesting.
It's recording his blinks and his lips, so you think the actor should blink and move his lips more?
I think it’s down to the lips and teeth and we will have no uncanny valley at all
Actors would never age, we can just let AI and CGI do the work.
Yeah, humans have limits, AI and computers in theory do not.
Who wants to keep seeing the same faces?? Isn't it enough with nepobabies??
@JishinimaTidehoshi With this technology, they can also 3D scan anyone and put their faces on actors, as well as use AI to change voices. It's really limitless what they could do.
if someone show me this video in 2000s I'll believe this was real footage from real cinema camera
Next step is make body riging also automated with videos without trackers and expensive tools
music is very emotional
Qoyil qoldim😮 , bu men ko‘rgan 3d aniatsiyalari ichidagi eng haqiqiysi ( my text lan = uzbek )
for almost the whole time in the first clip i thought it was real!
Hesus el f¨¨¨christo. This is insane!
The beginning, when he turns to the camera and makes facial expressions, that felt a bit artificial in parts of his face. I went back and watched it a few times. It's close, but not perfected yet. The rest of the scene was very good and convincing to me.
The uncanny valley slowly disappears
this is impressive
Very good stuff
Damn it. Tell me this first was an actor in a rather unconvincing CHI Background. That guy looked like Berlín from „La casa del Papel“ (I have no idea what it’s called in the US and I don’t know Spanish so please don’t judge the spelling). That was a Movie like quality of CGI. Just the part where picks up the helmet doesn’t really look real. But he does, all the time. This is going to be nuts… Brace for playable Movies!
The best yet
Damn it's so good!
Impressive, but curious on how well the face tracking with a wide angle lens translates to longer focal lengths, as the level of lens distortion is different.
wow amazing 👌
Better than the flash cgi ngl
6:12 felt like asmr ngl
JAW DROPPING!!!
Almost perfect. The eyes are quite real. The mouth needs to be improved (the lips sometimes need to stick together), and the cheek muscles need to be softer and bounce slightly due to gravity.
To conclude, even the neck muscles need to move along with the mouth as the person speaks.
Now add in a database of human based movements, link it to a language based programmed GPT style AI and you've got next gen canny valley.
This is impressive, but it will take many years to see this running in real time in a gameplay
lol i used to think like this too. That was rendered in pathtracing and im telling you from what i've tested, its close to becoming real time. Maybe a year or so, but not years, for sure. Check the new pathtracing nvidia methods. If we unite all epic, nvidia and other compatnies AI technologies, we will see its actually right around the corner... I've been in the industry for a few years now and damn its moving really fast now, but i might be wrong.
Thank you.
Gaming changing!
Where can I get the music from this video? Amazing music
We’ll see this kind of graphics probably on Ps6.
Current gen Series X and ps5 will be showing these kinds of graphics once pushed to their limits in the next few years
@@-1nterruption-960 sorry friend but I highly doubt it, the series X can’t handle mediocre facial expressions (Starfield) at 60FPS. Idk if they’ll announce a mid refresh gen next year, but so far both consoles showed their “true” power at the 3rd year.
@@worlds_greatest_detective6667 Starfield doesn't apply to this since the game is massive with tons of A.I calculations, galaxy orbital simulation. G.I. etc. It's incredible the character models look and micro details in the game look as good as they do given the sheer scope and scale of the game. Probly even much more complex and straining on the gpu, cpu, and memory than one simple character model budd. So your comparison is moot. These consoles also run UE5 and there will be plenty of games utilizing this graphics engine pushing the consoles to their limits. We may very well see this once they are. And I find it funny how you only make a mention of the Series X when it's more powerful than the ps5 . Are you a Sony fanboy by any chance
The new Apple Silicon ARM chips are the way to go for this, they are so far ahead of anything Nvidia or x86 in terms of performance per watt, so if the M3 chips get ray tracing and dlss, these could boost the consoles to astronomous levels.
More like Ps10
Close, but still missing Iris movement - the eyes!
Looks very impressive! But still missing many micro expressions.
love it except youtube doesn’t let uuu scroll through a video if it has chapters
Non exagerated animations does make a difference
Wow, that is pretty close to being out of the uncanny valley that has plagued cinema CGI. Almost there.
I couldn’t convince myself that it was cg, even though i knew it. Waow. 0 uncanny.
Imagine if starfield was on unreal engine 5.2
isanely good... imagine an AI controlling context driven conversations and you simply talk to your phone and you are acting in a game or movie and getting intelligent feedback through conversations with the NPCs... a whole new interface could be possible interaction driven through dialogue and emotional response. Of course this is the beginning of your future home... the MATRIX mwhuahahahahahahaha!
What the hell, its like Quantic Dream trailer
And it's only going to take another 10-15 years before a system will be powerful enough to actually display this level of detail in a full blown open world game.
Wow!
What is the soundtrack title of the "Cinematic showcase" chapter please ?
But, how to concile both digital and real humans in situations like what we see here ? Meaning, when the digital is meant to replicate exactly the human, how to maybe use the human as is, there won't be anything more real than a human, will it ?
Jheeeeez!!!! 🤯🤯🤯
🔥 🔥🔥
11:54 James Cameron is pleased with her
wow they invented FMV 30 years later
Can Metahumans be used to create Virtual Reality films/experiences and played on headsets like the Quest 2 - without a PC?
Which iPhones work witht the Live Link Face App? I'm looking to buy an iPhone 12, but I'm not completely sure it would work. Thanks.
This is unbelievable lol
This is Avatar level quality.
I certainly looks good but i still say it doesnt look "real", much of this has to do with motion blur and lens optics, of which there is very little. Typically gamers would see something like this and swear to its realism, but their visual sensitivity is not acute to live photography. very few games even come close to crossing that uncanny valley. Its about simulated optics. and refinement in post, Unreal still has a ways to go to getting a faithful interpretation of real world camera lenses and their optical results.
Yes but what's the size of a game? A whole hardrive?
We stopped using DVDs for games but will we start buying entire hardrives?
Imagine 2015 Lara Croft but with this realism 😍
My birthday is November 21st and I ride a motorcycle 😳. This was shocking.
next step is the stickiness of the lips.
I think if you write an a.i. and take actors body and face datas and with a 3d animating program you can animate automatically movies and etc. it is not so far. Beowulf movie was so good. And it is going to be real Al pacino's movie Simone :)
Serbian in an unreal engine demo😮
They need to retire the term “Uncanny Valley”, the space that historically defined believability is rapidly disappearing. CGI still excites me as much today as it did decades ago, I think the advancements are more attention grabbing than the material that stems from them.