Metahumans are getting TOO REALISTIC in Unreal Engine 5.4
ฝัง
- เผยแพร่เมื่อ 13 มิ.ย. 2024
- During its keynote session at GDC 2024, Epic Games shared the first look at Unreal Engine 5.4 MetaHumans showcasing a new Marvel 1943 tech demo and the new metahumans feature for Fortnite Unreal Editor. This video features all the craziest ultra realistic use of Metahumans we've seen so far. Enjoy!
00:00 New MetaHuman Animator in UEFN
03:46 Marvel 1943 Rise of Hydra MetaHuman in Unreal Engine 5.4
07:18 Hellblade 2 MetaHuman Tech Demo
12:02 OD MetaHuman
13:16 Muscle Animator MetaHuman
20:12 Death Stranding 2 MetaHuman
#ue5 #unrealengine #unrealengine5
***
☑️ Please support the channel with a LIKE... and turn on notifications! 🔔
❤️ Subscribe to the channel: / @enfant-terrible
👑 Join this channel to get access to perks: / @enfant-terrible
*Disclaimer:
If you're a publisher or developer and would like to get your game or asset featured in the channel, please contact us. We would love to feature your work and help you promote your game!
Contact: contact [at] enfant-terrible [dot] media - เกม
“And just like that…”
Naw dude, I’ve been using Unreal for years now. Nothing in that engine is “just like that”
Same here, 12 years working on that ue4/5, not a single step is easy.
And that metahuman thing is in my opinion just pure marketing, nobody wants a 350K polygon character in a video game.
@@arnaudkaho7137 Agreed. Metahumans look amazing, after you use Mesh to Metahuman to customize the look of course. But yeah, I would rather spend more engine resources on other things. The draw calls always get ya.
They need to make them more efficient. Even in cinematic scenes they drag along and there are issues with the neck and body animations. I still can’t figure out to to remove the gap between the shoulders and head while animating.
@@arnaudkaho7137 People use unreal for movies and GFX as well, It might be used in cutscenes also, and im sure it can be optmized to be used in a game. The tracking features can 100% be used+muscle anim
@@arnaudkaho7137If I'm not mistaken the recent Hellblade 2 uses metahuman for the characters and that game is pretty well optimized
There's not really a TOO REALISTIC threshold, as you can always dial it down if that's not the style you're aiming for. The exciting objective of making realistic gameplay achievable is a few steps closer and something we'll likely see over the next few years as this sort of progress continues. Hats off to the UE devs
Agreed. I'd rather get the ultra realism to be able to dial it down from there, any day over having to make something look more real with upscalers. Yes. Hats off to UE team.
Human production in recent decades is now a fossil fuel. The iA now has enough raw material to make useless actors, writers, viewers and even your own mother
pre-rendered still far ahead of anything real time when it comes to realism
look at brothers: a tale of two sons💀
Realistic, but there's something that happens when they smile or do any exaggerated expression. My mind just rejects it.
Final Fantasy all over again. Applying video captured 'rotoscoping' will NEVER give the results that a professional Animator can.
Mouths still uncanny Valley. They need to focus on less is better.
They will improve that as soon as it is in their roadmap
False you're just looking for something to complain about
@@NextWorldVR nah they'll figure it out sooner or later, but i'd say sooner
The Muscle Animator highlights a major reason for the uncanny valley: Intiation of movement. When a human in rest starts to move, their muscles tighten *BEFORE* any movent occurs. Here, all muscle movement is a *RESULT* of the movement - as opposed to cause for it.
Also, on other, close-up samples, the fixational eye motion is missing, making the 100% stationary pupils look dead.
Death Stranding 2 is decima engine.. Not Unreal Engine.
And for open world look Amazing line UE.
They are using MetaHuman from Unreal Engine in Death Stranding 2. Its first 3rd party game engine which use it
@@DimonDeveloper No. Japanese devs have their own engines.
@@benkraze3103 This is satire I hope
@@benkraze3103they are using guerrillas DECIMA engine do research or watch the trailer properly
Now i would like to be 20 years younger.
This is to late for that new technology in games. We are passed our point of fun :/ .
With MetaHuman a couple of gens further down the line + high quality VR you could be as young or old as you want to be.
Mesh and lighting is really good. But the animation and skin sliding / flesh deformation has still a loooong way to go
yea go provide us with your own technology
@@theflame45 I don’t have my own and why should I? And why would you want new tech from me a single individual? Odd request.
The Starfield developers have a lot to learn - if you take into account the beginning of this video (the character's face).
Yes, The smile was stupid looking it started too early and lasted too long... it was awful! Worse than Final Fantasy... We've had photoreal characters for decades, the ANIMATION is where it always falls apart.
@@NextWorldVR Yeah but metahuman's characters are faster to make than before. Animation is later on the line, because better to improve animation after you get the graphic you want
@@NextWorldVR this is objectively incorrect, we haven't had photoreal characters for decades, just believable ones (half life 2s people are believable if not photoreal)
Wow people still talking pot shots at Starfield? Give it a rest already.
@@sub-jec-tiv same except people defending it
They're at the stage where with the right lighting, animation, and camera work they look realistic but you can still tell the difference when you put the model next to the actor
Amazing tech, though I am frustrated how the speakers are like "just like that" and "untouched metahuman animator solves".....I've seen enough tech demos to know that NOTHING that gets aired at these presentations is "untouched". It's HEAVILY curated. I guarantee that there are all kinds of extra tricks happening here to make this look as easy and palatable as possible in this demo.
I really hate that they now have UEFN-only features.
God the compression is destroying it.
Maybe the compression plays a role in here, but the initial product is not that great either.
Just compare the actress on stage and her digital avatar.
The actress has thousands of micro-expressions, skin-movements on her face when she's talking and interacting with the other persons, which disappear completely on the avatar's face.
Starfield says “hello!” 😂
Its not too realistic, this is what we've been waiting for!
Still not realistic.
if the characters looks like they're breathing, like subtle chest movement and shoulder..they will prolly look more alive and real
Simulation of muscle stretching FINALY.
I am so tider of those plactic muscles in all games that doesn't even changes, no stretches. It made me sad. But FINALY!
Fight Night Round 6 must come!
@@turrican4d599 Tekken 8. Not ideal, but it's something at least. Like THE beninging...
in de beninging...in de beningin....IN DE BENINGIN.
@@Gesensormeningeal layer 😂
Cool, but I'm still waiting for full facial muscle rigging before I'm going to be truly impressed.
Full facial musculature + randomised minor muscle action is how we get over the uncanny valley IMHO.
@@mnomadvfx There are SO many variables that go into the uncanny valley. A lot of it is rendering, getting the exact reflection and refraction in the eyes, changes in blood flow under the skin, micro-wrinkles in the skin and how they anisotropically alter the specularity of the skin. The issue with trying to simulate the face from the inside out via muscle activation is that it becomes extremely difficult to get the exact expression you want, especially when you're trying to maintain the likeness of a real actor. It's easier to build a set of blend shapes (from scan data when you have a real person), blend between them, and add extra ones when you need something very specific, rather than trying to specify the physical properties of every muscle, lump of fat and skin wrinkle to get them to fold in just the right way. I know Weta built a system that allows the skin to kind of slide over the blend shapes, maintaining an appearance of physical elasticity in the motion of wrinkle and pore detail, while conforming to the scanned/sculpted shape and the animated interpolation of it, which isn't physically based.
The irony of showing Kojina stuff on an unreal demo when they used unity's Ziva RT to make the body deformations... :)
Metahuman is a content creation tool. It is used to create the character art of Death Stranding 2, but Decima Engine renders their actual in-game appearance in real time, not Unreal.
I want to know how to make Metahuman speak and her body animated by rigging. I can do separately with audio2face, but can't get them together. The face and body can't stick together in the sequencer.
DS2 will run on Decima, what does that have to do with UE5?
The initial video is still deep inside the uncanny valley
Brilliant stuff.... the one thing that always gets me though, is the completely overboard mouth animations... like i get it for intense emotion anims... but just talking? ... nobody goes into full blown facial coniptions ordering a meal, showing all there teeth whener possible, really overly mouthing every little word...
I suppose it's just for showing it off clearly.... but man, subtlety is a thing..
Nice. But why everyone still have the same 3 body types? Women’s breast vary, some men are built and cut. Like why are y’all avoiding the finer control in the body.
And I'm also missing the ability to create MetaHumans of different ages, especially younger versions like baby, kid and teenager
Exactly. I hate this about meta human. Incredible easy to use tool but leaves out one thing that is super important and is harder to use.
23:08 nice god of war reference 🤣
Wow, Elle Fanning in Death Stranding 2!
Na das nenn ich mal progressiven Fortschritt. Als Architekt habe ich seit den 90iger Jahren mit 3D-Programmen Erfahrung, aber Unreal hat es wirklich auf die höchste Stufe gebracht. Toll, das zu sehen !
I loved Khary Payton in The Walking dead, glad to see him in the gaming scene as well.
I love how the thumbnail which is ment to state the realism of UE5.4 is actually the Decima Engine. Just shows how great that engine is and is highly comparable.
Do people not look at real people now😂 these are cartoonish
открою вам анатомический секрет - что бы роботы не выглядели роботами, зрачки должны совершать очень быстрые движения(микро тряска), человеческий глаз это не замечает, но фокусировка на объектах заставляет глаз трястись с большой скоростью, что делает для нас глаз собеседника живым. При этом наш глаз считывает движения глаз собеседника и если этого всего нет, подкорка вам кромко кричит - перед вами киборг или труп.
Just imagine the potential, micro managing and total customization the director can have over an actor's core performance without having multiple takes.....
8:05 "convincing tongue animation" 😳😳
Face demo: Feels like the lips are not interacting with the teeth
Death Stranding 2: the hair moves in slow motion compared to reality (like the hair is underwater).
Muscle demo has some proportion issues (seems to be worst around the lats) and it looks like the movements are triggering the muscles, not the muscles triggering the movements.
Getting closer to real though for sure.
The cloth stuff seemed pretty flawless!
that is massively impressive...i should admit
I dont know if the kids working today realize how much work went into early animation.. the tools now are unbelievable. Remember when rudementary cloth modeling plugin was first introduced to Max?
I love Unreal Engine. Been following it's progress for a long time. I am excited for FSR3 and fsr upscaling with ai improvement and Unreal 5 games with FSR3 frame generation :)
I think we are just 10 years away from seeing animation that looks 100% real
can we have stylized imports like iclone has?
The sync between audio and lip movement is noticeably out most of the time, unless they are speaking slowly, then it looks fine.
Which is odd, that shouldn't be a hard thing to get right when you have complete control of both.
Maybe this issue got introduced during the editing of the youtube video rather than by UE
honestly the most exciting thing is cloth sim and muscle system...
muscle animation system looks great, but i think either there's a small bug or the character model is weird, cos he has virtually NO LAT DEVELOPMENT lmao
When I look at "RYSE: Son of Rome" (Xbox One) game I do not even think that ray tracing is needed. Very good lightening + metahuman graphics and photogrammetry should be enough in some cases.
OMG THAT WAS AWSOME
The question is, can they handle it??? No.
Was hoping they would making game easy.
People are quite wrong about this! It's NOT when characters look real in games and cartoons BUT when they look almost real but there is something that you cannot put your finger on or understand that is wrong so it's only slightly wrong, so they aren't either a depiction of a human OR either a human, when you enter the uncanny valley territory. A good example of graphics gone wrong is the new Dune game but a great example of when they are enough of a caricature, like in world of warcraft, that they look "good" or at least don't look weird. Look at the lord of the rings from director Ralph Bakshi movie where they recorded actual actors but then draw on them and how incredibly weird and uncanny it looks but it's not until you actually know the reason that you can put a finger on it.
The most impressive thing about all this is that it's still not fully capable of fooling you into thinking it's real despite looking as realistic as it does
Hypothetically I would even go so far as to say the only way this and even future tech will be able to convince someone that it's real is to make them first forget what reality is
Dude, your last sentence digs deep, and I have to admit that I agree with you.
Reading some comments here who find Unreal 5 to be "too realistic" in its visual reproduction of humans, I wonder if their writers remember what a real human looks and acts like.
Bioware used to make games with characters that seemed so realistic and cinema quality. I miss those days.
Still death stranding 2 or the beach or whatever they call it used a deferent engine the Decima engine
and showcase insane amount of details as well
They used UE's Metahuman for the characters
@@xvil9x932 thanks, I read about it appears Metahuman is shaping the future of the 3D world, the technology is fast and the results are accurate, Kojima production has a unique way in developing new technologies when use them, like they did with the Fox engine, the Disema Engine and know by using the unreal engine.
Yes. By the way, kojima also used metahuman in his new project titled "OD"@@aladdin5275
There is still a lot of Uncanny Valley in the expression and emotion.. Part of the face being totally dead and expressionless, while other areas overcompensate, making the expressions look like some kind of psychopathic bells palsy sufferer. It will get there, but not close yet.
Do something with hair movement. It's like they're underwater. In many games, no game developer has paid any attention to this for many years.
Ari Folman's Congress turns real.
This engine is amazing I just wish I had the talent and assets to push it to the limit.
Its too perfect, a real human has imperfections
impressive !
It'd be nice if even a handful of modern games looked remotely like the UE demos from 4 years ago, let alone this stuff.
Some where Mass Effect Andromeda was like " Could you that earlier????"
At this point I am convinced that the uncanny valley effect is 100% due to the mouth and not the eyes.
Muscle animation for Fight Night Nex Gen?
The future is here. Thanks Unreal Engine people!
I'm glad Metahumans helps with better gameplay
Damn impressive!
And for some reason still well within the uncanny valley
That muscle system demonstration was insanely mindblowing. Unreal is way too real for me now. 😍
To be or not to be is going to be the real question now.
Basic rules for creating realistic looking humans.
Break the facial symmetry.
Make the teeth a little crooked and yellowed.
Add more skin imperfections and micro details to the skin.
It's none of that. It's the movement that's the giveaway. The mouth and turns.
@@bengsynthmusic I agree, partly. I've noticed they use some techniques where some models disables the animation in the crowd during battle to cut down on calls and trimming out some keys in the animation during the movement.
@@tiaanbasson9092 fast paced scenes like battle animation hide the uncanney valley we feel while observing a digital face. As @bengsynthmusic wrote it is the micro movement of human facial muscles that awakes a sense of wrongness. You can find tons of articles and studys about this. Your mentioned points have been already done years ago but it is more intrinsic than some wrinkle maps or imperfections.
Death Stranding (left side on a poster) is not on UE5, it's on Decima
I noticed no one wants to make the perfect looking female Can't wait till I can do it. Also we need elfs and fairies, cat people, fox people, Monsters undead. Fantasy of all types. Plus sci-fi fantasy. This is going in the right direction it looks amazing.
Well, who volunteers to connect one of these metahumans to OpenAI's gpt-4o ?
the low bitrate of this video really fucks up the impact of the presentation
It's not the low bitrate. The initial results aren't that great. Just compare the thousands of micro-expressions and facial movements of the real actress and the bland result on her avatar.
When can we get this detail in VR?
nVidia 6000 series graphics cards, next gen OLED headsets. Maybe.
How many videos like this they already show to us before the PS5 came out....And we are still playing ps4 games quality on ps5....
Demon Souls, and Horizon Forbidden West shows graphics beyond PS4. But yes Unreal 5 is too demanding for pS5
Be patient mate, your time will come 👽
@@xavierf2229 waiting for that👹
naaah ,you need to update your information source cowboy@@Spn308
You can't ask for cutting edge graphics if you use a game console 😜
We aren't there yet. Inching closer and closer though.
I love you guys you are the best
Is it not meant to match? The finished one at 9:48 - 9:52 does not match the original eye movements for example like here 7:25 - 7:29. Why is that?
I'm ready for Meta to hurry up and release their Codec Avatars they've been developing with this very same technology.
The real gem in this keynote was not what it means for future flatscreen games and consoles....but what it means is
On the Horizon in VR
The muscle animator would look great if the muscles would flex much more at the beginning of a movement. Right now it looks like a puppet on strings.
Не, предыдущие версии красивее безопаснее были, а эти какие то страшноватые, как роботы Макс похожи на людей...
Anything is possible now. If you get the windows to the soul right, the world is basically limitless.
Son: Mom, can i get Metahumans?
Mom: No, we got Metahumans at home.
Metahumans at Home: Starfield
we need this in combo with true AI
So not to be nit picky but the main problem with the muscle animator is the lack of Lat muscle activation when arms are out stretched or straight overhead, dead giveaway for digital model.
How would they have imitated an anamorphic lens?
I feel like gaming is getting to a point where it’s evolving into a new form of entertainment. Hideo Kojima has the idea and he is going to execute
It looks impressive but they look like CG imo. Not even 'uncanny valley' like. Which is fine. It's just more data.
they need to make more powerful cpus for this
Witcher 4 going to use 5.4 ?/ or not means will the faces look like this ?? my god.
Please work on your converting skills even 4k quality is like 720P with extremely bad color bit rate.
my pessimistic outlook sees metahumans as a way to pay actors for only 1 hour of their time for capture data instead of a normal production schedule so they can be even more destitute then they already are.
So they only do an hour of "work" and get paid accordingly.
Maybe it's time to admit to ourselves that it can be a hobby or side gig and doesn't need to be a full-time employment.
What they said starfield characters were supposed to look like lol
Good! But the teeth, throat, and eyes need improvement. When the mouth opens, there isn't a black hole in real humans. Also, the pupils need to be more realistic in their movements.
I think I want to use whatever character creator the Korean studios are using.
Almost, and very impressive! But most of the time there's still that uncanny valley. Maybe in another 5-10yrs they'll get there. But not quite yet.
if this is next gen, imagine next next gen 👀
That Hellblade tech demo wasn't in real time. "Whether I'm acting scared..." (Actress looks left the to the right) in the render Senua only looks to the left. Why lie bruh?
16:14 why did they give dude such a big package ?
What is the name of the program to make clothes?
Marvelous Designer.
21:44 Beautiful girl from "I Follow Rivers - Lykke Li"
THIS GAME NEEDS UNREAL ENGINE 5
kojima is something beyond this world
When I picked up Pong and played it for the first time I envisioned a day where I could be body scanned into a game and be the main protagonist. Get on it I'm getting old.