Excellent breakdown! Thanks for putting it together! I’m glad to see Faceware Portal get a quick mention at the end. As a long-time user of Faceware, I know the barrier of entry is a bit steep for indy devs and beginners, but I believe Portal eliminates a huge portion of that front end, especially when it comes to building a tracking model. The fact that Portal is a cloud-based neural net solver, I expect it to continue to evolve and improve over time, and hopefully become more accessible to a wider audience.
@@Unrealhan I only briefly touched on this subject in my last webinar, but I hope I’ll have some more substantial feedback to share in the near future. Here’s a clip to preview those tracking results on static cam footage: th-cam.com/users/clipUgkxZbu8aJEZfmg7oBM01iaEBaBayFtcPRNO
Would love a tutorial on editing a base meta human (like heavily edit, turn the human into a orc or dwarf) in blender or zbrush then importing back to unreal with all the facial rigging working, or even how to edit the blend shapes of the new edited character in unreal
What about combine and retarget 3d scanned per frame facial animation and morph animation? Maybe is not good and flexible for unreal but must be better result
For me, a big drawback of ARKit is that certain facial expressions aren't recorded into the blendshapes at all. For example you can hardly do non-symmetric brow up movement (🤨), worried face (😟) and sad face with hanging lower lip. I assume this can be solved by software in the future, but so far I haven't seen much progress in ARKit facial tracking last few years unfortunately.
Hi thanks for this! Can I get your opinion on upcoming image generators? Specifically Bluewillow as they are still in beta testing phase? Are they too late?
I have an android, not an iphone. But I'm working on getting an iphone (not to have as a phone but just for facial motion capture). Could you please put together a tutorial at some point on facial motion capture in Unreal Engine 5 please?
Can I ask you where you got the assets you used to spice up your scene for your last video?? I thought they were assets scattered aroung the city already, but I cant find the movie theater entrance and storefront entrance anywhere....
Really interesting breakdown, and matches exactly my experience, although, the more you feed the FG AI, the better it becomes, so those places where it misses become less and less
I want to see AI interpolated in-betweens for hand-drawn animation that use a timing chart that you give it so it understands which friend a favor because right now that text just looks mushy cuz they're straight in-betweens
@Unreal Han I really hope they start enabling the LiDAR in the rear camera soon. The demo appears to be using the front cam which is SL based with low resolution
We really need an AI that analyses like 10 minutes of videos of someones face and can use that data to make near-perfect copy of all the emotions of that person in metahumans.
if there's someone who looking for lipsync animation for virtual human's speeking here are some alternative ai animation solutions speechgraphics is amazing OVR is free Metahuman SDK Plugin is free
Dude AI is useless. It's guessing + using more power. And people growling isn't a sellable product in entertainment. Entertainment is s break from that.
my 60s ARkit workflow here: th-cam.com/users/shortsXCImJLD9FA4?feature=share
iClone Acculips, with adjustments, layered in UE with ARkit passes, with adjustments, have gotten me the best results by far.
Do you have a workflow tutorial on that?
@@SaguinMedia I will probably do it after my next video release!
Ahahah
Excellent, comprehensive, brilliant, thank you
4:17 Super creepy 😎 Great effect to show a robot freaking out and starting to lose control.
Learn something new from master Han today✌🏻
may i ask, what did you learn from this video ?
Thanks for this comprehensive video, Han.
Glad I discovered this channel
🔥exciting series, looking forward to the next one!
Aye Cory!!!
superb video man
Great stuff Unreal Han! Thank you for your suggestions!
Awesome content Han! subscribed and liked!
I feel like with this ai matching system, the animations look a lot more emotional and believable!
Awesome man thanks you so much, am waiting for more. 💕💕💕💕
And now MetaHuman Animator is dropping soon. Exciting times!
Thank you for sharing your insights, looking forward to learn more from this site! Thank you!
Thank you SO SO SO much for this research. I know it was a mountain. I definantly will stick with ARKit and additional post animation where needed.
Very interesting. Thanks for sharing. Great video ❤😮😊
What an awesome video!
Really interesting!!! Thanks. Sticking to ARKit for now. 👍
Really great content. Thanks for sharing.
Awesome stuff! Very useful sum up for facial production.
Excellent breakdown! Thanks for putting it together! I’m glad to see Faceware Portal get a quick mention at the end.
As a long-time user of Faceware, I know the barrier of entry is a bit steep for indy devs and beginners, but I believe Portal eliminates a huge portion of that front end, especially when it comes to building a tracking model. The fact that Portal is a cloud-based neural net solver, I expect it to continue to evolve and improve over time, and hopefully become more accessible to a wider audience.
Totally agree. It's not easy to get Faceware for sure. Love to hear your thoughts on it in terms of quality tho. Especially the new portal.
@@Unrealhan I only briefly touched on this subject in my last webinar, but I hope I’ll have some more substantial feedback to share in the near future. Here’s a clip to preview those tracking results on static cam footage:
th-cam.com/users/clipUgkxZbu8aJEZfmg7oBM01iaEBaBayFtcPRNO
Love your content! Keep it up!
Fascinating Han thank you so much for sharing! Brilliant deep dive 🙏
感谢分享,干货中的干货👍
Youre the man Han. Show us the ways!
you deserve your name ! +1 sub
Amazing explanation and sharing of knowledge! Insta sub!
Hey, have you seen Unreal Engine's 5.2 Realtime Facial Tracking Animation Demo? Are you using the same tech/approach in the video?
Great stuff keep going
What do you think about the new animation in UE 5.2 they have greatly improved it, please record a video!
Is there any option for android alternative for ARkit
Great video!
Link to face good?
Would love a tutorial on editing a base meta human (like heavily edit, turn the human into a orc or dwarf) in blender or zbrush then importing back to unreal with all the facial rigging working, or even how to edit the blend shapes of the new edited character in unreal
It's coming for sure. Still need a bit time but def on the list!
@Dahoony thank you!
What about combine and retarget 3d scanned per frame facial animation and morph animation? Maybe is not good and flexible for unreal but must be better result
11:09 Respect.
Sir can you make a full tutorial about that
IMO I think I will be sticking to live link face app for a while and then manually correct in Unreal.
Same here.
hi good stuff, which program you use to do the machine learning from tracking data to retarget data? and how did you put it back for auto retargeting
For me, a big drawback of ARKit is that certain facial expressions aren't recorded into the blendshapes at all. For example you can hardly do non-symmetric brow up movement (🤨), worried face (😟) and sad face with hanging lower lip. I assume this can be solved by software in the future, but so far I haven't seen much progress in ARKit facial tracking last few years unfortunately.
good point. the hardware ability is there, just need someone to design the app specifically for us animators. I hope something is coming tho
@@Unrealhan yeah I hope so too. Although it is currently limited, it is very accessible both technically and financially compared to alternatives.
Hi thanks for this! Can I get your opinion on upcoming image generators? Specifically Bluewillow as they are still in beta testing phase? Are they too late?
What is your opinion of the new Faceware Portal solution?
I have an android, not an iphone. But I'm working on getting an iphone (not to have as a phone but just for facial motion capture). Could you please put together a tutorial at some point on facial motion capture in Unreal Engine 5 please?
Does it run on an apple m1 chip. I heard that its complicated?
I think having the unreal engine options completely work on a iOS or android device is truly being accessible
Facil ? What about body track to convert the data into a avatar ?
您好,杨老师。我是一名正在学习ue的学生。想请问一下您的lip sync 中,手指涂抹嘴唇那个部分是怎么做的,可以简单帮我指个方向吗。非常感谢
what software or tool u use to track AI generated faical capture
New sub here looking for good content on unreal for cinematic purposes.
Can I ask you where you got the assets you used to spice up your scene for your last video?? I thought they were assets scattered aroung the city already, but I cant find the movie theater entrance and storefront entrance anywhere....
kitbash3d.com/collections
@@Unrealhan awesome, thank you!
this just got a huge update
Ya crazy stuff right? They cracked the code
Do those metahumans' skins naturally deform based on collisions like a real human? I'm really wondering if this is happening.
Really interesting breakdown, and matches exactly my experience, although, the more you feed the FG AI, the better it becomes, so those places where it misses become less and less
hey i know you!
@@Jsfilmz me too
I want to see AI interpolated in-betweens for hand-drawn animation that use a timing chart that you give it so it understands which friend a favor because right now that text just looks mushy cuz they're straight in-betweens
How about looking at iClone 2 Unreal?
Now Unreal 5.2 is out. What do you think about Metehuman Animator?
Game changer, great for creators and sucks for other companies. Can’t wait for it to come out.
@Unreal Han I really hope they start enabling the LiDAR in the rear camera soon. The demo appears to be using the front cam which is SL based with low resolution
Ar core supported android phone works here?
Me crying in 3DS MAX rip me, honestly i might aswell start some sorta RND for facial tracking for 3ds max, i need courage!!!!
does it work on mac m1 max ?
What about faceware?
Should be similar idea. I tested their lite version before, similar to ARKit. The new portal looks like an ai driven solution as well.
The ghetto camera rig was like $4k lol!!
我也在期待那一天的到来,去释放arkit或者普通摄像头的最高的能力,AiGC的时代终将来临!!
We really need an AI that analyses like 10 minutes of videos of someones face and can use that data to make near-perfect copy of all the emotions of that person in metahumans.
top
Give alternative we dont have i phone
if there's someone who looking for lipsync animation for virtual human's speeking here are some alternative ai animation solutions
speechgraphics is amazing
OVR is free
Metahuman SDK Plugin is free
man....metahuman animator is comming...it use iphone depth sensor. wow.. what a fast
🦒
That video of the robot girl and the girl kissing looks inappropriate honestly
I’d rather it didn’t it’s bad enough without getting it to do that
Dude AI is useless. It's guessing + using more power. And people growling isn't a sellable product in entertainment. Entertainment is s break from that.