THIS IS HUGE! Everyone can do High Quality Face Mocap now!
ฝัง
- เผยแพร่เมื่อ 22 มิ.ย. 2023
- (advertising @hollyland) Check out the Hollyland Eco-system: www.hollyland.com - Learn how to use the new Metahuman Animator in Unreal Engine 5.2. Capture high detailed face motion data for your custom metahuman in this tutorial video.
Hollyland Products that we use
Solidcom C1 Pro ► www.hollyland.com/product/sol...
LARK M1 ► www.hollyland.com/product/lar...
LARK M1 ► www.hollyland.com/product/lar...
Cosmo C1 ► www.hollyland.com/product/cos...
Learn Unreal Engine 5 for Beginners
► www.cinecom.net/courses/unrea...
More Unreal Engine Tutorials
► • 5 Tricks you (probably...
Read More
► www.cinecom.net/unreal-engine...
👕 MERCH
► cinecom.net/merch
🎬 Check our Award Winning Courses
► cinecom.net/courses
💙 LETS CONNECT!
Instagram ► cinecom.info/Instagram
Discord ► / discord
💥 Download Unlimited Video Assets
► storyblocks.com/Cinecom
#Cinecom
Unreal Engine 5.2 brings some awesome new features! I know what to do this weekend! 😁 What are you weekend plans?
Making blender animations! 😂
glad to see you got it working
you dont know a ffing thing about it lol
Okay Sure@@peter486
"We don't need expensive gear"
Also him: "I'm using my iPhone right now"
🗿
Yeah... saying "expensive" depends on a person's perspective...
iPhones in general are expensive... And not very durable 😅
+100 for the neat folder structure and detailed naming of assets! :D Great tutorial!
You guys are great, you show from start to finish for a tutorial at a followable pace.
how do i fix the floating head
YESS CINECOM UPLOADS ALWAYS MAKE MY DAY!
This is a game-changer for face mocap! 🙌
Dont forget, this is just the first step... in a few months, years this is gonna be even better.... amazing!
just a note - your metahuman has a bowling ball heavy lower face because you scanned yourself with a big beard. the solver doesn't know the difference between hair and skin so it adds your beard as geometry, covers it in skin, and then you add your beard on top of that. You'd have a much better match if you went clean shaven for a scan-2-metahuman/animator creation, and then do your captures after that. From that period you should be able to use animator while having your beard and then apply the animation data to your non-bulbous metahuman, just as you would apply it to anyone other than you.
I have a massive mustache and metahuman thinks I have an overbite. Will have to shave it but it should grow back in one to two days.
My nba2k player always looked ridiculous because of that
@@44punk yeah, that’s how Freddie mercury got those big front chompers. He had that mustache when his face was created.
Hey Jordy, AKA MASTER ARTIST 😁 your Skilshare courses are awesome man . Funny and straight to the point 😃 Thank you so much.
2 minutes of tutorial and 7 mins of sponsorship 😇
Thanks for the quick funny workflow. ❤😂🎉
would like to see a detailed tutorial from start to finish. From a blank project to final render. Not just face mocap but also body animations. It is often quite complex and confusing as to the number of steps and different interfaces required. Many of the “tutorials” do not explain properly and often skip steps or start with “ I have already set up the …”. Also, the accuracy of the lip sync is often off and letters like B and P are not done with closed lips etc. Also how to animate the eyes to be more focused on the camera when needed. I would ideally like a single tutorial instead of having to watch dozens of videos each doing something different and not always working as expected.
Tat's the whole point, don't you think it's strange that THEY ALL suffer from the same total arrogance and stupid jokes and half-info, plus sponsoring they're mates crap for you to buy and to get stuck, whilst they make dallas
have you found one yet? looking for the same
@@LiterallyJord me too did you guys get any thing
So cool!
Thank you. This was really helpful
So good, thanks!
Thanks for uploading video 😊
Great tutorial man! This does skim over a lot of important details though. For example, what you showed here is only enough to animate the face and play it back, nothing else. Would've been nice if you talked about baking to control rig to do some tweaks or even how to animate the face with a performance (you're final export showed body and face). Also exporting the animation the way you did will only work on your specific metahuman since you specified that mesh. If you exported the metahuman skeleton instead, it could work on any metahuman.
This is correct, I am trying to figure out how to connect the body and the head together without the head floating away
Exactly ... instead of all the promoting there f4ggoty porn friends add ons and soft/hardware , give us info that's realistic and compleet ...spice girl with a beard
Love the videos!
this is mind blowing, does it work with android as well as long as it has LIDAR?
I wonder if this would work with any other 3D model with enough blend shapes. This would be a really nice and easy way to animate not just the meta humans. BUTTTTT Jordie... im surprised you guys are only noticing this now?!?!?!? this live link has been a thing since UE4.24 or 26 but and meta humans since like beta 5.0 lol BUT you still made it simple and fast to ingest and start creating thank you :)
Hi can I use blender or unreal engine on ipad. if yes then which one is the best ipad to use it for rendering and editing in blender / unreal engine
It was so unusual to see your neutral face since you are mostly very animated! Love your videos!!
Are there significant improvements for the later editions of phones?
So fun!
I never thought I needed to see jordy with shredded abs and chest 😂
Thank you, but how I did shape keys in fast way any program for that??
Nice ✨✨
you guys are the best
Does this only work on metahumans or can i use my own 3d model with this face mocap
Does this work on ipad pro with true depth sensor too?
I love the section where you explain how movie special effects were created.
and i have one quick question.
Is a vfx certificate required for employment in a company like MPC, SUNRISE WETA, etc?
Absolutely not required. You "just" need to be good at what you're doing. They don't care about a degree. You can learn everything you need through TH-cam, google, online courses and a lot of trial and error on your own. The one advantage a school might bring you is maybe contacts and connections into the industry. But generally I wouldn't recommend studying vfx, except it's at one of the top schools... and they are pretty expensive.
It would have been cool if you also mentioned the scene where the neck floating problem in the end etc.
What do you think about the possibility of using Apple's upcoming Vision Pro for virtual production? Could it make LED walls obsolete? Could actors and crew use this to work remotely?
Thanks for showing...I just wish it was even easier...so many menus to Open, so many options to choose wrong, so many steps to remember
I thought i was alone in this lost adventure 🤣🤣🤣🤣
Unreal looks almost real now!
Happy Friday!
what about live lips and expressions in the Quest pro in Unreal / metahuman etc ?
Sorry, might be a Noob question... Does this app mocap data works with only metahuman characters? Or we can use it on our own characters created using maya/zbrush? Because everyone one is just mentioning metahuman characters only.
Crazy! Could this be used for lip sync and just an image of a face?
Hi can this method work for Android phone.
Thank you
Seems like Unreal engine's native support of Mocap is only available using iphone. (besides the professional equipment) am I right?
I saw some webcam version of face mocap software but seems not smooth.
Buying some used iphone is the only cheap way to use face mocap I guess
*Is it anyway to do this live with your phone?*
is pasible only with iphone?
that was great nice bruh . love from indonesia
Thanks for the great tutorial, but could anybody help, why animation doesn’t work when i track it to the face mesh in sequencer, after all the steps have been done? Cannot find any info how to solve it
You dont need LiDAR, just true depth face Cam on iPhone
😁 oh yeah. depth cam cant map objects its just make depth of any object. you cant get movement with out lidar . then vfx films games just use depth cam for make full body movements not full body suit with trcking point for lidar
@@SggQpwpqpq-vq3ds knew that , was just related to metahuman animator and the mistake included in this particular video
TrueDepth camera is lidar bruh
That’s just lidar
What app do you use with an android? I have a Samnsung galaxy S23 ultra
Is there a way we can replicate using pixel 4xl ir camera?
Here's me hoping that one day there will be a way to utilise these types of technologies, in a much simple way.
my only question is
can you do a full body for the process of creating a characters movement
nice tut.why did you mention that requires the lidar sensor when you havent use the lidar sensor in the video?
is automatic used by the iphone when he records on iphone with metahuman app....it gives all the datas and not just a normal video.
ACTUAL GOLD
Hey, I actually would greatly appreciate the help, can we create a metahuman with the live link video data, like when we don't already have a metahuman of the person in the video but want to create a metahuman with the help of the live link video..
any chance of getting this to work on mac? metahuman plugin is windows only 😢
is it possible to do in a android phone?
I can't upload the live link files into UE5. When I try to drag it intto the content folder, it says error. Any idea whats up?
Do you have on for combining body and facial like you did here?
how do i fix my separated head?
While playing my character, whatever sound I had given in my iphone live link face, I unable to hear sound.. you can timeline 8:30 ..?
I have got problem, Time: 8:28 I don't see it in the drop-down tab. I cannot choose AS Metahuman Performance. And in the folder I have it.
Please, I have a problem! after installing Metahuman plugin from the market, and when I start UE5.3 I cant found plugin in the list !!! any solution thank you
Nightmare fuel!
Can I use a HMC?
Where can i test video to face?
haven't you skipped the FIT TEETH part ? or is it alright to bypass that button
so many steps. I think this can still be made more efficient
Ok but, how do i exactly do hand tracking?
Wow! I wanna know how to make a metahuman that is so like yourself!!! When I use MetaHuman animator or metahuman creator, the face is not like me so much.
I wouldn't call this high quality but its better than what we used to have.
is this "live link" is on pc? using a phone is a little bit unprofessional and cheap.
I would go with an open lip when picking an open frame .
Any iPhone 11 or up works?
Still waiting for the Body Motion Capture 😅
did u ever made games?
Like this)
Wait a sec iPhone 11 doesn't have lidar sensor🤔
I clicked on the face and then clicked on the asset in animation mode. And I clicked on the animation sequence (performance), but the video does not play. What's the problem?
Same problem mate
Is there a way to animate a non Metahuman face?
Lidar sensor on IPhone 11.????
How did you fiwx the floating head?
I cannot figure it out for the life of me.
How to fix the floating head?
Wait, I thought that Iphone X or greater was all that was necessary for using Live Link?
Can anyone tell me why my UE can't find my device, they are on the same network!
*yup this world moves so fast*
And just like that……. I need to watch again. Lol
I've got the problem, that the "MetaHuman Plugin" is unavailable for me. Anyone else with this problem?
We don’t need expensive gear, just an iPhone 📲
😅😅 nice tutorial.. Thanks
Autor:
We don't need expensive gear.
Autor also:
I'm using my i-phone
That's how you talk, you show a lot of great things, but I haven't seen a single video of yours with a normal explanation of how to get the desired effects.
So cool to see you make more UE5 stuff 🤩
Wheres the lonk on how to create custom metahuman
So, explain to me why someone would even need an iPhone if Unreal Engine is doing the actual facial capture with the green tracking marks? I would also LOVE a video from you on the Unreal Engine and Character Creator 4 from Reallusion's workflow for character animation. Both full body, facial etc, it might actually be superior to Unreal Engine, more universal, and easier to set up 🤔 but your opinion and a video would be awesome
My guess is that the iPhone app is bundling the RGB video data with the depth from the iPhone's camera (and possibly it's ARKit 3D model as well). If that's happening, then Unreal is likely matching it's analysis of the RGB video with the extra information the app gave, using both together to give a better result.
I'd love to see how this compares with just using RGB video, since I don't have an iPhone.
I want to know if you can create my avatar so that it can also work on zoom. And social media videos. And be able to easily modify your wardrobe. If so, I want to understand your values. And how to contact you. Thank you
👍🏻👍🏻
There was a company called faceshift that Apple bought up. That software is inbeded in all iphones. Thats why that works. Its not unreal doing the heavy lifting.
Prior to that facechift worked with 50 USD kinect.
Is this possible with Android? 😅
Thank you for doing this video…. As we usually have to search every UE TH-cam video for this info. The problem is…. In one year, UE will probably change the interface and procedure, making the above video obsolete!
why cant i save this video on my playlist?
Sir I have a sci-fi story. I want to make it as a short film. But I have no money and no software. Can you tell me how to do it ? Please any suggestions me sir
Can be used on iphone x and XR. That's how I do it
I wish they would update the body.
Can you do it on android?