just a note - your metahuman has a bowling ball heavy lower face because you scanned yourself with a big beard. the solver doesn't know the difference between hair and skin so it adds your beard as geometry, covers it in skin, and then you add your beard on top of that. You'd have a much better match if you went clean shaven for a scan-2-metahuman/animator creation, and then do your captures after that. From that period you should be able to use animator while having your beard and then apply the animation data to your non-bulbous metahuman, just as you would apply it to anyone other than you.
would like to see a detailed tutorial from start to finish. From a blank project to final render. Not just face mocap but also body animations. It is often quite complex and confusing as to the number of steps and different interfaces required. Many of the “tutorials” do not explain properly and often skip steps or start with “ I have already set up the …”. Also, the accuracy of the lip sync is often off and letters like B and P are not done with closed lips etc. Also how to animate the eyes to be more focused on the camera when needed. I would ideally like a single tutorial instead of having to watch dozens of videos each doing something different and not always working as expected.
Tat's the whole point, don't you think it's strange that THEY ALL suffer from the same total arrogance and stupid jokes and half-info, plus sponsoring they're mates crap for you to buy and to get stuck, whilst they make dallas
Great tutorial man! This does skim over a lot of important details though. For example, what you showed here is only enough to animate the face and play it back, nothing else. Would've been nice if you talked about baking to control rig to do some tweaks or even how to animate the face with a performance (you're final export showed body and face). Also exporting the animation the way you did will only work on your specific metahuman since you specified that mesh. If you exported the metahuman skeleton instead, it could work on any metahuman.
Exactly ... instead of all the promoting there f4ggoty porn friends add ons and soft/hardware , give us info that's realistic and compleet ...spice girl with a beard
I don't get it all of people promoting how amazing this feature is and literally no ones talking about that is completely unsuable for complex animations because of huge bugs with head rotation, body disconnection, baking etc.
😁 oh yeah. depth cam cant map objects its just make depth of any object. you cant get movement with out lidar . then vfx films games just use depth cam for make full body movements not full body suit with trcking point for lidar
There was a company called faceshift that Apple bought up. That software is inbeded in all iphones. Thats why that works. Its not unreal doing the heavy lifting. Prior to that facechift worked with 50 USD kinect.
I wonder if this would work with any other 3D model with enough blend shapes. This would be a really nice and easy way to animate not just the meta humans. BUTTTTT Jordie... im surprised you guys are only noticing this now?!?!?!? this live link has been a thing since UE4.24 or 26 but and meta humans since like beta 5.0 lol BUT you still made it simple and fast to ingest and start creating thank you :)
"Don't need expensive gear", "I'm using my iPhone" Last time I checked iPhone's (with LiDAR, like the iPhone 11) don't fall in the category "not expensive".
That's how you talk, you show a lot of great things, but I haven't seen a single video of yours with a normal explanation of how to get the desired effects.
8:22 - Performance to existing MetaHuman downloaded from Quixel Bridge / Created from MetaHuman Creator / Custom MH. Thank you!!! I was looking for that everywhere.
I clicked on the face and then clicked on the asset in animation mode. And I clicked on the animation sequence (performance), but the video does not play. What's the problem?
I love the section where you explain how movie special effects were created. and i have one quick question. Is a vfx certificate required for employment in a company like MPC, SUNRISE WETA, etc?
Absolutely not required. You "just" need to be good at what you're doing. They don't care about a degree. You can learn everything you need through TH-cam, google, online courses and a lot of trial and error on your own. The one advantage a school might bring you is maybe contacts and connections into the industry. But generally I wouldn't recommend studying vfx, except it's at one of the top schools... and they are pretty expensive.
Seems like Unreal engine's native support of Mocap is only available using iphone. (besides the professional equipment) am I right? I saw some webcam version of face mocap software but seems not smooth. Buying some used iphone is the only cheap way to use face mocap I guess
Very COOL! However I have been looking for a way to do this to a character that is not a Metahuman, A character that i have Modeled in Cinema 4D... What is the best way to do this face MoCap to a character like that... ?
Stop telling people how they should teach. That really bothers me. Take what you get and sit your happy ass down somewhere and put in the time. This stuff ain’t for people who want it easy. You’ll never forget what you spend time on learning. Thanks for the tutorial!!
What do you think about the possibility of using Apple's upcoming Vision Pro for virtual production? Could it make LED walls obsolete? Could actors and crew use this to work remotely?
Thanks for the great tutorial, but could anybody help, why animation doesn’t work when i track it to the face mesh in sequencer, after all the steps have been done? Cannot find any info how to solve it
I posted a question about Unreal Engine 5.2 and if it is possible to use imported animals like a fully rigged gorilla made with 3ds Max. Maybe I should not post a link here. Sorry. This could change everything as a filmmaker - will buy your course
@@liampugh could you answer more detailed please? I am totally new and interested in Unreal Engine. I do have a filmstudio with greenscreen and some good lights. Do you think it is realistic to set this up using the course? Is there a support in the course?
Thank you for doing this video…. As we usually have to search every UE TH-cam video for this info. The problem is…. In one year, UE will probably change the interface and procedure, making the above video obsolete!
Dude what you're talking about, this was possible with live link app on iphone years ago - but it was live and you had to record the whole thing and put it together in sequencer and bake an animation .... I made myself a lot of animations last year, same quality
live link face was available, yes. but the way the animation data is processed and the number of facial data points is far different. The use of the lidar is the key change, and the final product here is WAY better. I too animated lots of faces with the live link face and there is simply no comparison.
Wow! I wanna know how to make a metahuman that is so like yourself!!! When I use MetaHuman animator or metahuman creator, the face is not like me so much.
Sorry, might be a Noob question... Does this app mocap data works with only metahuman characters? Or we can use it on our own characters created using maya/zbrush? Because everyone one is just mentioning metahuman characters only.
So, explain to me why someone would even need an iPhone if Unreal Engine is doing the actual facial capture with the green tracking marks? I would also LOVE a video from you on the Unreal Engine and Character Creator 4 from Reallusion's workflow for character animation. Both full body, facial etc, it might actually be superior to Unreal Engine, more universal, and easier to set up 🤔 but your opinion and a video would be awesome
My guess is that the iPhone app is bundling the RGB video data with the depth from the iPhone's camera (and possibly it's ARKit 3D model as well). If that's happening, then Unreal is likely matching it's analysis of the RGB video with the extra information the app gave, using both together to give a better result. I'd love to see how this compares with just using RGB video, since I don't have an iPhone.
Unreal Engine 5.2 brings some awesome new features! I know what to do this weekend! 😁 What are you weekend plans?
Making blender animations! 😂
glad to see you got it working
you dont know a ffing thing about it lol
Okay Sure@@peter486
2 minutes of tutorial and 7 mins of sponsorship 😇
"We don't need expensive gear"
Also him: "I'm using my iPhone right now"
🗿
Yeah... saying "expensive" depends on a person's perspective...
iPhones in general are expensive... And not very durable 😅
if you think an iphone to do facial mocap is expensive it's best you don't go into facial mocap.
iphones are fucking cheaps nowaday we arent in 2010 area anymore just buy an old one lol...
@@ZinzinnovichYou’d need one capable of facial recognition. So don’t buy too old. Pretty much any model after and including the X should work.
just a note - your metahuman has a bowling ball heavy lower face because you scanned yourself with a big beard. the solver doesn't know the difference between hair and skin so it adds your beard as geometry, covers it in skin, and then you add your beard on top of that. You'd have a much better match if you went clean shaven for a scan-2-metahuman/animator creation, and then do your captures after that. From that period you should be able to use animator while having your beard and then apply the animation data to your non-bulbous metahuman, just as you would apply it to anyone other than you.
I have a massive mustache and metahuman thinks I have an overbite. Will have to shave it but it should grow back in one to two days.
My nba2k player always looked ridiculous because of that
@@44punk yeah, that’s how Freddie mercury got those big front chompers. He had that mustache when his face was created.
would like to see a detailed tutorial from start to finish. From a blank project to final render. Not just face mocap but also body animations. It is often quite complex and confusing as to the number of steps and different interfaces required. Many of the “tutorials” do not explain properly and often skip steps or start with “ I have already set up the …”. Also, the accuracy of the lip sync is often off and letters like B and P are not done with closed lips etc. Also how to animate the eyes to be more focused on the camera when needed. I would ideally like a single tutorial instead of having to watch dozens of videos each doing something different and not always working as expected.
Tat's the whole point, don't you think it's strange that THEY ALL suffer from the same total arrogance and stupid jokes and half-info, plus sponsoring they're mates crap for you to buy and to get stuck, whilst they make dallas
have you found one yet? looking for the same
@@LiterallyJord me too did you guys get any thing
+100 for the neat folder structure and detailed naming of assets! :D Great tutorial!
Great tutorial man! This does skim over a lot of important details though. For example, what you showed here is only enough to animate the face and play it back, nothing else. Would've been nice if you talked about baking to control rig to do some tweaks or even how to animate the face with a performance (you're final export showed body and face). Also exporting the animation the way you did will only work on your specific metahuman since you specified that mesh. If you exported the metahuman skeleton instead, it could work on any metahuman.
This is correct, I am trying to figure out how to connect the body and the head together without the head floating away
Exactly ... instead of all the promoting there f4ggoty porn friends add ons and soft/hardware , give us info that's realistic and compleet ...spice girl with a beard
I don't get it all of people promoting how amazing this feature is and literally no ones talking about that is completely unsuable for complex animations because of huge bugs with head rotation, body disconnection, baking etc.
Dont forget, this is just the first step... in a few months, years this is gonna be even better.... amazing!
You dont need LiDAR, just true depth face Cam on iPhone
😁 oh yeah. depth cam cant map objects its just make depth of any object. you cant get movement with out lidar . then vfx films games just use depth cam for make full body movements not full body suit with trcking point for lidar
@@SggQpwpqpq-vq3ds knew that , was just related to metahuman animator and the mistake included in this particular video
TrueDepth camera is lidar bruh
That’s just lidar
You guys are great, you show from start to finish for a tutorial at a followable pace.
how do i fix the floating head
this is mind blowing, does it work with android as well as long as it has LIDAR?
I wouldn't call this high quality but its better than what we used to have.
This is a game-changer for face mocap! 🙌
YESS CINECOM UPLOADS ALWAYS MAKE MY DAY!
Hey Jordy, AKA MASTER ARTIST 😁 your Skilshare courses are awesome man . Funny and straight to the point 😃 Thank you so much.
Thanks for showing...I just wish it was even easier...so many menus to Open, so many options to choose wrong, so many steps to remember
I thought i was alone in this lost adventure 🤣🤣🤣🤣
There was a company called faceshift that Apple bought up. That software is inbeded in all iphones. Thats why that works. Its not unreal doing the heavy lifting.
Prior to that facechift worked with 50 USD kinect.
is it possible to do in a android phone?
I wonder if this would work with any other 3D model with enough blend shapes. This would be a really nice and easy way to animate not just the meta humans. BUTTTTT Jordie... im surprised you guys are only noticing this now?!?!?!? this live link has been a thing since UE4.24 or 26 but and meta humans since like beta 5.0 lol BUT you still made it simple and fast to ingest and start creating thank you :)
"Don't need expensive gear", "I'm using my iPhone"
Last time I checked iPhone's (with LiDAR, like the iPhone 11) don't fall in the category "not expensive".
That's how you talk, you show a lot of great things, but I haven't seen a single video of yours with a normal explanation of how to get the desired effects.
8:22 - Performance to existing MetaHuman downloaded from Quixel Bridge / Created from MetaHuman Creator / Custom MH. Thank you!!! I was looking for that everywhere.
Thanks for the quick funny workflow. ❤😂🎉
*Is it anyway to do this live with your phone?*
nice tut.why did you mention that requires the lidar sensor when you havent use the lidar sensor in the video?
is automatic used by the iphone when he records on iphone with metahuman app....it gives all the datas and not just a normal video.
It was so unusual to see your neutral face since you are mostly very animated! Love your videos!!
Unreal looks almost real now!
Happy Friday!
Thank you. This was really helpful
so many steps. I think this can still be made more efficient
Autor:
We don't need expensive gear.
Autor also:
I'm using my i-phone
Which link is for how to create a meta human?
Thanks for uploading video 😊
How did you fiwx the floating head?
I cannot figure it out for the life of me.
God damn it, another iPhone tracking video. I'M NOT BUYING AN IPHONE JUST TO TRY SOMETHING FOR 5 SECONDS!
Metahuman Animator does not support iPhone 11, only 12 and above.
We don’t need expensive gear, just an iPhone 📲
😅😅 nice tutorial.. Thanks
Clickbait. They want you to buy an "iphone".
I just had to
Like just every good MoCap
you guys are the best
So good, thanks!
that was great nice bruh . love from indonesia
I clicked on the face and then clicked on the asset in animation mode. And I clicked on the animation sequence (performance), but the video does not play. What's the problem?
Same problem mate
Is there a way to animate a non Metahuman face?
So cool!
Love the videos!
I never thought I needed to see jordy with shredded abs and chest 😂
Nightmare fuel!
Wait a sec iPhone 11 doesn't have lidar sensor🤔
Do you have on for combining body and facial like you did here?
how do i fix my separated head?
I love the section where you explain how movie special effects were created.
and i have one quick question.
Is a vfx certificate required for employment in a company like MPC, SUNRISE WETA, etc?
Absolutely not required. You "just" need to be good at what you're doing. They don't care about a degree. You can learn everything you need through TH-cam, google, online courses and a lot of trial and error on your own. The one advantage a school might bring you is maybe contacts and connections into the industry. But generally I wouldn't recommend studying vfx, except it's at one of the top schools... and they are pretty expensive.
Here's me hoping that one day there will be a way to utilise these types of technologies, in a much simple way.
Hi can this method work for Android phone.
Thank you
Are there significant improvements for the later editions of phones?
Seems like Unreal engine's native support of Mocap is only available using iphone. (besides the professional equipment) am I right?
I saw some webcam version of face mocap software but seems not smooth.
Buying some used iphone is the only cheap way to use face mocap I guess
ACTUAL GOLD
Very COOL! However I have been looking for a way to do this to a character that is not a Metahuman, A character that i have Modeled in Cinema 4D... What is the best way to do this face MoCap to a character like that... ?
Thanks very useful
haven't you skipped the FIT TEETH part ? or is it alright to bypass that button
Everyone... with an iPhone.
Can't wait for a body mocap with the same 4D tech.
Still waiting for the Body Motion Capture 😅
my only question is
can you do a full body for the process of creating a characters movement
So fun!
So.. just a video add and also a bit of unreal.
Hi can I use blender or unreal engine on ipad. if yes then which one is the best ipad to use it for rendering and editing in blender / unreal engine
is pasible only with iphone?
Crazy! Could this be used for lip sync and just an image of a face?
And just like that……. I need to watch again. Lol
I would go with an open lip when picking an open frame .
It would have been cool if you also mentioned the scene where the neck floating problem in the end etc.
*yup this world moves so fast*
Stop telling people how they should teach. That really bothers me. Take what you get and sit your happy ass down somewhere and put in the time. This stuff ain’t for people who want it easy. You’ll never forget what you spend time on learning. Thanks for the tutorial!!
What do you think about the possibility of using Apple's upcoming Vision Pro for virtual production? Could it make LED walls obsolete? Could actors and crew use this to work remotely?
i like 70% of this video is random ads
Thanks for the great tutorial, but could anybody help, why animation doesn’t work when i track it to the face mesh in sequencer, after all the steps have been done? Cannot find any info how to solve it
my metahuman dont have hair when i include in mi project
Same problem for me too
Nice ✨✨
Spoiler alert: Everyone (who is RICH) can do High Quality face Mocap now.
Does this only work on metahumans or can i use my own 3d model with this face mocap
Feel like the whole video is an advert
It is lol
I posted a question about Unreal Engine 5.2 and if it is possible to use imported animals like a fully rigged gorilla made with 3ds Max. Maybe I should not post a link here. Sorry. This could change everything as a filmmaker - will buy your course
Yes
@@liampugh could you answer more detailed please? I am totally new and interested in Unreal Engine. I do have a filmstudio with greenscreen and some good lights. Do you think it is realistic to set this up using the course? Is there a support in the course?
Is this Unreal Engine guide or a promo video?
Promo video with a course commercial at the end lmao
any chance of getting this to work on mac? metahuman plugin is windows only 😢
When having on clothes on the metahuman and the head is moving and breaking true the shirt
Excuse me, what is the price of the suit and where can it be purchased? What if you can only record with an Apple cell phone? greetings from Mexico
Any iPhone 11 or up works?
Thank you, but how I did shape keys in fast way any program for that??
Does this work on ipad pro with true depth sensor too?
Thank you for doing this video…. As we usually have to search every UE TH-cam video for this info. The problem is…. In one year, UE will probably change the interface and procedure, making the above video obsolete!
Dude what you're talking about, this was possible with live link app on iphone years ago - but it was live and you had to record the whole thing and put it together in sequencer and bake an animation .... I made myself a lot of animations last year, same quality
live link face was available, yes. but the way the animation data is processed and the number of facial data points is far different. The use of the lidar is the key change, and the final product here is WAY better. I too animated lots of faces with the live link face and there is simply no comparison.
Wow! I wanna know how to make a metahuman that is so like yourself!!! When I use MetaHuman animator or metahuman creator, the face is not like me so much.
So cool to see you make more UE5 stuff 🤩
How do you add clothes?
We don’t need expensive gear, just an iPhone
Uhh yeah! Its is 😅
Thanks for the tutorial😃
did u ever made games?
Note: Same exact voice and cadence as the TH-cam robo-drummer.
Sorry, might be a Noob question... Does this app mocap data works with only metahuman characters? Or we can use it on our own characters created using maya/zbrush? Because everyone one is just mentioning metahuman characters only.
So, explain to me why someone would even need an iPhone if Unreal Engine is doing the actual facial capture with the green tracking marks? I would also LOVE a video from you on the Unreal Engine and Character Creator 4 from Reallusion's workflow for character animation. Both full body, facial etc, it might actually be superior to Unreal Engine, more universal, and easier to set up 🤔 but your opinion and a video would be awesome
My guess is that the iPhone app is bundling the RGB video data with the depth from the iPhone's camera (and possibly it's ARKit 3D model as well). If that's happening, then Unreal is likely matching it's analysis of the RGB video with the extra information the app gave, using both together to give a better result.
I'd love to see how this compares with just using RGB video, since I don't have an iPhone.
is this "live link" is on pc? using a phone is a little bit unprofessional and cheap.
That's a lot of work AND you need an iPhone!!!! - Stuff that, I stream my Face-Cap directly (live) from an Xbox 360 camera, which is about £20. 🥳
The thumbs