I know this is sponsored etc, but seriously Unreal needs to employ you fulltime to be their animation evangelist. Anytime we watch a pro animator using sequencer it just makes us all excited about the possibilities. More vids plz sir wade!
The end result was actually pretty good. If the eye-line had been more consistent it would have really sold it. If you had acted against a person or image off camera, that might have fixed that issue.
14:25 😂 I'm pretty sure alot of triple A studious could learn from this video. I swear %90 of video games like the character on the left 😅 this explains so much.
You did an INCREDIBLE job! I think a big problem with a lot of MetaHuman videos is that there are too many overexaggerated facial captures which the software makes uncanny. Your acting makes the model look much better!
Hey @SirWade, I just found your channel and I love your easygoing way of explaining everything in just enough detail to make it clear. You have my subscription :) I'm not sure if I understood you correctly and perhaps you already know this, but when you export your facial animation from metahuman and animation sequence, there's a checkbox there to remove the head movement. Oh, and XSens, let the man keep the suit, you cannot ask for a better ad!
30 seconds in and I'm like "omg, he's got an Xsens! So freaking cool!" Xsens, definitely let this man have 24/7 access to one. On loan, given for free - doesn't matter.
I wrote my own mocap system for Unreal because I really wanted to capture hands, body, face and voice all at once, and getting that all directly in engine is super convenient. I think the live link hub that is coming will help simplify the whole process however. Anyways, great stuff!
I was thinking the same, 128 GB RAM is incredible 😄! But also wondering about the GPU, the RTX 5000 with 16 GB VRAM, curious how this stacks up against a 3080 Ti / 4090 that have the same amount of VRAM? I'm going to guess the 4090 laptop GPU will likely be the fastest option.
This face capture suit system looked really neat Sir. It's also great to see a new upload from you again as well Sir. This motion cap suit system looked really neat. Just wondering do you think someday there will be more affordable versions of motion cap suits coming in the future? Would love to give this a try sometime. looks fun.
Look into Movella subscription pricing man. It costs 1300£ (minimum subscription plan) to reprocess data with this suit. This is on top of the suit cost. It's not for everyone.
I'm pretty sure the Rokoko or the Perception Neuron are the cheapest suits available currently. But if you want cheaper, look at some of the AI solutions. Rokoko has one (pretty sure it's free?) Radical is another and they've just added finger tracking.
It's mainly so everyone can get familiar with the process, since this is what you'd be using at many studio jobs- that way you have a head start if you end up in a professional environment where you need the knowledge, or just want to understand how it works in production :) That's why I had the suit on loan to me - it's out of my budget too haha
@@SirWade Would appreciate for an alternative coz I am tired of watching a lot of channel doing the same and people like us with no budget having a hard time getting a decent output. I am a sound engineer by profession and I do 3d for fun and as a hobby and try to showcase my audio work through storytelling but limited free animations or not so professional output makes me very frustrated since I am in no means an animator ( I wish ) !!
If the facial animation looks amazing, I think the acting itself doesn't fit the type of character. You are pretty expressive with a lot of facial movement, the character himself is a bit old, has a face marked by wrinkles. It gives a feeling the character could be stoic or at least express with measured facial expression.
Hi sir Wade I have a question, I have a walking cycle of 30 frames at 30 fps. If I want to upload it to a game, do I upload 29 frames so that 30 is not the same as 0 or do I upload the complete cycle and Unreal only reads from 0 to 29?
Part 2 to the UE5 mocap animation workflow process has arrived!! I think it's time to do an *actual* project using all of this. Any suggestions??
You're underrating your own acting.
I know this is sponsored etc, but seriously Unreal needs to employ you fulltime to be their animation evangelist. Anytime we watch a pro animator using sequencer it just makes us all excited about the possibilities. More vids plz sir wade!
The end result was actually pretty good. If the eye-line had been more consistent it would have really sold it. If you had acted against a person or image off camera, that might have fixed that issue.
We want more videos like this X Sens , please give this man the Suit!
why am i really getting a walter white vibe from the acting YOU DID SO FREAKING GOOD
14:25 😂 I'm pretty sure alot of triple A studious could learn from this video. I swear %90 of video games like the character on the left 😅 this explains so much.
You did an INCREDIBLE job! I think a big problem with a lot of MetaHuman videos is that there are too many overexaggerated facial captures which the software makes uncanny. Your acting makes the model look much better!
Looks awesome Sir Wade!!
Hey @SirWade, I just found your channel and I love your easygoing way of explaining everything in just enough detail to make it clear. You have my subscription :)
I'm not sure if I understood you correctly and perhaps you already know this, but when you export your facial animation from metahuman and animation sequence, there's a checkbox there to remove the head movement.
Oh, and XSens, let the man keep the suit, you cannot ask for a better ad!
u r really good at acting...
30 seconds in and I'm like "omg, he's got an Xsens! So freaking cool!"
Xsens, definitely let this man have 24/7 access to one. On loan, given for free - doesn't matter.
You can ignore the head and neck movement from the Metahuman animator solve but unchecking an option in the solved asset.
I wrote my own mocap system for Unreal because I really wanted to capture hands, body, face and voice all at once, and getting that all directly in engine is super convenient. I think the live link hub that is coming will help simplify the whole process however. Anyways, great stuff!
That sounds awesome!!
7:45 That's a lot of RAM!
I was thinking the same, 128 GB RAM is incredible 😄! But also wondering about the GPU, the RTX 5000 with 16 GB VRAM, curious how this stacks up against a 3080 Ti / 4090 that have the same amount of VRAM? I'm going to guess the 4090 laptop GPU will likely be the fastest option.
was waiting for this tutorial
This face capture suit system looked really neat Sir. It's also great to see a new upload from you again as well Sir. This motion cap suit system looked really neat. Just wondering do you think someday there will be more affordable versions of motion cap suits coming in the future? Would love to give this a try sometime. looks fun.
Look into Movella subscription pricing man. It costs 1300£ (minimum subscription plan) to reprocess data with this suit. This is on top of the suit cost. It's not for everyone.
I'm pretty sure the Rokoko or the Perception Neuron are the cheapest suits available currently.
But if you want cheaper, look at some of the AI solutions. Rokoko has one (pretty sure it's free?)
Radical is another and they've just added finger tracking.
thanks for sharing
Also could someone mention the name of the interior asset used in this last cinematic shot ?
Obsidium Dark Fantasy Modular Pack :) One of the more expensive things I've bought from the Marketplace but it's SO cool!!
Hi what ai software did you use to change the voice? thanks, great work btw! subbed!
Basically these tutorials are becoming product demonstration day by day as most of us cant afford super expensive suits, sigh !!
It's mainly so everyone can get familiar with the process, since this is what you'd be using at many studio jobs- that way you have a head start if you end up in a professional environment where you need the knowledge, or just want to understand how it works in production :) That's why I had the suit on loan to me - it's out of my budget too haha
@@SirWade Would appreciate for an alternative coz I am tired of watching a lot of channel doing the same and people like us with no budget having a hard time getting a decent output. I am a sound engineer by profession and I do 3d for fun and as a hobby and try to showcase my audio work through storytelling but limited free animations or not so professional output makes me very frustrated since I am in no means an animator ( I wish ) !!
question, what's the AI tool you used to deepen your voice ?
It was a free website, I think it was elevenlabs?
@@SirWade I'll check that one out, thanks!
Music is approprate, basically the Wild West.
Hi Sir wade , I would like to purchase this used xsens mocap suit from Xsens as a used product. Would you ask them if they would be interested?
me too. Please let us know, Sir Wade
You can reach out to their sales team and ask about it, I don't know what they do in terms of used hardware sales :)
why those animations have a glitch?
If the facial animation looks amazing, I think the acting itself doesn't fit the type of character.
You are pretty expressive with a lot of facial movement, the character himself is a bit old, has a face marked by wrinkles. It gives a feeling the character could be stoic or at least express with measured facial expression.
Hi sir Wade I have a question, I have a walking cycle of 30 frames at 30 fps. If I want to upload it to a game, do I upload 29 frames so that 30 is not the same as 0 or do I upload the complete cycle and Unreal only reads from 0 to 29?
You export 1 frame less than the full cycle so that 30 and 0 don’t both play next to each other :) So if 0 and 30 match, export 0-29
Great job man and thanks for sharing!
Thank you!
I'd like to see you switch to rokoko
Just turn off the rotation of head for facial mocap. No need to blend.
You look like a carbon copy of alex herch ( the guy who made gravity falls)
good bye hardware and more money welcome ai with any simple video
🥺
bro the quality is not good for real. You can't use raw data. you have to clean the face up and do alot of tweaking to make it real. Just advising.