Thank you for this tutorial, very helpful! Capturing facial performance using ARKit is really simple and nice, but I hope resolution of this capture will improve in the future. It's nice for some stylized animation with exaggerated expressions, but not for something realistic.
Have you tried using a Mixamo animation with ARKit as rigged with Human Generator? I'm considering purchasing Human Generator and haven't been able to find an answer to that question. Thanks
If someone fails to automatically record and bake data from an iPhone to a PC, everyone who has a similar problem, you just need to enter the correct IP address of your PC, I foolishly entered the same iPhone address in two fields :) To find out the PC IP address Press the Win + R combination. In the window that opens, enter cmd and click OK. Next, type the ipconfig command and press Enter.
I'd like to use this method but the head was built in the Facebuilder addon and currently is not rigged. IF anyone can confirm that this is possible i'd be stoked.
Hello, and thank you for this video which will help me enormously for my future facial animations of my characters. So my question is this, DOES IT WORK ONLY FOR HUMANGENERATOR OR WITH OTHER CHARACTERS CREATED WITH OTHER PLATFORMS? because I have already created my characters with MIXAMO and other facial software, will it also automatically animate my characters?. Please answer please.
It will work as long as the face is set up to work with ARkit. Human Gen just sets that process up for you instead of having to make all the shapekeys your self.
Thank you very much, I have just tested, and indeed the ARkit must be configured on the character. There I will buy the unlimited version to have more functionality of the software. Thank you again for your video which encouraged me to use this system.
Super useful, thanks a lot! I was wondering if I could ask for your advice. I’m creating a character where the face is an animated texture, so for example you get different versions of eyes and the opacity of each version is controlled with a mixer shader.. I guess I could set up some control bones for these texture parameters. Can I somehow map the data from the app to these parameters instead of a fully fledged facial rig? Hope that makes sense lol
I noticed 'enable head rotation' was on in the app but didn't come across to blender. Is there a way to have that? Or did you keyframe that manually afterwards? thanks!
The IOS app worked OK. However, the Blender add-on just didn't work :-(. After "stop recording" it just said "Now Baking" with no activity on the screen or CPU usage. In the end, I went the export & convert route.
How long did you leave it in the "Now Baking" stage? I did have a problem a couple weeks ago where the link wouldn't properly bake but a reinstall of the addon fixed it
@@djmnx01 should be only a minute or two at the most. I’d also check to make sure my network wasn’t interfering somehow, I didn’t have any issues but I imagine some network setups might be a little more restrictive
@@ZaneOlson Thanks for the info. The audio transferred over fine, and the realtime streaming also worked, so I don't think it's the network. I'll reinstall tomorrow and see if that has any affect.
I think it's supposed to but for whatever reason it has never worked for me. That hasn't really been an issue though cause I would just remove it so I can animated the eyes to match whatever eyeliner I needed in the scene, I doubt I could nail it through the mo-cap
No this is separate. Using ARkit removes the ability to pose the face with any other method. To get the best of both worlds I create a backup of the original in case.
@@ZaneOlson dude so hype you replied. Your work is insane. Been studying for months to get to a point where I can ‘begin.’ Sorry if my question here is dumb, just trying to clarify things: So using ARKit essentially overwrites the original facial rig? This is why you’re saying to back it up. It’s overhauling the face rig so for that reason it’s useable with auto rig pro or rigify. 🤔 Sorry, I’m really not that bright… just persistent as heck.
@@c0nsumption It's all good, this stuff is confusing as hell, it's decades of processes built up and implemented over time so there's a lot of info and a lot of stumbling around to find solutions. In HumanGen ARKit overrides the rigify face controls only, so the whole body is still animated from the rigify rig but the face will need mocap to animate. You can revert it but I've crashes trying so I duplicate the whole folder and use one for ARKit and the original with the rigify face controls, though I've never had to actually use the rigify face controls, but having the option is nice. Good luck on your journey, and if you have any other questions or need clarification feel free to ask anything. One of my first weeks using blender I spent a couple days just learning how to properly select items in the scene, so there's no such thing as a stupid question here.
No, but for most workflows animating the body and eye movements separately is an advantage especially since mocap is working without a reference it's easy to mess up eye lines.
@@ZaneOlson Thanks for responding, Zane. Do you think you might make a video on how to bake physics, like with wind fx before animating other things? You mentioned this in your last video when you made the robe for the relic keeper.
I have followed this but when it comes to the recording step I am stuck on NOW BAKING and I also dont see it happen the same way you do because when I hit FINISH RECORDING on my iPad it only gives me the option to export the FBX i dont see the BLENDER IS PROCESSING step.
Everyone who has a similar problem, you just need to enter the correct IP address of your PC, I foolishly entered the same iPhone address in two fields :) To find out the PC address Press the Win + R combination. In the window that opens, enter cmd and click OK. Next, type the ipconfig command and press Enter.
Hi, I have a question please. How do I make a single character's facial move on the stage? when I have two or more characters on the stage? I remarked that when I start the stream all the characters also move their faces, so I just need one character when I start my stream. Thanks again.
That’s a great question, I haven’t had the need yet to have two characters in one scene yet so unfortunately I’m not sure. I think you can record and copy the key frames from a separate file maybe. So if you have the master scene you can record the mocap for character 1 then delete the key frames from character 2 and paste in key frames from a separate file. I’m not sure if there’s a better solution but if I find one I’ll update this comment
Thank you for your responses so quickly. I'm still working on that. So I do as you say, but what I have found as the only solution by trying a lot of tactics is to record my stream quietly and on Blender, I keep the keys just for the character concerned and on other people. erase the keys that don't concern them, well it's long but it works well. But I continue to do research. Thanks.
@@constantformation9205 I figured it out, in the panel for FACEMOTION3D at the bottom there is a menu for "Find all ShapeKeys in a scene" if you uncheck that then the mocap only applies to the selected mesh
Please tell me. How can I transfer key frames several times to one character? I can only attach one set of keyframes to a character. how do I attach another set of keyframes?
Hi, is it possible to transfer key frames from "shape keys" from one model to another? That is, both models have the same "shape keys", I need to transfer the animation with them from one model to another, how to do it?
I had that issue. In the FM app, go into Settings/Streaming Settings and change the Neck Bone Name, Spine Bone Name to what the names are in your Human Gen character. I also changed the Armature Name.
I’m honestly not sure. I think Rokoko has some good software. This method piggybacks Apples API so integration with ARKit is locked to Apple since they developed the software as far as I’m aware.
@@ZaneOlson thx for your answer. Another question🙂 I tried to change the hairs as you did with the weigth manager in a previous video. But It didn't had any effect on the hair. There is something to take care about?
Loving this channel... the content is pretty good and the way you explain everything is pretty good. Keep it up.
Thank you for this tutorial, very helpful! Capturing facial performance using ARKit is really simple and nice, but I hope resolution of this capture will improve in the future. It's nice for some stylized animation with exaggerated expressions, but not for something realistic.
Thanks for the tutorial, first thing I tried that worked without any setup. Next time make your UI fonts bigger, can't see a thing in the menus
please increase the size of your font and panels, unable to anything, which addons was used or the option that were tweaked as well.
Have you tried using a Mixamo animation with ARKit as rigged with Human Generator? I'm considering purchasing Human Generator and haven't been able to find an answer to that question. Thanks
Blender should have native AUDIO to FACIAL ASAP..
Hey thanks for the video.
In your demo corporate video, how did you get the arms and body to move?
Was wondering the exact same
If someone fails to automatically record and bake data from an iPhone to a PC, everyone who has a similar problem, you just need to enter the correct IP address of your PC, I foolishly entered the same iPhone address in two fields :) To find out the PC IP address Press the Win + R combination. In the window that opens, enter cmd and click OK. Next, type the ipconfig command and press Enter.
How does it deal with Eye Shapekeys? Are the eyes rigged to begin with? Eye rotations can't be stored as Shapekeys as it's not manipulating the mesh.
Wao! Your tutorial is very interesting. Is it while the facemotion3d addon works on characters from daz3d imported by diffeomorphic?
how to place any model? please real time need
I'd like to use this method but the head was built in the Facebuilder addon and currently is not rigged. IF anyone can confirm that this is possible i'd be stoked.
You've really helped me out here. thank you so much
Wow, that is cool! Thanks, subscribed)
Hello, and thank you for this video which will help me enormously for my future facial animations of my characters. So my question is this, DOES IT WORK ONLY FOR HUMANGENERATOR OR WITH OTHER CHARACTERS CREATED WITH OTHER PLATFORMS? because I have already created my characters with MIXAMO and other facial software, will it also automatically animate my characters?. Please answer please.
It will work as long as the face is set up to work with ARkit. Human Gen just sets that process up for you instead of having to make all the shapekeys your self.
Thank you very much, I have just tested, and indeed the ARkit must be configured on the character. There I will buy the unlimited version to have more functionality of the software. Thank you again for your video which encouraged me to use this system.
Super useful, thanks a lot! I was wondering if I could ask for your advice. I’m creating a character where the face is an animated texture, so for example you get different versions of eyes and the opacity of each version is controlled with a mixer shader.. I guess I could set up some control bones for these texture parameters. Can I somehow map the data from the app to these parameters instead of a fully fledged facial rig? Hope that makes sense lol
I noticed 'enable head rotation' was on in the app but didn't come across to blender. Is there a way to have that? Or did you keyframe that manually afterwards? thanks!
The ARKit is only capturing face data so anything else has to be keyframed after capture
Ok gotcha, thank you
The IOS app worked OK. However, the Blender add-on just didn't work :-(. After "stop recording" it just said "Now Baking" with no activity on the screen or CPU usage. In the end, I went the export & convert route.
How long did you leave it in the "Now Baking" stage? I did have a problem a couple weeks ago where the link wouldn't properly bake but a reinstall of the addon fixed it
@@ZaneOlson I gave up after 20 minutes on the longest test, but I haven't reinstalled the add-on. I'll give that a go! Thanks
@@djmnx01 should be only a minute or two at the most. I’d also check to make sure my network wasn’t interfering somehow, I didn’t have any issues but I imagine some network setups might be a little more restrictive
@@ZaneOlson Thanks for the info. The audio transferred over fine, and the realtime streaming also worked, so I don't think it's the network. I'll reinstall tomorrow and see if that has any affect.
@@djmnx01 hopefully that takes care of it
What abt EYEball movment/rotating?
I think it's supposed to but for whatever reason it has never worked for me. That hasn't really been an issue though cause I would just remove it so I can animated the eyes to match whatever eyeliner I needed in the scene, I doubt I could nail it through the mo-cap
Is this only a rigify thing or do they have auto rig pro compatibility?
No this is separate. Using ARkit removes the ability to pose the face with any other method. To get the best of both worlds I create a backup of the original in case.
@@ZaneOlson dude so hype you replied. Your work is insane. Been studying for months to get to a point where I can ‘begin.’
Sorry if my question here is dumb, just trying to clarify things:
So using ARKit essentially overwrites the original facial rig? This is why you’re saying to back it up. It’s overhauling the face rig so for that reason it’s useable with auto rig pro or rigify. 🤔
Sorry, I’m really not that bright… just persistent as heck.
@@c0nsumption It's all good, this stuff is confusing as hell, it's decades of processes built up and implemented over time so there's a lot of info and a lot of stumbling around to find solutions.
In HumanGen ARKit overrides the rigify face controls only, so the whole body is still animated from the rigify rig but the face will need mocap to animate. You can revert it but I've crashes trying so I duplicate the whole folder and use one for ARKit and the original with the rigify face controls, though I've never had to actually use the rigify face controls, but having the option is nice.
Good luck on your journey, and if you have any other questions or need clarification feel free to ask anything. One of my first weeks using blender I spent a couple days just learning how to properly select items in the scene, so there's no such thing as a stupid question here.
@@ZaneOlson thanks dude 🙏🏽
Legit. Will tag you when I have something worth showing for sure.
@@c0nsumption I look forward to seeing it!
I suppose this doesn't work with eye rotation and movement?
No, but for most workflows animating the body and eye movements separately is an advantage especially since mocap is working without a reference it's easy to mess up eye lines.
@@ZaneOlson Thanks for responding, Zane. Do you think you might make a video on how to bake physics, like with wind fx before animating other things? You mentioned this in your last video when you made the robe for the relic keeper.
@@ZaneOlson And I do enjoy manual animation as well, so no problem.
I have followed this but when it comes to the recording step I am stuck on NOW BAKING and I also dont see it happen the same way you do because when I hit FINISH RECORDING on my iPad it only gives me the option to export the FBX i dont see the BLENDER IS PROCESSING step.
Everyone who has a similar problem, you just need to enter the correct IP address of your PC, I foolishly entered the same iPhone address in two fields :) To find out the PC address Press the Win + R combination. In the window that opens, enter cmd and click OK. Next, type the ipconfig command and press Enter.
What ARKit did you use
This is insane!!!
For Andriod or PC...
Faceware RT for iClone
$1,190.00 $1,590.00 not cheap, costs about the price of the iponoe.
You can get an iPhone X for around $200-$300 used and it will work on there if you don't want to spend that much
Hi, I have a question please. How do I make a single character's facial move on the stage? when I have two or more characters on the stage? I remarked that when I start the stream all the characters also move their faces, so I just need one character when I start my stream. Thanks again.
That’s a great question, I haven’t had the need yet to have two characters in one scene yet so unfortunately I’m not sure. I think you can record and copy the key frames from a separate file maybe. So if you have the master scene you can record the mocap for character 1 then delete the key frames from character 2 and paste in key frames from a separate file. I’m not sure if there’s a better solution but if I find one I’ll update this comment
Thank you for your responses so quickly. I'm still working on that. So I do as you say, but what I have found as the only solution by trying a lot of tactics is to record my stream quietly and on Blender, I keep the keys just for the character concerned and on other people. erase the keys that don't concern them, well it's long but it works well. But I continue to do research.
Thanks.
@@constantformation9205 I figured it out, in the panel for FACEMOTION3D at the bottom there is a menu for "Find all ShapeKeys in a scene" if you uncheck that then the mocap only applies to the selected mesh
Please tell me. How can I transfer key frames several times to one character? I can only attach one set of keyframes to a character. how do I attach another set of keyframes?
Try using animation layers set to "Add" youtube it. This might be what you're looking for
Hi, is it possible to transfer key frames from "shape keys" from one model to another? That is, both models have the same "shape keys", I need to transfer the animation with them from one model to another, how to do it?
search up shapekey transfer by royal skies on youtube
why doesnt blender head moves like in the phone window???? it surely has the data to do so
ARKit only captures the face, everything else has to be captured another way or animated later
I had that issue. In the FM app, go into Settings/Streaming Settings and change the Neck Bone Name, Spine Bone Name to what the names are in your Human Gen character. I also changed the Armature Name.
@@odntemagnus did that solve the problem?
link is broken, cannot go to it
www.facemotion3d.net/english/download/
Looks like they changed the link recently this one works, I will update the description
Your screen resolution kills me.
the link isnti in the description though...
I just double checked, and the link is in the description. Did you click to reveal the entire description?
@@ZaneOlson oh I see it just wasn't blue
Thx for the video...Do you know a desktop face motion capture software which work together with a webcam?
I’m honestly not sure. I think Rokoko has some good software. This method piggybacks Apples API so integration with ARKit is locked to Apple since they developed the software as far as I’m aware.
@@ZaneOlson thx for your answer. Another question🙂 I tried to change the hairs as you did with the weigth manager in a previous video. But It didn't had any effect on the hair. There is something to take care about?
Hmm I wonder does livelink face app work with blender, it's free. Great videos btw 👌
Thanks! Livelink does work, this is the method I prefer simply because I don't have to import any FBX's and re-doing takes is so much easier
Yeah only if you have an Iphone.