This looks incredible. I remember spending a lot of time with a kinect in mikumikucapture years ago, and have been surprised to see no good models releasing with face, body, and hand tracking all-in-one. I hadn't known about mediapipe holistic until now. The best of luck with this project! I think I will learn blender now :)
Worked great, installed everything. Here are some suggestions: a preview of the video detection progress and the option to be able to choose from which frame the video detection starts and ends. Congratulations on the excellent work!
I use this with Blender 3.6. I haven't used mocap before and this made it incredibly easy. The tutorial is a must but after that, pretty smooth sailing. Thanks!!
I have a suggest it seem the nonlinear function cannot be use to make adjustments? Right? Could you create a button that would take the final rig and create keyframe for the rig and release /delete the empty and delete constraints and just have keyframes . That way the nonlinear blender function cluld be used. That would advance mocap. THANKS
Amazing! Was wondering when you could use videos for mocap, as my webcam isnt the best in quality haha Looking forward to more updates on BlendArMocap!!!!
@@cgtinker The fact that you were able to make real-time motion capture work in blender is amazing! Any chance you could take this tech and turn it into something that works independently of blender, so that I could be used, for instance, with other animation software, or with video game engines or into a tracker for vtubing software? Because if you could make this work in a video game engine, this could change the game for amateur v-tubing.
Can you place your video in another part of the UI so we are able to see what options you select on the right side of the screen? Awesome tool you built!
Amazing stuff, truly! And the option to import from freemocap translates into multicam goodness for us DIYers! Question: Have you considered doing your own multicam setup BlendArMocap within Blender?
The fact that you were able to make real-time motion capture work in blender is amazing! Any chance you could take this tech and turn it into something that works independently of blender, so that I could be used, for instance, with other animation software, or with video game engines or into a tracker for vtubing software? Because if you could make this work in a video game engine, this could change the game for amateur v-tubing.
should work but guess I don't have the time to maintain stuff like that (kinda capped with my current tools..). On which video engines u thought? just curios
Is great, but i have a little problem the face have a bit issues , the markers show a precise movement o the face, but the face rig doesn't move the same way.
Hi! I'm running Blender 3.4.1 and I have session data from freemocap. But when I click Load Session Data I get the error message "Session directory not valid. D:\videos\mocap session\session_2023-03-11-10_29_10\" Wonderful work you are doing here (and Jon). If I can redirect more information to you let me know!
@@cgtinker yes i see in "output_data" mediapipe_body_3d_xyz.npy mediapipe_face_3d_xyz.npy, etc. in "synchronized_videos" Camera_003_binary.npy Camera_004_binary.npy
Can you do a demo of face with video and sound in blender. I loaded BlendArMocap and a audio but it seems the transfer of mouth speaking is not working. Please thanks
Thanks. Is it not possible to also use rigify rigs with the new face rig? Even if I just would like everything but the face mocap? Also does transfer mean it's gets linked via drivers or does it bake them movements to the rigify rig then? Thanks again!
For the new face you'd have to create a new config file - I think renaming targets should do. Linking is via driver's and constraints, this intended. This allows you to modify the data more easily using the driver settings in the constraint panel.
Thanks for this amazing update!!!! oh, also blenderARtrack will have any update? the app is cuting the framerate at 15fps unfortunately... Also wasnt unable to set up a 1920×1080p resolution, but isnt a big issue compared to framerate. (even with lower settings) also it have some small tracking displacements when one uses different angles or target the camera to up into sky or turn around. A external stabilizer gimball fix a lot of the camera shake for lucky ☆
In the video is everything you need to know to create a custom configuration to do so. It's some work though. However, once a config has been generated it can be shared so hopefully the transfer config pool grows over time
Hi - is there someway to see the frame rate of the motion capture data? I could see on the BlendAR app that it seemed to vary. I'm trying to figure out how best to sync it to audio.
for blendarmocap with video files that should basically work out of the box. I also synched audio with blendartrack (iOS) successfully - using a clapper helps a lot... as long the frame rate is consistent there shouldn't be an issue
@@cgtinker for the blendartrack does it allow you to record the audio? I ended up making weird faces to use as my clapper!...and was recording my audio into OBS - my problem was partially for sure having different frame rates between blender and premiere...and OBS for that matter...I just need to smooth out my workflow. Follow up question: is the data from blendarmocap as high resolution as from blendartrack? What a great set of tools you've created!
@@Hepworks you're welcome :) oddly enough I'd say blendarmocap is somewhere in between! arcore (android) < media pipe (desktop) < ARKit (iOS) think I'll soon switch to shape keys, I think that will improve transfer to faces a lot, point based is not to great. I consider to make the most tracking stuff standalone in a separate exe as it's easier to maintain for me.
I don't fully understand sadly. Do you want to "retarget"? Also, due to the nature of the add-on, driver objects get deleted when a new config gets applied, the add-on is not intended to animate multiple rigs in one scene. th-cam.com/video/ZfLU2NXWsUE/w-d-xo.html
Thank you so much for your reply As can I export a animation armature and import it again in blender so that I can use animation in different rig in Blender
@@ArpittheFact you can use this rigify extension I made to make the export easier: github.com/cgtinker/rigify_gamerig_extension link the metarig to the control rig, then bake some animation(s), unlink it and export it as fbx by example once you are done
all the data inside the collections uses keyframes. (besides the objs with a .D suffix (which are just drivers which is based on it). You can modify driver values using the mask presented in the video, besides working with the keyframed data is gd swell
hi there, is a full tutorial planed ? like for a stickman or whatever i have problems to transfer it to an character -_- and i already tryd to understand your videos
Hi, there is a problem when i transfer animation, the arms of my rig doesnt move, but everything else does. Do you have any idea why, and how do i fix this? Thanks!
It's great cg tinker Can you please make video on How we can transfer ther capture data onto another model ? Or how we can export this data in fbx format for macap animation?
Guess I'll make a video about exporting soon as many seem to have issues doing so. The video should actually give you the knowledge on how to transfer to another model.
@@shoaibakhtar4934 the config is for rigify, you can change the config to support other / similar rigs. so if the armature is a generated rigify rig - yes
it is really cool!! could you share me the way, to fix the problem of installation of mediapipe failed, check the system console output when I try to install install dependencies?
create a GitHub bug report and give me some info's there (hard to debug in YT). Let me know the following things: 1. Operating System, Blender Version, copy logs from the system console and either attach them as .txt or paste them github.com/cgtinker/BlendArMocap/issues/new/choose
@@cgtinker Will try that thanks, I scaled the empties to a large scale and that 'fixed it', but the tracking doesn't really convey the lip motion super well (on my model atleast)
@@kendarr sometimes it requires some cleanup / change of settings - planning to improve the face transfer results in the future. If you go in the "cgt_face" collection you will find empties named something like "cgt_lid.R.L" - in the object data props (where the constraints live) you'll find custom settings which help to improve the mapping
should work, yes, for autorig it may be sufficient to just change targets without touching the drivers. (I don't have autorig but it seems to use rigify).
@@cgtinker hey...thanks for the update...I want to try to create a target rig for the metahuman rig...let me know if you have an outline to get started or any other suggestions....if I can follow an outline for the rigify rig, let me know...
@@cgtinkerHow can I explain you where I tried to expor to... I want to explain you with picture, because I cannot explain you exactly. Let me know your email address please...
This looks incredible. I remember spending a lot of time with a kinect in mikumikucapture years ago, and have been surprised to see no good models releasing with face, body, and hand tracking all-in-one. I hadn't known about mediapipe holistic until now.
The best of luck with this project! I think I will learn blender now :)
Man this is SO exciting, especially shaking hands with the FreeMoCap project is really cool to see! This is PRIMO
dude i love you. thank for creating an alternative. flexible enough for diy'ers (specially those that lack all the best equipment) and professionals.
This is opening up a lot of possibilities for a lot of artists. Thank you!
Worked great, installed everything. Here are some suggestions: a preview of the video detection progress and the option to be able to choose from which frame the video detection starts and ends. Congratulations on the excellent work!
I use this with Blender 3.6. I haven't used mocap before and this made it incredibly easy. The tutorial is a must but after that, pretty smooth sailing. Thanks!!
you are creating open source full motion tracking solution it is insane thank you so much
Subscribed! Look forward to updates like Auto Rig Pro compatibility
Your job is amazing, the pose and hands mode are the best ones.
I believe it's an AirTrack. BUT Holy crap! This is awesome!!! Great work!
This is really impressive! You deserve a lot of support and donations 👏
This is amazing! Thank you for all the work!!!!!!
You are a legend! Appreciate your work, thank ever so much. Keep well and keep up the excellent work.
Just found this few days ago... Just wanted to say thank you 🙏
Incredible job !! Looking forward to testing
I have a suggest it seem the nonlinear function cannot be use to make adjustments? Right? Could you create a button that would take the final rig and create keyframe for the rig and release /delete the empty and delete constraints and just have keyframes
. That way the nonlinear blender function cluld be used. That would advance mocap. THANKS
*Everything works, but why are all movements tied to the center? The character is supposed to walk, but instead, it moves in place.*
Thanks for update! Super
the app is gone from the iphone app store :( what happened?
I dont have any more words than just beatiful amazing and amazing.
Amazing! Was wondering when you could use videos for mocap, as my webcam isnt the best in quality haha
Looking forward to more updates on BlendArMocap!!!!
Videos can be used for a while by now, I made it the default as it has been overlooked by many people :0
@@cgtinker The fact that you were able to make real-time motion capture work in blender is amazing! Any chance you could take this tech and turn it into something that works independently of blender, so that I could be used, for instance, with other animation software, or with video game engines or into a tracker for vtubing software? Because if you could make this work in a video game engine, this could change the game for amateur v-tubing.
Can you place your video in another part of the UI so we are able to see what options you select on the right side of the screen? Awesome tool you built!
Awsome work!
What should be done if you're mesh overlaps with itself after applying mocap data?
sad this isn't maintained anymore. if I knew how I would offer but I do not.
You need more folowers thats amazing and with the gnu license thats a very helpfull mocap addon thank you
This is a magnificent update
Thanks
Amazing stuff, truly! And the option to import from freemocap translates into multicam goodness for us DIYers! Question: Have you considered doing your own multicam setup BlendArMocap within Blender?
don't plan to develop it further for now
@@cgtinker Oh thats sad, but understandable, keep up the good work!
@@danyalghani7421 thanks :) hope you doing well!
This looks amazing. I read the docs. I didn't understand why blender needs to run as admin.
this looks amazing since I'm planning to do an animation project
Is it possible to remove the mocap data after transferring it?
After I reassigned the animation to rigify in blender, it did not capture the center of mass information. How can I solve the problem?
The fact that you were able to make real-time motion capture work in blender is amazing! Any chance you could take this tech and turn it into something that works independently of blender, so that I could be used, for instance, with other animation software, or with video game engines or into a tracker for vtubing software? Because if you could make this work in a video game engine, this could change the game for amateur v-tubing.
should work but guess I don't have the time to maintain stuff like that (kinda capped with my current tools..). On which video engines u thought? just curios
Is great, but i have a little problem the face have a bit issues , the markers show a precise movement o the face, but the face rig doesn't move the same way.
checkout the distance driver setup, you can modify the results and maybe even enhance the transfer config.
My webcam doesn't work in Blender 4.0.2 Mac OS 14.2.1, what should I do? Installed everything you need.
This is amazing !
Hi! I'm running Blender 3.4.1 and I have session data from freemocap. But when I click Load Session Data I get the error message "Session directory not valid. D:\videos\mocap session\session_2023-03-11-10_29_10\"
Wonderful work you are doing here (and Jon). If I can redirect more information to you let me know!
Thanks, uhm, does the folder contain the folders "DataArrays" and "SyncedVideos"?
@@cgtinker I see the folders "annotated_videos", "calibration_videos", "output_data" and "synchronized_videos"
Seems like folders have been renamed on the freemocap side? Are in the output folder .npy files?
@@cgtinker yes i see in "output_data" mediapipe_body_3d_xyz.npy mediapipe_face_3d_xyz.npy, etc.
in "synchronized_videos" Camera_003_binary.npy Camera_004_binary.npy
also running the freemocap-gui to do the calibration / capture / sync / export to blender, so i'm not sure if that has something to do with it.
Can you do a demo of face with video and sound in blender. I loaded BlendArMocap and a audio but it seems the transfer of mouth speaking is not working. Please thanks
Android's blendarTrack cannot compress files for sending to a computer. How do I set up the application?
i noticed that all my recording creates uncanny results where the face is so badly shaky! What should I do?
How to combine retargets of blendartrack facial data and mixamo animation?
when i done transfering driver, how do i adjust Rigify pose? i cant keyframe or move rigify rig
Thanks. Is it not possible to also use rigify rigs with the new face rig? Even if I just would like everything but the face mocap? Also does transfer mean it's gets linked via drivers or does it bake them movements to the rigify rig then? Thanks again!
For the new face you'd have to create a new config file - I think renaming targets should do. Linking is via driver's and constraints, this intended. This allows you to modify the data more easily using the driver settings in the constraint panel.
Thanks for this amazing update!!!!
oh, also blenderARtrack will have any update?
the app is cuting the framerate at 15fps unfortunately...
Also wasnt unable to set up a 1920×1080p resolution, but isnt a big issue compared to framerate.
(even with lower settings)
also it have some small tracking displacements when one uses different angles or target the camera to up into sky or turn around.
A external stabilizer gimball fix a lot of the camera shake for lucky ☆
planning to switch the video recording subsystem in the future.. will take time - but lets call that a yes^^
Sorry, but I didn't find a way to transfer face animation to HumanGenerator models
In the video is everything you need to know to create a custom configuration to do so. It's some work though. However, once a config has been generated it can be shared so hopefully the transfer config pool grows over time
Have you figured out how to transfer face animation to Human Generator models? I am stuck at that part of the transfer too.
@@dandylion-evn7w2 no
Thank you for your video. How do you get the meta rig into a model? Thank you.
Does this system work with IClone character?
How do you add an actual model to all this
Hi - is there someway to see the frame rate of the motion capture data? I could see on the BlendAR app that it seemed to vary. I'm trying to figure out how best to sync it to audio.
for blendarmocap with video files that should basically work out of the box. I also synched audio with blendartrack (iOS) successfully - using a clapper helps a lot... as long the frame rate is consistent there shouldn't be an issue
@@cgtinker for the blendartrack does it allow you to record the audio? I ended up making weird faces to use as my clapper!...and was recording my audio into OBS - my problem was partially for sure having different frame rates between blender and premiere...and OBS for that matter...I just need to smooth out my workflow. Follow up question: is the data from blendarmocap as high resolution as from blendartrack? What a great set of tools you've created!
@@Hepworks you're welcome :)
oddly enough I'd say blendarmocap is somewhere in between! arcore (android) < media pipe (desktop) < ARKit (iOS)
think I'll soon switch to shape keys, I think that will improve transfer to faces a lot, point based is not to great. I consider to make the most tracking stuff standalone in a separate exe as it's easier to maintain for me.
Hwy can we put animation in other rig also because when i start animation translation it stop other armature animation please reply 🙏🙏🙏🙏🙏
I don't fully understand sadly. Do you want to "retarget"? Also, due to the nature of the add-on, driver objects get deleted when a new config gets applied, the add-on is not intended to animate multiple rigs in one scene.
th-cam.com/video/ZfLU2NXWsUE/w-d-xo.html
Thank you so much for your reply
As can I export a animation armature and import it again in blender so that I can use animation in different rig in Blender
@@ArpittheFact you can use this rigify extension I made to make the export easier:
github.com/cgtinker/rigify_gamerig_extension
link the metarig to the control rig, then bake some animation(s), unlink it and export it as fbx by example once you are done
nice, thank you!
I dont understand how to edit the animation after the transfer is done. Help please.
all the data inside the collections uses keyframes.
(besides the objs with a .D suffix (which are just drivers which is based on it). You can modify driver values using the mask presented in the video, besides working with the keyframed data is gd swell
hi there, is a full tutorial planed ? like for a stickman or whatever i have problems to transfer it to an character -_- and i already tryd to understand your videos
Can you provide a separate video to introduce the general algorithm from MediaPipe to bones? I'm not very familiar with Blender
there isn't a general algorithm. I've started working on a pypackage but it's not done yet
github.com/cgtinker/mediapipe_rotations
amazing
Hi, there is a problem when i transfer animation, the arms of my rig doesnt move, but everything else does. Do you have any idea why, and how do i fix this? Thanks!
most likely the rigify rig has not been generated
Will BlenderArMocap work with Mixamo Rigs?
technically yes, the tools are there to do so within the add-on but I don't create custom configs
How would i bake the motion from driver to keyframe
baking works the same as usual, just bake to visual
Awesome !
Can I use a video if I don't have room to make a dance mocap animation?
yes
It's great cg tinker Can you please make video on How we can transfer ther capture data onto another model ? Or how we can export this data in fbx format for macap animation?
Guess I'll make a video about exporting soon as many seem to have issues doing so.
The video should actually give you the knowledge on how to transfer to another model.
@@cgtinker so if we select the armature of the model and apply the mocap to it will it automatically work
@@shoaibakhtar4934 the config is for rigify, you can change the config to support other / similar rigs. so if the armature is a generated rigify rig - yes
it is really cool!!
could you share me the way, to fix the problem of installation of mediapipe failed, check the system console output when I try to install install dependencies?
create a GitHub bug report and give me some info's there (hard to debug in YT). Let me know the following things:
1. Operating System, Blender Version, copy logs from the system console and either attach them as .txt or paste them
github.com/cgtinker/BlendArMocap/issues/new/choose
thank you my brother, now it is safe!!!!
Just curious. What linux distro is that ?
manjaro
When I use it to make face animations, the animations is too smooth on the character, why idea why? it moves very little
probably you have to increase the influences. you can change up the settings of the mapping objects and should be fine
@@cgtinker Will try that thanks, I scaled the empties to a large scale and that 'fixed it', but the tracking doesn't really convey the lip motion super well (on my model atleast)
@@kendarr sometimes it requires some cleanup / change of settings - planning to improve the face transfer results in the future.
If you go in the "cgt_face" collection you will find empties named something like "cgt_lid.R.L" - in the object data props (where the constraints live) you'll find custom settings which help to improve the mapping
Is it possible to transfer the data to another armature? like auto rig pro or mixamo for example?
should work, yes, for autorig it may be sufficient to just change targets without touching the drivers. (I don't have autorig but it seems to use rigify).
@@cgtinker hey...thanks for the update...I want to try to create a target rig for the metahuman rig...let me know if you have an outline to get started or any other suggestions....if I can follow an outline for the rigify rig, let me know...
@@misha4sculpt the rigify config delivers are fairly nice outline to do so. Most should work by just switching targets
@@cgtinker where do I find that...
..never mind...found it
How can I export fbx file include animation from the rigged file?
Actually I tried it, but every time I failed...
You got to bake before exporting.
Where you want to export to?
@@cgtinkerHow can I explain you where I tried to expor to... I want to explain you with picture, because I cannot explain you exactly. Let me know your email address please...
@@haganji hello@cgtinker.com
@@cgtinker I have sent you email. Check it out please.
@@haganji I'll :) give me 2-3 days
I used 12 key frame😂😂 took 2 min for 30sec
I hope you get heat in your place.
16°C it's fine :'D