7:32 you can also add 3 copies of your animation. Blend at start and end by overlapping. Unreal will blend for you. Mark you start and end so that they share the same pose in the middle of the 3 clips. Finally bake those frames into a new animation.
Unfortunately it seems like Mocopi data requires too much cleanup after recording. Something like iPiSoft or Rokoko's new dual cam Rokoko Vision product is in a very similar price range and would give better results and less cleanup work in the post. I do love the fact Mocopi can be used outside so freely though.
I’m using Sony Mocopi, and Live Link face in unreal engine to drive the motion capture of a custom character that i built in blender. I’ve been able to successfully save the mocopi capture as an animation sequence, but I can’t figure out how to save/record the live link facial capture simultaneously. Please, if you have any tips, that would be SUPER helpful. Thanks
Hey Woody i am hoping to test whether mocopi will work with my Ergonomic Engineering software is there anyway I can get an example c3d/bvh files thanks in advance
The only part that is complicated and possibly confusing is importing the BHVs into blender and getting them into Unreal. Would it just make sense to record the animations directly into Unreal with the Mocopi plugin? It seems like that would cut out a significant amount of work.
Hi woody, very new viewer but I’ve always been so curious on motion capture, I use a animation program called source filmmaker, would mocopi be good for it for funny little bits on this program?
I know a lot of cool projects happen in source. Personally, I don't use it, so my guess is you'd need to do some kind of conversion with the BVH, but I don't use source personally
@@WoodyDevs I figured as such, but my idea is using the mocopi + some kinda finger tracking like valve index, would that be possible? Converting to source film would require a DMX file wich I’ll figure out on a separate thing, but would you know how to pair mocopi with a glove hand tracking or vr hand control to imitate fingers?
@@EROSNERdesign You can aabsolutely do it in blender and add in the animations. I saw this video as a good teaching opportunity to do a lot of the adjustments in unreal.
Nice, the side by side video with the mocapi app on the right was super cool! The IK rig tips and animation processing workflows were great!
7:32 you can also add 3 copies of your animation. Blend at start and end by overlapping. Unreal will blend for you. Mark you start and end so that they share the same pose in the middle of the 3 clips. Finally bake those frames into a new animation.
Great video! Also I feel like "recording mocap in a public park" would be a great foundation for an Eric Andre sketch
I've been putting off learning Ik and retargeting for so long. It might be time to dive in. Thanks for this video and all the other ones too!
I think you are the channel with the most content about the mocopi. I wanna try it :'D
Great Video :)
Unfortunately it seems like Mocopi data requires too much cleanup after recording. Something like iPiSoft or Rokoko's new dual cam Rokoko Vision product is in a very similar price range and would give better results and less cleanup work in the post. I do love the fact Mocopi can be used outside so freely though.
Thanks for saying this, I know he's sponsored but I do love honesty when it comes to checking out stuff like this.
I’m using Sony Mocopi, and Live Link face in unreal engine to drive the motion capture of a custom character that i built in blender. I’ve been able to successfully save the mocopi capture as an animation sequence, but I can’t figure out how to save/record the live link facial capture simultaneously. Please, if you have any tips, that would be SUPER helpful. Thanks
Would love to see a video that shows how to use Mocopi bvh files in Daz Studio.
There are plenty of tuts of how to use bvh in daz studio on TH-cam.
Hey Woody i am hoping to test whether mocopi will work with my Ergonomic Engineering software is there anyway I can get an example c3d/bvh files thanks in advance
The only part that is complicated and possibly confusing is importing the BHVs into blender and getting them into Unreal. Would it just make sense to record the animations directly into Unreal with the Mocopi plugin? It seems like that would cut out a significant amount of work.
If you’re recording inside, absolutely!
Can mocapi be used in Iclone 8?
Hi woody, very new viewer but I’ve always been so curious on motion capture, I use a animation program called source filmmaker, would mocopi be good for it for funny little bits on this program?
I know a lot of cool projects happen in source. Personally, I don't use it, so my guess is you'd need to do some kind of conversion with the BVH, but I don't use source personally
@@WoodyDevs I figured as such, but my idea is using the mocopi + some kinda finger tracking like valve index, would that be possible? Converting to source film would require a DMX file wich I’ll figure out on a separate thing, but would you know how to pair mocopi with a glove hand tracking or vr hand control to imitate fingers?
@@Slop_Box you would need to sync the hand mocap with body mocap in the post. Some people do it. It's not ideal but it does do the job.
Interesting, I wish I had a camera
how does this system compare to Vive trackers regarding Mocap accuracy? i am looking myself for a good mocap system
Vive trackers are miles ahead of Mocopi. Mocopi is cheaper though.
THANKS
Does it work in blender?
sure does!
Any issues if I just wanted to use it with blender? Can I do a live capture? Thanks
@@EROSNERdesign You can aabsolutely do it in blender and add in the animations. I saw this video as a good teaching opportunity to do a lot of the adjustments in unreal.
Cool thanks.@@WoodyDevs