FreeMoCap ROCKS!!!

แชร์
ฝัง
  • เผยแพร่เมื่อ 16 พ.ย. 2024

ความคิดเห็น • 68

  • @ScifiSiShredaholic
    @ScifiSiShredaholic 10 หลายเดือนก่อน +9

    Excellent video, absolutely right about the turning too.
    If I see another promo video for a new mocap app where all the footage is just star jumps and waving at the camera...

    • @BracerJack
      @BracerJack  10 หลายเดือนก่อน +1

      I know right, it is all too easy to detect those "sudden" and gesturely "loud" moves.
      Give me a system that detects foot placement and turning movements correctly and you have yourself a winner!

  • @genieeye5783
    @genieeye5783 ปีที่แล้ว +12

    Could you please make a walk-through tutorial? I have 0 python experience, the original tutorial is really confusing to me🤣

    • @BracerJack
      @BracerJack  10 หลายเดือนก่อน +3

      Sounds like something I should do someday!
      Thanks for the suggestion :D

  • @user-beepbopbeep
    @user-beepbopbeep หลายเดือนก่อน +1

    Hey there! Great tutorial on freemocap and it's functions. I know this one was a bit of an older video but do you know what formats the mocap data can be exported? Is it a bvh? I tried looking at the documentation and the GitHub and I don't see that particular information of export types.

    • @BracerJack
      @BracerJack  หลายเดือนก่อน +1

      @@user-beepbopbeep Good Question, may I know what specific format you need and why?

    • @user-beepbopbeep
      @user-beepbopbeep หลายเดือนก่อน +1

      Well I suppose anything that could work within blender. .Bvh is fine, but if not I am still willing to work with other formats that are usable within Blender, especially since its a good enough workflow for most of my 3d needs, and can be retargeted with Rokoko.

    • @BracerJack
      @BracerJack  หลายเดือนก่อน

      @@user-beepbopbeep Then you will be VERY HAPPY to know that freemocap works DIRECTLY with Blender.
      As a matter of fact, it literally OPEN UP BLENDER!!! when the tracking is done and there is a plug-in that converts it to rigify's rig!!!
      Congratulations, your wish comes true!

  • @tmlander
    @tmlander 18 วันที่ผ่านมา +1

    I am curious if the glossiness of your calibration board impacted your crappy webcams... there is a shine in your video... but I suppose maybe you did not light the same way when you were doing mocap? Just a thought.... Thanks for the info!

    • @BracerJack
      @BracerJack  18 วันที่ผ่านมา

      @@tmlander You are absolutely correct! The lighting for the video in front of the camera is not the same setup for the mocap, but you already have a sense of that since you guessed it correctly.
      Hope you will have some fun with freemocap!

  • @LFA_GM
    @LFA_GM 10 หลายเดือนก่อน +2

    Hi, thanks for this video. I am also looking for alternatives with multiple webcams. Have you tested with 922 Logitech? Also, there is a limit of 2 webcams per USB controllers, otherwise the third will not work and the others will have frame rate decreased. In laptops, that is a serious constraint. You'll need at least 30fps at 640x480 resolution for acceptable results. For 3-cam setups, at least 1 webcam should be placed about 6ft/180cm above the floor.

    • @BracerJack
      @BracerJack  10 หลายเดือนก่อน +2

      I saw your videos and it is nice to see someone who is into mocap to this extend too.
      I bought the kinect version 2 for mocap test using the SDK, it was ok.
      I wanted to try ipisoft but the price is a tad too much for me.
      I was looking at Brekel too but then freemocap comes along and the quality just destroys everything I have seen before.
      You are also right about the webcam limit, I have to use three laptops if I were to use webcams, but I think I will stick with actual cameras because the webcam's framerate is not stable, the output video's framerate is stable but when I am syncing them in the video editor, I realize they skipped frame capture internally and I just cannot trust those cheap webcams anymore.

    • @LFA_GM
      @LFA_GM 10 หลายเดือนก่อน

      You're right, @BracerJack. Price tags nowadays are a limiting factor, and some "AI tools" pricing models are just delusional. Most of these techs are available as open source, but the setup is not beginner friendly. Also, this mocap business didn't grow as predicted, so we don't have many affordable providers. I'll be testing Freemocap with single and multiple cameras very soon and record a video with my results soon.
      There another one that is free, "EasyMocap", which also allows multiple cameras, but has a way more complicated setup, not so "easy" as the name suggests: th-cam.com/play/PL1NJ84s5bryvhGJzcCjPMDJiI9KW9uJ-7.html&feature=shared

    • @BracerJack
      @BracerJack  10 หลายเดือนก่อน

      @@LFA_GM If you are using discord, you can join my discord at discord.gg/AU7Rg6KD so that we can chat about mocap stuff :D

    • @LFA_GM
      @LFA_GM 10 หลายเดือนก่อน

      Thanks@@BracerJack Unfortunately I don't use Discord, but I'll update my results here. I tested two Kinect 360 in the same computer and unfortunately my hardware specs didn't handle the USB bandwidth requirements.

    • @MrGamelover23
      @MrGamelover23 6 หลายเดือนก่อน

      ​@@BracerJackhave you ever tried xranimator? It uses mediapipe to track the body. Rn it only works with one camera at a time, but It allows you to record motion capture and export it into blender. And I'm curious how well it would work compared to this.

  • @kevinwoodrobotics
    @kevinwoodrobotics 2 หลายเดือนก่อน +1

    I think you should be able to calibrate your cameras with a smaller chessboard. You would just need to move closer to the camera

    • @BracerJack
      @BracerJack  2 หลายเดือนก่อน +1

      @@kevinwoodrobotics YES, that too, but remember the dilemma:
      1: If the board is small, when you move closer to one camera, the other camera will not see it or would be too small to be of any use to that other cameras.
      2: Attempting to rectify the problem by moving the camera closer together remove the parallax property that is the sole principle behind being able to figure out a point in 3D space.
      A bigger board would allow the camera, especially if you only have two, to be far apart from each other for maximum positional triangulation in 3D space while allowing the board to still be detected by both cameras at the same time.

    • @kevinwoodrobotics
      @kevinwoodrobotics 2 หลายเดือนก่อน +1

      @@BracerJack I see. So they require you to calibrate all the cameras simultaneously instead of independently?

    • @BracerJack
      @BracerJack  2 หลายเดือนก่อน

      @@kevinwoodrobotics They cross reference each other to figure out where exactly in 3D space are the cameras and the board.
      The more cameras are able to catch a glimpse of the board at the same instance in time, the surer the positional space is.
      I believe your next question might be "So that means the videos need to be all synced up then?"
      The answer to that will be: "Yes".

    • @kevinwoodrobotics
      @kevinwoodrobotics 2 หลายเดือนก่อน

      @@BracerJack ok that makes sense now! Thanks!

    • @BracerJack
      @BracerJack  2 หลายเดือนก่อน

      @@kevinwoodrobotics Glad I am able to help you! Be well on your journey.

  • @adamvelazquez7336
    @adamvelazquez7336 11 หลายเดือนก่อน +1

    do you have a walk through of the usage of freemocap from capture to blender?

    • @BracerJack
      @BracerJack  8 หลายเดือนก่อน

      Not at the moment, maybe I will do that someday :D

  • @ok-hc4he
    @ok-hc4he 5 หลายเดือนก่อน +1

    I've set up everything and have export my animation to Blender but all I have is a walking skeleton, how do I transfer my animation to the human mesh I created

    • @BracerJack
      @BracerJack  5 หลายเดือนก่อน

      Well first of all, good job 👍
      There is a blender add-on that is created for this purpose created by the community from freemocap, check their GitHub and or discord to get it, it automates the process by a few clicks.

  • @BrianHuether
    @BrianHuether 8 หลายเดือนก่อน +1

    I am about to try this to make s 3d animated model of myself just for shadow and reflection casting. I don't need super accurate.
    Does it work if subject is sitting down? I need it to capture me seated playing guitar. Not my hands, but rest of upper body. Well, and I also need to capture guitar neck. Encouraging to watch your video!

    • @BracerJack
      @BracerJack  8 หลายเดือนก่อน

      I don't think the AI would work if....
      Hold on let me think about how to best explain this....
      First and foremost understand that AI motion capture have difficulty detecting the body's pose if the outline isn't very strong.
      You sitting down and not really moving very much is probably going to kill the detection, to make matters worse you are about to bring in a foreign object that the A.I. would then have to wonder whether it is your body or not.
      This is a brave experiment, I wish you luck 😀

    • @BrianHuether
      @BrianHuether 8 หลายเดือนก่อน +1

      Hi, I would think it should work. Am going to try today. Even if there are periods of no motion that shouldn't cause it to not work, since camera trackers work even when no motion as long as there are good initial frames of trackable points. So if I am sitting with two cameras - one on each side of me - and my upper body is moving is moving at first, that should be enough to generate data and set the tracking in motion. Interesting how the guitar will affect things, but I would think that too would not be a problem. AI is just large data set statistics, basically regression based on high dimension datasets, in other words high dimensional curve fitting, so if there are good initial frames, mocap should understand where arm joints are, and the nature of training data should be such that the system would not result in, say, arm joints suddenly jumping and being mapped to the guitar or something, since such jumps would be absent in the training data, and if this system works like next token prediction (the way OpenGPT) then the liklihood calculations would unlikely choose sudden deformation of a joint as a likely next token. I suspect the devs have some sort of constraints system so that frame to frame you wouldn't have the body geometry wildly deforming. I joined their discord channel, will let them know how this goes. Also, been playing around with the great camera tracking software SynthEyes, it has geometric hierarchial tracking, where it would be able to track an object as a separate hierarchy from some other object (like a person holding the object). These FreeMoCap devs and the SynthEyes dev should collaborate! SynthEyes manual describes a multicamera motion capture function that it has, but I have yet to see example of anyone using it. But if I get this working, then I won't bother trying to use SynthEyes for motion cap. @@BracerJack

    • @BracerJack
      @BracerJack  8 หลายเดือนก่อน

      @@BrianHuether I wish I am there to conduct the experiment with you, this is interesting. I actually want to know the result.

    • @BrianHuether
      @BrianHuether 8 หลายเดือนก่อน +1

      @@BracerJack will report back here! Going to print out that calibration image and follow the calibration steps.

    • @BracerJack
      @BracerJack  8 หลายเดือนก่อน

      @@BrianHuether You....you use synth eyes for BODY MOTION CAPTURE?!!!

  • @Gillissie
    @Gillissie หลายเดือนก่อน +1

    Let me know when this is a real app that doesn't require command line installation and reliance on Python. Couldn't get it up and running on my Mac after the installation.

    • @BracerJack
      @BracerJack  หลายเดือนก่อน

      I am sorry to hear that, maybe they will create a precompiled binary someday.

  • @hotsauce7124
    @hotsauce7124 11 หลายเดือนก่อน +1

    Question, have you also tried walking towards the camera?

    • @BracerJack
      @BracerJack  8 หลายเดือนก่อน

      It should be fine AS LONG AS....more than one camera is capturing your moment.

  • @terryd8692
    @terryd8692 ปีที่แล้ว

    What is the output in? Is it the application itself? Is it simple to get it into blender? It would be good to have a wirefrane background and floor to get an idea of how much drift uoure getting. Apart from getting up from the crouch it was pretty damned good

    • @BracerJack
      @BracerJack  ปีที่แล้ว +2

      1: The output is a Blender file, there is even a Blender add-on that auto converts the file into a rigify Blender file.
      2: The sliding of the foot have a lot of room for improvement, hopefully, having another camera will help.
      3: Yeah, maybe when the body is squeezed like that, the outline recognition part of the A.I. simply give up to some extend but the fact that it works at all...wow.

  • @AbachiriProduction
    @AbachiriProduction 11 หลายเดือนก่อน +1

    hi were i can get that large background image

    • @BracerJack
      @BracerJack  10 หลายเดือนก่อน +2

      That image is included in the github repo :D

  • @SalElder
    @SalElder 8 หลายเดือนก่อน

    This is very helpful and informative. Thanks for the video.

    • @BracerJack
      @BracerJack  8 หลายเดือนก่อน

      You are welcome Sal Elder :D

  • @forthesun6563
    @forthesun6563 7 หลายเดือนก่อน +1

    How to import animation in blender please?

    • @BracerJack
      @BracerJack  6 หลายเดือนก่อน

      There is a plugin for it, it comes with the mocap program, you can also join their discord to get the latest version of the add-on.

  • @ryanrose7523
    @ryanrose7523 11 หลายเดือนก่อน +1

    Great Job!

    • @BracerJack
      @BracerJack  11 หลายเดือนก่อน

      Thank you Ryan.

  • @AronBaddoPacas
    @AronBaddoPacas 18 วันที่ผ่านมา +1

    the application seems to be revolutionary but hard to install..

    • @BracerJack
      @BracerJack  15 วันที่ผ่านมา

      I also wish there is a one click install exe, but well, until then :D

  • @mohamedharex
    @mohamedharex 11 หลายเดือนก่อน

    Nice demo, where to get that addon.

    • @BracerJack
      @BracerJack  8 หลายเดือนก่อน

      Go to their discord, the person who make the add-on is there, you can get the latest version there :D

  • @GabrieldaSilvaMusic
    @GabrieldaSilvaMusic 10 หลายเดือนก่อน +1

    Does this work for 2 people detection?

    • @BracerJack
      @BracerJack  8 หลายเดือนก่อน

      I have no idea, I have never tested it with two people :D

  • @terryd8692
    @terryd8692 ปีที่แล้ว +1

    Awesome, but can it dance? 😄

    • @BracerJack
      @BracerJack  ปีที่แล้ว +2

      It can...when I can....someday ;p

  • @raymondwood5496
    @raymondwood5496 ปีที่แล้ว +2

    may i know what camera you used?

    • @BracerJack
      @BracerJack  ปีที่แล้ว +1

      I used the Canon g7x mark ii and Sony ZV1.

    • @raymondwood5496
      @raymondwood5496 ปีที่แล้ว +1

      @@BracerJack ok thanks.... Great video.. I still having problem moving the data to blender.. I will try kinect using ipisoft first..

    • @BracerJack
      @BracerJack  ปีที่แล้ว +1

      @@raymondwood5496 Let me guess, when the program was attempting to export to blender, it get stuck right ? That can be resolved, backup and then delete your Blender configuration folder, that will do it.
      This happened to me as well, once you have cleared the folder, you are good !

    • @raymondwood5496
      @raymondwood5496 ปีที่แล้ว

      @@BracerJackthanks for the tips.. Will try again tonight..

    • @BracerJack
      @BracerJack  ปีที่แล้ว +1

      @@raymondwood5496 I never got the chance to try ipisoft, the multi-cam option is really out of my budget.
      Good luck !

  • @The-Filter
    @The-Filter 9 หลายเดือนก่อน +1

    No it does not! It is totally horrible to use! It needs a complete UI with FBX export option!

    • @BracerJack
      @BracerJack  9 หลายเดือนก่อน +1

      Try to discuss with the creator about your issue in discord, he is very helpful :-)

    • @xyzonox9876
      @xyzonox9876 5 หลายเดือนก่อน

      Command Line Interface is cool though

    • @The-Filter
      @The-Filter 5 หลายเดือนก่อน

      @@BracerJack I did already. He avoids any improvements and just points out to the existing solution.

    • @The-Filter
      @The-Filter 5 หลายเดือนก่อน

      @@xyzonox9876 If you have only one animation to do, yea, maybe. But we have 2024! And this is not a solution, especially if you have to manage hundrets of animations, select their range that needs to be saved, organize and rewatch them quickly, to maybe redo the capturing again and so on!

    • @andreww4751
      @andreww4751 หลายเดือนก่อน

      @@The-Filter yikes.