Metahuman Cinematics Captured with DragonFly Virtual Camera in Unreal Engine 5

แชร์
ฝัง
  • เผยแพร่เมื่อ 25 ส.ค. 2024
  • It has been an amazing learning experience using the recently released DragonFly Virtual Camera by ‪@GlassboxTech‬ in ‪@UnrealEngine‬ 5.1 to capture this Metahuman performance.
    This #VCam allowed me to recreate real world sensors and lenses in Unreal, combined with the ‪@ZEISSCinematography‬ CinCraft Mapper which generated accurate lens distortion ST Maps that I assigned to each lens.
    Link to DragonFly Vcam: glassboxtech.c...
    Link to Zeiss CinCraft Mapper: cincraft.zeiss...
    The face and body performances were cleaned and fine-tuned entirely in Unreal with the Metahuman control rig and the custom Metahuman was created using the Mesh to Metahuman plugin, with a Paragon character head mesh.
    Gear Used:
    Xsens Link suit (body mocap) by ‪@MovellaInc‬
    Xsens Metagloves (finger mocap) by ‪@ManusMeta‬
    Mark IV HMC (facial motion) by ‪@FacewareTech‬
    Live Client plugin (stream facial motion to UE5) by ‪@GlassboxTech‬
    iPad and Gamevice (virtual camera)
    Powered by a custom ‪@PugetSystemsPC‬ workstation with an‪@NVIDIA‬ RTXA6000
    Sensor/Lenses inspired by:
    ~ RED Gemini 5K S35 & ARRI/ZEISS Master Prime 100mm T1.3
    ~ ARRI ALEXA Mini LF & ZEISS Compact Prime CP.2 85mm T2.1
    ~ RED Gemini 5K S35 & ARRI/ZEISS Master Macro 100mm T2.0
    ~ Custom S35 Sensor for a 16:9 image & Cooke-inspired 85mm Anamorphic/i (distortion Coefficients)
    Special Thank You to:
    Unreal Engine Artist ‪@maxwarlond‬ for providing the lighting/environment for this video!
    The incredible team at Glassbox: Johannes Wilkes, Norman and Jason Wang and Natalie Fernandez for all of your help and support throughout the making of this! I have learned so much from this experience and loved every moment of it!
    Arsène van de Bilt at Manus, for all of your feedback and support throughout this project with the glove data.
    Josh Beaudry at Faceware for your continued help, feedback and suggestions with the facial motion data.
    Kelly Shipman at Puget Systems for creating this incredible workstation with his amazing team!
    Emanuele Salvucci, Franco Vilanova & Daniel Langhjelm for your rendering help and answering my late-night messages for help!
    Thank you for your feedback:
    PixelUrge (Metahuman Grooms)
    Diana Diriwaechter & Sam Goldwater (lighting)
    Lenny aka uDraper (clothes)
    Marc Morisseau (MH Performance)
    Alvaro Garcia Martinez & ‪@WoodyDevs‬ (Cinematography)
    Simon Kay for being an amazing motion capture guide throughout my journey!
    Music by ‪@epidemicsound‬
    #metahumans #virtualproduction #ue5 #unrealengine5 #motioncapture #mocap #digitalhuman #meshtometahuman #nvidiartx #cincraft #zeiss #zeisscinematography #ARRI #gemini5k #shotonred #thecookelook #filmmaking #xsens #manusmeta #pugetsystems #liveclient #glassboxtech #Vcam

ความคิดเห็น • 82

  • @JonathanWinbush
    @JonathanWinbush ปีที่แล้ว +6

    So clean!

    • @FeedingWolves
      @FeedingWolves  ปีที่แล้ว +1

      Aww thank you Jonathan! You are an amazing teacher!

  • @fracturedfantasy
    @fracturedfantasy ปีที่แล้ว +10

    The movement and fidelity of the lips is fire. Nice work.

    • @FeedingWolves
      @FeedingWolves  ปีที่แล้ว

      Thank you so much!!!!!

    • @sputnickers
      @sputnickers ปีที่แล้ว

      @@FeedingWolves Yes, wow! Mouth movements are incredible in every sense. And...Absolutely no jitters here. It is as close to perfect as I've seen anywhere. How long did it take you to get to this state. Is this raw capture from Faceware, and if so how is it so much better than other stuff I've seen? If you are cleaning it up, how long does that take? Thanks

    • @FeedingWolves
      @FeedingWolves  ปีที่แล้ว +1

      @@sputnickers aww thank you so much! This is not the raw data. It took some work to get this result. This was my first time going through this workflow in 5.1 using IK Retargeter for the body/finger data and doing all the face cleanup and fine tuning with the face control rig in sequencer. Hard to say exactly how long the face alone took, as I worked on the body and face simultaneously, and did the eyes last, once the body/hand position was just right and matched the reference footage.

    • @user-jk9zr3sc5h
      @user-jk9zr3sc5h ปีที่แล้ว

      ​@@FeedingWolves I'd love to see your workflow separately for hands and face to get something this smooth

  • @SiddheshJadhav-z2j
    @SiddheshJadhav-z2j หลายเดือนก่อน

    Great work

  • @MarisFreimanis
    @MarisFreimanis ปีที่แล้ว +7

    Awesome. You really are getting ahead in motion capture. Results look amazing.

    • @FeedingWolves
      @FeedingWolves  ปีที่แล้ว +1

      Thank you so much! You have always been so supportive! Means a lot!

  • @WoodyDevs
    @WoodyDevs ปีที่แล้ว +3

    Nice! It turned out well!

    • @FeedingWolves
      @FeedingWolves  ปีที่แล้ว

      Thank you so much! You taught me a lot while I was working on this! I really appreciate it!!

  • @VRDivision
    @VRDivision ปีที่แล้ว +2

    you're killing it!

  • @mocappys
    @mocappys ปีที่แล้ว +2

    Fantastic work as always, Gabby. Some really lovely detail in the mouth and lips.

  • @bobhawkey3783
    @bobhawkey3783 ปีที่แล้ว +6

    Amazing! This was all in realtime? Well done. Yet another amazing creative tool.

    • @FeedingWolves
      @FeedingWolves  ปีที่แล้ว +7

      Aww thank you. Hard to differentiate what parts are technically considered real time and what parts are not. I use a real time workflow to stream and record the face data in Unreal, and the virtual cameras were also captured in real time. However, I do work in post, such as cleaning up the mocap in unreal and such. It would be hard for me to stream in live mocap and control my camera at the same time as its just me hehe.

  • @marcusmanningtv
    @marcusmanningtv ปีที่แล้ว +1

    This is beyond crazy!!

  • @IamSH1VA
    @IamSH1VA ปีที่แล้ว +1

    Woah 😳😳 facial mocap is so good

  • @charlytutors
    @charlytutors ปีที่แล้ว +1

    Very nice! Keep it up!

  • @diversityunityharmony
    @diversityunityharmony ปีที่แล้ว +2

    SO beautiful!! I'm in awe of this one --

    • @FeedingWolves
      @FeedingWolves  ปีที่แล้ว

      Awww, you are the best Natalie!

  • @tbowren
    @tbowren ปีที่แล้ว +1

    Looks great as usual!

  • @renegadealgorithmmedia1958
    @renegadealgorithmmedia1958 ปีที่แล้ว +1

    🎉🎉🎉🎉 this is insane! 🎉🎉🎉🎉

  • @kingsleyadu9289
    @kingsleyadu9289 ปีที่แล้ว +2

    Awesome this is great tell us more

  • @carloguayaba
    @carloguayaba ปีที่แล้ว

    This is just gorgeous, I can’t wait to see characters in video games.

  • @QReviews412
    @QReviews412 ปีที่แล้ว +1

    What!!!! Yo thats insane! Great Work im always inspired!

  • @maxwarlond
    @maxwarlond ปีที่แล้ว +1

    Nice work 💪

    • @FeedingWolves
      @FeedingWolves  ปีที่แล้ว +1

      Max, you are amazing! I am so grateful that you collaborated with me by sharing your gorgeous Metahuman lighting environments! You are a 💎

  • @CurtisKlope
    @CurtisKlope ปีที่แล้ว +1

    The smile is creepy but everything else looks amazing

    • @YacineChellat
      @YacineChellat ปีที่แล้ว +1

      Yes very bad smile unfortunately

  • @Descalabro
    @Descalabro ปีที่แล้ว +1

    Pretty good. The eyes are _almost_ alive.

    • @FeedingWolves
      @FeedingWolves  ปีที่แล้ว

      haha yes the eyes are hard to perfect

  • @MR3DDev
    @MR3DDev ปีที่แล้ว +3

    Is this raw data from the faceware helmet? cause this is very accurate

    • @FeedingWolves
      @FeedingWolves  ปีที่แล้ว +1

      Aww thank you, and no not the raw data. Facial expressions are sooooo complex that I was so grateful metahumans have the face control rig board so I was able to fine tune everything directly in Unreal.

  • @onethingeverything
    @onethingeverything ปีที่แล้ว +2

    Incredible work! Looks amazing!

  • @matchdance
    @matchdance ปีที่แล้ว +1

    the lips!

  • @tmaintv
    @tmaintv ปีที่แล้ว +1

    Just awesome. Hey Gabby what’s your work flow with the cincraft. Is it something that is applied in the dragonfly camera or an after render process? Would love to hear more

  • @visualpoetry3d
    @visualpoetry3d ปีที่แล้ว +2

    Wow, the facial mocap is impressive. I could really emphasize with the character. I guess this is faceware? Could you share how long it took you between capture and final result for the mocap?

    • @FeedingWolves
      @FeedingWolves  ปีที่แล้ว +1

      Thank you so much! Hard to tell exactly how long the face data took to fine tune as I also worked on the body, lighting, cinematics, rendering etc hehe.

  • @AnnisNaeemOfficial
    @AnnisNaeemOfficial ปีที่แล้ว

    I watched it again and had another question. You say, “with an iPad connected to a game vice”. I assume that’s device, but Could you elaborate ?

  • @axelgimenez9128
    @axelgimenez9128 ปีที่แล้ว

    This looks great. Thank you for sharing this and your process. It's super helpful. The one issue that gives it all away are the eyes. (1) The area where the eyeball meets the eyelids is too sharp/clean. (2) The eyeball itself is too bright - it doesn't look lit properly. Also the whole eyeball should be wet. Also, the hair - I'd expect it to move a little as the character moves her head. I wonder how all that can be solved. :) Really great work though.

  • @AnnisNaeemOfficial
    @AnnisNaeemOfficial ปีที่แล้ว +1

    This is great. Thanks. When you say camera by dragon fly- you don’t mean the face tracking, right?

    • @FeedingWolves
      @FeedingWolves  ปีที่แล้ว +1

      Thank you! DragonFly is a virtual camera that I used in Unreal and controlled it with an iPad attached to a gamevice. I was able to move the camera in order to add that subtle hand held movement and pull focus to match the reference video. Not related to the face/body performance 😉

  • @synapzproductions
    @synapzproductions ปีที่แล้ว +2

    Cool, going to try this out. Are you still retargeting movella mocap to the metahuman in UE4? Or do you have a new method?

    • @FeedingWolves
      @FeedingWolves  ปีที่แล้ว

      Thank you! For this project I did everything in 5.1 including retargeting 😉

    • @synapzproductions
      @synapzproductions ปีที่แล้ว

      @@FeedingWolves What process did you use to retarget? The finger spacing looks better than other methods I’ve seen.

  • @solofilmmaking
    @solofilmmaking ปีที่แล้ว +1

    is that the faceware new version ?

    • @FeedingWolves
      @FeedingWolves  ปีที่แล้ว +2

      No, I wish I had used the new version (the Portal) as it would have sped things up a bit. I streamed the face data through Studio into UE with the Live Client plugin, and recorded it directly in there.

  • @NoorFilms786
    @NoorFilms786 ปีที่แล้ว +2

    Wow that looks insane, is it all realtime?

    • @FeedingWolves
      @FeedingWolves  ปีที่แล้ว +1

      Thank you, part of my workflow is real-time but the final was rendered using Movie Render Que in Unreal.

    • @NoorFilms786
      @NoorFilms786 ปีที่แล้ว

      @@FeedingWolves your very talented, could you tell me what sort of anti aliasing samples you use at 4k?

    • @FeedingWolves
      @FeedingWolves  ปีที่แล้ว +1

      @@NoorFilms786 I didn't go high with AA, less than 12 and a few cvars for motion blur, dof and lumen. The downside to using AA was not being able to keep the hair physics enabled as it would cause the hair to go a bit crazy.

    • @NoorFilms786
      @NoorFilms786 ปีที่แล้ว

      @@FeedingWolves that is my biggest thing right now, trying to figure out the best render settings, if it's not too much to ask would you mind sharing your render settings SC, I would be forever grateful, I'll send you a message on discord for it, again I would be forever greatful.

  • @eliteartisan6733
    @eliteartisan6733 ปีที่แล้ว +2

    Is the face animation tweaked or does it come out like this straight out of the box? Also, except the helmet what else do you use to process this?

    • @FeedingWolves
      @FeedingWolves  ปีที่แล้ว +2

      I used the metahuman body and face control rig in Unreal to fine tune the data and learned a ton in the process 😉

  • @nicholass3430
    @nicholass3430 ปีที่แล้ว

    Hi, thank you for your wonderful tutorials. Do you know how to insert a custom iris texture? Or does anyone have a copy of the old eye shader (which has an iris col texture slot in the MI) Help appreciated!!

  • @alinazaranimation557
    @alinazaranimation557 ปีที่แล้ว

    What do you use for performance capture and facial capture?

  • @evlax8316
    @evlax8316 ปีที่แล้ว

    Hi, can you please tell what equipment set up(ho many cameras, microphone and etc) do you use to make this. I am just interested and trying to do it myself in educational purposes.(I am student)😊

  • @gw244
    @gw244 7 หลายเดือนก่อน

    Gabriella, I'm looking to buy the bodysuit & face combo that you have. How much did you pay?

    • @gw244
      @gw244 6 หลายเดือนก่อน

      So I contacted Xsens. There are three versions of the bodysuit; MVN Awinda Starter, MVN Awinda and MVN Link. Which one of these are you using, Gabriella?

  • @JaidevCGartist
    @JaidevCGartist ปีที่แล้ว

    I guess the newly announced metahuman animator will destroy this and all other systems.

  • @Billary
    @Billary ปีที่แล้ว +2

    Very cool system and your facial animation is great too!
    It would be an interesting experiment to try using their camera simulation in conjunction with the NvRTX Caustics 5.1 branch of UE that was recently released- purely because one of the experimental features is Ray Traced Depth of Field. I've mostly been looking at the really cool caustics feature of the branch so I haven't tried the RT DOF yet, but from the single screenshot they have on the bottom of the Github page it seems like it might be the final puzzle piece in a fully virtualized camera system.
    Lol I might even try out Dragonfly's 30 day free trial just to see if both systems work together. I was already doing some extremely basic tests of replicating physical lenses last year with the 4.27 Caustics branch because it's pretty much doing a light refraction & dispersion simulation in real-time.
    I have astigmatism and I thought it would be so cool to attempt to replicate the curvature of my glasses and see if the light distorts in a similar fashion to real life. I went all the way down the rabbit hole of trying to find optometrist software to reverse engineer my glasses prescription before I eventually got too confused and moved on to a different project lol

    • @FeedingWolves
      @FeedingWolves  ปีที่แล้ว +2

      This is very interesting. I have not heard of this Caustics branch. Mind sharing the link? And that idea of replicating the curvature of your glasses to see if the light distorts in the same way...you have to share the results if you do it!! I would love to know!

    • @Billary
      @Billary ปีที่แล้ว

      @@FeedingWolves I'll put the link in a comment after this one in case TH-cam withholds the comment for having a link. Btw in order to view the page on Github, Epic requires your Epic Games account to be linked with your Github account- otherwise it'll just show a 404.
      And earlier I realized that I could try using ChatGPT to help me with some of the complicated math with my glasses/contacts. I told it what I wanted to do, gave it my contacts prescription, and then asked it to write a Blender script to make the geometry lmao. The script didn't come out 100% right, but it got surprisingly close to making a contact lens shape!

  • @Jrdn357
    @Jrdn357 ปีที่แล้ว +1

    This is great. Would really love to see some tutorials on how we could get good results without the $50,000+ worth of equipment. I mean, your HMC alone is being sold at $28,000. I honestly have no idea how you make enough money to afford this stuff, haha. The whole point of Unreal Engine is to give smaller creators the chance to create amazing visuals and cinematics without the need of super expensive equipment, but to get even somewhat decent motion capture for Metahumans costs SOOOO much money.

  • @tylerweatherly2091
    @tylerweatherly2091 ปีที่แล้ว

    Hi! I just stumbled across your video while looking for the new meta human animator they showed for unreal. I have been looking for a way to bring some of my ideas into animation. I didn't even know that virtual production/ all this motion capture and 3d animation stuff really existed in such an accessible way like this. I would love to learn more, but am finding it hard to find a comprehensive tutorial or course or place to start. I would like to do what you do. the motion capture, and the editing and all that in unreal, but I don't even know how to use unreal yet. Do you have any advice to point me in the right direction? where should I start? do you have a list of online tutorials or courses that you found really helpful? what are the key tools and workflows I would need to learn to put it all together?
    I hope that is not to many questions, honestly just any pointers would be good! But honestly what you are doing just looks so cool and I have loved your work so far. You are encouraging in the way you talk about your own journey of learning. And inspiring, when I saw some of your videos after 3 months of doing this, and how you were new to all of it, I was really encouraged. I thought "wow, maybe I can actually do something like this too?" instead of what I normally feel, which is that all my creative aspirations are to daunting and learning them too challenging, that I just leave them in my head.
    I hope to be able to hear from you, thanks so much for what you do.

    • @FeedingWolves
      @FeedingWolves  ปีที่แล้ว +1

      Thank you so much for your kind words and I am so happy to hear you have found something that excites you!
      How I got started with zero Unreal experience:
      1. I got a PC laptop with an NVIDIA graphics card (I started with the Razer Studio, but there are a ton of new options out there now)
      2. Took the Jonathan Winbush Unreal course on Mograph.com, but Winbush re-released a new FREE version for the community that shows all of the same, if not more, in this series of tutorials on getting started. th-cam.com/video/RC3G0Ycv8jc/w-d-xo.html
      He is the reason I learned Unreal ;)
      Take your time with it and learn it inside out. Took me one month using UE all day everyday. From there, start working with animations.
      If you hit any walls, join the Unreal Discord community Unreal Slackers, and you are welcome to join mine as well.
      My discord: discord.gg/hthTPfT8

  • @Ifinishedyoutube
    @Ifinishedyoutube ปีที่แล้ว +1

    That is the most horrifying smile I have ever seen.

  • @scienticus
    @scienticus 8 หลายเดือนก่อน

    No more videos?

    • @FeedingWolves
      @FeedingWolves  8 หลายเดือนก่อน +1

      They are coming😉

    • @g3nius
      @g3nius 8 หลายเดือนก่อน

      great @@FeedingWolves

  • @cjmkdolphin8443
    @cjmkdolphin8443 ปีที่แล้ว

    Watched this twice now
    Cool tech, but that smiling face at the end is disturbing.

  • @Ajay-kz9ns
    @Ajay-kz9ns ปีที่แล้ว

    And all you need is a 4090 on nuke!!!! 👌

    • @FeedingWolves
      @FeedingWolves  ปีที่แล้ว

      lol it certainly helps to have a beefy machine when using UE with Metahumans and mocap, especially since this workflow uses a decent amount of processing power.

  • @jumpieva
    @jumpieva ปีที่แล้ว

    I really really wish ue didn't give the middle finger to android users that refuse to participate in the apple ecosystem

  • @Cloroqx
    @Cloroqx ปีที่แล้ว +1

    "The face and body performances were cleaned and fine-tuned entirely in Unreal with the Metahuman control rig and the custom Metahuman was created using the Mesh to Metahuman plugin, with a Paragon character head mesh."
    So this is not the actual live capture?

    • @FeedingWolves
      @FeedingWolves  ปีที่แล้ว

      That is correct. Part of my workflow is real time and the rest I like to go the extra mile and fine tune things 😺