UE Take Recorder Live Link Source & Control Rig Backward Additive Solver

แชร์
ฝัง
  • เผยแพร่เมื่อ 10 ก.ค. 2024
  • Link to my Discord: / discord
    In this video, I go over how you can record your Live Link Source data from the UE Live Link Face in Take Recorder for cleaner facial motion capture data on the new Metahumans.
    I also show you how you can use the new Control Rig Additive Backward Solver to either fine tune or add animations to your recording in Unreal Engine.
    Created this using the Razer Blade 15 Studio Edition laptop with NVIDIACreator graphics.
    #MadeWithBlade #UnrealEngine #EpicGames #3Lateral #FacialMotionCapture #UEControlRig #MotionCapture #VirtualProduction #Nvidia #RazerStudio #Animation #Raytracing #Quixel
  • ภาพยนตร์และแอนิเมชัน

ความคิดเห็น • 242

  • @CitizenMetaOne
    @CitizenMetaOne 3 ปีที่แล้ว +16

    Take recorder is scary. Thanks for making this video! 🍻

    • @FeedingWolves
      @FeedingWolves  3 ปีที่แล้ว +2

      Aww, thanks Matt....or should I say....MetaOne!

  • @ShadowSnake141
    @ShadowSnake141 2 ปีที่แล้ว +1

    Amazing! Bake to control rig is the core bit of this we've been searching for for months - so glad you made this video! Thanks so much!

  • @woyame1
    @woyame1 2 ปีที่แล้ว +1

    This has got to be the absolute best breakdown of this workflow I've ever seen on video! This is gold, diamond and platinum rolled into one. Thank you!

  • @errke
    @errke 3 ปีที่แล้ว

    Wow. Thanks. This was exactly what I was looking for but couldn’t find anywhere else. Looking forward for more videos 😃

  • @JhulowNT
    @JhulowNT 2 ปีที่แล้ว

    Thank you very much for this tutorial, also had a look on some of your vids it's inspiring and very motivating! Keep up the good work Gabriella!

  • @vfxsrini
    @vfxsrini 2 ปีที่แล้ว

    super clear. Take recorder was super confusing earlier,
    this video is very helpful. Thanks a lot.

  • @robot_collective
    @robot_collective 3 ปีที่แล้ว +1

    YES, exactly what I was looking for right now while playing with my MetaHuman =) Thank you!!

  • @andrewjordan5925
    @andrewjordan5925 ปีที่แล้ว

    I keep coming back to this video, it's so helpful, every time I run into an issue with livelink and MH. This video has become my checklist.

  • @MrToroburn
    @MrToroburn 3 ปีที่แล้ว +3

    Wonderfully clear, I can't thank you enough, it would have taken me ages to figure this out. Keep the excellent videos coming!

  • @maksithompson7018
    @maksithompson7018 3 ปีที่แล้ว +1

    Absolutly amazing! Thank you so much for this! It really explained everything I was missing! keep up the great work!

    • @FeedingWolves
      @FeedingWolves  3 ปีที่แล้ว

      I am so happy to hear that!! Thank you!

  • @mocappys
    @mocappys 3 ปีที่แล้ว

    Great video, Gabriella! Interesting comparrison between the results from the different recording modes at the end.

    • @FeedingWolves
      @FeedingWolves  3 ปีที่แล้ว

      Thank you so much, Simon! Coming from you that is a huge compliment!!

    • @sawtom
      @sawtom 2 ปีที่แล้ว

      @@FeedingWolves Hi,
      mocap is new to me. Would connecting iphone with cable give better result? Wifi has a lag for sure.

  • @workflowinmind
    @workflowinmind ปีที่แล้ว

    This is by far the most informative tutorial on the little gotchas about using ARKit in UE5, thanks a bunch

  • @Bumpitybumpbump
    @Bumpitybumpbump 2 ปีที่แล้ว

    Thanks so much - I was pulling my hair out over the "read only" problem on the level sequences.

  • @veithtutorials
    @veithtutorials 2 ปีที่แล้ว

    Thank yo su much!!! Cant wait to see your other tutorials!!! Has helped me very much! :)

  • @Billary
    @Billary 2 ปีที่แล้ว +1

    I've lost count how many times I've had to come back to this video for various new projects lol. Thank you so much for making such an informative tutorial!
    There's a lot of videos on recording Live Link with metahumans, but almost none of them seem to show how to properly do it right like in this tutorial! Great work!

    • @FeedingWolves
      @FeedingWolves  2 ปีที่แล้ว +1

      Thank you so much for your kind words! This video is old but so happy it is still useful for people ;) p.s. love your channel, just subscribed

  • @wmka
    @wmka ปีที่แล้ว

    Searched for "livelink metahuman control rig".
    This is the first in the search result.
    Thank you and have a good one.

  • @BAYqg
    @BAYqg ปีที่แล้ว

    Love you! Thanks a lot! Saved a day by giving a hint how to unlock the sequence! It is so strange that it is locked by default. At least I can't see why is it a primary option.
    You're a legend!

  • @ToysRUsKid_Critter
    @ToysRUsKid_Critter 2 ปีที่แล้ว

    Excellent tutorial A complex task. U walked us thru it like mom explaining to a child how to eat cereal or something. Keep the awesome vids flowing…

  • @StevenDiLeo
    @StevenDiLeo 2 ปีที่แล้ว

    Thank you so much for making this!

  •  11 หลายเดือนก่อน

    Thanks a lot, finally I found a plausible explanation!

  • @timtsang3
    @timtsang3 2 ปีที่แล้ว +1

    thank you!!! OMG what everyone said so many MH tutorials out there and this is the only one that has helped haha. thanks again :)

  • @codemeister9716
    @codemeister9716 2 ปีที่แล้ว

    Amazing tutorial. Thank you!

  • @johnteddyJoe
    @johnteddyJoe 3 ปีที่แล้ว

    wow perfect tutorial!!! Thank you so much

  • @buddhavibes
    @buddhavibes 3 ปีที่แล้ว

    Amazing. Thank you for the tutorial.

  • @billywhitaker6658
    @billywhitaker6658 2 ปีที่แล้ว +13

    So many metahuman tutorials, yet only yours gets us to where we want to go. Controlling and recording the animations. Thanks for what you do, I was at a wall and this took me from A to Z. 🙏

    • @FeedingWolves
      @FeedingWolves  2 ปีที่แล้ว +2

      Wow! Thank you so much! If you ever hit a wall a wall again, feel free to join my Discord channel ;)

    • @rickykngo
      @rickykngo 2 ปีที่แล้ว

      Yes, very to the point. No BS talking.

    • @jamesonvparker
      @jamesonvparker ปีที่แล้ว

      So true.

  • @mjack7563
    @mjack7563 2 ปีที่แล้ว

    Thanx a lot. This is the only way I can record my metahuman.

  • @TanChengLip
    @TanChengLip 11 หลายเดือนก่อน

    The only creator that teaches how to use the livelink animation

  • @elenasimonian
    @elenasimonian ปีที่แล้ว

    amazing tutorial! thank you!

  • @flufflepimp2090
    @flufflepimp2090 2 ปีที่แล้ว

    Thank you soooo much for this!

  • @dow647
    @dow647 3 ปีที่แล้ว

    Thanks, great video.

  • @tommyford1301
    @tommyford1301 ปีที่แล้ว

    awesome just what i needed thanks

  • @A_RayChan_Joint
    @A_RayChan_Joint 3 ปีที่แล้ว

    very helpful, thank you.

  • @Eclectic_Chicken
    @Eclectic_Chicken 3 หลายเดือนก่อน

    oooh yes, it helped me with audio 2 face !!! thanks a lot !!!

  • @JonJagsNee
    @JonJagsNee ปีที่แล้ว

    Thanks Gabby! :D

  • @siddharth122
    @siddharth122 2 ปีที่แล้ว

    Very nice tutorial 😊

  • @user-lx6xb5sg9z
    @user-lx6xb5sg9z 3 ปีที่แล้ว +1

    Very Helpful

  • @3lihsn367
    @3lihsn367 ปีที่แล้ว

    Thank you so much!!

  • @ericbarrios1301
    @ericbarrios1301 3 ปีที่แล้ว

    so freakin' helpful thank you thank you

    • @FeedingWolves
      @FeedingWolves  3 ปีที่แล้ว +1

      My pleasure!! Glad it helps!

  • @davidbuenov
    @davidbuenov 3 ปีที่แล้ว

    Great Video!!!

  • @sphangman
    @sphangman 3 ปีที่แล้ว +1

    Great video , thanks !
    One thing that would improve it , would be to name the buttons you are unchecking or checking so it’s easier when following along. Apart from that very helpful thanks.

    • @FeedingWolves
      @FeedingWolves  3 ปีที่แล้ว +1

      Hi! Thank you for the feedback. I will definitely do that, as I know how frustrating it can be to try and follow along only to get stuck on simple stuff with UE. Out of curiosity, which buttons did you have a hard time with?

    • @sphangman
      @sphangman 2 ปีที่แล้ว +1

      Sorry , thought I’d replied already. It was at 1:46 when you were saying these two. I was getting a whiplash effect trying to check my screen and your screen quickly , so naming the buttons would help in that regard. Thanks again for the video , I’ve got some nice stuff working with it and use your video as a quick setup guide when grabbing captures from my va’s

  • @gameswownow7213
    @gameswownow7213 2 ปีที่แล้ว

    wow, nice one. thanks

  • @RCL4958
    @RCL4958 2 ปีที่แล้ว

    Thank you!

  • @cob666fuk
    @cob666fuk 2 ปีที่แล้ว

    Thanks, life saver

  • @axlrosecnr
    @axlrosecnr ปีที่แล้ว

    thank you so much

  • @supremebeme
    @supremebeme ปีที่แล้ว

    thankyou FW!

  • @megalockge
    @megalockge 10 หลายเดือนก่อน

    You're the best

  • @dooleymurphy
    @dooleymurphy 2 ปีที่แล้ว

    Thank youuuuuuuuuuu

  • @leonarrow
    @leonarrow 2 ปีที่แล้ว

    thanks, very nice and easy to understand, as yet there is no UE5 equivalent, so I am going to try to follow this but find the equivalents in UE5

  • @Native_Creation
    @Native_Creation 3 ปีที่แล้ว

    This is very awesome, thank you. I'm finally looking into Live Link to use with Metahumans so this is great timing, look forward to seeing more videos on these processes and details/tips. Any suggestions if I were to use another source for mouth? (i.e., Vive facial tracker, I'm assuming it'll be more accurate for mouth area)

    • @FeedingWolves
      @FeedingWolves  3 ปีที่แล้ว

      Hi Rick James! If the facial tracker has a live link source, then it would be the same method. However, I believe @MetaOne would have more experience with the vive tracker workflow as I have never tried this out myself.

  • @lucdelamare8996
    @lucdelamare8996 3 ปีที่แล้ว +1

    Thanks for this! Was starting to get frustrated with Take Recorder + the BP_Metahuman route...

    • @FeedingWolves
      @FeedingWolves  3 ปีที่แล้ว

      I was just as happy to discover this as you are!

    • @lucdelamare8996
      @lucdelamare8996 3 ปีที่แล้ว

      @@FeedingWolves I mean to ask, is there something I'm missing to get the face rotation from live link? I see that the data is recorded in the original LiveLink SubScene (HeadYaw, HeadPitch, HeadRoll), but when I bake the animation that data doesn't seem to transfer.

    • @lucdelamare8996
      @lucdelamare8996 3 ปีที่แล้ว

      Never mind, just figured it out! Needed to have the animation mode in the BP set to animation asset.

    • @lucdelamare8996
      @lucdelamare8996 3 ปีที่แล้ว

      Hmm so after some further testing, it doesn't look like there is a way to bake the face rotation (from livelink) to the control rig.

    • @FeedingWolves
      @FeedingWolves  3 ปีที่แล้ว

      @@lucdelamare8996 thats interesting. So when you bake the face, you loose the head rotation?

  • @faddlewaddle2615
    @faddlewaddle2615 2 ปีที่แล้ว

    This has been such a frustrating experience. Trying to deal with this on my own as well as following the numerous tutes and yet, still not getting to where I want/need to be.
    Your tute's the first to get me the farthest. Yet, even then, the results aren't great. In the end, it feels like the program's to blame with how finicky it is.
    At this point, I'm thinking it best to just try the control rig route since this week I'm just looking for head gestures w/smiles. I've your tute to thank (halfway) for getting me there but not without jitteriness in movements.
    Anyhow, will look through your channel to see if there are updated versions of this all. Presently using UE5.0
    'and thanks btw for this. At least it DID get me somewhere. 🙂

    • @FeedingWolves
      @FeedingWolves  2 ปีที่แล้ว

      We have all been experienced those moments of frustration lol! UE Live Link face is free which is great, but if you are looking for slightly higher end quality, you might want to try out faceware. Unreal Learning released a course on it and I am pretty sure you can sign up and get a free trial just to test it out. All you need is some video footage, and they provide some sample data so you can have something to work with. dev.epicgames.com/community/learning/courses/d66/metahuman-workflows-with-faceware-studio/Zd8v/faceware-studio

  • @user-ou1nc8or5e
    @user-ou1nc8or5e ปีที่แล้ว

    Thanks!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

  • @GerbearRadio
    @GerbearRadio 2 ปีที่แล้ว +3

    Thank you for this! This video helped in a massive way with my project! Question for you, I noticed that there is a record button in the Live Link app. How would I import those recorded takes from my phone into Unreal. Is that were the "use source timecode" comes in at 1:36 Any input is appreciated. Thanks again and keep up the great work!

    • @rewwhiskas4234
      @rewwhiskas4234 ปีที่แล้ว +1

      try this: th-cam.com/video/z99Fo9LXLHk/w-d-xo.html

  • @SProj-px7wm
    @SProj-px7wm ปีที่แล้ว +1

    I'm also interested in the baking of neck/head movement; are there some drive controls that need to be added to the blueprint?

  • @superdownwards
    @superdownwards 2 ปีที่แล้ว

    This is amazing! If you do contracting engagements I'd be keen to chat

  • @weightednormal3682
    @weightednormal3682 2 ปีที่แล้ว

    Thanks for sharing your knowledge. Is it possible to also play back the head tracking that exists in the recording? When I bake it into the face control rig, it is only driving the face bones. The head motion would need to be added to the body rig? I'm not sure.

  • @kobayz
    @kobayz 2 ปีที่แล้ว

    Thank you for this vid! is this still applicable with the new Unreal Engine 5?

  • @binyaminbass
    @binyaminbass 2 ปีที่แล้ว +3

    Hi! Thank you for your clear and to the point tutorial. So helpful! I have a problem still: For some reason, only the facial recording comes through in the baked animation, but not the head/neck. I tried baking a seperate animation for the body, hoping that would cover the head/neck, but it didn't do anything! How do I get that head/neck movement?

    • @thebensimon
      @thebensimon 2 ปีที่แล้ว

      Did you find a solution?

    • @binyaminbass
      @binyaminbass 2 ปีที่แล้ว

      @@thebensimon no *sigh*, not yet

    • @thebensimon
      @thebensimon 2 ปีที่แล้ว

      @@binyaminbass I can't find anything about it as well :(

    • @thebensimon
      @thebensimon ปีที่แล้ว +1

      Anything about the neck animation?

    • @binyaminbass
      @binyaminbass ปีที่แล้ว +1

      @@thebensimon I've turned my attention elsewhere...for now. A little disheartening that you are still having this problem. I'm going back to this soon.

  • @loregodd
    @loregodd 2 ปีที่แล้ว

    Hey Gabriella! Thank you for sharing your knowledge :) I have a quick question - for non-meta human characters with blend shapes/morph targets, is it possible to combine body animation and face animation? I noticed that metahumans have face and body separated in the sequence track, but most 3D models don't have this. Let me know if you've already covered this else where!

    • @FeedingWolves
      @FeedingWolves  2 ปีที่แล้ว +1

      Hi! Thank you! So you are trying to record the Live Link source for the body and face simultaneously or the actual body/face data by dropping the entire character into Take Recorder. It also really depends on how you have your character set up. What kind of character are you using?

    • @loregodd
      @loregodd 2 ปีที่แล้ว +1

      @@FeedingWolves I eventually figured it out! I appreciate you replying :D!! Thank you!

    • @Fiflo95
      @Fiflo95 2 ปีที่แล้ว +2

      @@loregodd I'm very interested to know how you did :)

    • @IFTechFoundation
      @IFTechFoundation ปีที่แล้ว

      @@loregodd Also am curious what you did to solve this!

  • @letterl1840
    @letterl1840 3 ปีที่แล้ว

    This is incredible! Do you have any tips for a college grad seeking a mocap job like your vids to watch

    • @FeedingWolves
      @FeedingWolves  3 ปีที่แล้ว +1

      Checked out your mocap vids! Pretty awesome! Keep doing what you are doing! And thank you! I still have so much to learn. As far as tips, create something with your mocap and have fun with it;) See where it takes you!

    • @letterl1840
      @letterl1840 3 ปีที่แล้ว

      @@FeedingWolves Oh my you actually responded! Would it be ok to ask you questions from time to time! I've recently added you to instagram!

  • @LamassuAnimations
    @LamassuAnimations ปีที่แล้ว

    nice tutorial.
    When I bake my animation into the rig for an additive animation layer, i am not able to move the head and neck controls anymore...
    how can I fix it?

  • @GregCorson
    @GregCorson 2 ปีที่แล้ว

    I have a problem with take recorder maybe you have seen. I live-animate my metahuman using a mocap suit, which works fine. But if I use "from actor" in take recorder to record the character, the character comes out distorted when I play it back. Usually this is hunched shoulders and an enlarged upper body but it varies depending on the metahuman used. The default character DHRUV is the worst, some of the female characters are only slightly distorted. Do you know a good solution? The ones I've found involve editing the skeleton but this doesn't seem right because they change a lot of things that are explicitly set different ways in the Metanuman.

    • @FeedingWolves
      @FeedingWolves  2 ปีที่แล้ว

      Hi Greg! So when you record only the body live link source in Take, the data looks good, but when you drop the entire character into Take Rec, the recording looks weird? Is this correct? What mocap solution are you using? Depending on your solution, does this require you to set your character in a T pose or to create a remap asset?

    • @GregCorson
      @GregCorson 2 ปีที่แล้ว

      @@FeedingWolves Actualy, you can see the problem without doing any mocap. Just load up a meta (DRUHV has the worst problem) and do a short take recorder recording "from actor" then review the take and you should see it. Here is a milder example th-cam.com/video/Tun5sID9ab4/w-d-xo.html but DRUHV's head sinks down till his chin is level with his shoulders! In that video I fix it by resetting the retarget in the recorded animation but it seems like there should be a better way than this. Most of the female metas don't distort or get only slightly thinner. The tall male metas seem to distort the most. This only happens if you record "from actor", some mocap setups require to use this instead of recording the livelink source directly. Rokoko recommends the same fix I am using in their latest tutorial.

    • @FeedingWolves
      @FeedingWolves  2 ปีที่แล้ว

      @@GregCorson It is difficult to figure out what is causing this. I would need a better idea as to how you are setting up your mocap on the metas first. What I can tell you is that depending on what you want to record on a Meta, you can drop your Meta into take recorder, and once you select the MH, there should be an options menu below (in take recorder). There, you can uncheck the root, and only check on the things you want to capture, such as Body-animation track, Face-Animation track. Is this a Rokoko set by the way? Are you using the facial capture built into the Rokoko pipeline or running the UE Live Link face app separately? In other words is the body/face mocap from Rokoko integrated into one Live Link Source? If you tell me more about how the Live Link pipeline for Rokoko is set up on the actual metahuman, I can tell you what might be causing these things. If I had to guess though, looks like you might need to go into the actual MH skeleton and change the retargeting options....again I would need more info to help resolve this

    • @GregCorson
      @GregCorson 2 ปีที่แล้ว

      @@FeedingWolves You don't have to be doing mocap...it seems to happen whenever you do a record "from actor" on a meta. Doesn't matter if the character is standing still, animated from a control rig, by mocap or something live like Perception Neuron or Rokoko. Take recorder just seems to mess up the animation it records when "from actor" is used, apparently by not copying the retarget skeleton from the original meta into the recorded animation. The fix is to set the right retarget skeleton yourself, but you have to do it to every animation take recorder records, which is very inconvenient. If you look at my video linked earlier, you can see the whole setup and the fix.

  • @SonicTemples
    @SonicTemples 3 ปีที่แล้ว

    Thanks, great video. I'm a wee bit stuck with Faceware and Glassbox using take recorder and the child blueprint. I can't seem to record data. When the sequencer comes up it disables the Faceware! Possibly something to do with control rig tracks perhaps?

    • @FeedingWolves
      @FeedingWolves  3 ปีที่แล้ว +1

      So there is no live link source for Faceware/Glassbox. For that workflow, you need to add the actual character BP into take recorder. Go into play mode, then record. How are you recording?

    • @SonicTemples
      @SonicTemples 3 ปีที่แล้ว

      @@FeedingWolves Thanks for getting back to me. Yes I was using the BP in the recorder however it was also bringing in control rig tracks which for some reason turned faceware off. Deleting the control rig tracks enabled me to record data so all sorted now thanks to your tutorial! Now just need a basic webcam solution for body mocap! On an additional note regarding the additive control rig section. Is there a way to not override the whole mocap track when making a change on the control rig? I noticed for example If I add a blink and the final keyframe is open eyes, the eyes stay open and it overrides any subsequent blinks on the mocap.

    • @FeedingWolves
      @FeedingWolves  3 ปีที่แล้ว

      @@SonicTemples Hi! So I am not sure what you mean by when adding the BP in take recorder, it was bringing in the control rig tracks as well. Can you send me a snapshot of this if it happens again? Very curious about this. discord.gg/JhuRsgbu (link to my discord server for Meta troubleshooting). Now as far as the blink issue, so when you make a change, what I can suggest, is add a keyframe, before and after where you want the change to occur, and then in the middle make your adjustment. The reason being, when you just add a keyframe, it holds that expression, instead of just adding one in between the mocap data. Let me know if that works. P.S. When you add the before and after keyframes, dont make any changes to those, just add a keyframe, without any adjustments. Make the adjustments only in the middle keyframe. So A, B,C. B is your change. A and C you leave as is.

    • @SonicTemples
      @SonicTemples 3 ปีที่แล้ว

      @@FeedingWolves Thankyou, my error was bringing the child blueprint into the sequencer track rather than the take editor it seems!

    • @FeedingWolves
      @FeedingWolves  3 ปีที่แล้ว

      @@SonicTemples so glad you figured it out! Yes I did that once and then realized you cant use the actual child in sequencer, that the child is just for recording lol.

  • @byz2491
    @byz2491 2 ปีที่แล้ว

    Thanks for the video! I just have an issue that the full animation does not bake - only the first couple of seconds. Do you know what I am doing wrong maybe? Thanks again

    • @byz2491
      @byz2491 2 ปีที่แล้ว

      Figured it out :)

  • @shalup2530
    @shalup2530 2 ปีที่แล้ว

    Great Video.. Can you please help me? Instead of save in uasset form. Can we download(Render from sequences) in avi or png form? Using Livelink

  • @DesignsbyElement
    @DesignsbyElement 3 ปีที่แล้ว

    At the 4:06 mark, when clicking the + sign on face, the option to choose animation is not there. Only template sequence and control rig are at the top. I'm using 5.0. If I click the + sign on the BP_name of Metahuman, the animation prompt is there, but the recorded animation isn't in the list to be chosen. Any ideas? Tried to join the discord but it says the invite is invalid
    update: I think it had to do with the timeline scrubber not being in front of the animation on the timeline ...

    • @FeedingWolves
      @FeedingWolves  2 ปีที่แล้ว

      I am glad you figured it out. Here is the link to the Discord discord.gg/GehykCmvAB

  • @marc1137
    @marc1137 2 ปีที่แล้ว

    very cool , any way to export control rig animation to maya ? if i export something actually i get a group with all attributes from control rig but no keys, would be amazing can send the live link baked info to maya control rig

  • @NicolasLekai
    @NicolasLekai 3 ปีที่แล้ว +3

    When I' bake animation to sequence' I end up with an empty animation in the blueprint with the bold head...I have no idea what s not working on my end. it also doesn't really make any sense to me how the face should know what it needs to bake to what in the first place with this tecknique. thx one million for ur great tuts though. and any help on this issue would be greatly appreciated!

    • @FeedingWolves
      @FeedingWolves  3 ปีที่แล้ว +2

      Ok so when recording the live link source, go into the subsequence of that recording. You should be seeing the keys in the timeline. Unlock the timeline, drop your character into the timeline, and if you recorded in Play mode, go back into Play mode and then bake to animation sequence. Let me know if that fixes the issue.

    • @FeedingWolves
      @FeedingWolves  3 ปีที่แล้ว +1

      Also, when you drag the character into the sequence, make sure the timeline scrubber is a few frames before the actual animation.

    • @NicolasLekai
      @NicolasLekai 3 ปีที่แล้ว +1

      @@FeedingWolves thx a million , that did it! u gotta love UE's mysterious ways...

    • @NicolasLekai
      @NicolasLekai 3 ปีที่แล้ว

      @@FeedingWolves thx:)

    • @NicolasLekai
      @NicolasLekai 3 ปีที่แล้ว +1

      @@FeedingWolves is there anything we did explicitly so there wouldn't' be head rotation? I'm trying to find the option for the head rotation to be baked onto control the rig also...unreal engine is so amazing at not working in obvious ways. lol . I did record with head rotation switched on on the iPhone and it's there in the animation blueprint.

  • @robot_collective
    @robot_collective 3 ปีที่แล้ว +2

    I noticed that the recorded head movements aren't there after baking the animation to Face. So would that be the same process for Body to also get the head movements from LiveLink?

    • @XaertV
      @XaertV 3 ปีที่แล้ว

      This tbh, really need these for it to be useful.

    • @XaertV
      @XaertV 3 ปีที่แล้ว +1

      Just noticed that if I do the sequence stuff at 3:51 in the original subsequence, instead of the level sequence, and then drag the subsequence into the level sequence, the head motion is retained.

    • @FeedingWolves
      @FeedingWolves  3 ปีที่แล้ว +1

      Hi K! You are not the only person that has said this is happening to the head movements. Working on trying to replicate this issue to find out how to fix it.

    • @FeedingWolves
      @FeedingWolves  3 ปีที่แล้ว +1

      @@XaertV are you then baking the face to the control rig and the head motion is retained? I ask because a few people are loosing the head rotation when baking to control rig and trying to replicate this issue so I can fix it.

    • @XaertV
      @XaertV 3 ปีที่แล้ว

      @@FeedingWolves yeh, I basically do the face baking, then re-apply from the baked animation within the recorded sub-sequence. Then after creating the level sequence, I drag the sub sequence in there, and that appeared to work for me. Although, only if the current frame selector is outside of the sequence timeline when playing? Kind of weird still

  • @user-yq6hj5iq9i
    @user-yq6hj5iq9i 2 ปีที่แล้ว

    Nice tutorial, but how can I get the head rotation control rig when baking face animation? The head control rig belongs to body component...

  • @chpdigital
    @chpdigital ปีที่แล้ว

    Hello, awesome tutorial! But the same method doesen't work for UE5, i mean baking sequence animation in UE5. Can you explain me how to do this? thank you

  • @greylight1
    @greylight1 3 ปีที่แล้ว +2

    How would you go about doing the same with translated head rotation? I lose that once I've applied the animation to the Face. Thank you!

    • @FeedingWolves
      @FeedingWolves  3 ปีที่แล้ว +1

      Are you using the UE Live link face/neck data?

    • @FeedingWolves
      @FeedingWolves  3 ปีที่แล้ว +2

      Ok just tested this out. If you recorded the head/neck rotation in play mode, when you bake to sequence, make sure you are also in play mode;) When you are not, you do lose the neck rotation. let me know if this works.

    • @greylight1
      @greylight1 3 ปีที่แล้ว

      @@FeedingWolves Right I See! Im using face/neck data. That probably won’t translate well if you bring cine cameras in I imagine .. I’ve been playing around today with this + retargeting animations from the starter kit - can you think of a way to combine those two with the proper head rotation? I’ve also noticed that there’s no slider for the rotation in the „face“ tab, I imagine that’d be in the body one.. let’s say you have a retardierendste body animation (with head rotation), is there a way to replace that head rotation with the one from the face app?

    • @FeedingWolves
      @FeedingWolves  3 ปีที่แล้ว

      @@greylight1 So if I am understanding correctly, when you say it wont work with cinecams, do you mean you cant seem to hide the control rig? If that is what you mean, all you have to do, is save the sequence you have your anims set up in, close the sequence and reopen it. The control rig should no longer be visible.
      As far as neck rotation via ue live link face and how to deactivate it when combining body data that already has head neck rotations. One thing you can do is just disable the neck rotation when recording. The other thing you can do is if you have the animation with the neck rotation, when you bake that to a sequence, dont go into play mode when baking. It will remove the neck rotation. Let me know if this works.

    • @Max01912
      @Max01912 2 ปีที่แล้ว +1

      ​@@FeedingWolves Hello Gabriella, first thing, thanks a lot for this tutorial, I've checked your channel, great videos, you've got a new subscriber :)
      Regarding this issue, I'm experiencing the same king of thing, I've tried multiple tests, but I don't manage to get that working:
      The rotation is actually well baked in the animation. But when putting back the animation on the "Face" animation track in sequencer, it doesn't get played (all other facial animations are well played) Would you have any lead for me to look into?
      Cheers !

  • @davidbolivarg
    @davidbolivarg 3 หลายเดือนก่อน

    Hi Gabriella! Excellent tutorial! I recorded a video for LiveLink, added to a Sequencer... everything was perfect except is important that the character moves and tilts the head accordingly to the gestures (as it was recorded), but when I use the Face Control Rig, the head won't move anymore. The gestures are fine but the tilt and all of that won't happen again. If I delete the control rig, the neck and head movements return without any problem. Is there any way to mix both? I´ve tried every "solution" but none works

  • @supremebeme
    @supremebeme ปีที่แล้ว

    in UE5.0.3 keep simulation mode running while you bake to animation sequence!

  • @brianchen8396
    @brianchen8396 ปีที่แล้ว

    Very nice video but I can't also bake the rotation of head and neck to animation.......

  • @andresca1985
    @andresca1985 11 หลายเดือนก่อน

    why the UI controls of modifiers, only can be showed in secuencer and not out of there? thanks

  • @7_of_1
    @7_of_1 3 ปีที่แล้ว

    Super good. I’m dropping the character BP in the take recorder rather than the live link source, is that the wrong way to do it? It seems to work ok. When I add my live link source and record the take, the animation is empty for some reason. Great, thanks., EDIT, just watched end and you explain lol. I’m not doing any body capture at the same time so it seems to come in fine, and also can’t get live link data source to work, will try it again.

    • @FeedingWolves
      @FeedingWolves  3 ปีที่แล้ว +1

      Make sure you unckeck timecode when you select the live link source in take recorder. Also, make sure when you also have the timecode boxes unchecked in the take recorder settings as well.

    • @FeedingWolves
      @FeedingWolves  3 ปีที่แล้ว +1

      Also, by dragging the character into take, no, that is not the wrong way at all. However, when you record that way, when you look at the animation files you notice it is recording everything, the body, face, clothes, etc, so there is more processing power required, hence the data you are recording is not going to be as clean.

    • @7_of_1
      @7_of_1 3 ปีที่แล้ว

      @@FeedingWolves good, tip, will try it again today. Was hoping to try the Faceware 30 day, demo this weekend as well as they are going to be doing the indie license from next week they tell me.

    • @FeedingWolves
      @FeedingWolves  3 ปีที่แล้ว

      @@7_of_1 Yes, those timecodes being unckecked was how I got it to work. When they were checked on, I as well did not see anything in the subsequence file;)

    • @7_of_1
      @7_of_1 3 ปีที่แล้ว

      @@FeedingWolves just a quick one, sorry to keep bothering you, but I don’t find many people using the rigs as well. I did a couple of takes with Faceware over the weekend, pretty good out of the box. Is there anything special to get baked to the rig with it? I’ve got the sub scene take in sequencer and added the face and body animation tracks I recorded on the take, then baked to face and body rig. A couple of things happened, first is that the head rotation from Faceware didn’t work anymore without highlighting both anim tracks in sequencer and right click and select active, then I can see controls moving on the rigs etc, so all good. When I play back sequence now it’s gone from 70 FPS to 11fps in the viewport. It’s all a bit weird, do I need to bake down the control rig after adjustments I make back to an anim clip again? I can’t see to find anywhere the workflow to do it.
      Also I found on baking the face rig, a couple of controls were fired off a higher value around the nose. Sorry to ask but I can’t find anyone else actually using the rig to try and refine the anim after recording the take, but now I have a sequence stuck at 11fps with the control rigs, do you know where I can find decent documentation on the workflow for it. Sorry, again..

  • @DanielPartzsch
    @DanielPartzsch 3 ปีที่แล้ว

    Thanks for the video. I tried to bake my recordings that have been streamed in via faceware as shown in your video but unfortunately I don't get any animation data baked this way (and thus can not use it via an animation track). Can you please make a tutorial on how to bake your faceware recordings as well? Thanks again!

    • @FeedingWolves
      @FeedingWolves  3 ปีที่แล้ว +1

      Hi Daniel, because the Faceware/Glassbox pipeline to unreal does not use a live link source, instead the data is transferred via the Live Client plugin, you wont be able to record the live link source. Instead, you will have to drop your child character BP into Take Recorder and record that way.

    • @DanielPartzsch
      @DanielPartzsch 3 ปีที่แล้ว

      @@FeedingWolves Thank you for your quick reply :-) Yes, that's what I did and I can as well record this way. But the part where you bake this recording to an actual animation via Right Click >> Bake and based on this baking this to the Metahuman Face Rig I can not get to work unfortunately ...:(

    • @FeedingWolves
      @FeedingWolves  3 ปีที่แล้ว +2

      @@DanielPartzsch
      1. Add the character to sequencer,
      2. delete the control rig board for the face.
      3. Add the animation you recorded,
      4. Right click on the face track, select bake to control rig
      5. Select new control rig board, make additive track
      6. Right click on additive track and select key this section
      7. If you want to save this new animation just right click the face again and select, bake to animation sequence

    • @DanielPartzsch
      @DanielPartzsch 3 ปีที่แล้ว

      @@FeedingWolves sorry, I actually meant the part before that, starting at around 3min, when you bake the animation from the recorded sub scene and save it to a new animation. When I try to follow this, nothing has been baked / saved in my case (in contrary to what you get afterwards when you show the head only playing the animation in the animation editor. In my case the head is just doing nothing 😉). I will see if I can make a screencast tomorrow to maybe better show what I mean 😊

    • @FeedingWolves
      @FeedingWolves  3 ปีที่แล้ว +1

      @@DanielPartzsch I started a small discord for this kind of troubleshooting. Feel free to join in here;)
      discord.gg/WUxwBYu3

  • @mithunkrishna3567
    @mithunkrishna3567 2 ปีที่แล้ว

    what happens to head rotation how to add them

  • @elvismorellidigitalvisuala6211
    @elvismorellidigitalvisuala6211 3 ปีที่แล้ว

    How to add head movement? in my take i got only face expression.

    • @FeedingWolves
      @FeedingWolves  2 ปีที่แล้ว +1

      If you had the UE live link face Neck rotation activated and lost the neck rotation after baking to an animation sequence, what you can do is in sequencer once you add your animation, you should be able to keyframe in the metahuman components section "custom mode".

    • @elvismorellidigitalvisuala6211
      @elvismorellidigitalvisuala6211 2 ปีที่แล้ว

      @@FeedingWolves thank you so much

    • @Great_Cardinal
      @Great_Cardinal 2 ปีที่แล้ว

      @@FeedingWolves I added my animation, but I'm confused by what you mean here? I should add a keyframe and set the animation to "Custom Mode"? I did this, but when I go to bake to control rig, the head movement is still missing

    • @FeedingWolves
      @FeedingWolves  2 ปีที่แล้ว

      @@Great_Cardinal I see, so you are losing head/neck rotation when baking to control rig?

    • @FeedingWolves
      @FeedingWolves  2 ปีที่แล้ว +1

      When baking to control rig you will lose the head rotation. However I do have a solution for this..it does involve changing a few settings on the BP level. You are welcome to hop into my Discord server and i can link you the screen recording for this that has been shared in there.

  • @dam55ian
    @dam55ian 11 หลายเดือนก่อน

    so for metahumans there is a bake to control rig option for the face but there does not seem to be a control rig class for the body. how do u get that?

    • @FeedingWolves
      @FeedingWolves  11 หลายเดือนก่อน

      If the body control rig does not show up when you go to bake an animation, you can locate the metahuman control rig asset itself, go to the preview tab on the top right panel, assign the female medium normal weight preview mesh, then go to the root heirarchy tab on the bottom right click on it and select refresh (last option on the bottom) and reselect female medium normal (its the base skeleton) compile.

  • @TorQueMoD
    @TorQueMoD ปีที่แล้ว

    That seems strange that you're not using the Timecode from the recording. Wouldn't you need that for everything to sync correctly? Great video though :)

  • @verygoodfreelancer
    @verygoodfreelancer 2 ปีที่แล้ว

    Has anyone figured out how to get through this process and still keep the head rotation information from livelink? I can't figure it out 😢😢

  • @akbulutdes
    @akbulutdes ปีที่แล้ว

    There is a problem with baking step. When I bake the animation and open the file, face mesh is not previewing the animation. I tried to continue without seeing animation but its not working. I don't know how to pass that.

  • @xkigolx
    @xkigolx 2 ปีที่แล้ว +1

    I am having an issue where when I "Bake to Animation Sequence" none of the motion is captured it's just the face with the no movement. Any ideas? Ty for the video btw it's great!

    • @FeedingWolves
      @FeedingWolves  2 ปีที่แล้ว

      Are you using the UE Live Link face app?

    • @xkigolx
      @xkigolx 2 ปีที่แล้ว +2

      For anyone having this issue I solved it by hitting Play in Editor (not simulate) and THEN baked the animation.

    • @rkrehe20
      @rkrehe20 ปีที่แล้ว +1

      @@xkigolx BRUHHHH this was driving me insane, thank you for sharing the hot tip!

  • @andyhair4665
    @andyhair4665 ปีที่แล้ว

    Does anyone know how to combine a Livelink facial animation with a body animation. I've recorded a facial capture and want to use it in a scene where I have an existing body animation. When using both in the sequencer, the body moves but the head becomes detached and stays in place although the facial animation does play. How would I keep the head attached to the body so it all moves together? Any help is really appreciated

    • @FeedingWolves
      @FeedingWolves  ปีที่แล้ว +1

      You might want to try using a fresh metahuman. This sometimes happens when you use the metahuman you recorded the facial performance in the same level.

    • @andyhair4665
      @andyhair4665 ปีที่แล้ว

      @@FeedingWolves thanks for the reply. It's a different scene that I migrated the animation to although the metahuman is the same blueprint. I'll try reloading the metahuman into the scene

    • @andyhair4665
      @andyhair4665 ปีที่แล้ว

      @@FeedingWolves I managed to get the actual facial animation working in my scene, although head rotation doesnt work. I saw something on a forum about force custom mode but when I tried that the head rotation works but the head detatches from the body. I've a feeling this should be easier than it is lol. Have you any suggestions?

    • @FeedingWolves
      @FeedingWolves  ปีที่แล้ว +1

      @@andyhair4665 Do you already have body mocap or are you just trying to use the head and neck rotation?

    • @andyhair4665
      @andyhair4665 ปีที่แล้ว

      @@FeedingWolves the animation for the body is retargeted from mixamo. The head and face animation was recorded over livelink. I've backed the animation and migrated it to the required scene. Body animation is working fine, I can see the facial expressions moving but the head rotation isn't working. I had another scene where the metahuman was looking around and because the body was idle I could use force custom mode which got the head rotation working but on this scene the metahuman is walking so I can't use that option. I'm new to unreal so I'm learning as I go but it seems like 1 step forward and 2 steps back at the moment lol

  • @JenniferMcSpadden
    @JenniferMcSpadden 2 ปีที่แล้ว

    I can't seem to get this to work with the male rigs. Been reading that the female average rig is the only one that plays well, and the others have scaling issues. Have you been able to replicate these results with a variety of Metahumans?

    • @FeedingWolves
      @FeedingWolves  2 ปีที่แล้ว

      The live link source is not working on the males? Are you doing this in UE4 or UE5?

    • @JenniferMcSpadden
      @JenniferMcSpadden 2 ปีที่แล้ว

      @@FeedingWolves sorry, to clarify, the recorded results with the males have very inconsistent results. In UE4 and UE5. Like scaling issues on the recorded data.

    • @FeedingWolves
      @FeedingWolves  2 ปีที่แล้ว

      @@JenniferMcSpadden are you recording body or face data?

    • @JenniferMcSpadden
      @JenniferMcSpadden 2 ปีที่แล้ว

      @@FeedingWolves both simultaneously. Optical body data and face from Faceware. Fingers from gloves.

  • @lwcinema
    @lwcinema 3 ปีที่แล้ว

    Is there a button to putting down manual keys?

    • @FeedingWolves
      @FeedingWolves  3 ปีที่แล้ว +1

      Hi! I believe S is what what you mean? When you have "keyed this section" in the additive layer, when you add keys, you press S. Unless your question meant something else?

    • @lwcinema
      @lwcinema 3 ปีที่แล้ว

      @@FeedingWolves thanks that’s what I meant.

  • @CaseyChristopher
    @CaseyChristopher ปีที่แล้ว

    When I :Bake to Animation Sequence" at 3:15 .. it doesn't create animation. Just all tracks with flat lines when I open the created animation. I did right click on the face track when I did this too. Maybe the Unreal 5 version is different than this 4.x version? I feel like there has to be a missing step to tell the keyframes on the myphone track to actually drive the "face" track. The face animates with the live link correctly and when I look at the raw clip it animates correctly as well. Just getting it to jump from the myphone keyframes data to drive the "face" layer in the sequence we unlocked isn't happening. Bake produces tracks with flat lines. ---EDIT--- NEVER MIND... I Forgot to press the play button so that it is running properly. Sorry it works correctly now. :) Thanks

    • @FeedingWolves
      @FeedingWolves  ปีที่แล้ว +1

      Are you using UE Live Link face?

    • @FeedingWolves
      @FeedingWolves  ปีที่แล้ว +1

      In the comments here, Paul Grey mentioned he uses simulation mode running while you bake to an animation sequence in 5.0.3.

    • @CaseyChristopher
      @CaseyChristopher ปีที่แล้ว

      @@FeedingWolves Yeah it was the play mode not running. Sorry. Left the comment there in case anyone else makes the same slip up. Very useful info here. So many tutorials get the setup... but actually setting it up to include the iPhone data ... as well as baking it down to usable clips was essential for a traditional cinematics to render workflow. Thanks for the help.

  • @Scultore3D
    @Scultore3D 3 ปีที่แล้ว

    Hey there,
    I can create 52 blendshapes on a custom character, like the ones on the metahuman. Hit me up if you need any.

  • @ApexArtistX
    @ApexArtistX 10 หลายเดือนก่อน

    no need to bake.. just import into main sequence the data

  • @NicolasLekai
    @NicolasLekai 3 ปีที่แล้ว

    how do u find the message u wanna reply to, I hate TH-cam replies...lol

    • @FeedingWolves
      @FeedingWolves  3 ปีที่แล้ว

      haha did you figure out the neck rotation issue being lost after being baked to the control rig?

    • @NicolasLekai
      @NicolasLekai 3 ปีที่แล้ว

      @@FeedingWolves yes, if figured it all out...buggy shitshow that it is..in the end I forced stuff with blueprints...super annoying...dont; get me started on Rokoko...lol.

    • @josem.norton4136
      @josem.norton4136 2 ปีที่แล้ว

      @@NicolasLekai Any chance you can share the solution, please? Also, @feeding_wolves, is the discord server still up? Couldn't join for some reason. Thanks!

    • @NicolasLekai
      @NicolasLekai 2 ปีที่แล้ว

      @@josem.norton4136 sorry Jose I already forgot...lol...been in bug world of UE ever since...

  • @gameswownow7213
    @gameswownow7213 2 ปีที่แล้ว

    this is "arkit"?

  • @BorisMagazinnikov
    @BorisMagazinnikov 2 ปีที่แล้ว

    Thank you!

  • @DesignsbyElement
    @DesignsbyElement 3 ปีที่แล้ว

    Thank you!

  • @cinemyscope6630
    @cinemyscope6630 3 ปีที่แล้ว

    Thank you!