- 26
- 68 068
Toby VirtualFilmer
United States
เข้าร่วมเมื่อ 26 ก.พ. 2020
Toby from VirtualFilmer.com
How to model a Teddy Bear for Unreal with Maya (with XGen fur)
If you want to model a teddy bear or any type of cartoon 3d creature, this is the tutorial for you!
I show you how to model the teddy bear with Maya 2025, add fur with XGen, texture with Substance 3D Painter and then end up in Unreal Engine.
I show you how to model the teddy bear with Maya 2025, add fur with XGen, texture with Substance 3D Painter and then end up in Unreal Engine.
มุมมอง: 321
วีดีโอ
Full body mocap test. Rokoko Smartsuit Pro II, Smart gloves, Head Mount etc (SEE DESCRIPTION)
มุมมอง 6555 หลายเดือนก่อน
Just a test of Smartsuit Pro II, Rokoko Smart Gloves, and Rokoko Head Mount. This is NOT with a Coil Pro, because my smart gloves are too old to support the Coil Pro apparently. I recorded it twice - once with each actor's role. I recorded facial performances with the Rokoko Head Mount and an iPhone 13 mini, and then processed the face animation with MetaHuman Animator (Ue5.4.2), then voice cha...
Rokoko Coil Pro with Smart Gloves - shaking and not accurate?
มุมมอง 2005 หลายเดือนก่อน
I'm getting very shaky results with the Coil Pro and my smart gloves. I've followed the directions, but haven't found any "how to stop it shaking" info yet!
Partial fix for hair grooms with shallow depth of field in Unreal 5.4 (see description)
มุมมอง 6765 หลายเดือนก่อน
NEW UPDATE JULY 6, 2024: Im trying to “fix” shallow depth of field with metahuman hair and fur in Unreal 5.4. I thought i had solved it, using a fast process of exporting the background and character separately. But it only works on daylight scenes - well, it works on those because the DirLight is the only light source affecting the character. So the character looks fine when rendered by itself...
Cascadeur 2024.1.2 is broken with Rokoko animations!
มุมมอง 2436 หลายเดือนก่อน
Unfortunately, Cascadeur's animation import doesn't work with UE5 Mannequin or UE4 Mannequin animations exported from Rokoko Studio.
NVIDIA Omniverse Audio2Face 2023.2.0 with a MetaHuman (in Unreal 5.3)
มุมมอง 1.3K7 หลายเดือนก่อน
NVIDIA Omniverse Audio2Face has definitely improved! These results are with "2023.2.0". But I find that it still struggles with the Australian accent. Still, it's much much better (with Australian accents) than iClone's AccuLips, which unfortunately doesn't work at all. HOW WAS THIS CAPTURED? This was captured using Omniverse Audio2Face which is a part of Omniverse Launcher, a free program by N...
UE5.4 cinematic clothing! (Marvelous Designer via USD) ***UPDATE JUNE 29 2024 in description!!**
มุมมอง 11K9 หลายเดือนก่อน
UPDATE JUNE 29 2024 You can now download the fully demo! And sign up for free Marvelous Designer etc etc. I've written out the full instructions for how to get the full UE project with the clothing simulated here: virtualfilmer.com/gdcdownload/ And if you want to sign up for a one-year free Marvelous Designer, here's the info: create.fortnite.com/news/new-talisman-demo-templates-now-available-f...
Move.AI motion capture problems (Beta feedback Inc uploading problems, calibration failure, etc)
มุมมอง 264ปีที่แล้ว
I love MoveAI - I can't wait for it to work really well! So far, I've found the app experience a little frustrating, but I'm sure they will fix these issues soon. I'm a paid customer ($50 per month for 5 minutes of capture with 2 cameras) and am hoping that I can figure out ways around all of the problems in this video, because it really has the potential to be awesome for solo creators like me...
How to make a CC4 (CC3+) body in iClone that matches a MetaHuman
มุมมอง 1.1Kปีที่แล้ว
Here's quick and dirty method to make a CC4 body (CC3 ) in iClone 8/Character Creator 4 that matches a MetaHuman. AND gives you an animatable face!! You can use this method so that live link works much better. Check out my new Live Link tutorial for details: th-cam.com/video/ZQZ3fMwaUZY/w-d-xo.html
How to livelink a CC4 character in iClone 8 to a MetaHuman in UE5.2 (LINK MOVED SEE DESCRIPTION)
มุมมอง 3.3Kปีที่แล้ว
UPDATE AUGUST 2024: For a technical reason I had to move the ZIP file on my Virtualfilmer site (it stopped allowing me to put files in the root directory). It's now here: virtualfilmer.com/downloads/StyleMarshall_CC4_UE52_LL_Kit.zip IMPORTANT INFO: The amazing StyleMarshall/Bassline303 weighed in after I posted this video with some important new info - please see this post on the official forum...
Export MetaHuman Animator from Unreal as a CSV (SEE DESCRIPTION!! UPDATE SEPT 2 2023)
มุมมอง 1.2Kปีที่แล้ว
Just a quick very rough video on how to export MetaHuman Animator animation from Unreal as a CSV file, for use in my FACE ANIMATION IMPORTER plugin for iClone. To use this YOU MUST HAVE THE "FACE ANIMATION IMPORTER" BETA DOWNLOADED. This process REQUIRES a script that in those files. You can sign up FREE at www.virtualfilmer.com The full instructions are here (new URL): virtualfilmer.com/iclone...
MetaHuman Animator to iClone 8 to Unreal Engine 5.2 test
มุมมอง 1.7Kปีที่แล้ว
Just a quick test, not perfect yet! I'm working on iClone 8 plugins that import MetaHuman Animator animation and then process it/adjust it. Then you render the result inside Unreal Engine 5.2 HOW WAS THIS DONE? I recorded this with my iPhone 14 Pro Max, using Epic Games' LIVE LINK FACE app, set to 'MetaHuman Animator' mode. I then transfered the result clip to my PC. Then I changed my voice usi...
MetaHuman Animator in iClone (see description)
มุมมอง 937ปีที่แล้ว
Development continues on my iClone plugin. The Beta (on virtualfilmer.com) still only includes livelink arkit and Rokoko face, HOWEVER I’m preparing to release a new Beta with MetaHuman Animator support soon. As you can see above, the Visemes still need work. I only have weekends to work on this project, and I’m coding it myself, so it’s taking a while to problem solve.
New FREE iClone 8 plug-in: FACE ANIMATION IMPORTER (Open Source)
มุมมอง 6Kปีที่แล้ว
Ever wanted to import facial animation into iClone? Now you can! With the new free plug-in FACE ANIMATION IMPORTER, currently in final stages of development and releasing soon. It is going to be open source, which means everyone out there can contribute to making it the best we can get it. :-) Check out this video showing how easy it is to import facial animation in various formats. At launch, ...
ANNOUNCEMENT: NEW MetaHuman iClone 8 method (with Unreal 5)!
มุมมอง 6Kปีที่แล้ว
Say goodbye to having to use a SEPARATE head and body for a Metahuman character in iClone! This new animation technique makes it fast and easy to animate metahumans in iClone 8 and see exact results (in real-time) in Unreal 5. I call it a "Franken-MetaHuman" technique LOL. You'll see why! :-) Check it out and let me know if anyone else has been doing something similar. I find it makes animating...
Maya and MetaHumans - My own custom FACE and BODY tools
มุมมอง 1.3K2 ปีที่แล้ว
Maya and MetaHumans - My own custom FACE and BODY tools
Faceware Retargeter In Maya With iClone and Character Creator Characters
มุมมอง 5512 ปีที่แล้ว
Faceware Retargeter In Maya With iClone and Character Creator Characters
TUTORIAL: Simple METAHUMAN Animation (Maya/UE5.0.2) Part 5
มุมมอง 6822 ปีที่แล้ว
TUTORIAL: Simple METAHUMAN Animation (Maya/UE5.0.2) Part 5
TUTORIAL: Simple METAHUMAN Animation (Maya/UE5.0.2) Part 4
มุมมอง 1.1K2 ปีที่แล้ว
TUTORIAL: Simple METAHUMAN Animation (Maya/UE5.0.2) Part 4
TUTORIAL: Simple METAHUMAN Animation (Maya/UE5.0.2) Part 3
มุมมอง 8422 ปีที่แล้ว
TUTORIAL: Simple METAHUMAN Animation (Maya/UE5.0.2) Part 3
TUTORIAL: Simple METAHUMAN Animation (Maya/UE5.0.2) Part 2
มุมมอง 1.3K2 ปีที่แล้ว
TUTORIAL: Simple METAHUMAN Animation (Maya/UE5.0.2) Part 2
TUTORIAL: Simple METAHUMAN Animation (Maya/UE5.0.2) Part 1
มุมมอง 4.8K2 ปีที่แล้ว
TUTORIAL: Simple METAHUMAN Animation (Maya/UE5.0.2) Part 1
UE HOW TO: Move blueprint actors together at runtime, using parent / child, in UNREAL ENGINE
มุมมอง 7K3 ปีที่แล้ว
UE HOW TO: Move blueprint actors together at runtime, using parent / child, in UNREAL ENGINE
Help! The new Unreal camera shake blueprint (MatineeCameraShake) as of 4.26 refuses to edit :-)
มุมมอง 1503 ปีที่แล้ว
Help! The new Unreal camera shake blueprint (MatineeCameraShake) as of 4.26 refuses to edit :-)
Animating with Metahumans in Unreal: Converting an animation clip to keyframes with Control Rig UE4
มุมมอง 16K3 ปีที่แล้ว
Animating with Metahumans in Unreal: Converting an animation clip to keyframes with Control Rig UE4
Good one!
how about buttons etc?
Hey I just got my suit and its extremely bad. Did you find a solution? My legs drag, body shifts... almost like my character is stuck in mud / drunk
@@PainnGames The best thing to do is to make a video of the mocap you’re getting and contact support. They take a few days and are on Greek time (which from USA can be very frustrating) but it’s worth it to just step through it with them. It’s never going to be “perfect”, but definitely can be improved with their help. Do you have gloves and coil pro? Both are necessary for better results because then there is some sort of positioning for the maths to work from. (The suit otherwise is just using math to work out how you’re moving)
@@PainnGames also it’s vital to learn how to use the cleanup features in the Rokoko studio program. You’ll notice that every time he captures, Sam always goes through and adjusts lots of foot movement etc in the software. It really makes a big difference.
@@virtualfilmer thanks for your reply, I have the suit and gloves. I heard the coil pro is ONLY for the gloves and has little to no influence on the suit. I'll record a side by side comparison of my mocap recording and send it to them for help
@PainnGames The coil pro, you’re correct, is only for the gloves. However; once Rokoko studio “knows” where the gloves are, it really massively improves the suit in my opinion. I’ve seen a big difference. It’s not perfect but it’s good. I think inertial mocap suits always need a lot of cleanup. I use “Cascadeur” mostly for that, and it makes a big difference to the result.
@@virtualfilmer you export your "cleaned up" mocap animation from Rokoko to Cascadeur to further clean it up again?
😂😂 i cancelled my Rokoko order.
Have you looked at the electromagnetic interference at the place you're recording at? There's a video on the rokoko channel that recommends some apps on iphone and android and they recommend values < 4 on the unit the app uses. Apparently anything more than that and it starts messing with the IMUs of the suit.
@@npatch I tested the em but yeah it was aok. For me, the bottom line is that I didn’t understand how “approximate” intertial mocap is. It has no idea where you “are” in a space, it only knows the rotations of individual sensors. The rest is math. The Rokoko algorithm is fine but not great. It is trying hard to guess where the suit “is” but of course it’s only approximate; so it gets it wrong a lot.
@@virtualfilmer yeah. I think I heard somewhere are relative to router but I haven't verified it. That said, accuracy is the whole reason they added the coil pro. Also for interactions between multiple suits which seeingly have problems when touching (handshake) etc.
Exactly. I have the coil pro now (since I recorded the video) and I don’t know why my hands don’t “match” accurately for hand touch. They still don’t clap together accurately for example. And it doesn’t really help much with walking etc.
Did you contact rokoko? Because the demos they show, even the ones that were live streamed were very accurate. The only things I’d expect to have impact on this are the wifi not being wifi 6 and using the 5GHz band and then the interference. Otherwise I’d contact rokoko for support or even a refund in the coil.
such a Greatttttttttttttttt
FRANKEN-HUMAN!
This was cool until i tried to download the Style Marshal files which have been removed from the site!
@@BlkAresGaming That’s so weird, the file was there even a month ago! Looks like it’s gone and also the page with descriptions. Sorry about that, I’ll look when I’m next at my computer. Very odd
Ok I've fixed the file - it seems that I'm not allowed to put files on my site in the root directory anymore. So it's now here: virtualfilmer.com/downloads/StyleMarshall_CC4_UE52_LL_Kit.zip
@@virtualfilmer YOU ARE AMAZING MAN!
I am having the exact same problem, I hope you have the time to make a tutorial. Earned a subscriber either way.
@@phoenixstudios7520 I wish I did too. I’m trying to find the time but I am doing a huge amount of overtime etc and can’t find the time on weekends cause of chores. But yes I’m trying.
Thank you for posting this test. When they stand still, the results are pretty neat. But you are right that, when they walk or turn, the results seem unusable (to me at least). If the suit is showing 100% green indicators in Rokoko Studio, as you say in your description, there is nothing else you can do to fix this?
@@phoenixstudios7520 In Rokoko Studio turn off locomotion and the other filters, and you’ll see the core issue. The suit isn’t getting any data at all about movement in 3d space, it’s purely getting rotation data of the sensors. So it’s then using maths to figure out where the suit is in 3d space, and yeah it’s not… good. It’s better than nothing but not good. There are two options to improve it: option 1: you can get gloves and coil pro. The coil pro helps the math because it at least has a reference to where your hands are, but it’s still not amazing. Option 2: get a Vive tracker and base stations. but there’s no easy way to feed its 3d location and rotation into Rokoko studio. You have to make your own custom api program (!) to feed the location of the tracker in. And it only can be assigned to hip bone. Thats it.
@@virtualfilmer Thanks for this detailed reply. I am thinking about getting a rokoko smartsuit, but now I am having some doubts. Maybe its best to save some money and buy a more expensive suit
Rokoko is fantastic for single motions for games and short animations. I only have issues because I’m trying to record actual acting scenes. So check out all of Sam’s great Rokoko videos - and he shows off the suit etc, and if his recordings are enough for you then buy. They are great people and trying hard. And there’s nothing at this cheap price point that comes close. But yeah it’s not good for actual performance capture for dialogue scenes
@@virtualfilmer I want to make animated films using UE, so our purposes are similar. I need solid mocap performance. I actually think that, as AI suitless mocap is developing, suits will gradually become cheaper. Maybe I will wait a while and go with the flow.
can you explain pls how can you do this with sequencer
can you share the characters maybe?
Oh no! How complicated it is... There is so much to do and this is just one character. What to do if there are many characters in the scene? My opinion is that all these 12 steps need to be automated. Only who will do it...
@@digor72noname86 Absolutely! It’s useful to know about but not a good solution
@@virtualfilmer There is no official workflow provided by Reallusion on how to do it?
Would love to see FBX Blendshape data support!!!
Can you get the MD license and use it just with UE5? Not UEFN. I click the link you provided for the one-year of MD but don't see where to get a code. Thoughts?
Thanks for the info! Whenever i try to set the skeletal mesh in transferSkinWeights node the app crashes, i'm trying to use a metahuman body skeleon (w/o head and a custom shapekey), do you know why this happens?
@@KotobaProject-KOTOBA It tends to crash for me all the time. No, I don’t know why it happens. It seems to be quite random. I think honestly we are better off waiting for 5.5 at this point. But yeah if you want to keep trying, download the UEFN sample that includes PIA (metahuman). m.th-cam.com/video/xjhTVSuawoA/w-d-xo.html
Thanks for sharing this great info! I now try to find how to automatically transfer physics properties of materials from MD to UE, but can't find the exact workflow. Can you help me please, have you find this out? How do you do this exactly?
@@lovagrus in MD in the export options, you tick the checkbox so that it sends the info across. But it doesn’t work properly. If you go through the official epic tutorials (the pages they wrote about these topics) it specifically says that they haven’t implemented physics settings for fabrics properly yet. It’s all basically alpha or beta at this point, and doesn’t work.
it works normally but whenever i click run it doesnt animate the according to the livelink
thank you for your test! you are right the hand gesture looks creepy, I prefer to use the hand gesture library or puppet-like in Iclone. Also, the balance of the character while wavering looks bad. I think Rokoco is more into marketing their product than support and development, they should use AI to clean and balance the mocap like in Casscadeure
Same problems with the legs... I've smarsuite pro 2 and gloves. Not bad as raw data, but a cleanup is really needed, maybe cascadeur it can help on this? Looking for an automatic cleanup solution.
@@elvismorellidigitalvisuala6211 Yes I’m learning Cascadeur indie to try to do that. It’s tough because part of the issue is the unnatural and random “leaning” of the characters. And there’s no easy fix for that because Cascadeur just tries to balance the character (auto physics rather than “fix” it :)
@@virtualfilmer I see... ok let's see we we find a solution, if I find all post a video and tag you :) Thanks!
Hi Tobby. I participated in their kickstarter at the time and was able to buy a suit for $700. I find it extremely complex to implement (wifi connection is PITA). I have never managed to get a usable result. I posted a video on the Reallusion forum and Rokoko asked me to remove it. I have 2 Neuron mocap Gen1 suits and I am satisfied, however it still requires cleaning work.
@@Tontonmarto I really love their gloves and I like the upper body mocap. I just wish the walking wasn’t so unnatural!
Hello there, I am in the process of deciding if I should buy a rokoko or a perception neuron suit. Can you please elaborate on the part that it needs cleaning work? You mean it's jittery? Or it has other problems as well?
Thanks for the sharing
I bought Smartsuit Pro I years a go and tried it for a month but it was so bad, even cleaning the Mocap was time consuming and frustrating, so I admitted that I lost my money and hopping one day they will have some type of upgrade in the software or in the frameware but they never do anything about it then they came up with Smartsuit Pro II..... "fool me once, shame on you, fool me twice, shame on me''
lol the girl dragging her leg at the end was priceless.
Read your description, yes it looks very robotic at times.
Was this adjusted at all? If this was raw it’s okay. Some of it looks fine but there’s some “swooping” for sure
@@ryanisanart it was cleaned up in Rokoko studio using their filters (foot contact segments etc) but that’s all. it’s so jerky though, no?
This really was to test body mocap. I am looking for any hints and tips? So far am not pleased with smartsuit pro 2 with walking or turning etc. Jerky and fake looking to me, for cinematics. See description for info.
Dont buy rokoko - it is one of the worst motion system on the market. If you buy - sell. Buy xens or perfeption neuron
What if you import to UE and then export animation?
@virtualfilmer hi toby, is there an update on this issue yet?
@@MrMarniche Yes, rokoko support has confirmed my gloves are the original type - that aren’t compatible with the coil pro. So I have to send them to Denmark to be adjusted. Then they will send them back. Then they should work.
@@virtualfilmer Happy for you. Good to know. Quite a hassle but at least there is a solution 🤷♀
Wow, I was under the impression that Facial animation could be imported with the rest of the body. You may have just saved me thousands of dollars. Im in the middle of planning the workflow for my animated series and have been setting up to go the route of using Mocap Fusion to simultaneously record the body mocap, face mocap, voice, and position in space. As this program allows your actors to do all the above by wearing VR Headsets and see each other in the actual environment the scene is taking place is, which is revolutionary.. However my intention is get that animation data to iclone for refining/cleanup them Unreal engine for the Mastering/final renders etc.
@@WillyCash69 happy to chat it through buddy if you need any advice. Email me at Toby at virtual filmer dot com
@@virtualfilmer Cool just emailed u !
If you're using Rokoko in an apartment, good luck with all that WiFi noise. Might as well move to a shack in the middle of nowhere-that's the real sweet spot!
@@Metarig 😂 It’s so tough. We all need to win the lottery and get Xsens - but it’s out of my budget unfortunately
Hi Toby, I know you are already in contact with support about this, but as you have been informed, you have an older version of the gloves that is not compatible with the Coil Pro. There are few of these older gloves on the market and they have a component on them that doesn't work with the Coil as intended. I'm sure our support is helping you with an upgrade to the correct version of the gloves. What you are showing in this video is of course not the performance you should see from the Coil Pro.
@@RokokoMotion Hi! Oh that’s so interesting!! Yes I contacted support a couple days ago but haven’t heard back yet. But a user on discord said it was possible that I had an older glove. Let’s hope support sorts it out. I wish they had checked what gloves I have before sending me the coil. :( I purchased in 2021 and have been a regular purchasing customer since then.
@@virtualfilmer Hi Toby, great that you are in touch already. I'm confident they can sort it out quickly. We have attempted to reach out to all customers with this generation of gloves when we see the Coil order come in, but it sounds like it wasn't done here. Again, it only applies to a small % of gloves we have on the market, and we were caught by surprise by it after we started shipping and therefore have had to catch up a bit since. We apologize for the inconvenience. Looking forward to getting you up and running again and hearing your take. Thanks for being one of the early adopters of this tech. There's such masive potential to unlock in future software/firmware upgrades. Great to have you on the ride :)
@@RokokoMotion How do I know if my gloves are to old as well. I am seeing similar issues and I purchased the gloves in oct 2021. Could you confirm a way to know if they are the reason for my issues with my coil pro?
did you input the height from your coil pro into rokoko studio? Also try to put your coil on a wooden surface.
Wooden surface interesting - no havent heard of that as a fix! The demo with Sam is on a tripod so I did that. Will try wood. :-) And yes did height yep - entered it manually yep
Have you tried the new Md > UE live link plugin?
Yes! It doesn’t save me much time - it’s fine but it’s just transferring baked, pre-computed simulations. I really want my simulations to be “live” inside of Unreal. I was hoping the chaos cloth new process with 5.4 would be better.
@@virtualfilmer I've seen a live sync demo transferíng real time cloth to a player character so it definitely isn't limited to baked anim.
Okay, I haven’t seen that. “Live sync” in marvelous designer, on their original TH-cam, is shown to transfer only baked anims. So I’d love to see where you saw real time cloth transfer? Only realtime cloth transfer I’ve seen is via usd, then imported into ue5.4 but the data flow layers don’t work very well. Here's that process: th-cam.com/video/Nf6OUDoPCPs/w-d-xo.html&ab_channel=MarvelousDesigner
@@virtualfilmer I'm sorry but I think you're right. The realtime character controlled demo I saw was using the USD workflow, not the livesync. It does look like the livesync is only for baked/cached sims. 😬
Check out the description for more info! Updated June 29, 2024 :-)
yes looks promising! really interesting) 👍👍 well done!
❤❤❤ wheres the tutorial?
I’m working on it don’t worry! :) thanks for watching!
@@virtualfilmer im to waithing
Hello, thanks for the help. When i ad the animation to my metahuman in the sequence; the head doesn't move... only the face. Could you help? Thanks
This might help: virtualfilmer.com/iclone-face-animation-plugin/iclone-plugin-face-animation-importer-mha-head/
Did you ever manage to get good results out of this? I tried the rokoko ai and wasn't very impressed but this seems to be far more expensive now. Ended up with a rokoko suit, works great!
No, I tried and tried but it needs many cameras to be good, I think. And it just had so many bugs! I cancelled my cheap (relatively) multicam subscription because jt just never worked.
This is sort of like when a ventriloquist controls a puppet, you see the puppet's mouth somewhat moving but the voice just doesn't match it LOL
😂 so true
Very cool
After I "Load Skeleton Definition" I get an error that I'm "Missing Required Bones". Any idea what i can do? I havne't adjusted my metahuman rig at all?
Metahumans have changed their bone setup since this tutorial - so I’m not sure, sorry :(
does it export with thickness
All my clothes appear to be overly glossy.
It is possible to get head movement data?
Well that's unfortunate. Hopefully they fix this soon.
cloth accet is not visible in physics in unreal engine .. pls tell what I can do