Thank you for the nice tips, especially about using both the Livelink and Aurosetup for rendering. Also, congrats for your very useful Ultimate expression bundle. You’re a great help.
Thank you for the kind words about the video, Argus. Sincerely appreciate it. And I’m glad to hear that you’re finding the Ultimate Expression pack helpful 🙏🏽. I also appreciate your contribution on the Reallusion forum 😊
You should get a job making tutorials for RL. This was way easier to follow and more informative than any of their live link videos. I do wish you'd mentioned the render queue thing at the start as tbh that is a deal breaker!
Thanks. I would suggest applying the skin of one of the brand new CC4 toon style-base characters, that comes free with the latest 4.4 update. (forum.reallusion.com/545554/【Christmas-Gift】Download-New-CC-Toon-Base-for-Free) Then you can change the skin colour, tone or texture; either inside CC4 itself or by exporting the skin texture map image to a photo-editor like Photoshop, Sketchbook (which is free), Krita, etc. and change it there I covered changing the skin colour in this TH-cam Video (th-cam.com/video/NesqH1H_RYE/w-d-xo.html ) You would just need to skip to 3:11 to see how this is down. I hope this helps
Thank you very much for your tutorials for Iclone -> Unreal. I have a question though. When you do the voices in Iclone, with lip sync etc. How do you get the sound into Unreal? Is it just the same audio files added to the sequence / track?
Hey Thomas, So, unfortunately the audio/voices don’t automatically transfer to Unreal, you would need to import it in separately. However, if the voices sync in iClone I found that it syncs in Unreal too (provided your iClone animation and Unreal import are using the same frames per second) I usually just reapply the voices again in post, during editing.
Yes it is very convenient @designomomento. That is why I prefer bringing my characters in via CC4 where they can be automatically retargeted for most 3D animation software or game engine.
No, I prefer to set up my cameras and lights inside Unreal . I found that I have a lot more control that way and it’s much easier to make changes. I used to transfer the camera and lights quite easily when I did before, however. Which version of iClone and Unreal are you currently using? Let me see if I have any problems transferring it and come back to you.
I believe that we learn from each other 😊 So, I’m happy to pass on the knowledge that I have and answer any questions, or struggles you might be having.
Nice tutorial. I real could render in MOVIE RENDER QUEUE. The only problem i found was that no physic works in movie render queue. But even though, the AUTO SET UP is more consistent and live link. But live link is Faster.
I am having an incredibly annoying problem that maybe you can help me with. Here is the scenario, I have a character in iclone that is Live Linked to Unreal. I have set up all the necessary timecode settings, they are both on 60 fps, I have gone through all the steps many many times. I need to record in "by frame" instead of "realtime" because im getting a lot of frame drops when recording in realtime. The only thing is once I have recorded the animation in Unreal, It plays back exactly as it was recorded. Very slow and playing every frame. My question, how do you record in "by frame" mode but still be able to play back the animation in real time in Unreal once its recorded? Hope you can help. Thanks!
Hey Cole, It’s hard to say for sure without seeing your scene and your process, but it could be related to a lack of PC memory. What are your computer specs? Try everything to save memory resources during the recording - ie: * Switch to Unlit in Unreal and the lowest visual setting in iClone. * Delete everything else in the iClone scene except the character or object you’re animating, then record the animation. It could also be a setting. So, another thing you could try (which I needed to do a few times) is to: 1. save your character and his motion in iClone. 2. Now open up the last iClone project that worked properly when you transferred the animation to Unreal. 3. Save a copy of the project in iClone. 4. Now delete everything else in that scene. 5. Bring in the character/s and the animation that you want to transfer correctly. 6. Finally re-transfer the character and record the animation, as per the steps in my video. As a work around and last resort you can adjust the speed of your characters animation inside the Unreal Engines sequencer I hope this helps. Please let me know if that works. Want to help you get it right.
Thank you for this but the issue really isnt in iclone. Its in Unreal. I noticed in your video that you record from iclone to unreal with "By Frame" checked on in iclone. When you do that process does your animation play in "realtime" in Unreal once the take has been recorded? Thats what I am trying to figure out. I have a super powerful machine so that shouldnt be the issue. I have a GTX 4080. @@wholesome3d
I see, you said adjust the speed in Unreal. That could work but for lip sync it would never line up. I’m so surprised there isn’t an official workflow for this. The frame dropping issue has been a thing for a long time with live link.
@@colehiggins111 To comment on your initial question - Yes, the recording is done ‘by frame’ but it plays back properly at normal speed in Unreal. The only time I’ve experienced dropped frames was when my recording was made with ‘glitches’ I would simply re-record that animation (with less PC resources) and it plays back perfectly. With your system you shouldn’t be having this problem and you may be right- it could be a setting in Unreal engine, under the Live-link or recording window. If are willing to We-transfer me a video recording of your process and settings, you are welcome to send it to Wholesome3D@gmail.com. I have however just stepped into my office (for my day job) and will only be able to look at it this evening.
in every aspect Auto setup is better than Live Link as far Animator libery, control over character and render quality is concerned but auto setup has some big disadvantages like relative positions between the characters and manual positioning of character within the set. I hate live link and still using auto setup as recently you can do some fine tune in your animation through Modular Control Rig system. Please make a tutorial on those problems in auto setup and how we can overcome it. I actually forced to shift Blender Eevee RayTracing as there is wonderful plugin CC Blender Tool which will act both auto setup and live link between iClone and Blender. Similar single plugin should be implemented in Unreal-iClone pipeline also so that animator could take highest advantage of both modules (Live Link and Auto Setup).
Thanks for taking the time to comment. Yes, I totally agree with you - the Autosetup plugin is better. My workflow has changed since I made this video. Since, the introduction of the automatically generated control rig in UE, I don’t actually use Live link anymore. Now I fix all the minor face and body animations using the Control Rig, (after bringing my characters in with Autosetup.) I haven’t actually tried the new Blender plugin, but will give it a go! Thanks for the recommendation.
Thank you for the nice tips, especially about using both the Livelink and Aurosetup for rendering. Also, congrats for your very useful Ultimate expression bundle. You’re a great help.
Thank you for the kind words about the video, Argus. Sincerely appreciate it.
And I’m glad to hear that you’re finding the Ultimate Expression pack helpful 🙏🏽.
I also appreciate your contribution on the Reallusion forum 😊
Just started watching your videos. I would be interested in Iclone to Unreal course for beginners. Begin to end. You know what you are doing…
Thanks so much. Very kind of you to say. If time allows, i will definitely consider doing that.
You should get a job making tutorials for RL. This was way easier to follow and more informative than any of their live link videos. I do wish you'd mentioned the render queue thing at the start as tbh that is a deal breaker!
Thanks, Kerb. Appreciate your positive feedback. Yes, the Movie Render Queue is why I use UE5.
Amazing job! Thank you for sharing ❤
Thank you 🙏🏽
Great Tutorial 👍
Thank you. Glad you liked it 😊
Thank you.. any video for skin color like Disney I hope to hear from you
Thanks.
I would suggest applying the skin of one of the brand new CC4 toon style-base characters, that comes free with the latest 4.4 update. (forum.reallusion.com/545554/【Christmas-Gift】Download-New-CC-Toon-Base-for-Free)
Then you can change the skin colour, tone or texture; either inside CC4 itself or by exporting the skin texture map image to a photo-editor like Photoshop, Sketchbook (which is free), Krita, etc. and change it there
I covered changing the skin colour in this TH-cam Video (th-cam.com/video/NesqH1H_RYE/w-d-xo.html ) You would just need to skip to 3:11 to see how this is down.
I hope this helps
Thank you so much :-)
@@shivangipriya4153Pleasure. Let me know if you get stuck. Happy to assist , if necessary
ohh very thanks .. you are the first person reply in good and nice.. could you please provide me your email which I can contact you. Thank you
Thank you very much for your tutorials for Iclone -> Unreal. I have a question though. When you do the voices in Iclone, with lip sync etc. How do you get the sound into Unreal? Is it just the same audio files added to the sequence / track?
Hey Thomas,
So, unfortunately the audio/voices don’t automatically transfer to Unreal, you would need to import it in separately.
However, if the voices sync in iClone I found that it syncs in Unreal too (provided your iClone animation and Unreal import are using the same frames per second)
I usually just reapply the voices again in post, during editing.
Live link is really handy. I tried to retarget a mannequin in UE couldnt succeed. Anyway it is nice also to use with CC4 characters.
Yes it is very convenient @designomomento.
That is why I prefer bringing my characters in via CC4 where they can be automatically retargeted for most 3D animation software or game engine.
do camera transfer work with live link because its not working for me oh and neither is the lights transfering.
No, I prefer to set up my cameras and lights inside Unreal . I found that I have a lot more control that way and it’s much easier to make changes.
I used to transfer the camera and lights quite easily when I did before, however.
Which version of iClone and Unreal are you currently using? Let me see if I have any problems transferring it and come back to you.
@@wholesome3d thank you it's clear i need more knowledge of unreal
I believe that we learn from each other 😊
So, I’m happy to pass on the knowledge that I have and answer any questions, or struggles you might be having.
Nice tutorial. I real could render in MOVIE RENDER QUEUE. The only problem i found was that no physic works in movie render queue. But even though, the AUTO SET UP is more consistent and live link. But live link is Faster.
Yes, I forgot to mention the physics. Thanks for sharing your experience, 24vencedores11. Really appreciate it
@@wholesome3d They say in unreal 5.3 is now working fine. I didn't even try. Can you give a try ?
Just updated to iClone 8.4. Will give it a go and let you know 👍🏽
I am having an incredibly annoying problem that maybe you can help me with. Here is the scenario, I have a character in iclone that is Live Linked to Unreal. I have set up all the necessary timecode settings, they are both on 60 fps, I have gone through all the steps many many times. I need to record in "by frame" instead of "realtime" because im getting a lot of frame drops when recording in realtime. The only thing is once I have recorded the animation in Unreal, It plays back exactly as it was recorded. Very slow and playing every frame.
My question, how do you record in "by frame" mode but still be able to play back the animation in real time in Unreal once its recorded?
Hope you can help. Thanks!
Hey Cole,
It’s hard to say for sure without seeing your scene and your process, but it could be related to a lack of PC memory.
What are your computer specs?
Try everything to save memory resources during the recording - ie:
* Switch to Unlit in Unreal and the lowest visual setting in iClone.
* Delete everything else in the iClone scene except the character or object you’re animating, then record the animation.
It could also be a setting.
So, another thing you could try (which I needed to do a few times) is to:
1. save your character and his motion in iClone.
2. Now open up the last iClone project that worked properly when you transferred the animation to Unreal.
3. Save a copy of the project in iClone.
4. Now delete everything else in that scene.
5. Bring in the character/s and the animation that you want to transfer correctly.
6. Finally re-transfer the character and record the animation, as per the steps in my video.
As a work around and last resort you can adjust the speed of your characters animation inside the Unreal Engines sequencer
I hope this helps. Please let me know if that works. Want to help you get it right.
Thank you for this but the issue really isnt in iclone. Its in Unreal. I noticed in your video that you record from iclone to unreal with "By Frame" checked on in iclone. When you do that process does your animation play in "realtime" in Unreal once the take has been recorded? Thats what I am trying to figure out.
I have a super powerful machine so that shouldnt be the issue. I have a GTX 4080.
@@wholesome3d
I see, you said adjust the speed in Unreal. That could work but for lip sync it would never line up. I’m so surprised there isn’t an official workflow for this. The frame dropping issue has been a thing for a long time with live link.
@@colehiggins111 To comment on your initial question - Yes, the recording is done ‘by frame’ but it plays back properly at normal speed in Unreal. The only time I’ve experienced dropped frames was when my recording was made with ‘glitches’
I would simply re-record that animation (with less PC resources) and it plays back perfectly.
With your system you shouldn’t be having this problem and you may be right- it could be a setting in Unreal engine, under the Live-link or recording window.
If are willing to We-transfer me a video recording of your process and settings, you are welcome to send it to Wholesome3D@gmail.com. I have however just stepped into my office (for my day job) and will only be able to look at it this evening.
Thank you so much for offering to help. I will record my process and then send you a video. Much appreciated!
in every aspect Auto setup is better than Live Link as far Animator libery, control over character and render quality is concerned but auto setup has some big disadvantages like relative positions between the characters and manual positioning of character within the set. I hate live link and still using auto setup as recently you can do some fine tune in your animation through Modular Control Rig system. Please make a tutorial on those problems in auto setup and how we can overcome it. I actually forced to shift Blender Eevee RayTracing as there is wonderful plugin CC Blender Tool which will act both auto setup and live link between iClone and Blender. Similar single plugin should be implemented in Unreal-iClone pipeline also so that animator could take highest advantage of both modules (Live Link and Auto Setup).
Thanks for taking the time to comment.
Yes, I totally agree with you - the Autosetup plugin is better.
My workflow has changed since I made this video.
Since, the introduction of the automatically generated control rig in UE, I don’t actually use Live link anymore. Now I fix all the minor face and body animations using the Control Rig, (after bringing my characters in with Autosetup.)
I haven’t actually tried the new Blender plugin, but will give it a go! Thanks for the recommendation.