Yeah, it's not exactly perfect yet... But it's good enough for general purpose. You can get uLipSync here: github.com/hecomi/uLipSync You can get NEBPro here: ko-fi.com/s/c47de71369 You can get NEBLite here: ko-fi.com/s/ad2b08fa67 My Github Project Download: github.com/ReForge-Mode/Live_Lipsync_Examples
The best way to test lip sync is by reading the phonemic chart: Longing. Rusted. Seventeen. Daybreak. Furnace. Nine. Benign. Homecoming. One. Freight car. Good morning, soldier.
Thank you for the tips! That's what I do off-recording with the IPA chart. Unfortunately, this plugin won't be able to capture all the phonemes because it only listens to vocal sounds. It does look "good enough" for VRoid models due to the limitation in tongue/teeth movement. Maybe next time I'll do an even better plugin!
Sorry if I was being intentionally unclear, but it's stupid o'clock in the early morning and I was being goofy. These words I suggested are just the activation codephrase for the Winter Soldier's mental conditioning from Marvel. :3 --- Being serious instead, I will say that I very much enjoy your highly technical content coupled with very down-to-earth explanations, PLUS your general tendency towards providing user-friendly tools to those who are following along. It's a refreshing combination of factors that make me glad to have discovered your content, both for features that I want to explore myself and some that are just fun little peeks into development underpinning them. You are one of my favorite resources for absorbing new information about Unity, modeling (even without actually "doing" the modeling by hand oneself), scripting, and the possible applications of VRoid in more general fashion. I think it's wonderful to be able to take a very specific point of enthusiastic interest for those without fully-developed technical knowledge or skills as a motivating factor to learn more across multiple disciplines and grow technical understanding through practical application and the challenges that will arise naturally from doing so. Keep up the good work!
Hmmmm, that's weird, since it works for me in the video. You might need to try out different options to refresh that component. Try switching other settings in that component. Or re-drag and drop different gameobjects.
I don't see why not. Go ahead and try it! Though, keep in mind that this lipsync plugin is not accurate enough for a full song lipsync. Your mileage might vary.
master, Can you help? when Im importing vroid model into the scene its okey, and its scale is 1 1 1. But its tooo little for the scene so I set it 500 500 500 to fit the scene. and then I get problems with the animation! hair and skirt going up to air like the Dyson is blowing right to character. what should I do with that??
Well, you're supposed to use UniVRM to import in your VRM model directly into Unity. Do not import it as FBX. I have a video on how to do that right here: th-cam.com/video/aaks0yg9ZyU/w-d-xo.html Importing through UniVRM immediately makes it so that you don't need to tinker with the scales.
@@ReForgeMode Thank you for the answer, but you said in the video about mixamo animation that we need to make it fbx first to create an animation. does it mean that I use fbx model only for creating mixamo animation and then I put these animation on vrm model?
@@aneliaalabaster340 No, you don't actually need to export it as FBX. I've been meaning to update this video for a long time, but never got a chance to do it. Basically, you just need to skip the exporting to Mixamo part, and just download whatever animation there on Mixamo. Put that Mixamo model into Unity, set as Humanoid, and you can immediately use it with your VRoid models.
@@ReForgeMode I don't know what to do - when im importing vrm into the project its too small.when I change its scale and the vrm model stay still without any animation - in play mode her hair and skirt are going up like it have its own physics !! how to fix that?? in your video its kinda ok with sizes bit in my project its not.
In theory, this plugin should work for any Unity project. If you require a dedicated service, DM in the Discord. I do take custom commissions for various works.
This is Unity-only. As far as I know, VSeeFace doesn't support mods/plugins. But if you're talking about the models, yes, all VSeeFace models will work with this plugin.
@@ReForgeMode oh i see thats a little sad i hoped it would work for vseeface when the vrm is imported , also ive seen your other tutorials as well they r amazing , i also wished to ask if i can make eye jiggle movement using neb tool like how we do with hana tool?
@@mimiko_chunn Oh, as long as you set your model using blendshapes through NEB or Hana Tool, it should work fine in VSeeFace. I will need some examples of the eye jiggle to see if it's possible. Do you have any link to it?
@@ReForgeMode Character Creator 4 I think is what he is speaking about... CC4 creates characters for Unity in many cases though so I dont get what they mean by do the same... Create the character then send it to unity where you can apply this.
@@realestatekas I mean, you can just assign the lipsync blendshapes (a, i, u, e, o) and this tool will do the same thing. The plugin is not even VRoid-based.
Yeah, it's not exactly perfect yet... But it's good enough for general purpose.
You can get uLipSync here: github.com/hecomi/uLipSync
You can get NEBPro here: ko-fi.com/s/c47de71369
You can get NEBLite here: ko-fi.com/s/ad2b08fa67
My Github Project Download: github.com/ReForge-Mode/Live_Lipsync_Examples
The best way to test lip sync is by reading the phonemic chart:
Longing. Rusted. Seventeen. Daybreak. Furnace. Nine. Benign. Homecoming. One. Freight car.
Good morning, soldier.
Thank you for the tips!
That's what I do off-recording with the IPA chart. Unfortunately, this plugin won't be able to capture all the phonemes because it only listens to vocal sounds. It does look "good enough" for VRoid models due to the limitation in tongue/teeth movement. Maybe next time I'll do an even better plugin!
Sorry if I was being intentionally unclear, but it's stupid o'clock in the early morning and I was being goofy. These words I suggested are just the activation codephrase for the Winter Soldier's mental conditioning from Marvel. :3
---
Being serious instead, I will say that I very much enjoy your highly technical content coupled with very down-to-earth explanations, PLUS your general tendency towards providing user-friendly tools to those who are following along.
It's a refreshing combination of factors that make me glad to have discovered your content, both for features that I want to explore myself and some that are just fun little peeks into development underpinning them.
You are one of my favorite resources for absorbing new information about Unity, modeling (even without actually "doing" the modeling by hand oneself), scripting, and the possible applications of VRoid in more general fashion.
I think it's wonderful to be able to take a very specific point of enthusiastic interest for those without fully-developed technical knowledge or skills as a motivating factor to learn more across multiple disciplines and grow technical understanding through practical application and the challenges that will arise naturally from doing so.
Keep up the good work!
Another incredible tool! Thanks for sharing
Glad I could help! I'm looking forward to whatever you made out of this!
As FREE software - it is really, really great. Thanks for sharing
Cheers! Glad I could help anyone needs them!
Thanks @ReForgeMode for sharing free plugin ! ❤
No problem! Always happy to help!
Sweet thanks
Hehe, glad I could help!
Even though I have selected the different option in skin mesh renderer the drop down menu comes up but it still shows none option
Hmmmm, that's weird, since it works for me in the video. You might need to try out different options to refresh that component. Try switching other settings in that component. Or re-drag and drop different gameobjects.
Is there a way to mirror head movement and blinking?
Mirror head movement? What do you mean?
Does this translate into vrchat? Or is this just meant for presentation in Unity play mode?
No, it's just Unity-only plugin. In theory, you can make your own VRChat application with this.
Can you sync lips with music?
I don't see why not. Go ahead and try it!
Though, keep in mind that this lipsync plugin is not accurate enough for a full song lipsync. Your mileage might vary.
does this work with custom meshes, like custom avatars, or only with vroid models?
Yes, custom mesh will work as well. Just make sure you assign the right mouth blendshapes.
I don't have the uLipSync script showing for me to add in the component so I'm not sure what I'm doing wrong.
Have you put installed the uLipSync plugin properly? Is it in the Project Window?
@@ReForgeMode Yeah, it's in the project window.
@@Swagberryy Hmmmm... That's weird. If you're still stuck, just download my Github project in the description and open it in Unity.
master, Can you help? when Im importing vroid model into the scene its okey, and its scale is 1 1 1. But its tooo little for the scene so I set it 500 500 500 to fit the scene. and then I get problems with the animation! hair and skirt going up to air like the Dyson is blowing right to character. what should I do with that??
Well, you're supposed to use UniVRM to import in your VRM model directly into Unity. Do not import it as FBX. I have a video on how to do that right here: th-cam.com/video/aaks0yg9ZyU/w-d-xo.html
Importing through UniVRM immediately makes it so that you don't need to tinker with the scales.
@@ReForgeMode Thank you for the answer, but you said in the video about mixamo animation that we need to make it fbx first to create an animation. does it mean that I use fbx model only for creating mixamo animation and then I put these animation on vrm model?
@@aneliaalabaster340 No, you don't actually need to export it as FBX. I've been meaning to update this video for a long time, but never got a chance to do it.
Basically, you just need to skip the exporting to Mixamo part, and just download whatever animation there on Mixamo. Put that Mixamo model into Unity, set as Humanoid, and you can immediately use it with your VRoid models.
I can't get it. how can I download mixamo animation without uploading a fbx model to mixamo site?
@@ReForgeMode I don't know what to do - when im importing vrm into the project its too small.when I change its scale and the vrm model stay still without any animation - in play mode her hair and skirt are going up like it have its own physics !! how to fix that?? in your video its kinda ok with sizes bit in my project its not.
Does it support WebGL builds?
I haven't tried it yet, but I'm pretty sure it does. I didn't see anything specific that limits it from being exported in WebGL
Very cool, looks like a powerful plugin. I will be sure to check it out. Also, I have sent you an email. Greetings, Stefan :)
Got it! Let's talk more on Discord!
hallo brother, can u make lipsync for zepeto on unity?
In theory, this plugin should work for any Unity project. If you require a dedicated service, DM in the Discord. I do take custom commissions for various works.
@@ReForgeMode i have tried, but doesn't work. because when zepeto avatar appears in unity it is in play mode.
@@LMCREATIVE24 Then you just need to add the uLipSync components to the avatar after it's spawned in.
Sir does it work for vsee face?
This is Unity-only. As far as I know, VSeeFace doesn't support mods/plugins.
But if you're talking about the models, yes, all VSeeFace models will work with this plugin.
@@ReForgeMode oh i see thats a little sad i hoped it would work for vseeface when the vrm is imported , also ive seen your other tutorials as well they r amazing , i also wished to ask if i can make eye jiggle movement using neb tool like how we do with hana tool?
@@mimiko_chunn Oh, as long as you set your model using blendshapes through NEB or Hana Tool, it should work fine in VSeeFace. I will need some examples of the eye jiggle to see if it's possible. Do you have any link to it?
Outstanding! Now lets do the same with CC4 :)
CC4 as in closed caption?
@@ReForgeMode Character Creator 4 I think is what he is speaking about... CC4 creates characters for Unity in many cases though so I dont get what they mean by do the same... Create the character then send it to unity where you can apply this.
@@realestatekas I mean, you can just assign the lipsync blendshapes (a, i, u, e, o) and this tool will do the same thing.
The plugin is not even VRoid-based.
Is their one for blender ?
There must somewhere out there. But this one is exclusively Unity.
@@ReForgeMode thank you