Usually not an Apple fan, but I'm amazed that even old Iphones are the best option for cheap motion capture, seems like no one was able to release a 3D camera in that entire timeframe
Biggest flaw with liveface iPhone has a bug for generations according to the forums where single eyebrow movement is not being registered. iPhone can only move both at the same time, but if you raise a single one it doesn't work.
I see your video was recorded this same week, but Live Face app hasn't looked like that for a long time. Have you not updated the app? THe new version doesn't have that button section showing the sliders. You only see the mesh on the face. ALso Im getting 100% complete opposite results than your video. WIth Accuface/webcam it looks just like the left, nearly no mouth movement but lots of eyes and expression. THen with LIVE FACE , I get single eyebrow movement and not as much expression but good mouth movement. I wonder if you perhaps confused the two.
Actually I can confirm,at 4:16 in your video Accuface is on the left. Im able to tell because when you selected the character on the right, its clearly titled f_O and that character is assigned to Iphone in Motion Live 🙂🙂
@@BlenderBob It turns out, there are two apps with same name. LIVE FACE for Unreal, and LIVE FACE for Iclone. I didn't know that. You do have correct app. But what I mentioned about your confusing the two cameras is still true, at 4:16
I wonder if this is good enough for a deaf person who can lip read to see sounds/words, OK for the hearing world, lips flap around in time with the sound, seems good enough but lip readers can really see a lot of sound, not as much as silly FBI TV shows suggest though
@@BlenderBob Generally speaking yes, but you can customize Metahuman mesh. I haven't try this workflows, but you can do it using Maya and this free tool: th-cam.com/video/RBLw4q9Hcrc/w-d-xo.htmlsi=PkM3qm19PhfxBiIy And with Blender, MeshMorpher plugin, Wrap3: th-cam.com/video/HOSPDn1j9Jw/w-d-xo.htmlsi=DzOQB4LoXPPgjCF5 There are other workflows also, but this two look most customizable.
are you oky. right is good then the left one. left is jittering and even did not open mouth when you was talkin i think bob its time to go weak sunglasses you did not look good.2ndly you can do it easy with blender , faceit or rigify. 3rd : best facial rightnow is Unreal engine . because thos 52 shape iphone and others in blender faceit and in iclone ex.. thos are lack of others shapes with you dont have , in unreal they have ... thos 52 shapes iphone are for kids bro... trust me in cilone and beldner frace it i mean ... its good only for facial expression not to talk ... but im impressed with that nvidia accuface . if left is i phone ts really bad af necause look the eyes trackig is really bad and eyebrows also ex.. even mouth is huge tucking bad XD . replace T with THE F xD .
Well, you have to understand that I was recoding from three input at the same time on my laptop. Hence the bad sound at some places. The signal was probably lost while recording. We didn’t get I to any Issues with live face on tiki.
Usually not an Apple fan, but I'm amazed that even old Iphones are the best option for cheap motion capture, seems like no one was able to release a 3D camera in that entire timeframe
can you use a recording instead of live capture?
No I don’t think so. We wanted to capture the recording of the sound session but it wasn’t possible.
The one on the right seems to be cleaner. Not much clean work after
It’s better for the mouth but not for the eyes and expressions.
Everybody knows somebody with an iPhone.
Nice, thanks for sharing!
Biggest flaw with liveface iPhone has a bug for generations according to the forums where single eyebrow movement is not being registered. iPhone can only move both at the same time, but if you raise a single one it doesn't work.
Yeah!
Thank you for this comment, I spent hours looking through settings/tuts thinking I set some kind of mirror until NOW! TY
I see your video was recorded this same week, but Live Face app hasn't looked like that for a long time. Have you not updated the app? THe new version doesn't have that button section showing the sliders. You only see the mesh on the face. ALso Im getting 100% complete opposite results than your video. WIth Accuface/webcam it looks just like the left, nearly no mouth movement but lots of eyes and expression. THen with LIVE FACE , I get single eyebrow movement and not as much expression but good mouth movement. I wonder if you perhaps confused the two.
Really? Then maybe I should keep the old version.
Actually I can confirm,at 4:16 in your video Accuface is on the left. Im able to tell because when you selected the character on the right, its clearly titled f_O and that character is assigned to Iphone in Motion Live 🙂🙂
@@BlenderBob It turns out, there are two apps with same name. LIVE FACE for Unreal, and LIVE FACE for Iclone. I didn't know that. You do have correct app. But what I mentioned about your confusing the two cameras is still true, at 4:16
One on the right is much better. The one on the left looks very bad.,Not sure why you said one on the left is better. Its not even close.
Did you see the addendum clip?
Great video, thanks!
The TH-cam link in the video description doesn’t work.
Fixed! Thanks!
Can this face mocap retarget to CC3 rig?
I don’t know. Most likely
I wonder if this is good enough for a deaf person who can lip read to see sounds/words, OK for the hearing world, lips flap around in time with the sound, seems good enough but lip readers can really see a lot of sound, not as much as silly FBI TV shows suggest though
parts of the video (including the very beginning) are not correctly mono'd, coming from only the left channel for some reason
I hope you can hear on both side. ;-)
@@BlenderBob it's just very obvious when using headphones ;)
With Unreal Engine you can do the same, and for free...
But only with meta humans?
@@BlenderBob Generally speaking yes, but you can customize Metahuman mesh. I haven't try this workflows, but you can do it using Maya and this free tool:
th-cam.com/video/RBLw4q9Hcrc/w-d-xo.htmlsi=PkM3qm19PhfxBiIy
And with Blender, MeshMorpher plugin, Wrap3:
th-cam.com/video/HOSPDn1j9Jw/w-d-xo.htmlsi=DzOQB4LoXPPgjCF5
There are other workflows also, but this two look most customizable.
@@balsonsash3875 But it can be used only in Unreal. You cannot export MetaHumans
The mouth is terrible with the iphone one
It’s probably because my prefs were setup for my characters and not this one because that’s what we use on tiki and I get much better results
@@BlenderBob Ah OK
great but make it in more detail will be greatly helpful
i'm not sure if it's intentional but the audio sometimes cut out on the right channel in your clip
thanks for sharing this!
👍👍
are you oky. right is good then the left one. left is jittering and even did not open mouth when you was talkin i think bob its time to go weak sunglasses you did not look good.2ndly you can do it easy with blender , faceit or rigify. 3rd : best facial rightnow is Unreal engine . because thos 52 shape iphone and others in blender faceit and in iclone ex.. thos are lack of others shapes with you dont have , in unreal they have ... thos 52 shapes iphone are for kids bro... trust me in cilone and beldner frace it i mean ... its good only for facial expression not to talk ... but im impressed with that nvidia accuface . if left is i phone ts really bad af necause look the eyes trackig is really bad and eyebrows also ex.. even mouth is huge tucking bad XD . replace T with THE F xD .
Well, you have to understand that I was recoding from three input at the same time on my laptop. Hence the bad sound at some places. The signal was probably lost while recording. We didn’t get I to any
Issues with live face on tiki.
And capturing the screen too
3d animators all updating their resumes..