Thanks for the tutorial, I couldn't find the hand gestures sample scene because I didn't have experimental packages enabled. Its amazing how Unity has made absolutely complex logic like pose detection so accessible.
Great tutorial on inputs via hand gestures. Will you create a tutorial on how to use Meta's new inside-out body tracking (IOBT), for instance, to detect if the user is leaning forward (rotated their torso) instead of walking forward? This will be useful to not have the character move forward and just lean forward (i.e., leaning forward to grab something on a desk and not have the character's body move forward and collide with it).
If you take the output of the "Static Hand Gesture" script (Gesture Performed) and have it send a signal (i.e. a boolean of true) to a locomotion script, you can then trigger the virtual camera (i.e. VR rig) to move until another gesture is performed (sends a boolean of false). You can then have the player move based on a constant speed based on a direction, for instance the virtual camera's forward direction. I haven't seen a tutorial show how this is done, but if you take what I said with what this tutorial shows and with a continuos locomotion (i.e. joystick locomotion) tutorial, you should be able to create a simple locomotion system based on gestures. I hope this helps.
Here are some other forms of locomotion that could also be used through hand tracking. It might be useful if you really want to use your hands to control navigation, but know that it isn't very precise or fast - th-cam.com/video/VjiuICpRkRk/w-d-xo.htmlfeature=shared th-cam.com/video/3GizXDgB-kY/w-d-xo.htmlfeature=shared I think using your body's leaning to control locomotion might be a better solution as your hands are free to do other actions, like interact with virtual objects, instead of being used to control locomotion (like in real life) - th-cam.com/video/jzoaBAd6gPY/w-d-xo.htmlfeature=shared Leaning to fly: th-cam.com/video/IRZIytR4Wkw/w-d-xo.htmlfeature=shared I hope this helps.
Hi Valem I have a probelm it always tells me NullReferenceException: Object refernce not set to an instance of an object - when I try to use the displaying with the text. Do you have any tips for me?
Comment the lines that are showing an error, so you have to double click on the error to open the script. There you have to add double slash "//" in the beginning of the lines 195, 207, and for the line 156 replace with "m_BackgroundDefaultColor = Color.black;" without ""
@@iABOoDxZ You need to assign the Background in the "Static Hand Gesture" script in scene view. I just figured it out. It took me two days to find this solution.
@@davidosorio1585 Thats not an ideal solution, you are not really supposed to change existing scripts since a package upgrade would bring the issue back. What you need to do is this: You need to assign the "Background" in the "Static Hand Gesture" script in scene view. I just figured it out. It took me two days to find this solution.
Hi Valem! I have a question... Why do you choose unity over unreal engine? I am really curious... I like unity of course, I never even touch the unreal engine. But, wherever I see, unreal engine seems to have massive advantage if we're going to do realistic VR graphics.. Is there any specific thought on that? I really think a lot wether I should learn unreal or not at this point... Thanks.
can you (or anyone here?) note what replaced TrackedPoseDriver component - for sticking a mesh onto your hand anchor? (say I wanted to put a custom mesh prop onto my wrist). + and can this mesh stay: in same spot when you put controllers down and switch to gestures? ( InteractionSdk suggests HandJoint component, but this doesn't keep mesh in same spot when you cross over between controller and raw hand input.) (... i'm debating joining everyone in XR Toolkit usage world, after watching this video, but still not clear how to best stick props onto my custom VR hands. Want to support switching between controller and controller free usage at will. Guessing i need to have 2 setups and switch what is active when I detect change. curious what more experienced folks would suggest. mostly curious if there is an established "this has officially replaced Tracked Pose Driver. everyone should use it."
Thanks for the tutorial, I couldn't find the hand gestures sample scene because I didn't have experimental packages enabled. Its amazing how Unity has made absolutely complex logic like pose detection so accessible.
Perfect tutorial. Complete concept from start to finish, efficient and fast. Very, VERY useful. Thank you.
Yess!! Just what I needed since i dont work with the Oculus package anymore
Thanks! Great tutorial!
It works on WebXR as well🤩
How does that work on WebXR?
@@ironandsilk there's WebXR Export package. The same tutorial here can be applied to that package.
Great tutorial as always Valem!
Great tutorial on inputs via hand gestures.
Will you create a tutorial on how to use Meta's new inside-out body tracking (IOBT), for instance, to detect if the user is leaning forward (rotated their torso) instead of walking forward? This will be useful to not have the character move forward and just lean forward (i.e., leaning forward to grab something on a desk and not have the character's body move forward and collide with it).
Would love a tutorial on how to make good quality UI in VR / on Quest! I think you go through it another - but I believe it’s outdated
Hi valem can u made a video of how to navigate using hand
If you take the output of the "Static Hand Gesture" script (Gesture Performed) and have it send a signal (i.e. a boolean of true) to a locomotion script, you can then trigger the virtual camera (i.e. VR rig) to move until another gesture is performed (sends a boolean of false).
You can then have the player move based on a constant speed based on a direction, for instance the virtual camera's forward direction.
I haven't seen a tutorial show how this is done, but if you take what I said with what this tutorial shows and with a continuos locomotion (i.e. joystick locomotion) tutorial, you should be able to create a simple locomotion system based on gestures.
I hope this helps.
@@ivanaguilar2856 thank you so much it's completely logic I will try it.
@@Besttechnology you're welcome. I hope it works out well.
Here are some other forms of locomotion that could also be used through hand tracking. It might be useful if you really want to use your hands to control navigation, but know that it isn't very precise or fast - th-cam.com/video/VjiuICpRkRk/w-d-xo.htmlfeature=shared
th-cam.com/video/3GizXDgB-kY/w-d-xo.htmlfeature=shared
I think using your body's leaning to control locomotion might be a better solution as your hands are free to do other actions, like interact with virtual objects, instead of being used to control locomotion (like in real life) - th-cam.com/video/jzoaBAd6gPY/w-d-xo.htmlfeature=shared
Leaning to fly: th-cam.com/video/IRZIytR4Wkw/w-d-xo.htmlfeature=shared
I hope this helps.
Damn just when I thought I was done adding features to Rogue Stargun (cough, coming February 1st to Applab). Thanks for the tutorial Valem!
Did you upload to Applab
Salut! Is there a way to display hand poses or poses while editing them in unity? That would be cool!
XR hands works with AR?
Hi, wonderful video. Could make a video on teleporting using hand tracking please? Thanks alot.
Hi Valem I have a probelm it always tells me NullReferenceException: Object refernce not set to an instance of an object - when I try to use the displaying with the text. Do you have any tips for me?
same issue bro any solutions?
Comment the lines that are showing an error, so you have to double click on the error to open the script.
There you have to add double slash "//" in the beginning of the lines 195, 207, and for the line 156 replace with "m_BackgroundDefaultColor = Color.black;"
without ""
@@davidosorio1585 thanks bro you saved me
@@iABOoDxZ
You need to assign the Background in the "Static Hand Gesture" script in scene view. I just figured it out. It took me two days to find this solution.
@@davidosorio1585
Thats not an ideal solution, you are not really supposed to change existing scripts since a package upgrade would bring the issue back.
What you need to do is this:
You need to assign the "Background" in the "Static Hand Gesture" script in scene view. I just figured it out. It took me two days to find this solution.
Finally I can make my VR Rock Paper Scissors game without having to do the handtracking myself
Does it work with quest 3 as well
Hi Valem! I have a question... Why do you choose unity over unreal engine? I am really curious... I like unity of course, I never even touch the unreal engine. But, wherever I see, unreal engine seems to have massive advantage if we're going to do realistic VR graphics.. Is there any specific thought on that? I really think a lot wether I should learn unreal or not at this point...
Thanks.
can you (or anyone here?) note what replaced TrackedPoseDriver component - for sticking a mesh onto your hand anchor?
(say I wanted to put a custom mesh prop onto my wrist).
+ and can this mesh stay: in same spot when you put controllers down and switch to gestures?
( InteractionSdk suggests HandJoint component, but this doesn't keep mesh in same spot when you cross over between controller and raw hand input.)
(... i'm debating joining everyone in XR Toolkit usage world, after watching this video, but still not clear how to best stick props onto my custom VR hands. Want to support switching between controller and controller free usage at will. Guessing i need to have 2 setups and switch what is active when I detect change. curious what more experienced folks would suggest. mostly curious if there is an established "this has officially replaced Tracked Pose Driver. everyone should use it."
can you do a tutorial on how to add fishing to unity vr project
XR HANDS work fine on Unity Editor but in the build it doesn't detect my hands its this normal, or I have to make some changes in the configuration?
I had no idea you could change the value of a text through an event
can i use a $300 laptop from amazon to do these?