Fantastic tutorial Sir! Not only did you help me make handtracking work, but by following everything I also got Oculus Airlink to work again. The trick was to switch to OpenXR. Goodbye Oculus plugin, you've become obsolete and we won't miss you. (FYI, I used Unity 2022.2.17f (latest version), and it worked identically to the tutorial.)
For anyone unable to get the hands to display when you place your controllers on the table, make sure that the Oculus app is set as the default OpenXR runtime and not Steam. This caught me up for hours. You can change this by opening the Oculus app going to Settings and changing "OpenXR Runtime".
I added the hand tracking stuff to the "XR-Interaction-Toolkit-Examples" project. The materials at the end was already set to standard shader, bit still pink. I had to set them to "Universal Render Pipeline/Lit".
Fantastic tutorial! Any chance you are planning on making a tutorial to show how to swap the default VR hands with a custom model that works? I've tried to do it myself but can never get the visuals for the hand to actually appear.
Hey Valem Tutorials, thank you so much for all the tutorials! I do have a question, though. If I want to drag an object (possibly a very large box or a model), how can I do so? I think this is a bit different from grab, or can similar mechanisms be applied to this situation as well?
Thank you so much for the tutorial! Is there any chance of you doing a tutorial on enabling passthrough with this openXR hands setup? I can't seem to get it to work no matter what I try
Hey Valem! thanks for the tutorial, it was very informative :) I noticed an error occurs when i switch from hand tracking to the controllers where it says "could not find active control after binding resolution" any tips on fixing this error?
Seems that this error only appear in Unity Play Editor Mode. When I build my game in Android, side loading it into my Oculus Quest 2, and open it in Quest. Everything works fine and I can easily switch between controller and hand.
Same issue with you, and my editor is paused when poping up this message ... 2021.3.20f1 XRTK2.3.1 XR Hands 1.1.0, OpenXR Plugin 1.7.0, Oculus XR Plugin 3.2.3, XR Plugin Management 4.2.1
Hello, Don’t know if you still care or have found a solution to this But from what i’ve seen, it oNly recognise pinching wich your fingers by default. You will have to implement yourself a logic to recognise gesture, or go into an asset store plugin
I have issues with the Openxr plugin said to not be selected as plugin provider when I press play, while it was selected in the project settings for both desktop and Android. I tried to make sure I had all three packages (up to date): Input System; XR; XR core utilities (I didn't have this one); and ofc XR Plugin management. You can find this info under the hands package dependencies. It didn't work. Valem, do you maybe know what is going on?
Hey Valem! Great tutorial! In my case with the "Complete XR Origin Set Up" from the "XR Interaction Toolkit" doesn´t show my hands, I see the controllers all the time :( But when I did the first step, using the hand visualizer script, I could see them. Any idea what is happening? Thank you!
Any idea if this works for Quest 3? I've followed the general setup from this video and it doesn't appear to be working. I don't have a Quest 2 on hand, so I'm trying to determine if it's the versions of the packages I used or if it's simply not supported.
Anyone knows how to solve if error : "Could not find active control after binding resolution" UnityEngine.InputSystem.InputManager.OnNativeDeviceDiscovered (int,string) appears after trying to switch from hand tracking to controller in run time?
Open XR version 1.8 is not compatiabile with XR Hands it shows error "Hand tracking subsystem not found or not running. " This XR Hands still cause a lot of trouble to set up... Planty informations missing in this tutorial like Left Hands Stabilized... XR Input Modality component doesn't work at all... Generaly all prefabs given in XR Interaction toolkit 2.4 are full of bugs. Xr orgin sopouse to be ready to go complete set up but is not ready to go of course is not working.... Ehhh how to create games when basic stuff are mess up.
They seems to have updated things. When I get to the 10:00 point and hit PLAY, nothing shows up. No controllers, and no hands. And as they have changed some of the names and serialized fields (one seems to be deleted). It could be a I just made the wrong guesses.
Nice video! Having a weird bug when I follow along, or even use some of the sample scenes default values. My hands are oddly duplicated and appear different in the left and right eye. Is this normal?
Hey Valem I just tried it. I feel I followed every single step you mentioned. I don't have any errors. But when I put my oculus controllers down and wait for a few seconds, I don't see my hands. I have no clue why my hands don't show :(
I have the same issue, plus the "could not find active control..." errors. I guess it's a preview package for a reason. I backed it out for now and may try again in a new project. The hand-tracking in Meta's own toolkit worked great, and had a lot more functionality with poses and pose recognition, as well as other Oculus features like voice commands. I like Unity's toolkit, but if you know you're developing for Quest, then Meta's toolkit has some advantages [but unfortunately, crappy documentation].
Hello, I am currently running the openxr 1.7.0 and it won't load the OpenXR tab below the XR Manager. I keeps spitting out an error about the Meta Quest Pro Controllers, I've been stuck on it for a while and can't figure out what to do. Any advice would be great! I'm using Unity 2021.3.23 at the moment Edit** I found that going back to an earlier version fixed my problem, great tutorial btw
I have tried everything, but I cannot get this to work, I am using the XR hand 1.3 and there is no slot for XR Origin. I have my Oculus app OpenXR Runtime set to MetaQuest. I have tried a new blank project; I have tried the VR Core template. I cannot get this to work no matter what I have tried. Has anyone else got this to work with the newer stuff on a Quest 2?
Is there a way to use Meta passthrough along with this solution? I tried using the new Oculus Integration Package along with this to enable passthrough but it seems to break switching between hands and controllers with the Complete XR Origin Hands Set up prefab. Any suggestions would be great!
Hi, I'm making a VR project for my university. My first idea was to do something with the controllers but then I saw your videos about hand tracking. Do you think hand tracking is capable of different actions/interactions like a controller? Or is it limited? What would be some good practises to adapt a VR game to a fully hand tracking VR game? Like menus, exiting the game, movement, adapting buttons, recognizing hand gestures/poses?
it can be very good if you use pose tracking to trigger events. simple gestures can be tracked and used to trigger functions through unity's events/Actions system. it's not as accurate as I'd like but it's enough to move virtual controls when you pinch or close your fist etc.
@@soya8983 basically I used the meta hand tracking package and tracked the bones, and compared the detected positions to an array of stored positions over time. It the bones matched a gesture I had saved, it fired a method that read classes to define what to do.
Hey Valem! Thanks a ton for your help! I have a problem where I my hands don't show up in the Build! (I updated an older project and might have some other issues) Is it mandatory to setup the android settings or was that just to show how it works on both platforms?
I have thw the following error "Hand Tracking Subsystem not found, can't subscribe to hand tracking status. Enable that feature in the OpenXR project settings and ensure OpenXR is enabled as the plug-in provider." I have the subsystem enabled as it should be and I dont see any hands
When I press any button on my controllers they both fall out of the world and I cannot get them back unless I restart the game mode. Also the hands are not showing. How can I fix this?
My HandVisualizer class wont load, anyone else have this issue? I’m using the same version of unity and have all the current packages and samples. Totally empty project otherwise.
Finally! Now XR toolkit actually feels complete.
It would be nice if they would enable full body with hand tracking. I don't know if you can do that now, since body only follows controllers.
Hands don't appear in Windows PC builds.. so.. no
FINALLY! Thanks for sharing Valem! This is so cool!
You bloody star mate. Working perfectly.
Awesome!
Thanks for the donation man it means a lot !!!!😍
You are saving my thesis
Fantastic tutorial Sir! Not only did you help me make handtracking work, but by following everything I also got Oculus Airlink to work again. The trick was to switch to OpenXR. Goodbye Oculus plugin, you've become obsolete and we won't miss you.
(FYI, I used Unity 2022.2.17f (latest version), and it worked identically to the tutorial.)
We won't miss it, but would be great to know to do hand pose grabing and physic hands with the OpenXR! 🙏
Excellent orientation thanks for sharing!
12:01 i love that energy
For anyone unable to get the hands to display when you place your controllers on the table, make sure that the Oculus app is set as the default OpenXR runtime and not Steam. This caught me up for hours. You can change this by opening the Oculus app going to Settings and changing "OpenXR Runtime".
Thank You So Much!!! I was going bonkers looking for this
I added the hand tracking stuff to the "XR-Interaction-Toolkit-Examples" project. The materials at the end was already set to standard shader, bit still pink. I had to set them to "Universal Render Pipeline/Lit".
Great tutorial as always :) would be nice to know how to detect different hand gestures through code
I also want to know how to do that.
Fantastic tutorial! Any chance you are planning on making a tutorial to show how to swap the default VR hands with a custom model that works? I've tried to do it myself but can never get the visuals for the hand to actually appear.
Hey Valem Tutorials, thank you so much for all the tutorials!
I do have a question, though. If I want to drag an object (possibly a very large box or a model), how can I do so?
I think this is a bit different from grab, or can similar mechanisms be applied to this situation as well?
Grabbing just hold the object on top of my hands 😅
@@jbjbjb_ XRGrabInteractor should work, just set it to use a dynamic attachment point.
Great explanation and clear thank you!
So Gooood update on XR TOOLKIT. Thanks Valem for tutorial 🤘😁 btw nice hairstyle 😁
Aha thank you :D
JUST YESTEDAY I WAS WONDERING IF I COULD MAKE HAND TRACKING
Excellent video. Thanks
Thank you so much for the tutorial! Is there any chance of you doing a tutorial on enabling passthrough with this openXR hands setup? I can't seem to get it to work no matter what I try
Love your tutorials!
How would you go about setting colliders for these hands? I don't want them to go through objects...
very cool tutorial bro
... We just need full IK body tracking add to this and nothing else is needed so far. 🤘🤘🤘
Thanks Valem!
Hey Valem! thanks for the tutorial, it was very informative :)
I noticed an error occurs when i switch from hand tracking to the controllers where it says "could not find active control after binding resolution" any tips on fixing this error?
Seems that this error only appear in Unity Play Editor Mode.
When I build my game in Android, side loading it into my Oculus Quest 2, and open it in Quest. Everything works fine and I can easily switch between controller and hand.
Same issue with you, and my editor is paused when poping up this message ... 2021.3.20f1 XRTK2.3.1 XR Hands 1.1.0, OpenXR Plugin 1.7.0, Oculus XR Plugin 3.2.3, XR Plugin Management 4.2.1
I have the same error
same here
I love
Valem Tutorials😀😀😀😀
your tutorials are super easy to follow. your chanel is the best way for my vr games to go to the next level. please keep posting videos.😄😄😄
Thank you. Just in time
I literally have been waiting for this!!!, I really don't like oculus SDK
Is there any way to detect a grab pose in order to grab and throw objects, like a basketball?
Can you create some tutorial about hand poser with this stuff? Or we can use the exact logic with your previous video about the custom grab hand pose?
Can we detect hand poses with this package? For example grabbing an object with the fist closed instead of just using a pinch gesture?
Hello,
Don’t know if you still care or have found a solution to this
But from what i’ve seen, it oNly recognise pinching wich your fingers by default. You will have to implement yourself a logic to recognise gesture, or go into an asset store plugin
@10:24, is there a way of turning off the controllers' mesh and have just the hands?
thank you very much!!!
I was in AI world for some time.. time to implement this future :D
I have issues with the Openxr plugin said to not be selected as plugin provider when I press play, while it was selected in the project settings for both desktop and Android. I tried to make sure I had all three packages (up to date): Input System; XR; XR core utilities (I didn't have this one); and ofc XR Plugin management. You can find this info under the hands package dependencies. It didn't work. Valem, do you maybe know what is going on?
Hey Valem! Great tutorial! In my case with the "Complete XR Origin Set Up" from the "XR Interaction Toolkit" doesn´t show my hands, I see the controllers all the time :(
But when I did the first step, using the hand visualizer script, I could see them. Any idea what is happening?
Thank you!
Did you activate the auto switch from Controller to Handtracking in your Quest Settings?
Any idea if this works for Quest 3? I've followed the general setup from this video and it doesn't appear to be working. I don't have a Quest 2 on hand, so I'm trying to determine if it's the versions of the packages I used or if it's simply not supported.
Hi Valem, is there any chance you could show how to use these new features along with Passthrough from Oculus integration?
Agreeeeed
I'm currently attempting to put the two together, let me know if you're interested in being kept up to date on what I find!
Is there a way to have the hand model display on the controllers when holding the controllers?
hi, can you make the next video about making a custom scene like this? also did you have a chance to look at TactGloves by bHaptics?
this is awesome! Anyway is someone else experiencing a strong input lag when using the hand tracking? Is it normal?
Anyone knows how to solve if error :
"Could not find active control after binding resolution"
UnityEngine.InputSystem.InputManager.OnNativeDeviceDiscovered (int,string)
appears after trying to switch from hand tracking to controller in run time?
running into the same error ye
Same here
The hand visualizer component is not popping up for me
Hi, Is there a difference between using the xr toolkit and using the Oculus integrated package?
Any idea to make the hand can push object
Nice tutorial! Thx a lot! One question, how did you test it through Unity Editor PlayMode, was that from QuestLink?
can we use teleport locomotion with hand tracking?
You need a tutorial about the Buttons in the Hand controllers ( B, A, Y, X)
Open XR version 1.8 is not compatiabile with XR Hands it shows error "Hand tracking subsystem not found or not running. " This XR Hands still cause a lot of trouble to set up... Planty informations missing in this tutorial like Left Hands Stabilized... XR Input Modality component doesn't work at all... Generaly all prefabs given in XR Interaction toolkit 2.4 are full of bugs. Xr orgin sopouse to be ready to go complete set up but is not ready to go of course is not working.... Ehhh how to create games when basic stuff are mess up.
now this is cool
newer version doesnt have complete xr origin setup?
OpenXR is finally usable for something. It just lacks controller models.
They seems to have updated things. When I get to the 10:00 point and hit PLAY, nothing shows up. No controllers, and no hands. And as they have changed some of the names and serialized fields (one seems to be deleted). It could be a I just made the wrong guesses.
I've been caught too. Which versions of the relevant XR packages do you have in your project?
Nice video! Having a weird bug when I follow along, or even use some of the sample scenes default values. My hands are oddly duplicated and appear different in the left and right eye. Is this normal?
Hi, when I use hand tracking and close my left eye, my hands display the wrong position.
I found the problem.
OpenXR: Render Mode: Multipass
Wait. So we don't need leap motion anymore?
Hey Valem I just tried it. I feel I followed every single step you mentioned. I don't have any errors. But when I put my oculus controllers down and wait for a few seconds, I don't see my hands. I have no clue why my hands don't show :(
I also have the same problem
Check whether automatic switching to hand tracking is on in the settings of the quest itself. I had this problem too.
@@TheArmatt this fixed it for me, thanks!
I have the same issue, plus the "could not find active control..." errors. I guess it's a preview package for a reason. I backed it out for now and may try again in a new project. The hand-tracking in Meta's own toolkit worked great, and had a lot more functionality with poses and pose recognition, as well as other Oculus features like voice commands. I like Unity's toolkit, but if you know you're developing for Quest, then Meta's toolkit has some advantages [but unfortunately, crappy documentation].
@@TheArmatt thanks for the suggestion. I tried that too. And then the VR scene doesn’t even load for some reason.
Hello, I am currently running the openxr 1.7.0 and it won't load the OpenXR tab below the XR Manager. I keeps spitting out an error about the Meta Quest Pro Controllers, I've been stuck on it for a while and can't figure out what to do. Any advice would be great! I'm using Unity 2021.3.23 at the moment
Edit**
I found that going back to an earlier version fixed my problem, great tutorial btw
is there a way to use hand tracking and controller tracking at the same time?
Did you find any solutions?
I have tried everything, but I cannot get this to work, I am using the XR hand 1.3 and there is no slot for XR Origin. I have my Oculus app OpenXR Runtime set to MetaQuest. I have tried a new blank project; I have tried the VR Core template. I cannot get this to work no matter what I have tried. Has anyone else got this to work with the newer stuff on a Quest 2?
did you find out what went wrong? i'm facing the same issue
I see mesh hands but how can we also make use of those mesh hands to act as colliders?
I understand XR Interaction2.3(Complete XR Origin Hands Set Up) physical Hand as XR Hand,same thing, right?
How would you make a convincing rollercoaster like in Switchback VR?
What do you suggest to have laptops for doing development for VR
Finally!!
Is there a way to use Meta passthrough along with this solution? I tried using the new Oculus Integration Package along with this to enable passthrough but it seems to break switching between hands and controllers with the Complete XR Origin Hands Set up prefab. Any suggestions would be great!
Hi, I'm making a VR project for my university. My first idea was to do something with the controllers but then I saw your videos about hand tracking. Do you think hand tracking is capable of different actions/interactions like a controller? Or is it limited? What would be some good practises to adapt a VR game to a fully hand tracking VR game? Like menus, exiting the game, movement, adapting buttons, recognizing hand gestures/poses?
it can be very good if you use pose tracking to trigger events. simple gestures can be tracked and used to trigger functions through unity's events/Actions system. it's not as accurate as I'd like but it's enough to move virtual controls when you pinch or close your fist etc.
@@NanashiNokaze How can I achieve that? I want to use hand movements to achieve continuous movement and teleportation.
@@soya8983 basically I used the meta hand tracking package and tracked the bones, and compared the detected positions to an array of stored positions over time. It the bones matched a gesture I had saved, it fired a method that read classes to define what to do.
@@soya8983 if you'd be interested in hiring help I do contract programming work in unity with c#.
So it can read input. Then why exactly do we need headset plugin support ?
i can not see Origin property in Hand Visualzier script...i think because of this when i put down my controllers, hands dont appear
Hey Valem! Thanks a ton for your help!
I have a problem where I my hands don't show up in the Build!
(I updated an older project and might have some other issues)
Is it mandatory to setup the android settings or was that just to show how it works on both platforms?
Update: i enabled developer options in the oculus settings and it started working in the build 🙌
@@halbzwilling Do you mean work in the .exe file build ? Cound you tell me the version of your Editor ? I am stuck on the handtracking in PC build
Can you use Auto Hand with the XR Device Simulator in the XR Interaction Toolkit? I can't get it to work.
Hi Valem, can you tell me something about UltimateXR?))
A good idea for a new video would be to show how to do a full hand grab!
I have only headset of Oculus Quest 2. Is it impossible to use handtracking using Only Headset? Should I have to have controllers?
I have thw the following error "Hand Tracking Subsystem not found, can't subscribe to hand tracking status. Enable that feature in the OpenXR project settings and ensure OpenXR is enabled as the plug-in provider." I have the subsystem enabled as it should be and I dont see any hands
Dont forget to turn on floor on the xr origin if ur controller doesn't move
Valem, i need help. How to detect if a finger is closed? I'm working on a game that will uses a virtual guitar hero guitar inside VR!
how do you switch from controller to hand
How do I run the unity app in quest when i enter play mode?
Same issue
Are hand poses already usable?
LETS GOOOO
Cool! But there is some deviation when my right index finger touches my left hand. How can I fix it?
help me please everything works fine just doesn't work for pc versions
it lets me use my hands but it doesn't track my fingers, any fixes?
Will it support index?
Can Pico4 or Pico3 do this things?
how do u get physics with it?
When I press any button on my controllers they both fall out of the world and I cannot get them back unless I restart the game mode. Also the hands are not showing. How can I fix this?
I couldn't make it work in a Windows .exe Build. Any thoughts?
How can i make my hands collide so they dont go through eachother
can this be use in pico 4?
My HandVisualizer class wont load, anyone else have this issue? I’m using the same version of unity and have all the current packages and samples. Totally empty project otherwise.
Can I use it with HTC Vive?
Has anyone else the problem of handtracking not working in build, only in editor?
such wow
i want to add hand physics in XR Interaction Hands Setup at HandsDemoScene, please make a tutorial🙏🙏🥺🥺
hand tracking with physics xr interaction toolkit please
rotation is not working for me
i am having a issue where i install the interaction toolkit and the complete rig isnt there did they remove it in a newer verison?
The name has changed its something like Hand Interaction Rig or something
would it still be in xr interaction toolkit because if so I don't see that either@@ValemVR
@@anon3020 It is XR Origin Hands (XR Rig) inside Samples > XR Interaction Toolkit > 3.0.3 > Hands Interaction Demo > Prefabs
hey Valem, would be cool to see like a "virtual desktop" tutorial with XRI 2.3 where you can interact with your desktop with hands tracking!
👏👏👏👏👏
Can't even enter the VR scene with this error:DllNotFoundException: UnityOpenXRHands assembly: type: member:(null)
same here :/ if you find a fix let me know please
@@script4724 with Quest Pro problem solved itself