Oh guys by the way Unity Xr Interaction Toolkit is not in pre release anymore so you can directly download it from the unity package without the little steps I did at the start of the project. :)
Valem's channel is the gift that keeps on giving, thank you very much man for all the information that you have provided us with and all the info that you will in the future, the work and effort you put into editing these videos and explaining the concepts is showing, we all appreciate it, and hopefully a lot more people will find it useful in the near future, especially with VR getting bigger
Hi, I must I have never tried to make a game or even opened any program to try. But when I bought the Quest 3 I thought maybe I can make a simple app that can show some 3D-models in XR and also choose different models from a menu. I'm not there yet due to lack of time but I feel the tutorials made by Valem is exactly what I need. Super thanks! Tip on good and free converters that brings along the texture and material of models from SolidWorks would be nice 🙂
I don't get it. I made it a few times, step by step and it doesn't work. I still see the menu on my hand and gaze doesn't work. Can someone explain what is going on?
This Tutorial is not compatible with the new "Complete XR Origin Hands Set Up" used in the Video "Hand Tracking with Unity XR Interaction Toolkit". Looks like it is necessary to have a XR Controller-Component in the GameObject where there is the HandMenu-GameObject - otherwise it does not follow the hand movements. I found a work around putting the HandMenu-GameObject in the Ray-Interactor-GameObject AND a XR Controller-Component in the Left-Hand-Parent-GameObject. Allthough there are some problems with that. Maybe someone knows a better way ... Whatever, great Videos! Really appreciate it. Fantastic!
@@matebarkoczi9792 I was able to get it just by making an empty game object called HandMenuParent. Added an XR Controller (Action-based) component to it. Then dropped the hand menu in as a child of this new game object. I placed the parent as a child of the left hand.
@@barakawins I tried your solution and finally the display shows up...but it flies on the back of my virtual hand while I placed it in the palm of my hand (like a phone). any idea?
Thanks for this ! Your videos are amazing. Btw I have tried to recreate this using the OVR - Meta Blocks , but I can’t get it to work. Can you use the XR ways explained here together with the Meta Blocks? Thanks
Love this but, I'm new and am driving myself crazy! I can't get the hands to point! following your Hand Presence video it makes a fist but I cannot get it to point to push the buttons!
hi, really cool concept! I'm using currently the official Oculus Integration package. And now in my project I have 'OculusInteractionSampleRig' and 'OVR Camera Rig'. Do you think Unity XR interaction Toolkit would work well with them together?
I get a lot of knowledge from your tutorials, so thanks for that. But man I have some many issues. I followed this video 10 times now, deleting the Gaze objects and recreating them and I cannot get it to work. its too irregular. I tried to set the sphere to 2 units in all sizes and still it's only when my hand is pointing towards my other hand or away from the camera or something. I make sure the gaze interactor red, green and blue vectors are pointing the same direction as you do in your video and all. I can't understand why this is so difficult to get to work (seems that everything with Unity is just fiddling for hours until you by sheer luck find the right combination of checkboxes, object hierarchy etc.). I thought of just getting the vector from the menu that points up from the menu and another vector from the menu to the camera and calculate the angle between the two and use that to enable or disable the hand menu, but that is also stupid when these gaze interactors should do that for you.
This is a wonderful tutorial. I used it in my game (school project). And it worked fine with my quest 3 connected to PC. However when I made it for android and uploaded it with Sidequest to my quest 3, one button (submit) for saving the settings of my menu did not work. All other buttons seams to work. Over Quest link everything is working. Do you have any idea why it is not working from Quest?
I don't understand how I am supposed to poke something. Do I press a button or do I just get close to the UI element? My XR Poke Interactor -> Attach Transform is set to index_ignore but the hand isn't changing its shape to make a poke pose.
Why did you put Gaze Interactor on the Menu and not the Camera? It does feel a bit backwards once you would add in other elements in the world you want to interact with your gaze.
@@ValemTutorials Thank you for the reply! I understand what you mean. Pre; I'm fairly new to VR so I might be wrong here. To me it feels a bit wrong; as if you would not be looking at the hand but it is turned against you, you might accidentally interact with the menu without the intention. So the fact that the player should be looking at the menu and not the other way around is what I would argue would be more correct. (Of course with some extension that backside collision is occluded or doing some dot product math to validate interaction). But to me it still feels backwards considering interacting with other things in the world. Consider you would want to place multiple UI in the world and only interact with one, with this you would now need to setup a XR Interaction Group to do some prioritiziation or fix it in some other way, instead of letting the first hit object by gaze decide.
I solved it by duplicating the Interactable, changing the float affordance target, and setting up each interaction to happen on a separate layer. It sounds nasty but it works if you just need a menu on each hand.
I have the exact same question and I think I know what Valem is talking about. I agree that the "gaze interactor" should be the eye/head/cam, and if you swap the two, it would still work. However, it's a little bit more complicated than common sense. Here is why: - If you have the eyes are the gaze interactor, the raycast from eyes will be always shooting out and detecting the collision from the hand menu, meaning if you want to only see the menu when you flip your palm towards you, you need to implement additional axis alignment logic in C# - If the intended interaction is you want to see the hand menu when you flip your palm, then that means even if you don't directly look at it, the menu should still show if you flip your hand. Eyes are gaze interactor will not allow that easily - Eyes and cam/head don't always point to the same direction. Your head/cam can be pointing towards the hand but your eyes are looking up to the sky while you are flipping the palm out, so your hand SHOULD show the menu. But if you use eyes as gaze, the menu won't show Shooting the gaze/ray out of the menu can eliminate all those issues, but it feels like a quick hack I agree.
I cant get the gaze interactor to work , I have everything setup up according to the video, so interaction with the menu works and everything. However the Gave interaction isn't working when I hover over and exit nothing changes
I figured it out, it isn't checked in the video, but on the Gaze Interactor under the selection configuration you need to check "allow hovered Activate"
Oh guys by the way Unity Xr Interaction Toolkit is not in pre release anymore so you can directly download it from the unity package without the little steps I did at the start of the project. :)
Hello, can you help me i dont have gaze interactor script in inspector, but i installed unityXR?
@@radari7180 Could you solve?
Valem's channel is the gift that keeps on giving, thank you very much man for all the information that you have provided us with and all the info that you will in the future, the work and effort you put into editing these videos and explaining the concepts is showing, we all appreciate it, and hopefully a lot more people will find it useful in the near future, especially with VR getting bigger
Thanks man you're going to make me emotional, thank you for watching my content it means a lot !
You keep suprising me with all the features you come up with in unity. Love it.
Aha thanks, I wanted to talk about this one for awhile now. :)
video helped me fix my pause menu thank you so much Valem, you have been a great help with my VR project.
just in time. thanks
the fact that he doesn't have more than a million subscribers is CRAZY!
Soon ! :D But I'm already happy with the number of subs I have. :)
valem, i love your tutorials
Hi, I must I have never tried to make a game or even opened any program to try. But when I bought the Quest 3 I thought maybe I can make a simple app that can show some 3D-models in XR and also choose different models from a menu. I'm not there yet due to lack of time but I feel the tutorials made by Valem is exactly what I need. Super thanks!
Tip on good and free converters that brings along the texture and material of models from SolidWorks would be nice 🙂
I don't get it. I made it a few times, step by step and it doesn't work. I still see the menu on my hand and gaze doesn't work. Can someone explain what is going on?
Are you going to make more xr toolkit physics tutorials? I would love one about guns/combat!
This Tutorial is not compatible with the new "Complete XR Origin Hands Set Up" used in the Video "Hand Tracking with Unity XR Interaction Toolkit". Looks like it is necessary to have a XR Controller-Component in the GameObject where there is the HandMenu-GameObject - otherwise it does not follow the hand movements. I found a work around putting the HandMenu-GameObject in the Ray-Interactor-GameObject AND a XR Controller-Component in the Left-Hand-Parent-GameObject. Allthough there are some problems with that. Maybe someone knows a better way ... Whatever, great Videos! Really appreciate it. Fantastic!
Did you find any smoother solution for that?
@@matebarkoczi9792 I was able to get it just by making an empty game object called HandMenuParent. Added an XR Controller (Action-based) component to it. Then dropped the hand menu in as a child of this new game object. I placed the parent as a child of the left hand.
@@barakawins I tried your solution and finally the display shows up...but it flies on the back of my virtual hand while I placed it in the palm of my hand (like a phone). any idea?
How to make the poke animation btw?
Thanks for this ! Your videos are amazing. Btw I have tried to recreate this using the OVR - Meta Blocks , but I can’t get it to work. Can you use the XR ways explained here together with the Meta Blocks? Thanks
Can the gaze interactor be adapted so that a ray is enabled when the hand is pointing at some object?
Love this but, I'm new and am driving myself crazy! I can't get the hands to point! following your Hand Presence video it makes a fist but I cannot get it to point to push the buttons!
can you add options for the menu like sound
hi, really cool concept! I'm using currently the official Oculus Integration package. And now in my project I have 'OculusInteractionSampleRig' and 'OVR Camera Rig'. Do you think Unity XR interaction Toolkit would work well with them together?
這兩個套件會打架,出現Bug: StackOverflowException: OVRPlugin+UnityOpenXR.HookGetInstanceProcAddr (System.IntPtr ….
I get a lot of knowledge from your tutorials, so thanks for that. But man I have some many issues. I followed this video 10 times now, deleting the Gaze objects and recreating them and I cannot get it to work. its too irregular. I tried to set the sphere to 2 units in all sizes and still it's only when my hand is pointing towards my other hand or away from the camera or something. I make sure the gaze interactor red, green and blue vectors are pointing the same direction as you do in your video and all. I can't understand why this is so difficult to get to work (seems that everything with Unity is just fiddling for hours until you by sheer luck find the right combination of checkboxes, object hierarchy etc.). I thought of just getting the vector from the menu that points up from the menu and another vector from the menu to the camera and calculate the angle between the two and use that to enable or disable the hand menu, but that is also stupid when these gaze interactors should do that for you.
@Valem Tutorials can you make a video where we can crab objects
This is a wonderful tutorial. I used it in my game (school project). And it worked fine with my quest 3 connected to PC. However when I made it for android and uploaded it with Sidequest to my quest 3, one button (submit) for saving the settings of my menu did not work. All other buttons seams to work. Over Quest link everything is working. Do you have any idea why it is not working from Quest?
"This will work 100%" I'll be the judge of that
Hi which headset have you used while testing this scene?
Hello, impossible to get just "hands" I have to use the physics controllers....Any help? Thx
how to make this with using MRUK or ovr?
Hey Valem, how would I go about implementing a hand menu on a hand tracking system?
can you make a video explaining how to animate a slider like on a gun using the XR interactable events?
I don't understand how I am supposed to poke something. Do I press a button or do I just get close to the UI element? My XR Poke Interactor -> Attach Transform is set to index_ignore but the hand isn't changing its shape to make a poke pose.
Hi :D, non euclidian game tutorial coming soon ?
can ya do one about how fu;; body for vr works?
Why did you put Gaze Interactor on the Menu and not the Camera? It does feel a bit backwards once you would add in other elements in the world you want to interact with your gaze.
Cause i dont want the ui to show when i look at it, i want the ui to show when it looks at me
@@ValemTutorials Thank you for the reply! I understand what you mean. Pre; I'm fairly new to VR so I might be wrong here. To me it feels a bit wrong; as if you would not be looking at the hand but it is turned against you, you might accidentally interact with the menu without the intention. So the fact that the player should be looking at the menu and not the other way around is what I would argue would be more correct. (Of course with some extension that backside collision is occluded or doing some dot product math to validate interaction). But to me it still feels backwards considering interacting with other things in the world. Consider you would want to place multiple UI in the world and only interact with one, with this you would now need to setup a XR Interaction Group to do some prioritiziation or fix it in some other way, instead of letting the first hit object by gaze decide.
I solved it by duplicating the Interactable, changing the float affordance target, and setting up each interaction to happen on a separate layer. It sounds nasty but it works if you just need a menu on each hand.
I have the exact same question and I think I know what Valem is talking about. I agree that the "gaze interactor" should be the eye/head/cam, and if you swap the two, it would still work. However, it's a little bit more complicated than common sense. Here is why:
- If you have the eyes are the gaze interactor, the raycast from eyes will be always shooting out and detecting the collision from the hand menu, meaning if you want to only see the menu when you flip your palm towards you, you need to implement additional axis alignment logic in C#
- If the intended interaction is you want to see the hand menu when you flip your palm, then that means even if you don't directly look at it, the menu should still show if you flip your hand. Eyes are gaze interactor will not allow that easily
- Eyes and cam/head don't always point to the same direction. Your head/cam can be pointing towards the hand but your eyes are looking up to the sky while you are flipping the palm out, so your hand SHOULD show the menu. But if you use eyes as gaze, the menu won't show
Shooting the gaze/ray out of the menu can eliminate all those issues, but it feels like a quick hack I agree.
Hello..I want to ask. Can this hand menu interacted with hand tracking (Hand Visualizer) ?...
does this work with vr hurricane framworke?
I cant get the gaze interactor to work , I have everything setup up according to the video, so interaction with the menu works and everything. However the Gave interaction isn't working when I hover over and exit nothing changes
I figured it out, it isn't checked in the video, but on the Gaze Interactor under the selection configuration you need to check "allow hovered Activate"
I have a problem that when i make a quest build and open it, it opens as a non VR game?
Did anyone get the gaze interactor component from this video working?
Have you found any solutions I copied exact steps but it doesnt work
After following your tutorials, my hands make fists instead of poking the index finger out! How do I fix that?
Not working at all
here early, fourth!
First
Hello i am a student in college i want to ask you something regarding my college project where should i contact you ??