This is a lifesaver, thank you. I'm still confused about getting values of things (like trigger, grip button, etc) out of the input map actions and into scripts. That would be a great video to clarify
This is a great question! The Generic XR Controller setting from my understanding, allows the Input System to use a scheme that allows for broad compatibility from a variety of devices. Essentially it is a way for the Input System to abstract the hardware specifics and allows it to use a generalized input model. Will your inputs work if you don't use this setting? Maybe. Will it work on a variety of devices without using this setting? probably not
The UI issue when selecting the binding control is due to a misspelling in the UI theme that was introduced in recent versions of Unity. You can see the list is rendered to high up, covering the input field and listen button.
Thank you so much. I feel like im finally a step closer to learning and figuring out how the heck i can assign the XYAB buttons of my Quest 2 into the Unity project im making. All i want it to do is provide a small ui pop up for a list of different button options. Pressing the X button brings up the choices and then the user can use the joycon and the Y button to select the option. Cause i want that small UI to stay attached to the Controller object. Like they do in Vr chat, where you can tilt your controller and see the Ui of the Live stream chat. I cant find any, successfully working, tutorial on implimenting the specific XYAB buttons and how to use them in code. The best I found was the OVRInput but that also didnt work. This has been the closest ive found. So im going to try find the buttons im looking for. Thank you
Hey Thanks for the video! I am currently working on a research project regarding gaining insights about a user's usage of different menu types within VR, now I'm stuck between choosing whether to go ahead with using Unity's XR interaction toolkit, or Meta's XR All-in-One SDK. In your opinion, with your experience with either toolkit, considering I'm going to need data such as time taken to select item, checking certain inputs, etc.. which toolkit would you suggest using? I am developing for an Oculus so I can go with either but if you have the time, which of the two would you use? Do both of the toolkits support a decent amount of features regarding observing user input? Thanks :)
This is a bit of a toughie because I think both could work just fine. I do find Meta's to be a bit messy and confusing most of the time, but it does seem to offer more in-depth specifics for Oculus devices. The XR Interaction Toolkit is way easier to get projects moving along and implement with. Both could be used easily for the current things you've listed that you're going to collect. I'd personally go with the XR Toolkit because I'm more experienced with it and I enjoy developing with it, but if I was going to want every little bit of data specific to only Meta devices, then I'd consider going with Meta's Toolkit. Hope that helps!
@@FistFullofShrimpThanks for the insightful response! Ya that makes sense, I also think just in terms of getting things moving along I'd stick with the XR interaction toolkit and then see down the line if I would need a wide variety of specifics but theres also probably 3rd party libraries that may help like cognitive3D
@@suldra915 funny enough ya, there is a paper written and submitted to IEEE 2025, if its accepted then I'll be able to release it and hopefully the public can see
This is a lifesaver, thank you. I'm still confused about getting values of things (like trigger, grip button, etc) out of the input map actions and into scripts. That would be a great video to clarify
Thanks for watching and the suggestion! 🍤
This demo arrive at the perfect time thank you for this!
You're super welcome!
Thanks for the video. 4:11 What will if I didn't check the Generic XR controller?
This is a great question! The Generic XR Controller setting from my understanding, allows the Input System to use a scheme that allows for broad compatibility from a variety of devices. Essentially it is a way for the Input System to abstract the hardware specifics and allows it to use a generalized input model. Will your inputs work if you don't use this setting? Maybe. Will it work on a variety of devices without using this setting? probably not
The UI issue when selecting the binding control is due to a misspelling in the UI theme that was introduced in recent versions of Unity. You can see the list is rendered to high up, covering the input field and listen button.
I was honestly not expecting someone to answer this question that I was thinking of when making this video. It was driving me nuts!!! THANK YOU!!! 🍤🍤🍤
W Shrimp. This tutorial came in handy for the second time :)
Help me a lot! Thank you so much!
you are a literal saint. thank you for these videos!!
Thank you for another amazing video 🎉❤🎉
What a wonderful shrimp lady you are!!!
THANK YOU!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
Thank you so much. I feel like im finally a step closer to learning and figuring out how the heck i can assign the XYAB buttons of my Quest 2 into the Unity project im making.
All i want it to do is provide a small ui pop up for a list of different button options.
Pressing the X button brings up the choices and then the user can use the joycon and the Y button to select the option. Cause i want that small UI to stay attached to the Controller object. Like they do in Vr chat, where you can tilt your controller and see the Ui of the Live stream chat.
I cant find any, successfully working, tutorial on implimenting the specific XYAB buttons and how to use them in code. The best I found was the OVRInput but that also didnt work.
This has been the closest ive found. So im going to try find the buttons im looking for.
Thank you
DUDE THANK YOU SOOOOOO MUCH FOR THIS TEMPLATE but can i make a game with this template
Hey Thanks for the video! I am currently working on a research project regarding gaining insights about a user's usage of different menu types within VR, now I'm stuck between choosing whether to go ahead with using Unity's XR interaction toolkit, or Meta's XR All-in-One SDK. In your opinion, with your experience with either toolkit, considering I'm going to need data such as time taken to select item, checking certain inputs, etc.. which toolkit would you suggest using? I am developing for an Oculus so I can go with either but if you have the time, which of the two would you use? Do both of the toolkits support a decent amount of features regarding observing user input? Thanks :)
This is a bit of a toughie because I think both could work just fine. I do find Meta's to be a bit messy and confusing most of the time, but it does seem to offer more in-depth specifics for Oculus devices. The XR Interaction Toolkit is way easier to get projects moving along and implement with. Both could be used easily for the current things you've listed that you're going to collect. I'd personally go with the XR Toolkit because I'm more experienced with it and I enjoy developing with it, but if I was going to want every little bit of data specific to only Meta devices, then I'd consider going with Meta's Toolkit. Hope that helps!
@@FistFullofShrimpThanks for the insightful response! Ya that makes sense, I also think just in terms of getting things moving along I'd stick with the XR interaction toolkit and then see down the line if I would need a wide variety of specifics but theres also probably 3rd party libraries that may help like cognitive3D
Have you finished the research yet? I'm interested in what your findings are/were
@@suldra915 funny enough ya, there is a paper written and submitted to IEEE 2025, if its accepted then I'll be able to release it and hopefully the public can see
@@ohokcool3119 oh that's awesome! I'll keep an eye on it