Mapping Inputs to Actions in Unity VR!

แชร์
ฝัง
  • เผยแพร่เมื่อ 25 พ.ย. 2024

ความคิดเห็น • 23

  • @DrunkAncestor
    @DrunkAncestor 7 หลายเดือนก่อน +4

    This is a lifesaver, thank you. I'm still confused about getting values of things (like trigger, grip button, etc) out of the input map actions and into scripts. That would be a great video to clarify

    • @FistFullofShrimp
      @FistFullofShrimp  7 หลายเดือนก่อน

      Thanks for watching and the suggestion! 🍤

  • @ThomasSimonini-d9s
    @ThomasSimonini-d9s 9 หลายเดือนก่อน +2

    This demo arrive at the perfect time thank you for this!

  • @Lemon-dh4fz
    @Lemon-dh4fz 9 หลายเดือนก่อน +6

    Thanks for the video. 4:11 What will if I didn't check the Generic XR controller?

    • @FistFullofShrimp
      @FistFullofShrimp  8 หลายเดือนก่อน +2

      This is a great question! The Generic XR Controller setting from my understanding, allows the Input System to use a scheme that allows for broad compatibility from a variety of devices. Essentially it is a way for the Input System to abstract the hardware specifics and allows it to use a generalized input model. Will your inputs work if you don't use this setting? Maybe. Will it work on a variety of devices without using this setting? probably not

  • @GwynPerry
    @GwynPerry 9 หลายเดือนก่อน +4

    The UI issue when selecting the binding control is due to a misspelling in the UI theme that was introduced in recent versions of Unity. You can see the list is rendered to high up, covering the input field and listen button.

    • @FistFullofShrimp
      @FistFullofShrimp  8 หลายเดือนก่อน +1

      I was honestly not expecting someone to answer this question that I was thinking of when making this video. It was driving me nuts!!! THANK YOU!!! 🍤🍤🍤

  • @DrDrasticVR
    @DrDrasticVR หลายเดือนก่อน

    W Shrimp. This tutorial came in handy for the second time :)

  • @ryoxxyz
    @ryoxxyz 2 หลายเดือนก่อน

    Help me a lot! Thank you so much!

  • @pickledOnionGal
    @pickledOnionGal 2 หลายเดือนก่อน

    you are a literal saint. thank you for these videos!!

  • @ladyupperton23
    @ladyupperton23 9 หลายเดือนก่อน +3

    Thank you for another amazing video 🎉❤🎉

    • @FistFullofShrimp
      @FistFullofShrimp  8 หลายเดือนก่อน +1

      What a wonderful shrimp lady you are!!!

  • @rodneywheeler7764
    @rodneywheeler7764 8 หลายเดือนก่อน +4

    THANK YOU!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

  • @E-0n-exe
    @E-0n-exe 2 หลายเดือนก่อน

    Thank you so much. I feel like im finally a step closer to learning and figuring out how the heck i can assign the XYAB buttons of my Quest 2 into the Unity project im making.
    All i want it to do is provide a small ui pop up for a list of different button options.
    Pressing the X button brings up the choices and then the user can use the joycon and the Y button to select the option. Cause i want that small UI to stay attached to the Controller object. Like they do in Vr chat, where you can tilt your controller and see the Ui of the Live stream chat.
    I cant find any, successfully working, tutorial on implimenting the specific XYAB buttons and how to use them in code. The best I found was the OVRInput but that also didnt work.
    This has been the closest ive found. So im going to try find the buttons im looking for.
    Thank you

  • @Vice_vr24
    @Vice_vr24 4 วันที่ผ่านมา

    DUDE THANK YOU SOOOOOO MUCH FOR THIS TEMPLATE but can i make a game with this template

  • @ohokcool3119
    @ohokcool3119 8 หลายเดือนก่อน +4

    Hey Thanks for the video! I am currently working on a research project regarding gaining insights about a user's usage of different menu types within VR, now I'm stuck between choosing whether to go ahead with using Unity's XR interaction toolkit, or Meta's XR All-in-One SDK. In your opinion, with your experience with either toolkit, considering I'm going to need data such as time taken to select item, checking certain inputs, etc.. which toolkit would you suggest using? I am developing for an Oculus so I can go with either but if you have the time, which of the two would you use? Do both of the toolkits support a decent amount of features regarding observing user input? Thanks :)

    • @FistFullofShrimp
      @FistFullofShrimp  8 หลายเดือนก่อน +2

      This is a bit of a toughie because I think both could work just fine. I do find Meta's to be a bit messy and confusing most of the time, but it does seem to offer more in-depth specifics for Oculus devices. The XR Interaction Toolkit is way easier to get projects moving along and implement with. Both could be used easily for the current things you've listed that you're going to collect. I'd personally go with the XR Toolkit because I'm more experienced with it and I enjoy developing with it, but if I was going to want every little bit of data specific to only Meta devices, then I'd consider going with Meta's Toolkit. Hope that helps!

    • @ohokcool3119
      @ohokcool3119 8 หลายเดือนก่อน

      @@FistFullofShrimpThanks for the insightful response! Ya that makes sense, I also think just in terms of getting things moving along I'd stick with the XR interaction toolkit and then see down the line if I would need a wide variety of specifics but theres also probably 3rd party libraries that may help like cognitive3D

    • @suldra915
      @suldra915 หลายเดือนก่อน

      Have you finished the research yet? I'm interested in what your findings are/were

    • @ohokcool3119
      @ohokcool3119 หลายเดือนก่อน

      @@suldra915 funny enough ya, there is a paper written and submitted to IEEE 2025, if its accepted then I'll be able to release it and hopefully the public can see

    • @suldra915
      @suldra915 หลายเดือนก่อน

      @@ohokcool3119 oh that's awesome! I'll keep an eye on it