Mapping Inputs to Actions in Unity VR!

แชร์
ฝัง
  • เผยแพร่เมื่อ 21 ก.ย. 2024

ความคิดเห็น • 15

  • @DrunkAncestor
    @DrunkAncestor 5 หลายเดือนก่อน +4

    This is a lifesaver, thank you. I'm still confused about getting values of things (like trigger, grip button, etc) out of the input map actions and into scripts. That would be a great video to clarify

    • @FistFullofShrimp
      @FistFullofShrimp  5 หลายเดือนก่อน

      Thanks for watching and the suggestion! 🍤

  • @ThomasSimonini-d9s
    @ThomasSimonini-d9s 6 หลายเดือนก่อน +2

    This demo arrive at the perfect time thank you for this!

  • @GwynPerry
    @GwynPerry 6 หลายเดือนก่อน +3

    The UI issue when selecting the binding control is due to a misspelling in the UI theme that was introduced in recent versions of Unity. You can see the list is rendered to high up, covering the input field and listen button.

    • @FistFullofShrimp
      @FistFullofShrimp  6 หลายเดือนก่อน +1

      I was honestly not expecting someone to answer this question that I was thinking of when making this video. It was driving me nuts!!! THANK YOU!!! 🍤🍤🍤

  • @Lemon-dh4fz
    @Lemon-dh4fz 6 หลายเดือนก่อน +6

    Thanks for the video. 4:11 What will if I didn't check the Generic XR controller?

    • @FistFullofShrimp
      @FistFullofShrimp  6 หลายเดือนก่อน +2

      This is a great question! The Generic XR Controller setting from my understanding, allows the Input System to use a scheme that allows for broad compatibility from a variety of devices. Essentially it is a way for the Input System to abstract the hardware specifics and allows it to use a generalized input model. Will your inputs work if you don't use this setting? Maybe. Will it work on a variety of devices without using this setting? probably not

  • @ladyupperton23
    @ladyupperton23 6 หลายเดือนก่อน +3

    Thank you for another amazing video 🎉❤🎉

    • @FistFullofShrimp
      @FistFullofShrimp  6 หลายเดือนก่อน +1

      What a wonderful shrimp lady you are!!!

  • @rodneywheeler7764
    @rodneywheeler7764 6 หลายเดือนก่อน +4

    THANK YOU!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

  • @E-nfileexe
    @E-nfileexe 21 วันที่ผ่านมา

    Thank you so much. I feel like im finally a step closer to learning and figuring out how the heck i can assign the XYAB buttons of my Quest 2 into the Unity project im making.
    All i want it to do is provide a small ui pop up for a list of different button options.
    Pressing the X button brings up the choices and then the user can use the joycon and the Y button to select the option. Cause i want that small UI to stay attached to the Controller object. Like they do in Vr chat, where you can tilt your controller and see the Ui of the Live stream chat.
    I cant find any, successfully working, tutorial on implimenting the specific XYAB buttons and how to use them in code. The best I found was the OVRInput but that also didnt work.
    This has been the closest ive found. So im going to try find the buttons im looking for.
    Thank you

  • @ohokcool3119
    @ohokcool3119 6 หลายเดือนก่อน +3

    Hey Thanks for the video! I am currently working on a research project regarding gaining insights about a user's usage of different menu types within VR, now I'm stuck between choosing whether to go ahead with using Unity's XR interaction toolkit, or Meta's XR All-in-One SDK. In your opinion, with your experience with either toolkit, considering I'm going to need data such as time taken to select item, checking certain inputs, etc.. which toolkit would you suggest using? I am developing for an Oculus so I can go with either but if you have the time, which of the two would you use? Do both of the toolkits support a decent amount of features regarding observing user input? Thanks :)

    • @FistFullofShrimp
      @FistFullofShrimp  6 หลายเดือนก่อน +2

      This is a bit of a toughie because I think both could work just fine. I do find Meta's to be a bit messy and confusing most of the time, but it does seem to offer more in-depth specifics for Oculus devices. The XR Interaction Toolkit is way easier to get projects moving along and implement with. Both could be used easily for the current things you've listed that you're going to collect. I'd personally go with the XR Toolkit because I'm more experienced with it and I enjoy developing with it, but if I was going to want every little bit of data specific to only Meta devices, then I'd consider going with Meta's Toolkit. Hope that helps!

    • @ohokcool3119
      @ohokcool3119 6 หลายเดือนก่อน

      @@FistFullofShrimpThanks for the insightful response! Ya that makes sense, I also think just in terms of getting things moving along I'd stick with the XR interaction toolkit and then see down the line if I would need a wide variety of specifics but theres also probably 3rd party libraries that may help like cognitive3D