Thanks for video. Can we detect the hand pose that is meant to hold a specific object? For example, detecting whether the user is holding a pen correctly. Or detecting whether the user is holding a drill correctly.
Hi, thanks for this tutorial. I've got that working on my Apple Vision Pro. the gesture will be detected but I don't know how to grab something with that gesture. Is there any tutorial available for this?
Great explanation. How can I grab an interactable object with my new custom pose/shape. I want the object to only be grabbed with the custom pose. Can you provide me link to a video if you have done it already. I have watched the previous setup video and although there was mention of input action map in it. I couldn't extract the needed information from it as im quite unfamiliar with it. Can you help me out in this regard? Thanks
Thanks for the Info. One question: how to perform a Shake Pose? I tried waving with fingers, palm and other movements, but no feedback on the shake button...BTW, I'm using a Quest3
We can only design static gestures at this point unfortunately! So for shaking, you would need to have kind of a half closed hand on each person and then maybe checking the collision between the two hands or come up with a different solution.
@@blackwhalestudio I tried doing the custom hand gesture to activate teleportation in Apple Vision Pro. But, the gestures are not detected. Can you help me redirect to any resource that could achieve this?
Thanks for video.
Can we detect the hand pose that is meant to hold a specific object? For example, detecting whether the user is holding a pen correctly. Or detecting whether the user is holding a drill correctly.
That could definitely be possible! However, it would probably require very accurate finger shapes and less tolerance values.
Very Good Video.
Can I use XR Hands in my MR project with the Meta Quest 3?
Yes, you can!
Thank you Man @@blackwhalestudio 💓💓💓
I got you! :)@@giuseppelauria8288
Hi, thanks for this tutorial. I've got that working on my Apple Vision Pro. the gesture will be detected but I don't know how to grab something with that gesture. Is there any tutorial available for this?
Great explanation. How can I grab an interactable object with my new custom pose/shape. I want the object to only be grabbed with the custom pose. Can you provide me link to a video if you have done it already. I have watched the previous setup video and although there was mention of input action map in it. I couldn't extract the needed information from it as im quite unfamiliar with it. Can you help me out in this regard? Thanks
Hi, have you find a solution for this? I am looking for the same Info
hi sir , thx for the video
When u redirect us to a different video of yours (like u did at 2:17) plz do add video link in the description.
Hi! I did, the link is always on the top right corner!
That's the video: th-cam.com/video/uGq10Tcl3Ns/w-d-xo.html&t
Thank you
Thanks for the Info. One question: how to perform a Shake Pose? I tried waving with fingers, palm and other movements, but no feedback on the shake button...BTW, I'm using a Quest3
We can only design static gestures at this point unfortunately! So for shaking, you would need to have kind of a half closed hand on each person and then maybe checking the collision between the two hands or come up with a different solution.
I too read it as shake first... but it actually says Shaka . thumb and pinky out
@@poolguymsj Ohhhh, well done, you're right, thank you, works well!!!
I wanted to run it on hololens, but encountered a different configuration issue
Can we do this in vision pro?
yes!
@@blackwhalestudio I tried doing the custom hand gesture to activate teleportation in Apple Vision Pro. But, the gestures are not detected. Can you help me redirect to any resource that could achieve this?
Can Pico be used?
I'm not sure if Pico supports hand tracking as well as OpenXR. But if so, this should work yes!
WHY !! Gui elements only visible in one eye :"""(