Why Throwing Often Does Not Work In VR

แชร์
ฝัง
  • เผยแพร่เมื่อ 14 ต.ค. 2024

ความคิดเห็น • 56

  • @gennoveus
    @gennoveus 2 ปีที่แล้ว +34

    What a fantastic idea. I was using the "averaging solution" in COMPOUND but going forward I'll be using this. Thank you!

    • @okreylos
      @okreylos  2 ปีที่แล้ว +6

      Awesome! I really hope there is a way to combine this approach (properly using what the tracking system already gives you) with how pre-built physics engines do their things, as another commenter pointed out. I haven't played too many VR games, but like I said in the video, whenever there was throwing, it was a mess.
      I've recently been playing Super Hot VR on Quest 2, and while overall it's fantastic, the completely broken throwing physics drive me up the wall, because throwing is supposed to be an integral part of gameplay, but it SIMPLY. DOES. NOT. WORK.

    • @okreylos
      @okreylos  2 ปีที่แล้ว +2

      I just realized I forgot to mention the most important reason why averaging is not a good approach in this context: averaging can get rid of *random* noise, but this noise is *not* random. It's not even technically noise, it's drift correction.

    • @gennoveus
      @gennoveus 2 ปีที่แล้ว

      @@okreylos I think your video did a very good job of intuitively explaining the flaws of that approach, even when you weren't explicit about every detail. Luckily my game doesn't really use throwing all that much, but in future games I'd like it to be a perfect as possible. Your video has been a huge help. That you for inspiring me to get into this industry, BTW. It would not be an understatement to say it has completely changed my life!

    • @okreylos
      @okreylos  2 ปีที่แล้ว +1

      Awesome!

  • @TundraFightSchool
    @TundraFightSchool 2 ปีที่แล้ว +8

    well presented, useful info, solid avatar/IK, sweet data visualization. nicely done, sir! thanks for sharing.

    • @okreylos
      @okreylos  2 ปีที่แล้ว

      Glad you think so, hope it'll be useful to some.

  • @SamMotioneer
    @SamMotioneer 2 ปีที่แล้ว +8

    Glad to see that you got your step dancing robot under control! Have you implemented automatic calibration?
    Also I was very impressed with the IK when you demonstrated the wrist flicking. Absolutely stable, amazing.

    • @okreylos
      @okreylos  2 ปีที่แล้ว +5

      I gave it leg extension surgery. :) I am working on an automatic calibration procedure, but it's not finished yet.
      Thanks for noticing the wrist flick, I too thought that worked out well. I noticed only one point in the video where the IK conked out and the poor robot dislocated its elbow, but I believe that's an actual software bug, not a fundamental problem.

  • @Skarredghost
    @Skarredghost 2 ปีที่แล้ว +5

    Great video. If I can give confirmation of what you said. In our boxing game for Quest HitMotion, I used at first a software solution to get the velocity of punches and it was totally unreliable. But when I started using the hardware-provided velocity, everything started going much better. Just a little warning, though: different systems detect speed in different ways: so our speed detected by SteamVR was different than the one detected by Quest standalone. The same between Focus and Quest. Don't ask me why, I don't know.

  • @AndrigeEU
    @AndrigeEU 2 ปีที่แล้ว +1

    Excellent video, this should be shared to all VR devs because accurate throwing physics is just night and day

  • @StreamwaveProduction
    @StreamwaveProduction 7 หลายเดือนก่อน

    This is an unbelievable tutorial and explanation of how these VR system work. Great job. This is exactly what I was looking for. My default UE5 game does not have throwable items. How did you do the rotation of your character to be so lifelike!? Is that IK?

    • @okreylos
      @okreylos  7 หลายเดือนก่อน

      Glad you got something out of it. Yes, the avatar is IK-controlled, based solely on head- and controller-tracking and lots of heuristics.

    • @StreamwaveProduction
      @StreamwaveProduction 7 หลายเดือนก่อน

      TY for the response@@okreylos I just implemented the logic tonight like you said (get motion controller data, calculate distance/linear velocity) and the code works. I THINK I'm doing it right.
      The stock UE5 mechanics do EXACTLY what you talk about where the item just drops to the floor after you throw it. I have to double check the velocity with a graph like you did. Absolutely brilliant video thank you so much for the help.

  • @0LoneTech
    @0LoneTech 2 ปีที่แล้ว +7

    Funny how it boils down to use what the VR system is already providing, just as with the view projections where you get entire matrices and shouldn't go rebuilding them from estimated fields of view.

    • @okreylos
      @okreylos  2 ปีที่แล้ว +2

      Exactly. The VR headsets are carefully calibrated at the factory; don't go making up your own view matrices. The tracking system is carefully optimized to give the best possible instantaneous positions/velocities for the controllers; don't redo that work by tracking yourself, only poorer.

  • @HansMilling
    @HansMilling 8 หลายเดือนก่อน

    Great explanation. I will keep this in mind. With my Quest 3 I often move my hands out of view from the cameras, when throwing, and it is so bad for throwing objects. Using controller input only seems a lot better.

  • @Amni3D
    @Amni3D 2 ปีที่แล้ว +1

    Really darn good video.
    While I think some smoothing solutions are decent, a lot of the time (especially action games) that margin of error can be painful.

  • @benja_mint
    @benja_mint 2 ปีที่แล้ว +3

    this is pretty great, and deserves more views

    • @okreylos
      @okreylos  2 ปีที่แล้ว

      Sure hope so, thanks! :)

  • @PoboyMusic
    @PoboyMusic 6 หลายเดือนก่อน +1

    Echo Vr had throwing down to a science.

  • @macratak
    @macratak 2 ปีที่แล้ว +2

    good presentation format lol.

  • @kazioo2
    @kazioo2 2 ปีที่แล้ว +1

    Are you using Bullet or another big physics engine for the throw or did you write simple custom physics for it? This could be another difficulty for game developers that you theoretically may not have in a non-game tech app (if you wrote all the physics yourself) - the impulse-based physics engine that is integrated to the game engine can be a nightmare (PhysX, Havok, Chaos, Bullet etc.). In a tech demo you can use any math you want to get a proper trajectory, but a game developer usually has no choice but to launch objects with force impulses (instead of velocity) if the dev wants thrown object to be properly interacting with the physical world. And relying on those force impulses can be very tricky. It's often more stable to do custom velocity based algorithm integration, but then you can't properly combine it with the giant, overly optimised (==compromises) physics engine, so devs usually only do that for some visual effects (eg. dangling rope made with verlet integration is 100x more stable than rope made with PhysX joints and games had it 20+ years ago, but if you want your character to pull that rope with force then you have to use that horrible PhysX method, even if you have the source code access, because the algorithms are incompatible - a big dilemma in modern gamedev).

    • @okreylos
      @okreylos  2 ปีที่แล้ว +2

      Yes, I agree that trying to fit a square peg into a round hole with physics engines that don't have avenues for kinematics is probably a large part of the problem. This, like all my other VR applications, is entirely home-grown, and this testbed (I can hardly call it an application) is super simple.
      At first glance, and I haven't thought about it yet, there should be a way to combine a kinematic approach (assign velocity / momentum to an object from outside) with what a physics engine does internally. One could potentially track an object's intended instantaneous momentum via the tracking system like I do it here, and then nudge a physics object towards that ideal frame by frame by adding "delta" impulses in the physics engine, getting the best of both worlds.

  • @mg1632
    @mg1632 2 ปีที่แล้ว

    One thing that is unaccounted for is the role of the fingers during throws of spherical objects. When you "flick your wrist forward" you also relax your finger tips. The fingers and their tendons stretch and act as a spring, and allow the ball to roll up and off of them at a tangent to the arc of throw at the point where you STARTED the release, during this wrist flick, and are perpendicular to the ground when your controllers angular velocity may be slightly downward. They add an extra little push when they get to this point as the potential energy from them going backwards, releases. So the actual origin of the trajectory is closer to where you started the release, not where it finished. AND with a little more energy than just the velocity of the controller says. This is why throws, even such as yours, still feel unnatural; they don't simulate the fingers that we've grown so accustomed to. Our throws still feel slightly weaker than we are used to, and at odd angles.

    • @johnwagner7144
      @johnwagner7144 2 ปีที่แล้ว

      I waa thinking about something similar this during the video too. While this video clearly shows how throwing in VR can be done BETTER, I don't think it can be achieved *perfectly* with our current technology, because in real life, the shape of an object often determines the way you are holding it and release it from your hand.
      A perfect example is throwing an American football. Your wrist and fingers need to release from the ball in a very specific way if you hope to throw a perfect spiral, but because you are actually holding a VR controller, there's no way to translate that into a VR game or program.
      I think what this video shows is simply how to program throwing an object as accurately as possible with our current technology. The way you expect an object to be released from your fingers is a small enough factor that you could argue it does not matter with our current technology, especially since the user isn't gripping the object they are throwing, they are gripping a VR controller and will therefore be throwing slightly differently than they normally would to begin with.

  • @pip.turner
    @pip.turner 2 ปีที่แล้ว +1

    Absolutely brilliant video! Thanks so much, definitely going to be playing with this.

    • @okreylos
      @okreylos  2 ปีที่แล้ว +1

      Thank you! I was hoping it wasn't just going to be interesting to VR game players who might be as frustrated with throwing as I am, but also to developers.
      I've had discussions about throwing for years now, mostly on reddit in text form which doesn't really work for this, so I figured I'd finally put together a visual demonstration.

    • @pip.turner
      @pip.turner 2 ปีที่แล้ว +1

      @@okreylos the visuals were wonderful! I especially loved the in-scene graphs, they were so effective!
      I make xr hand tracking interactions for a living & we're looking at throwing later this year so I'm definitely going to refer back to this, thank you!

  • @Zaptruder
    @Zaptruder 2 ปีที่แล้ว

    Interesting ideas here - but I think it's largely redundant going into hyper specifics - simply because real world object throwing behaviour varies in a number of ways - some which can be captured in an algorithm, others that are more difficult to capture.
    Specifically, I think the biggest difference between VR throwing and real world throwing is the feeling of the release point - where VR is quite on and off, real world is more reliant on slip and friction - the release point can cause the angle that the object is launched at to be quite significantly different (i.e. release 1-2 frames later than you intended, and your hand is now pointing down).
    In the real world, we might hold an object in such a manner as to allow its velocity to overcome the held friction to launch with more robustness.
    What seems to be more important to me is consistency - given that real world objects vary substantially in thrown behaviour, but that we can learn to accurately throw them, it seems the same can occur for VR - so long as the system can consistently capture intent and transform that accordingly, then we have a good throwing system in VR.
    As a result, some form of simply averaging out the frames, isn't a bad way of proceeding with throw functionality. There's a good example of various throwing algorithms available in the Steam VR Unity project that you can test on - I've found the middle average of last frames provides a consistent and predictable result to throwing in VR - much better than naieve throwing implementations that simply use the velocity of the last frame before release to launch the object.

  • @AJGaS
    @AJGaS ปีที่แล้ว +2

    I hope the devs of contractors VR sees this, hahah.

  • @bassett_green
    @bassett_green 2 ปีที่แล้ว

    Ahhhh, this makes sense - irl what matters here is the total energy. Mass cancels out, so it's the potential energy of the point of release, and kinetic energy of the velocity at the instant of release. There is inherently some noise in the VR position data, which makes a relatively large difference in the calculated velocity?

    • @okreylos
      @okreylos  2 ปีที่แล้ว

      It's better to think of it in terms of momentum (mass times velocity) rather than energy. Momentum is the original conserved quantity, and it's also a vector quantity, unlike kinetic energy. It encodes both speed and direction in one quantity.
      The controller position makes no difference in calculating velocity / momentum if you use the reported velocity and don't recalculate velocity from position as I show in the beginning, and that's what leads to the predictable and natural throwing behavior I show in the beginning of the video.

  • @Wombatus
    @Wombatus 2 ปีที่แล้ว

    I don't have a game development background and I kinda got used to throwing in VR (even tho I usually don't do it because it is janky). I don't have much to say here other than that I want to leave a comment so that this video is picked up by the youtube algorithm and more people see it

    • @okreylos
      @okreylos  2 ปีที่แล้ว +1

      I am replying to your comment solely for the algorithm to think that I am engaging with my viewers, and will hopefully promote this video. :)

  • @manuelkimpel3788
    @manuelkimpel3788 ปีที่แล้ว

    Thank you for the video!
    Is it possible to accurate throw move by a software. I have no power on my right hand but there is a VR game which force me to throw only with right hand. Now I'm stuck there because my right arm isn't strong enough to make the required throw.
    It's about throwing a grenade at a specific boss to beat.

    • @okreylos
      @okreylos  ปีที่แล้ว

      That's very bad design right there. VR software has no business enforcing handedness on its users, given that all current VR headsets have two hand-held controllers that are either identical or mirror images of each other.
      It's a fundamental design decision in my VR toolkit that applications don't enforce user interface. For example the grabbing/throwing interaction I'm showing in this video. It's not bound to a specific hand, or even a specific button on the controller. Users can bind the functionality to any button on any device at run-time, and even change the binding in the middle of using an application. That's how it should be done.
      Regarding your specific issue: it might sound silly, but have you tried temporarily using the right-hand controller with your left hand just for that specific point in the game?

    • @manuelkimpel3788
      @manuelkimpel3788 ปีที่แล้ว

      @@okreylos yes I've tried that. The thing is that you also have to use the other controller to dodge the shooting attacks from the boss.
      The VR game I'm talking about is Ven VR adventure. It's a third person jump and run game where you're controlling a creature similar to crash Bandicoot.
      But sometimes you have to help him with your hands. And the first boss level requires you to dodge the shoot and throw the grenades at the same time and as I said there is no option to switch between left and right. I don't really know why the devs designed something like this.
      But thank you for your help. :)

  • @atkearns
    @atkearns 2 ปีที่แล้ว +2

    You had me laughing when you threw the bow

    • @okreylos
      @okreylos  2 ปีที่แล้ว +1

      We didn't need that right then.

    • @mobslaya_4547
      @mobslaya_4547 2 ปีที่แล้ว

      @@okreylos What is a bow, but a more complex way of throwing?

    • @okreylos
      @okreylos  2 ปีที่แล้ว +1

      Indeed. :)

  • @SonicTheCat
    @SonicTheCat 2 ปีที่แล้ว

    Most of your failure cases here were due to the trigger not releasing as you expected. Can you think of a way to deal with that problem? Different VR controllers probably release the trigger differently too.

    • @okreylos
      @okreylos  2 ปีที่แล้ว

      Fix the triggers. :) Seriously, my Index controllers' triggers are slightly broken. They sometimes register a release when I hold the button, or don't register a release when I do. I think it's a hardware fault.

    • @SonicTheCat
      @SonicTheCat 2 ปีที่แล้ว

      @@okreylos I don't think that really deals with the issue, because different people will release the trigger at different speeds and at different times. Plus some VR hardware has a different travel distance on the trigger and a different activation point, which would also alter the timing of the release.

    • @okreylos
      @okreylos  2 ปีที่แล้ว

      One reason I'm not too concerned about the exact timing of release etc. is that while it does influence the final trajectory, it is still predictable between individual throws. The real issue is things that are "random" between throws, like the velocity spikes I show in the middle of the video. Due to controller release timings etc. being consistent between throws, you might not be able to exactly predict where the first throw ever ends up, but you'll be able to correct from there, like when I aim for the middle distance and the first throw ends up slightly short, and the second one is on the spot.

  • @RC-1290
    @RC-1290 2 ปีที่แล้ว

    In Unreal and Unity (PhysX) you need to have an intermediate physics object between the kinematic controller, and physics objects. When you make this the same mass as the controller (and perhaps your hand) it tends to give pretty accurate momentum purely using the built-in physics engines. Most games seem to go for some kind of custom estimation of momentum based on locations at previous frames (or at even higher temporal resolution in the case of Half Life Alyx), combined with lower gravity. I'm very curious if this is going to be something you mention in this video.

    • @RC-1290
      @RC-1290 2 ปีที่แล้ว

      I guess my approach here is similar to averaging (though averaging using a constraint, which is inherently springy), but correcting for the offset (because the physics engine does that automatically through the constraint)
      Interesting points about using the sensor data more directly, and compensating for the offset from the controller manually.

    • @thaytan
      @thaytan 2 ปีที่แล้ว

      Half-Life Alyx uses the velocity reports from the controllers like Oliver describes. When testing my OpenHMD Rift CV1 driver, throwing didn't work at all there until I added those.

    • @okreylos
      @okreylos  2 ปีที่แล้ว

      @@thaytan Thank you for confirming that. I didn't know, and was wondering how throwing works in HL:A since I've heard it works really well. I have to admit I haven't played it yet. Too little time.

    • @asdpls9231
      @asdpls9231 2 ปีที่แล้ว

      Another few things to pay attention to are how view direction seems to have some influence over throwing direction and how object mass has less of an influence on throwing (there will be some visual lag to hand movement depending on the mass, but actual throwing behavior matches something of lower mass). Both of these things align with Valve's "know your simulation" philosophy, i.e. basically favoring what matches player expectations better instead of a unadulterated simulation. There are so many elements of feedback missing from motion controllers (that would otherwise help keep the player more "in sync" with the simulation), so trying to sort of guess the player's intent compensates for their absense. I don't know if you've watched the Kerry Davis talk on VR Doors but it goes into this philosophy in good detail and you see it repeated across many of HLA's interaction systems

  • @SakosTechSpot
    @SakosTechSpot 2 ปีที่แล้ว

    Brilliant!

  • @SuperKittyPancake
    @SuperKittyPancake 2 ปีที่แล้ว

    thanks PVC tube man.

  • @kroysteamvr
    @kroysteamvr 2 ปีที่แล้ว +3

    Intersting video, but the background noise in your audio is super distracting. Should've recorded it on your index. The mic on that HMD is high quality. Add in NVIDIA broadcast tools for noise cancellation, and you're golden.

    • @okreylos
      @okreylos  2 ปีที่แล้ว

      I was going to use the Index microphone, which is great, but guess what -- it completely refused to work yesterday. I could open the device, I could record audio, but I only got complete silence. As in zero amplitude, not even background hiss. So I had to improvise.

  • @wilfredkane7569
    @wilfredkane7569 2 ปีที่แล้ว

    🌸 【p】【r】【o】【m】【o】【s】【m】