My thought went to controlling lights. Imagine seeing your lights spots in space. On the ground or in the air. And the knob to adjusts those are there. MIDI to DMX. I don’t see a good use case for a slider when I can use the physical, but all the base framework you already thought of is golden for other use cases that people will submit to you now! Great job with getting a bunch of features that would normally come later built in.
Cool idea. But only one manipulation at a time, like working with one mousce-curser, rigtht? Because you have to look at i.e. a specific fader befoe manipulating it and you can only look at one fader at a time
Excellent ! I had sketched this with a friend like 20 years ago using SDL, and today you're building it in MR, technology is finally here :) The result looks so intuitive. You can easily scale i up with single controls or premade panel and help with shortcuts. Would the vision be not be that expensive, it could help build a huge a virtual studio cheaply. Keep up with this insanely good work, with an crazy fast coding speed.
I cant believe what this would mean! especially for the synthstrom deluge with its very hard to use internal synth engine. just being able to create a panel to your design to control all of its internal functions (assuming they could all be mapped that way)
Apple really should have released a VR/XR version of GarageBand with spatial audio for vision pro at launch. If they can ever can implement full hand/finger tracking, then I think you could play any virtual 3D instrument using the headset.
This is cool! Would love to see widgets that leverage the MR space-- I could imagine a cube with an XYZ space control for modifying mappable filters. Similar to Electronauts in VR (check it out!).
Could you simultaneously have a window with your Mac output and maybe the iPad Logic Remote app in another window? Or does this like take over everything so you can't have another app running in the same space?
Sadly, at the moment in order to be able to do world anchoring, you need to switch to immersive mode and other windows disappear. I've submitted a feedback request to Apple to remove that restriction and then this would be possible.
Looks cool, but honestly, it is most likely unusable. The main problem is not even in the absence of haptic feedback of the fader/knob (so your fingers don't feel they are on the knob or fader); the main problem is... do I get it right that to move the fader, you have to _look_ directly at it? That's precisely what faders are designed to avoid - to look at them while controlling. Take a very simple real usage scenario and try to do it. I won't even ask for something like “you are at a large console, you are looking at VU-meters and raise 8 faders at once using 8 fingers, without even looking at them”. No, something simpler: you have two sounds, and you simultaneously decrease one fader and increase another. Is it doable with the current Vision Pro techs?
With the Vision Pro you can also touch objects to interact with them, not only by looking and pinching. Of course you won't get the haptic feedback but simultaneously increasing one fader while increasing another without looking would be fairly trivial.
@@AndrewSouthworth, does it work reasonably well? Many reviews of Vision Pro point out that it is too much dependent on “actually looking at a thing” (and most people don't really realize how much and how often we interact with something without even looking at it). Without haptic, one has to make the controllers much larger (to give enough leeway to controlling imprecision). But if actually interacting with multiple controllers at once is doable, it may work better than suspected.
And what about latency? It looked horrible in this demo, not instant enough for musical expression moves, how can you accurately move a control in time to the music?
I would want to use this to quickly add controls from multiple hardware synths to a tablet for Liine Lemur, although there's a faster way to do that without VR if you setup GUI's for all the hardware synths you have.
Holy shit man, the possibilities this tech is providing is just unreal. This is so smart, I'll be following this project!
Thanks so much for your support and interest!
This is awesome. Excited for you to release this!
Thank you!
My thought went to controlling lights. Imagine seeing your lights spots in space. On the ground or in the air. And the knob to adjusts those are there. MIDI to DMX. I don’t see a good use case for a slider when I can use the physical, but all the base framework you already thought of is golden for other use cases that people will submit to you now! Great job with getting a bunch of features that would normally come later built in.
Exactly! Just the beginning and many more types of widgets will be added.
The future potential of this tech is blowing my mind :) Super exciting.
It really is super exciting! 🙂
Cool idea. But only one manipulation at a time, like working with one mousce-curser, rigtht? Because you have to look at i.e. a specific fader befoe manipulating it and you can only look at one fader at a time
Actually I found out a way to do more than one, you can have one with each hand now 🙂
Excellent ! I had sketched this with a friend like 20 years ago using SDL, and today you're building it in MR, technology is finally here :) The result looks so intuitive. You can easily scale i up with single controls or premade panel and help with shortcuts. Would the vision be not be that expensive, it could help build a huge a virtual studio cheaply. Keep up with this insanely good work, with an crazy fast coding speed.
Thank you!
Incredible!
Wow, really cool. Like MainStage for Vision Pro!
This is amazing. Please let us know when you release it!
Thank you, not long now, just finishing writing the manual and I should release this weekend.
I cant believe what this would mean! especially for the synthstrom deluge with its very hard to use internal synth engine. just being able to create a panel to your design to control all of its internal functions (assuming they could all be mapped that way)
Exactly, the possibilities are mindblowing!
Apple really should have released a VR/XR version of GarageBand with spatial audio for vision pro at launch. If they can ever can implement full hand/finger tracking, then I think you could play any virtual 3D instrument using the headset.
This is cool! Would love to see widgets that leverage the MR space-- I could imagine a cube with an XYZ space control for modifying mappable filters. Similar to Electronauts in VR (check it out!).
Can you elaborate on what you mean by Electronauts?
Thank you! Yes an XYZ cube is on the list for a future update 🙂
Could you simultaneously have a window with your Mac output and maybe the iPad Logic Remote app in another window? Or does this like take over everything so you can't have another app running in the same space?
Sadly, at the moment in order to be able to do world anchoring, you need to switch to immersive mode and other windows disappear. I've submitted a feedback request to Apple to remove that restriction and then this would be possible.
I can use this as a midi controller on a daw without the necessity of using a real midi controller connected to my laptop?
Yes, if your computer supports BLE Midi
@ thnx for the speed feedback, brilliant app thank u 🫡
Dear Sir. Can you send NRPN messages from those virtual faders and knobs?
Is this works for logic pro/Ableton?
can you do a demo of you using them?
The Future has arrived. Unbelievable
it stays anchored even after reboot? Didnt think that was possible
If you hook into the right APIs, it's possible 😀
@@gbevin I didn't know about that, but I found the correct API now, thanks!
Need a version for the quest man
hard to say on a video if the subsequent 37 and acoustic guitar are actually there or are they virtual models)
They're there 🙂
Looks cool, but honestly, it is most likely unusable. The main problem is not even in the absence of haptic feedback of the fader/knob (so your fingers don't feel they are on the knob or fader); the main problem is... do I get it right that to move the fader, you have to _look_ directly at it? That's precisely what faders are designed to avoid - to look at them while controlling.
Take a very simple real usage scenario and try to do it. I won't even ask for something like “you are at a large console, you are looking at VU-meters and raise 8 faders at once using 8 fingers, without even looking at them”. No, something simpler: you have two sounds, and you simultaneously decrease one fader and increase another. Is it doable with the current Vision Pro techs?
With the Vision Pro you can also touch objects to interact with them, not only by looking and pinching. Of course you won't get the haptic feedback but simultaneously increasing one fader while increasing another without looking would be fairly trivial.
What he said^ you can control knobs or faders without looking but there’s no haptic feedback just sound (if programmed)
@@AndrewSouthworth, does it work reasonably well? Many reviews of Vision Pro point out that it is too much dependent on “actually looking at a thing” (and most people don't really realize how much and how often we interact with something without even looking at it).
Without haptic, one has to make the controllers much larger (to give enough leeway to controlling imprecision). But if actually interacting with multiple controllers at once is doable, it may work better than suspected.
And what about latency? It looked horrible in this demo, not instant enough for musical expression moves, how can you accurately move a control in time to the music?
I would want to use this to quickly add controls from multiple hardware synths to a tablet for Liine Lemur, although there's a faster way to do that without VR if you setup GUI's for all the hardware synths you have.