This is good work . It shows you this stuff is not easy. It will be a fine day when we can design our own synths easily using this hand detection technology....and build things in AR like Synthedit has done for VSTs. I can also imagine buying an AR headset that can only run one specific plugin. Being able to see those fingertips and the stretch distance is a good idea in this demo.
I hope hands can be swapped for us left-handed folks. Also, I would love to have something emulate a string instrument where you would finger notes with one hand and use the other hand to emulate a bow or pick.
Yeah definitely swappable hands. At the expense of range, one hand could be the trigger for the notes of the other hands, if there's requests for that.
I don't know yet, I'll have a sense for that once I put a sound engine in there. However, judging from my experience with Animoog Galaxy, it should be good.
I just love the cylinder dragging after finger contact !
This is good work . It shows you this stuff is not easy. It will be a fine day when we can design our own synths easily using this hand detection technology....and build things in AR
like Synthedit has done for VSTs. I can also imagine buying an AR headset that can only run one specific plugin.
Being able to see those fingertips and the stretch distance is a good idea in this demo.
I hope hands can be swapped for us left-handed folks. Also, I would love to have something emulate a string instrument where you would finger notes with one hand and use the other hand to emulate a bow or pick.
Yeah definitely swappable hands. At the expense of range, one hand could be the trigger for the notes of the other hands, if there's requests for that.
how the latency?
I don't know yet, I'll have a sense for that once I put a sound engine in there. However, judging from my experience with Animoog Galaxy, it should be good.