💡 Question: Let me know how you feel about licensing after watching my explanation about it. I will pass your feedback to Unity. 💻 GitHub repo to help you get started available here: github.com/dilmerv/HelloVisionOS Thanks everyone!
Great video, I'd like to know more about why we are getting charged this amount, is this only for developing on Apple Vision Pro, or can you use this with other devices too? If other devices, are there any advantages to using this vs other plugins? Thanks
When you get Unity Pro, you pretty much get access to additional features. This amount represents the license required to use VisionOS PolySpatial (including the installation of packages/tools). However, Unity Pro offers more than just support for Vision Pro. You can compare the different plans at the following link: unity.com/pricing/compare-plans Great question Nick!
Honestly, working with Swift and RealityKit, and Xcode IDE aren't my favorite things to do... But After 2 days trying to work with this jerry rigged solution from Unity bothers me so much! Specially knowing they are forcing you to pay to use it. When I started the free trial my first thought was: "wow, this should be such a smooth experience that they are making me pay for it! let's try!" but after 2 days begging to Unity to not crash on build, throw random errors, or just build corrupted builds, I really hope this doesn't become the standard way of developing on Apple Vision.
Definitely a big turn off that the cost to entry in experimenting with this new technology is crazy high. Luckily I already have a silicon mac and that apple offers a simulator of the headset, but Unity creating this new barrier for entry with that additional cost is a bit frustrating. I think to build the technology should be free, but publishing could be charged. Let the incredible tools that people build be what brings Unity more customers and money, not the devs themselves
Hi Dilmer, really nice video. Please is it possible to show how to build the project to visionOS and generate the Xcode project. And also how to set it up in Xcode and launch it on the vision Pro simulator. You skipped that part in the video and I think it is important, because that is where I am having a lot of issues.
Hey thanks for your feedback. I would recommend watching this video th-cam.com/video/LeqVHfqRq_I/w-d-xo.html which walks you through the Xcode part, but here is a summary of what you need to do in Unity: 1- Change your platform to visionOS under build settings. 2- Under player settings for visionOS platform, make sure to have “Simulator SDK” and not “Device SDK” unless you are deploying to the physical device that setting must be set to simulator. 3- Click on Build Settings > Build & Run (make sure you current scene is added to scenes to be included & once Xcode opens do the next step). 4- You need to have the dev account setup for Xcode, also make sure visionOS is installed as a platform. 5- Double check that you have a simulator setup, if not you can watch the video above for steps to do so. 6- Click on play and it should run in the simulator. Also, I will be writing a blog post about this process today, if you don’t get it working then that may help too. Thanks for your question!
I learned a lot in this video that was hard to find elsewhere, thank you. I do think the repo might need some updating, playing in editor complains about no XR origin and AR lighting does not seem to be working, everything seems a lot dimmer when built.
This video goes through the entire visionOS dev setup process, including the visionOS simulator: th-cam.com/video/LeqVHfqRq_I/w-d-xo.html thanks for your feedback.
That's a lot of money to ask for given that the Apple Vision Pro is a new, unproven (and expensive) platform that will have limited adoption for a the foreseeable future. This kind of price barrier for developers is EXACTLY what could kill it in the cradle. No apps = no platform. Unity needs to get a subsidy from Apple and remove this price barrier for developers who already need to buy the expensive device that many or may not have a viable user-base in less than 5 years.
Thanks for your feedback Christopher and I will pass that along to Unity. I agree this is a lot to ask for a dev or dev team, I would suggest testing the tools with the 30 days which at least could give you an idea of what's available before making any type of commitment with Unity Pro. I appreciate the feedback once again.
What do I press? This project is using the new input system package but the native platform backends for the new input system are not enabled in the player settings. This means that no input from native devices will come through.
I just checked the project in GitHub and it is currently set to "Both" under "Active Input Handling" so it should handle input just fine. Here's the project: github.com/dilmerv/HelloVisionOS Thanks for watching!
Why doesn't every developer consider the actual testing and user experience as the endpoint? I'm really eager to understand the complete development process, but the final output often seems somewhat lacking.
I have a question about how to use the raw XR input system to achieve the following: drawing a picture in the air? I want to develop a feature in my application that can be used on various devices such as HoloLens, Vision Pro, and Pico Quest 3.
I did something similar for VR which with a few adjustments can be brought over to the other devices: th-cam.com/video/SYY0BYP0ncE/w-d-xo.html Great question 😉
@@dilmerv hi I saw your video , but how to use AR foundation or Unity input system to implement draw a picture in the air? not VR easy, And it can implement in HoloLens2 \Quest3\Pico3
I never thought I was paying homage to Zuckerberg for making it easy for small developers to create something for his devices. I don't have a mac or Unity Pro, so it costs me thousands of dollars to develop for Vision. All hail Meta, all hail Unreal
The vision pro template only shows up on apple Silicon.. That is Unity installed in Mac Mini or Mac Book etc. Sorry I didnt find a way to develop on Windows.
Nice video and really helping ! Now that we have a device with no controllers, how would you move in a full immersive Space (like we used to move with teleport locomotion in VR). Any idea ?
Hey thanks for your feedback. I believe that will happen with custom gestures, you can create custom gestures with ARKit natively, I am sure that will also be available in Unity. You could then detect a specific gesture and activate a locomotion system, either smooth locomotion or from point 1 to point 2 type of thing.
Looks like Apple doesn’t provide a gaze vector due to privacy reasons, take a look at what they provide which I believe is just the pointer Ray initialized when you do a pinch discussions.unity.com/t/how-to-handle-vision-os-sight-direction-from-code/319404
Hi Dilmer, thanks for the amazing tutorial. I am trying to build a fully immersive scene. Its running ok in simulator but I see a window that says loading always in front of the user.
Hey thanks for your question. The window that shows during the beginning can be closed for now, I talked to Unity and it looks like is a bug, for now just close it and ignore it, later on it should to away. Thanks again !
Can you do a video tutorial about anchoring to indoor locations with google geospatial? Google has persistent cloud anchors that "Scan an indoors location and use this saved location later in your augmented reality experience." However ive not been able to figure it out for indoors? For example, creating an art tour that can be shared on XR devices So that people walking through the same room can see the same 3d models in the same location. Eg there may be a model in each corner of the room, and in the centre of each wall. Or having a different model appearing in different rooms. I know how to do that with geospatial creator for google anchor point outside based on lat and long, but inside im having trouble figuring it out indoors.. Ive seen some people do this through an nfc code or qr image on the wall.. but it would be much better if it could be done solely on the device like it is with geospatial creator with their stock solution or something universal.. So that everyone using the app can see the same things indoors, just like you would in street view.
Hey thanks for your feedback and this is something I could look into. I haven’t done anything with geospatial and Vision Pro so this could be interesting, keep an eye for future content about it ;)
Unity doesn't even mention Apple Vision OS/Pro as a benefit for purchasing a Unity Pro subscription on their pricing page. That is not encouraging. Unreal 5 is starting to look nicer by the minute. Apple and Epic aren't friends; but UE5 has so much power that they'll be frenemies well enough.
That's probably because it is still in Beta, they haven't moved it to production yet which I agree is not encouraging, I personally have a feeling that pricing may change which getting this as part of Unity Pro to me is not worth it (at least not until it is in production mode). UE5 branch is already out there with visionOS support (still work in progress) but it only supports VR, I know for a fact that Mixed Reality won't be supported based on what I discussed with the guys working on it.
I'm not going to lie, Unity is really leaving a bad taste in my mouth by forcing people to pay over $2000 just to try the software. I'm a designer thinking about transitioning into the industry, and I'm just trying to get my feet wet right now. And they don't give you the option to pay $165 per month and cancel if it's not for you. They lock you in for the full year. That really comes off as slimy corporate greed to me.
Wow I didn’t know about not having the monthly option anymore, that’s news to me, I agree that they should provide those options, specially to give us a sense of the tools before deciding to do an entire year subscription which 30 day trial to me is not enough!
Too bad they are pricincg the developer path way out of reach for 90% of unity devs... Afraid this policy will set back AR with another 10 years. It's quite a shame really...🕶
That’s something I am really pushing for, I agree that it shouldn’t require a Unity Pro license and perhaps with Unity plus it would have been great, but sounds like Unity plus is not even available anymore 😅
@@dilmerv Guess the more we talk about it the more they are willing to see reason... 😁 Btw It made me make up my mind and buy a Quest3 in stead of waiting for the fruit loops...
Jesus Unity ... fuck off with the $2k / year barrier to entry. I'm just a dude who's literally trying to learn how to do this. How am I going to ever learn, and ever build something, if the barrier to entry is that I need to shell out $2k / year just to test things out?
Unfortunately a Mac is required to be able to develop applications, you need to have Xcode running to push your code to the device or simulator, that app only runs in Mac operating systems. Good question!
💡 Question: Let me know how you feel about licensing after watching my explanation about it. I will pass your feedback to Unity.
💻 GitHub repo to help you get started available here: github.com/dilmerv/HelloVisionOS
Thanks everyone!
Great video, I'd like to know more about why we are getting charged this amount, is this only for developing on Apple Vision Pro, or can you use this with other devices too? If other devices, are there any advantages to using this vs other plugins? Thanks
When you get Unity Pro, you pretty much get access to additional features. This amount represents the license required to use VisionOS PolySpatial (including the installation of packages/tools). However, Unity Pro offers more than just support for Vision Pro. You can compare the different plans at the following link: unity.com/pricing/compare-plans
Great question Nick!
They need to make all this stuff work straight out of the box. From what I've seen with development, eventually stuff like that does happen.
Honestly, working with Swift and RealityKit, and Xcode IDE aren't my favorite things to do... But After 2 days trying to work with this jerry rigged solution from Unity bothers me so much! Specially knowing they are forcing you to pay to use it.
When I started the free trial my first thought was: "wow, this should be such a smooth experience that they are making me pay for it! let's try!" but after 2 days begging to Unity to not crash on build, throw random errors, or just build corrupted builds, I really hope this doesn't become the standard way of developing on Apple Vision.
Definitely a big turn off that the cost to entry in experimenting with this new technology is crazy high. Luckily I already have a silicon mac and that apple offers a simulator of the headset, but Unity creating this new barrier for entry with that additional cost is a bit frustrating. I think to build the technology should be free, but publishing could be charged. Let the incredible tools that people build be what brings Unity more customers and money, not the devs themselves
Great Video and good tip on the Simulator SDK
Sweet I am glad you found this helpful!
Hi Dilmer, really nice video. Please is it possible to show how to build the project to visionOS and generate the Xcode project. And also how to set it up in Xcode and launch it on the vision Pro simulator. You skipped that part in the video and I think it is important, because that is where I am having a lot of issues.
Hey thanks for your feedback. I would recommend watching this video th-cam.com/video/LeqVHfqRq_I/w-d-xo.html which walks you through the Xcode part, but here is a summary of what you need to do in Unity:
1- Change your platform to visionOS under build settings.
2- Under player settings for visionOS platform, make sure to have “Simulator SDK” and not “Device SDK” unless you are deploying to the physical device that setting must be set to simulator.
3- Click on Build Settings > Build & Run (make sure you current scene is added to scenes to be included & once Xcode opens do the next step).
4- You need to have the dev account setup for Xcode, also make sure visionOS is installed as a platform.
5- Double check that you have a simulator setup, if not you can watch the video above for steps to do so.
6- Click on play and it should run in the simulator.
Also, I will be writing a blog post about this process today, if you don’t get it working then that may help too.
Thanks for your question!
I learned a lot in this video that was hard to find elsewhere, thank you. I do think the repo might need some updating, playing in editor complains about no XR origin and AR lighting does not seem to be working, everything seems a lot dimmer when built.
well at least the AR lighting turned out to be as easy as changing the virtual environment.
Thanks for your feedback and for letting me know about the issues, do a PR and I will merge it into master 😉
could you make a video about how to set up the simulator? Wonder what steps should be taken on the Xcode side
This video goes through the entire visionOS dev setup process, including the visionOS simulator: th-cam.com/video/LeqVHfqRq_I/w-d-xo.html thanks for your feedback.
That's a lot of money to ask for given that the Apple Vision Pro is a new, unproven (and expensive) platform that will have limited adoption for a the foreseeable future. This kind of price barrier for developers is EXACTLY what could kill it in the cradle. No apps = no platform. Unity needs to get a subsidy from Apple and remove this price barrier for developers who already need to buy the expensive device that many or may not have a viable user-base in less than 5 years.
Thanks for your feedback Christopher and I will pass that along to Unity. I agree this is a lot to ask for a dev or dev team, I would suggest testing the tools with the 30 days which at least could give you an idea of what's available before making any type of commitment with Unity Pro.
I appreciate the feedback once again.
Finally, it is here !!
It is here 🎉 thanks for sharing your excitement!
So helpful, thanks for making the video
Glad it was helpful! Thank you for your comment.
What do I press?
This project is using the new input system package but the native platform backends for the new input system are not enabled in the player settings. This means that no input from native devices will come through.
I just checked the project in GitHub and it is currently set to "Both" under "Active Input Handling" so it should handle input just fine. Here's the project: github.com/dilmerv/HelloVisionOS
Thanks for watching!
Why doesn't every developer consider the actual testing and user experience as the endpoint? I'm really eager to understand the complete development process, but the final output often seems somewhat lacking.
I have a question about how to use the raw XR input system to achieve the following: drawing a picture in the air? I want to develop a feature in my application that can be used on various devices such as HoloLens, Vision Pro, and Pico Quest 3.
I did something similar for VR which with a few adjustments can be brought over to the other devices:
th-cam.com/video/SYY0BYP0ncE/w-d-xo.html
Great question 😉
@@dilmerv hi I saw your video , but how to use AR foundation or Unity input system to implement draw a picture in the air? not VR easy, And it can implement in HoloLens2 \Quest3\Pico3
I never thought I was paying homage to Zuckerberg for making it easy for small developers to create something for his devices. I don't have a mac or Unity Pro, so it costs me thousands of dollars to develop for Vision. All hail Meta, all hail Unreal
That is for sure a huge advantage over Apple, thanks for your comment!
Hi, I was wondering if I want to develop a VR game (which includes xr hands and xr interaction tool), do I need to use polyspatial?
Do I have to get a mac computer? I only have a windows computer.
You could do development with windows but a Mac computer is required for device or simulator deployment. Good question!
windows to present 2D or 3D content in visionos with unity?made a video about it..
Will do thanks for letting me know!
How do I make the dynamic lighting work when testing on simulator or device? As the light only shows up in the editor
Dynamic lighting works by default when I deploy to the device or sim. If you're having issues try restarting Unity and re-building your project.
hey man, I downloaded the official Vision Pro Template project and it cant be seen in Hub. Any ideas? I have the 30-day pr trial too
The vision pro template only shows up on apple Silicon.. That is Unity installed in Mac Mini or Mac Book etc. Sorry I didnt find a way to develop on Windows.
Nice video and really helping !
Now that we have a device with no controllers, how would you move in a full immersive Space (like we used to move with teleport locomotion in VR). Any idea ?
Hey thanks for your feedback. I believe that will happen with custom gestures, you can create custom gestures with ARKit natively, I am sure that will also be available in Unity. You could then detect a specific gesture and activate a locomotion system, either smooth locomotion or from point 1 to point 2 type of thing.
@@dilmerv so, by combining XRIT and XR hands packages right ?
Thank you for the hint !
That's a nice chair. What chair is it?
Thanks, I really love that chair. Is the Herman Miller Sayl Chair.
How about eye tracking? does it possible to realize eye tracking features in unity + vision pro?
Looks like Apple doesn’t provide a gaze vector due to privacy reasons, take a look at what they provide which I believe is just the pointer Ray initialized when you do a pinch discussions.unity.com/t/how-to-handle-vision-os-sight-direction-from-code/319404
Hi Dilmer, thanks for the amazing tutorial. I am trying to build a fully immersive scene. Its running ok in simulator but I see a window that says loading always in front of the user.
Hey thanks for your question. The window that shows during the beginning can be closed for now, I talked to Unity and it looks like is a bug, for now just close it and ignore it, later on it should to away.
Thanks again !
Can you do a video tutorial about anchoring to indoor locations with google geospatial?
Google has persistent cloud anchors that "Scan an indoors location and use this saved location later in your augmented reality experience."
However ive not been able to figure it out for indoors?
For example, creating an art tour that can be shared on XR devices
So that people walking through the same room can see the same 3d models in the same location. Eg there may be a model in each corner of the room, and in the centre of each wall. Or having a different model appearing in different rooms.
I know how to do that with geospatial creator for google anchor point outside based on lat and long, but inside im having trouble figuring it out indoors..
Ive seen some people do this through an nfc code or qr image on the wall.. but it would be much better if it could be done solely on the device like it is with geospatial creator with their stock solution or something universal..
So that everyone using the app can see the same things indoors, just like you would in street view.
Hey thanks for your feedback and this is something I could look into. I haven’t done anything with geospatial and Vision Pro so this could be interesting, keep an eye for future content about it ;)
@@dilmerv thanks, that would be awesome!
Do they have a hand poser for objects?
There are a few hand poses (custom gestures) available but not a way to customize the hand pose per object. Good question!
Unity doesn't even mention Apple Vision OS/Pro as a benefit for purchasing a Unity Pro subscription on their pricing page. That is not encouraging.
Unreal 5 is starting to look nicer by the minute. Apple and Epic aren't friends; but UE5 has so much power that they'll be frenemies well enough.
That's probably because it is still in Beta, they haven't moved it to production yet which I agree is not encouraging, I personally have a feeling that pricing may change which getting this as part of Unity Pro to me is not worth it (at least not until it is in production mode). UE5 branch is already out there with visionOS support (still work in progress) but it only supports VR, I know for a fact that Mixed Reality won't be supported based on what I discussed with the guys working on it.
Does it works on windows 10 ?
I'm not going to lie, Unity is really leaving a bad taste in my mouth by forcing people to pay over $2000 just to try the software. I'm a designer thinking about transitioning into the industry, and I'm just trying to get my feet wet right now. And they don't give you the option to pay $165 per month and cancel if it's not for you. They lock you in for the full year. That really comes off as slimy corporate greed to me.
Wow I didn’t know about not having the monthly option anymore, that’s news to me, I agree that they should provide those options, specially to give us a sense of the tools before deciding to do an entire year subscription which 30 day trial to me is not enough!
I still dont got how turn a real table into digital table just detecting by Raycast Size of the planes x.x
Feel free to join my discord here and I am sure the community can help out discord.gg/bhgnC6hQ thanks for your comment.
Could youh help me in text recognition with AR?
You could definitely ask in my discord and the community or myself can provide you with feedback discord.gg/Gtt8uatz
Why can I develop for Oculus for free but AVP is $2k a year? No thanks.
I wish it was free my friend, I have provided the feedback many times to Unity, hopefully that happens soon. Thanks for your time.
Looking for someone to sponsor unity pro license 😂
Oh man I will see it Unity wants to give me a few licenses to give away to the community.
hello
👋
Too bad they are pricincg the developer path way out of reach for 90% of unity devs... Afraid this policy will set back AR with another 10 years. It's quite a shame really...🕶
That’s something I am really pushing for, I agree that it shouldn’t require a Unity Pro license and perhaps with Unity plus it would have been great, but sounds like Unity plus is not even available anymore 😅
@@dilmerv Guess the more we talk about it the more they are willing to see reason... 😁 Btw It made me make up my mind and buy a Quest3 in stead of waiting for the fruit loops...
what do you think about developing for Quest 3? I dont have money for a Pro license or Apple vision device @@dilmerv
Jesus Unity ... fuck off with the $2k / year barrier to entry. I'm just a dude who's literally trying to learn how to do this. How am I going to ever learn, and ever build something, if the barrier to entry is that I need to shell out $2k / year just to test things out?
Is it possible to develop and run (test) apps having no mac pc, just vision pro.
Unfortunately a Mac is required to be able to develop applications, you need to have Xcode running to push your code to the device or simulator, that app only runs in Mac operating systems. Good question!
@@dilmerv Thanks for the prompt response. It is getting even more expensive this way.