Hope you guys enjoyed this tutorial. There are more to come and I can't wait to show the final result. :D Big shoutout to Cognitive 3D for sponsoring this series. You can try the tool by registering for free and test the pro version here : app.cognitive3d.com/get-started?&
I just want to say thank you for your great tutorials! I worked as a XR Developer for a year. Now I'm working as a scientific associate in the field of VR/AR at a university. Your tutorials still help me a lot during development and I think, you are well known among other XR Developers. Your work has a positive impact on the whole XR Community!
What a great video!! It Was easy to follow the steps and the final results are very cool! Perfecto for begginer just like me. Really happy with this kind of content, I really appreciate it
Hey Valem, nice tutorial as always! A feature that I think a lot of developers could use, is how to implement DLCs into our game. Can you make a tutorial about that?
Hi Valem, thanks for your great tutorials! Maybe you could make a video on VR and UI interaction? Specifically using the standard Unity UI with the Quest 3 controllers, using the building blocks framework. There seems to be support for using hands with the regular Unity UI, but not controllers for some reason
Heya Ive been following this tutorial and its soo good thank you so much i was just wondering would there be any setup adjustments possible for me to make this work on the Vive XR or is it not possible?
Pretty helpful to start with. A couple issues. No translation info comes back to Unity for the camera coords like at 15:45 despite having the options enabled in Meta Quest Link config and rebooting. Also, scene (script) updates never are reflected dynamically when going to game mode (Play) in Unity as described. A full rebuild/run gets them working... but that is painfully slow. Anyone else have fixes on these?
Nevermind. The above were limitations due to having not changed over the libs from OpenXR to Oculus in the project config. Enabling "Oculus" support does not provide the SDK libs needed (yet). You'd hope OpenXR will accept the translation data soon.
@@davec7934 how can this be solved? I have same problem, when I connect the Quest 3 by air link or cable and press play, it does not work, camera dont follow the right head movement... I have to build/run so it works fine
Can you please make a tutorial for how to have a selective passthrough window (see passthrough only through a specific grabbable plane) using OpenXR and ARFoundation (NOT OVR)? I know that it's relatively simply if you go all-in on Meta's SDK, but our project is utilizing OpenXR and I cannot seem to figure out how to make this work. Thanks!
Great tutorial! I'm running into one issue though. When I try using Meta Link, all I see is a black screen in my Quest with the blue circle at my feet. However, everything works fine when I use the Build & Run method. Any idea what I might be doing wrong? I followed all the steps in your tutorial - thanks in advance!! (I'm new to this)
Hey bro i am learning the VR developement from your channel and you're tutorials are Amazing. I need some help regarding rotation, I want to implement a functionality where when i grab an object, i want to rotate it on its pivot axis, i dont want to change its position, how can i implement that? i am searching for it but i didnt find any tutorials regarding that, a little help will be highly appreciated , Thanks alot.
If game scene is not opened then black screen is visible in quest 3 and not pass through. This comment is to help others with this issue, just open game scene and it will be good to go
I Good morning sir I have a sort of a off topic question, do you or will you make a tutorial or something on how to create a quest version of something like horizon work rooms that you can conduct a class, speech etc with other avatars that can be interactive where you can show videos write on boards take notes etc without a agent running on your pc? Thank you sir!
great, but i can't figure out how to get the scanned room data from my headset to use as Effect Mesh. all i get are one or other of the randomized pre existing prefabs. I can't seem to get how to use my room as the effect mesh model. I've scanned my room in my headset but it's not showing up in this scene.
I ran into this issue last month. I still don't know if I have a perfect solution but make sure that the computer is detecting your device through side quest or meta link. Try to get the meta link up and running first, in other words do you see the infinite white scene with the unity applications listed? I don't fully know what got it up and running for me besides making sure everything was updated, headset was in developer mode
@@lindoryel2637 yeah, sorry, I still haven't found a solution to getting it to work in the headset during a playtest. If I build and run on the headset, it manages to find the correct scene data and works as it should. I only ever got the scene data to properly show up over Link cable during a playtest once or twice. all the other times it came up it was collapsed and I got lots of error messages about missing anchors and whatnot, to which I could not find any real useful info online. so for the remaining videos in this tutorial series I'll just build it to the headset and run it from there.
Currently if you're only targeting the Quest I would suggest Meta own SDK because it has more capabilities and allow you to use the latest feature of the Quest. On the other hand, Unity XR Toolkit is a bit more simpler to understand and to use
Anyone running into the error that "Deprecated Gradle features were used in the build" causing it to fail? I'm currently stuck there.I did run this with the same version of unity as was in the tutorial.
Hope you guys enjoyed this tutorial. There are more to come and I can't wait to show the final result. :D
Big shoutout to Cognitive 3D for sponsoring this series. You can try the tool by registering for free and test the pro version here :
app.cognitive3d.com/get-started?&
Is there a reason why my scene is loading 2 rooms when I go into play mode? @Valem
WAKE UP VALEM POSTED!!
Your tutorials have been such a huge benefit to my XR development. Thank you!
I just want to say thank you for your great tutorials! I worked as a XR Developer for a year. Now I'm working as a scientific associate in the field of VR/AR at a university. Your tutorials still help me a lot during development and I think, you are well known among other XR Developers. Your work has a positive impact on the whole XR Community!
THanks Dave it means a lot !
Many thanks for supporting the XR dev community, your tutorials are the best!
Thank you Valem this is exactly what I was hoping you’d make
Thank you for this tutorial, I learned so much by watching it!
Thank you for your tutorials. Love it so much. 👏
Completed your Space VR Game Playlist, now it's time for MR 👻 Game
This is awesome!!
Never cease to amaze me with these tutorials! Thank you for all that you do! :D
What a great video!!
It Was easy to follow the steps and the final results are very cool! Perfecto for begginer just like me.
Really happy with this kind of content, I really appreciate it
Excellent Tutorial. Very well explained.
Hey Valem, nice tutorial as always! A feature that I think a lot of developers could use, is how to implement DLCs into our game. Can you make a tutorial about that?
Very Good Tutorial Thank You!
hi thank you very much for all your hard work!
is there a way to add room scanning or does it only work on a scanned room at the moment?
Hi Valem, thanks for your great tutorials! Maybe you could make a video on VR and UI interaction? Specifically using the standard Unity UI with the Quest 3 controllers, using the building blocks framework. There seems to be support for using hands with the regular Unity UI, but not controllers for some reason
Heya Ive been following this tutorial and its soo good thank you so much i was just wondering would there be any setup adjustments possible for me to make this work on the Vive XR or is it not possible?
Pretty helpful to start with. A couple issues. No translation info comes back to Unity for the camera coords like at 15:45 despite having the options enabled in Meta Quest Link config and rebooting. Also, scene (script) updates never are reflected dynamically when going to game mode (Play) in Unity as described. A full rebuild/run gets them working... but that is painfully slow.
Anyone else have fixes on these?
Nevermind. The above were limitations due to having not changed over the libs from OpenXR to Oculus in the project config. Enabling "Oculus" support does not provide the SDK libs needed (yet). You'd hope OpenXR will accept the translation data soon.
@@davec7934 how can this be solved? I have same problem, when I connect the Quest 3 by air link or cable and press play, it does not work, camera dont follow the right head movement... I have to build/run so it works fine
Hi! does the tutorial of the table tennis game shown in 1:31 is already available? Ty 😉
Can you please make a tutorial for how to have a selective passthrough window (see passthrough only through a specific grabbable plane) using OpenXR and ARFoundation (NOT OVR)? I know that it's relatively simply if you go all-in on Meta's SDK, but our project is utilizing OpenXR and I cannot seem to figure out how to make this work.
Thanks!
when I move the ray gun the handanchor moves as well, how to fix it ?
Ok for the future generations: just switch from Center to Pivot on the scene editor (top left corner)
Great tutorial! I'm running into one issue though. When I try using Meta Link, all I see is a black screen in my Quest with the blue circle at my feet. However, everything works fine when I use the Build & Run method. Any idea what I might be doing wrong? I followed all the steps in your tutorial - thanks in advance!! (I'm new to this)
Hey bro i am learning the VR developement from your channel and you're tutorials are Amazing.
I need some help regarding rotation, I want to implement a functionality where when i grab an object, i want to rotate it on its pivot axis, i dont want to change its position, how can i implement that? i am searching for it but i didnt find any tutorials regarding that, a little help will be highly appreciated , Thanks alot.
Which is better for developing MR apps, Standard RP or URP? (I have a lot of heavy 3D models)
If game scene is not opened then black screen is visible in quest 3 and not pass through. This comment is to help others with this issue, just open game scene and it will be good to go
I
Good morning sir I have a sort of a off topic question, do you or will you make a tutorial or something on how to create a quest version of something like horizon work rooms that you can conduct a class, speech etc with other avatars that can be interactive where you can show videos write on boards take notes etc without a agent running on your pc? Thank you sir!
Is there a reason why my scene is loading 2 rooms when I go into play mode?
I keep getting a value cannot be null when I click on oculus in the xr plugin managment
great, but i can't figure out how to get the scanned room data from my headset to use as Effect Mesh. all i get are one or other of the randomized pre existing prefabs. I can't seem to get how to use my room as the effect mesh model. I've scanned my room in my headset but it's not showing up in this scene.
I have the same issue, did you manage to resolve it ? Or does someone else have a solution for this ?
I ran into this issue last month. I still don't know if I have a perfect solution but make sure that the computer is detecting your device through side quest or meta link. Try to get the meta link up and running first, in other words do you see the infinite white scene with the unity applications listed? I don't fully know what got it up and running for me besides making sure everything was updated, headset was in developer mode
@@lindoryel2637 yeah, sorry, I still haven't found a solution to getting it to work in the headset during a playtest. If I build and run on the headset, it manages to find the correct scene data and works as it should. I only ever got the scene data to properly show up over Link cable during a playtest once or twice. all the other times it came up it was collapsed and I got lots of error messages about missing anchors and whatnot, to which I could not find any real useful info online. so for the remaining videos in this tutorial series I'll just build it to the headset and run it from there.
All I get when I try to run the game is a black screen with a spinning hourglass. Please advise. Thanks
If only developing for Quest would it be better to use Meta's toolkit over OpenXR?
Currently if you're only targeting the Quest I would suggest Meta own SDK because it has more capabilities and allow you to use the latest feature of the Quest. On the other hand, Unity XR Toolkit is a bit more simpler to understand and to use
Anyone running into the error that "Deprecated Gradle features were used in the build" causing it to fail? I'm currently stuck there.I did run this with the same version of unity as was in the tutorial.
Solution: Edit > Preferences > External Tools > Make sure Gradle installed with unity is checked.
Thank you, Valem, great tutorial! Curious at 14:40 why do you have crosses/grids? Mine does not. Thank you.