Hope you guys enjoyed this first Quest 3 tutorial. Go send some love to Wonderland Engine for sponsoring this video ! You can watch my Wonderland Engine tutorial to get you started on making your own Web VR game here : th-cam.com/video/97rtpteFYco/w-d-xo.html
Hey Thanks for the Tutorial, one thing is different if i go to OVRCameraRig manager i have no option to select any quest device. In your Video its at 8:16
For anyone choosing to use the Meta XR All-in-one SDK instead of the Oculus Integration package that is shown as depricated now, you can still find the prefabs he is using by clicking either "All" or "In Packages" next to the "Search" in the editor and searching by the same names. That's because the prefabs are now within the Packages folder instead of Assets (which is where you search by default).
The prefabs are in Packages>Meta XR Core SDK > Prefabs, just in case the search function didn't work for others (not sure if it's my unity version lol)
Love the tutorial! I realized u have to set the OVR Hand script's Show State to Always in order to see hands. It worked when I changed the settings! Overall thank you for this tutorial! I'm excited to watch the next ones
Yet again, an outstanding video tutorial Valem - looking forward to the others! Got my Quest 3 earlier this week and the MR tech is really impressive, far better than I expected - looking forward to seeing more games making use of it.
@@ValemTutorials first. Great explained. very thank you! But one Question: What is, if i want to do a PC VR, and not an "android" mobile game, because. Android is not interesting to me as developer. I do no stand alone Applikations, only Steam VR / PC VR support.
Fantastic video as always. I am really looking forward to the next video. The editing made it look much quicker to click through all of those dialogues after importing. It took me around 10 minutes on my computer with all of the loading and compiling!
Nice video thanks. Got it working, but chose 3d urp template (hands and controllers renders as purple, but i'm sure it fixes easily). And i used new meta - xr all in one package instead of oculus integration. All the steps are the same except for prefabs is located in package folders, not in assets folder. Also got problem with very slow framete - found that separating scene and game window helps alot with in-hmd performance (don't use scene and game window on same screen, move it to separate tabs). PS: i also tried making video using quest camera app - passthrough is visible from unity in-editor playmode
Oculus integration on Unity asset store is now deprecated. Meta is now offering a bunch of different XR sdk unity packages, which should we use now? can you do a tutorial using the new system? what benefits do the new meta XR sdks bring?
@@Kurokage130 I just got through this tutorial after several attempts over the months. You can use the unity all in one sdk. Another tip added on here that helped me not have errors. You have to set the OVR Hand script's Show State to Always in order to see hands. Without this it did not work. I did have a passthrough issue before also but after a fresh install of unity it went away. I am not sure what caused it.
Thank you sooo much!! I'll jump patrion. I'm super into learning how to have in gsme objects interact with rwal work. Again, thanks for all the amazing youtube!!
As usual just great ! Everything gets easy to do with your tutorials ! By the way have you made a tutorial for gaze interaction for example to teleport, ? Again Thank you to make XR simple.
Fantastic Tutorial, thank you! I had a few hiccups, like unity not recognizing the android device when I went to build and run it. But everytime I had an issue the resolution was always to just reset the headset haha sometimes multiple times. Anyways thanks again!
Awesome, thank you! Still moving through this. Which cable do I need to link the Quest 3? A basic usb c sync cable, or the more expensive 40-70~ dollar link cable?
Wondering the same thing, my current cable results in high lag, low fps performance so I definitely want to get a good one. Last time I was building on the quest 2 you had to build it to the device everytime so this would be a luxury. Thanks for the video!
For anyone interested, looking at the passthrough API overview (passthrough over oculus link) they recommend "A compatible USB-C cable to use for Meta Quest Link. For Color Passthrough, the USB connection should provide an effective bandwidth of at least 2 Gbps." I ended up going with a decently cheap 3.1 type C with 10Gbps data transfer and it works great!
@@4Aces. I managed to get it to work. I don't really know what did it but I switched my meta account to developer mode, turned on developer mode in the phone app and built the game. I also test by clicking on Build and Run, which is probably the one that did the trick. If you ever wanna try it again and it doesn't work, send a reply and I'll see if I can help. This also goes for anyone else reading this.
Hi! Thanks for the tutorial. I met a problem here: there's no "enable oculus" in beta tab and thus when I clicked play, nothing happened in the quest. Any ideas?
This is excellent!! Is there a way to have the Oculus Integration system DYNAMICALLY look for geometry during runtime so it can generate geometry as you play?
Hello, nice tutorials, as always. Sorry for my english I'am french :) I have a question, when testing on PC why the camera is working ? Is it always like this ? Using the apk on quest it is working well.
Thank you for the tutorial but I have a major issue with the passthrough. I have the latest oculus software on my pc, and the cube and hands work great, but not the passthrough (all black in the headset). The reason is probably because in the software, I only get 3 options on the beta software, not a bunch like you have??? SOLVED: Make sure you have enabled developer options IN the headset settings! Now to figure out why controllers show up but handtracking does not appear to work :)
Help! 🙏🏻 I want to create a VR game and I'm failing again like I did right at the beginning. I followed almost everything as in the video. Except that I use the virtual desktop as a connection instead of a Quest link cable. Still it doesn't work 😩 It does not connect to the glasses. But this only happens with the Oculus Integration asset. If I start the project straight away with the VR Core, everything works. I just want to work with Oculus integration...
Hi @ValemTutorials, I did exactly everything according to your video, but for some reason I can’t see anything on my Meta Quest 3. I’m using a powered Link Cable, I activated the options indicated in the Meta App on PC in Beta and also put my Oculus in developer mode through the Android App, but when I click Play, the project only works in Unity but I can’t see anything directly on the Meta Quest 3. Could you explain what the problem might be? I’m using Unity 2022.3.21f1 LTS.
@@PedroFerreira-vn9qk hi brow, I needed to find the version of Unity he was using, unfortunately the current versions seem to have problematic support.
few problems: it's showing quest 2 controllers instead of quest 3 ones and my hands aren't visible, instead i see the controllers even in hand tracking, i'm using virtual desktop for this because my gpu isn't supported for link
Is it possible to create app what will draw some 3d object right in main room of Oculus? Some kind of widget. For example if a want to place my lovely Teddy Bear somewhere in my work space.
Was able to get through the entire tutorial and build and run and working perfectly, the only trouble I'm having is not getting anything to happen when I hit the play button in Unity. I've also enabled passthrough over PC link in the Oculus app. What should happen-- does it require that I wear the device or is it simply playing in the game window-- what's the expected behavior?
Why did you use the oculus integration package and not Unity’s own provided AR Foundation package with the Unity OpenXR Meta package? It allows for cross platform with iOS and Android as well.
The Oculus Integration appears to have been deprecated at v. 57 or so and replaced by about a dozen different Meta branded packages. I've not yet been able to find the one that builds on the Oculus Integration package specifically. I've imported about half a dozen, and the "Oculus" folder that appears in Assets is always empty. I have a clean template project with the deprecated version, but I'd like to ensure I can take advantage of new functionality as it is added. Anyone have any idea where I'm going wrong?
There's a "Meta All in One SDK" which seems to closely resemble the entire oculus integration from before. Also in their documentation they tell you how to upgrade from the oculus integration package, so might be worth checking that out as well (if you're upgrading a current project)
Hello are you on airlink or linkcable? Everything works but the passthrough is super slow 3fps when in play mode. I'm in airlink since the battery would deplete with the cable. It works fine when in the build
Hey just followed this tutorial - thank you for this! One problem I have though, is that the cube is rendering for me in double-vision (in the Quest 3), even whilst the hands / controller render just fine. This only occurs though, when the central OVR camera is not set to Skybox (ie when is set to Solid Colour) - OR when passthrough is enabled. If passthrough is not enabled, or Skybox is selected, no such double vision of the cube occurs. As far as I can tell I've followed all instructions precisely - except that (as mentioned elsewhere in the comments) the OVRCameraRig was not found in the same directory location as demonstrated by this tutorial (but in a different directory location) - so presumably there is a version change in something since this tutorial was recorded. Any suggestions welcome!
Hi paul, did you find a solution, do you have the same problem in build ? Maybe its something related to the "Single Pass Instanced" in your project settings > oculus. You should set it to multi pass it might fix the problem
So without the oculus integration package the hand tracking won’t work on deployment? I’m using MacBook and Unity and my app deploys fine but I couldn’t get the hands to show at all and my app depends on them for micro interactions
@@ValemTutorials Funny you asked as I made the latest improvements just yesterday. At least for my game, the Q3 pushes the graphics fidelity to almost on PSVR2/PC levels. Even 90hz mode is now possible. Places where the Q2 stuggles are no issue for the Q3 even at higher refresh rates.
Hi, thank you so much for the introduction to vr development, I already installed all the necessary and I was testing just with a simple box and plane as a floor, I was wondering if there is a way to make these props stable, I mean, when I move myself with the glasses the box and floor is moving a bit.
Quick question. Can I do this and test mixed reality on Quest 2? Have Quest 3 but can't setup until Christmas and just wondering if I can just get into dev early using Quest 2?
Hey Valem! Great video. This comment is in reference to your VR audio video regarding using spatializers... Please make a video about audio spatializers! My use case is that I want a room to be soundproof, users inside the room can hear each other but can't hear outside users, and users outside cannot hear the inside users. This is actually not easily answered anywhere online at all. Spatializers affect and dampen the sound, but nothing really talks about completely cutting off the sound all together conditionally. Various raycasting solutions are not an option for me as it is not scalable, optimized, or functional with a large amount of users. You are great at explaining things so I would really appreciate it!
Would it be possible to "cheat"? Assuming that you absolutely need people to hear you with a certain set of radius with your current set up, say 5 meters and it needs to be set that way 1. create a very very thick wall and the sptial sound unable to transfer through with distance 2. teleport the outside players so they are not in the same spatial location as the inside players The first one is very easy to make, just do it so the radius won't reach the outside, or barely The second one is aesthetically better but requires a lot of vector calculation because you need to create two scenes and two walls so they each present each other I'd say these are lot easier than adjusting the audio effect on other objects
@@jimmyding335 Thanks for your input! There is a doorway to the room so it would be really weird to have very thick walls, and the teleportation thing is not an option since the goal is to make a seamless room experience, you enter into the room and you can leave the same way you came, no teleportation shenanigans :P also the goal isn't really to adjust audio effects on other objects, but rather have a persistent environmet property that intrinsically blocks sounds if that makes sense. You enter the room, and people outside it can't hear you, all while not accessing each individual's audio source. Seems like this isn't possible tho :(
that's what I'm trying to figure out as well. There's some new meta open xr plugin that does that but needs ARFoundation as well. No simple hack for integrating XRI and passthrough without going through the oculus plugin
@@MohitShukla-g5iActually I figured it put. The newer version of Unity Editor 2022.3.14fx has an option to make an MR (mixed reality) project. If you use that template it has a newer fancier xr hands and also capabilites of enabling passthrough when built on oculus. That’s as far as i got.
@@A_walk_in_the_park that’s awesome! I’m gonna try out having ovrmanager and the pass through layer somewhere in the scene and see if that works. Glad to know you got it working, but one fallback is that you don’t have pass through over oculus link with that method
I followed all of the steps but I can't get it to show in my headset when I press play. None of the tutorials I have followed allow my Quest 3 to play from within Unity. I am sure it's a setting or something I have done wrong. I did install the Meta Quest Developer Hub originally. Would this stop it from playing? I installed the Oculus app on my desktop. Both MQDH and Oculus app can see my Quest 3 via link cable. Just not displaying anything when I press play. Any ideas? I have confirmed my headset is in developer mode too.
SOLVED: I had to turn on Meta Quest Link in the Meta Quest Developer Hub. When I did this the home interface on my Quest 3 changed and it allowed me to hit play and see it all in action.
Do you guys know how to get the quest to rescan the global mesh data? If I left the room that I developed in, it loses the data and doesn't ask to rebuild it.
My quest arrived 2 days ago but I can’t manage to pair the headset and oculus desktop app. They simply cannot see each other neither via cable or air link
@@daviderapisarda3833My pc is also not usable for quest link so I have tried virtural desktop app in quest. It can work the unity app in a low quality and without hand tracking and passthrough, there is just a cube in darkness.
Your videos are the best of the best. You made me wait to buy the quest 3 since march! Now I got it and i want to experiment with unity of course.. what specs does your computer have? I’m a little concerned mine won’t be able to do the job..
At 8:03, I was not getting passthrough until I changed this black color's alpha channel (opacity) to be lower. It was its highest, so it was completely opaque. You can set it to lower or 0 to make it transparent. In this example, he has it at a very low 5.
works nice with oculus integration v57, with the new meta all in one v59 Im getting stuck, the oculus folder that has things like invisiblePlane (in part 2 of this series) is missing, in fact the whole assets/oculus folder only has 1 file in it, any ideas?
@@ValemTutorials ah thanks for confirming this, I learned that anything new in software is probably broken! so why did I choose XR? ha ha. do you think the changes they made, its to make things super lightweight for developers? no examples just all the tools you need to do the job with very light repos?
Hey - another great video. I noticed in your build settings you didn't have to change the texture compression to ACSI (I think I spelled that right) and its on ETC2 - is that standard now with the new versions of the OVR stuff?
how far out are we from being able to use MR for experiences in the real world??? have been walking around denver using passthrough for around a week and cant help but want a TRUE MR experience wherever i go!!!! Keep it up!
@@ValemTutorials Hey! Thank you for the amazing tutorials :) I was wondering if there are any updates on the OpenXR version of these tutorials? Very excited to switch from VR to AR and I want to dev for cross platform! Thank you :)
I followed your guide step by step, but at some point I must have done something wrong. I only see Quest 2 Controllers with my Quest 3 and also my Handtracking is working but instead of tracked hands I only see again Quest 2 Controllers... Or might it be because I use Virtual Desktop as Connection?
I have a current quest 2 project on 2021.3.2 LTS. I do have some issues getting the quest 3 to work, would I need to update the engine to the 2022 LTS version? hope you can help!
Hope you guys enjoyed this first Quest 3 tutorial. Go send some love to Wonderland Engine for sponsoring this video ! You can watch my Wonderland Engine tutorial to get you started on making your own Web VR game here : th-cam.com/video/97rtpteFYco/w-d-xo.html
@valemTutorials
at 10:00 I only have 3 options to select from and thats not something to enable path through.
Pls someone help me
Hey Thanks for the Tutorial, one thing is different if i go to OVRCameraRig manager i have no option to select any quest device. In your Video its at 8:16
For anyone choosing to use the Meta XR All-in-one SDK instead of the Oculus Integration package that is shown as depricated now, you can still find the prefabs he is using by clicking either "All" or "In Packages" next to the "Search" in the editor and searching by the same names. That's because the prefabs are now within the Packages folder instead of Assets (which is where you search by default).
Just saved me some good amount of time with that one, thnx buddy ♥
This should be pinned
Thank you Sir!
The prefabs are in Packages>Meta XR Core SDK > Prefabs, just in case the search function didn't work for others (not sure if it's my unity version lol)
Thanks 👌
As someone whos just got a quest 3 and been following u on here and patreon cant wait to see more mixed reality tutorials
Nice. 9:57 Passthrough over Quest Link now in Settings>Beta>Developer Runtime Features>Pass-through over Oculus Link
is it still there? I cant see Developer Runtime Features
@@naimgg7806 me too
Love the tutorial!
I realized u have to set the OVR Hand script's Show State to Always in order to see hands. It worked when I changed the settings!
Overall thank you for this tutorial! I'm excited to watch the next ones
This comment saved me
Yet again, an outstanding video tutorial Valem - looking forward to the others! Got my Quest 3 earlier this week and the MR tech is really impressive, far better than I expected - looking forward to seeing more games making use of it.
Hi man thanks, same here really cool to finally have some good color passthough :D
@@ValemTutorials first. Great explained. very thank you! But one Question:
What is, if i want to do a PC VR, and not an "android" mobile game, because. Android is not interesting to me as developer. I do no stand alone Applikations, only Steam VR / PC VR support.
Fantastic video as always. I am really looking forward to the next video.
The editing made it look much quicker to click through all of those dialogues after importing. It took me around 10 minutes on my computer with all of the loading and compiling!
WOW! I was waiting for this, but I did not think you were going to come in this quick! I wasn't ready next time give me a heads up lol
Aha thanks :D
I was just working on a vr game that i learned from you and i just saw you upload this video! I have never been so early!
Aha thanks for watching the video so early man. :)
Thanks!I love this tuturial. You make my day!
"Oculus Integration" asset was deprecated and changed for "Meta XR All-in-One SDK" asset.
This will affect the tutorials?
Please more of these , cant wait
Will do ! :)
Nice video thanks. Got it working, but chose 3d urp template (hands and controllers renders as purple, but i'm sure it fixes easily). And i used new meta - xr all in one package instead of oculus integration. All the steps are the same except for prefabs is located in package folders, not in assets folder. Also got problem with very slow framete - found that separating scene and game window helps alot with in-hmd performance (don't use scene and game window on same screen, move it to separate tabs). PS: i also tried making video using quest camera app - passthrough is visible from unity in-editor playmode
Oculus integration on Unity asset store is now deprecated. Meta is now offering a bunch of different XR sdk unity packages, which should we use now? can you do a tutorial using the new system? what benefits do the new meta XR sdks bring?
Great question, has anyone found an answer to this?
@@Kurokage130 I just got through this tutorial after several attempts over the months. You can use the unity all in one sdk. Another tip added on here that helped me not have errors. You have to set the OVR Hand script's Show State to Always in order to see hands. Without this it did not work. I did have a passthrough issue before also but after a fresh install of unity it went away. I am not sure what caused it.
What a perfect primer. Thank you very much. Have a wonderful weekend
You too James thanks for watching. :)
Exactly what I need right now for a work project. Massive thanks.
Thank you! Merci! Really easy to understand. I will be trying this shortly and am very excited. Appreciate it very much.
Excellent tutorial and got me up and running right away. Keep them coming! And thanks again!!!
Thank you sooo much!! I'll jump patrion. I'm super into learning how to have in gsme objects interact with rwal work. Again, thanks for all the amazing youtube!!
As usual just great ! Everything gets easy to do with your tutorials ! By the way have you made a tutorial for gaze interaction for example to teleport, ? Again Thank you to make XR simple.
Fantastic Tutorial, thank you! I had a few hiccups, like unity not recognizing the android device when I went to build and run it. But everytime I had an issue the resolution was always to just reset the headset haha sometimes multiple times. Anyways thanks again!
Great stuff Valem - agree with others, looking forward to more MR for sure!
Thanks for the amazing tutorial! Can't wait for the next two episodes. Subscribed.
nice tutorials, keep it going. Good Luck!
Can't wait for the next one! Awesome work with the tutorial
Awesome! Worked perfectly. Thanks.
Awesome, thank you! Still moving through this. Which cable do I need to link the Quest 3? A basic usb c sync cable, or the more expensive 40-70~ dollar link cable?
Wondering the same thing, my current cable results in high lag, low fps performance so I definitely want to get a good one. Last time I was building on the quest 2 you had to build it to the device everytime so this would be a luxury. Thanks for the video!
For anyone interested, looking at the passthrough API overview (passthrough over oculus link) they recommend "A compatible USB-C cable to use for Meta Quest Link. For Color Passthrough, the USB connection should provide an effective bandwidth of at least 2 Gbps." I ended up going with a decently cheap 3.1 type C with 10Gbps data transfer and it works great!
Great video! I’m having a bit of a problem, when I launch it on my quest I only see black and not my passthroufh
same! did you ever find a fix?
@@4Aces. Did you guys find a fix?
@@UnityDevJOY I wasn’t able to so I discontinued the project I was going to start sorry:
@@4Aces. I managed to get it to work. I don't really know what did it but I switched my meta account to developer mode, turned on developer mode in the phone app and built the game. I also test by clicking on Build and Run, which is probably the one that did the trick. If you ever wanna try it again and it doesn't work, send a reply and I'll see if I can help. This also goes for anyone else reading this.
Hi! Thanks for the tutorial. I met a problem here: there's no "enable oculus" in beta tab and thus when I clicked play, nothing happened in the quest. Any ideas?
Does i work with Virtural Dekstop other than quest link. My pc doesn't meet requirements for quest link.
This is excellent!! Is there a way to have the Oculus Integration system DYNAMICALLY look for geometry during runtime so it can generate geometry as you play?
Hello, nice tutorials, as always. Sorry for my english I'am french :)
I have a question, when testing on PC why the camera is working ? Is it always like this ?
Using the apk on quest it is working well.
it'd be cool to see this tutorial updated for URP
Awesome video! Appreciate all your hard work. Cool to see it set up in Oculus Integration as well. 😊
Does URP support passthrough placed as Underlay? Seems only Overlay works on URP
Hey do you know why the objects kind of follow the camera?? Almost like that are attached to it ?
Hey could you make a video on the new oculus sdk package? this one is depreciated.
Thank you for the tutorial but I have a major issue with the passthrough. I have the latest oculus software on my pc, and the cube and hands work great, but not the passthrough (all black in the headset). The reason is probably because in the software, I only get 3 options on the beta software, not a bunch like you have??? SOLVED: Make sure you have enabled developer options IN the headset settings! Now to figure out why controllers show up but handtracking does not appear to work :)
Help! 🙏🏻
I want to create a VR game and I'm failing again like I did right at the beginning.
I followed almost everything as in the video. Except that I use the virtual desktop as a connection instead of a Quest link cable. Still it doesn't work 😩
It does not connect to the glasses.
But this only happens with the Oculus Integration asset.
If I start the project straight away with the VR Core, everything works. I just want to work with Oculus integration...
Hi @ValemTutorials, I did exactly everything according to your video, but for some reason I can’t see anything on my Meta Quest 3. I’m using a powered Link Cable, I activated the options indicated in the Meta App on PC in Beta and also put my Oculus in developer mode through the Android App, but when I click Play, the project only works in Unity but I can’t see anything directly on the Meta Quest 3. Could you explain what the problem might be? I’m using Unity 2022.3.21f1 LTS.
also having this. did u find the solution?
@@PedroFerreira-vn9qk hi brow, I needed to find the version of Unity he was using, unfortunately the current versions seem to have problematic support.
few problems: it's showing quest 2 controllers instead of quest 3 ones and my hands aren't visible, instead i see the controllers even in hand tracking, i'm using virtual desktop for this because my gpu isn't supported for link
Is it possible to create app what will draw some 3d object right in main room of Oculus? Some kind of widget. For example if a want to place my lovely Teddy Bear somewhere in my work space.
Was able to get through the entire tutorial and build and run and working perfectly, the only trouble I'm having is not getting anything to happen when I hit the play button in Unity. I've also enabled passthrough over PC link in the Oculus app. What should happen-- does it require that I wear the device or is it simply playing in the game window-- what's the expected behavior?
It should play in the game window normally but sometime the passthrough does not show I believe
I get the same problem -- I hit play, then I get a loading screen for the app, but after 1 second it kicks me out and nothing happens.
How can you upload it to your quest 3 library?
So you can play it without cable
it says that Oculus Integration is depreciated now and you should use Meta XR Interaction SDK OVR Integration
Why did you use the oculus integration package and not Unity’s own provided AR Foundation package with the Unity OpenXR Meta package? It allows for cross platform with iOS and Android as well.
Can we create an app that automatically loads every time I turn on the headset? I would like to display paintings on the wall
Good question, I know that there are some "showcase" option to kind of lock a vr headset into only one application
If not, then I hope there will be something like that in the future. Would be cool to also be able to place shortcuts etc on walls. 🙂
It says oculus integration is deprecated, what do you use instead? Please, provide any references to documentation. Thank you!
That’s the updated one you use that.
Hi Valem, Great tutorial! How would this work with OpenXR? :)
The Oculus Integration appears to have been deprecated at v. 57 or so and replaced by about a dozen different Meta branded packages. I've not yet been able to find the one that builds on the Oculus Integration package specifically. I've imported about half a dozen, and the "Oculus" folder that appears in Assets is always empty. I have a clean template project with the deprecated version, but I'd like to ensure I can take advantage of new functionality as it is added. Anyone have any idea where I'm going wrong?
There's a "Meta All in One SDK" which seems to closely resemble the entire oculus integration from before. Also in their documentation they tell you how to upgrade from the oculus integration package, so might be worth checking that out as well (if you're upgrading a current project)
Hello are you on airlink or linkcable? Everything works but the passthrough is super slow 3fps when in play mode. I'm in airlink since the battery would deplete with the cable. It works fine when in the build
Finally here
Thank you great tutorial video, my project is working perfectly except I can only see my hands and the cube through my left eye???
Do we need anythijg specific hardware wise for this to work?
Hey just followed this tutorial - thank you for this!
One problem I have though, is that the cube is rendering for me in double-vision (in the Quest 3), even whilst the hands / controller render just fine. This only occurs though, when the central OVR camera is not set to Skybox (ie when is set to Solid Colour) - OR when passthrough is enabled. If passthrough is not enabled, or Skybox is selected, no such double vision of the cube occurs.
As far as I can tell I've followed all instructions precisely - except that (as mentioned elsewhere in the comments) the OVRCameraRig was not found in the same directory location as demonstrated by this tutorial (but in a different directory location) - so presumably there is a version change in something since this tutorial was recorded.
Any suggestions welcome!
Hi paul, did you find a solution, do you have the same problem in build ? Maybe its something related to the "Single Pass Instanced" in your project settings > oculus. You should set it to multi pass it might fix the problem
Hi, can I please use this tutorial with a Meta Quest Pro?
Ahhh Fantastique vidéo
So without the oculus integration package the hand tracking won’t work on deployment? I’m using MacBook and Unity and my app deploys fine but I couldn’t get the hands to show at all and my app depends on them for micro interactions
Hello Valem, how usefull is Wonderland engine in rendering larger models with high level of detail? Like 2M polygons etc.
Nice work.
Hey mr cactus congrats on porting your game to quest 3 ! What do you think about the comparison between q2 and q3 in terms of graphism for your game ?
@@ValemTutorials Funny you asked as I made the latest improvements just yesterday. At least for my game, the Q3 pushes the graphics fidelity to almost on PSVR2/PC levels. Even 90hz mode is now possible. Places where the Q2 stuggles are no issue for the Q3 even at higher refresh rates.
Thank you very much, will you also be doing something with UI?😊
Will passtgrough work when running in the editor?
Thanks for sharing such a detailed tutorial Valem! Can you update your video with an annotation as Meta have now deprecated the package? Thanks.
My Game view doesn't show anything. Could you help me?
1million Thanx to this nice Tutorial
Any advice in order to use URP and UltimateXR?? :)
Hey, I have quest 2 project but i want to extend it to quest 3 as well but getting heights issues, is there any fix for this issue ?
Hi, thank you so much for the introduction to vr development, I already installed all the necessary and I was testing just with a simple box and plane as a floor, I was wondering if there is a way to make these props stable, I mean, when I move myself with the glasses the box and floor is moving a bit.
Hi, I followed your tutorial, but when I play, the scene is not showing in my headset. I am new to VR development, pls help me in this...
Thank you. The tutorial was great and it worked perfectly. Later I changed to URP and I got a black background. Anyone has an idea why?
Found the solution. The passthrough does not suppport Post Processing :(
I am curious : is it possible to make the same tutorial on quest 2 since your not using depth sensor capabilities...yet
I guess yes for now but without the color passthrough
Quick question. Can I do this and test mixed reality on Quest 2?
Have Quest 3 but can't setup until Christmas and just wondering if I can just get into dev early using Quest 2?
Did you also try this with HDRP? I cannot get passthrough to work with HDRP VR :(
somehow its not working for me... when I click on "play" I don't see my scene in the headset
Cool , merci de Belgique ..:-)
Hey Valem! Great video. This comment is in reference to your VR audio video regarding using spatializers... Please make a video about audio spatializers! My use case is that I want a room to be soundproof, users inside the room can hear each other but can't hear outside users, and users outside cannot hear the inside users. This is actually not easily answered anywhere online at all. Spatializers affect and dampen the sound, but nothing really talks about completely cutting off the sound all together conditionally. Various raycasting solutions are not an option for me as it is not scalable, optimized, or functional with a large amount of users. You are great at explaining things so I would really appreciate it!
Would it be possible to "cheat"? Assuming that you absolutely need people to hear you with a certain set of radius with your current set up, say 5 meters and it needs to be set that way
1. create a very very thick wall and the sptial sound unable to transfer through with distance
2. teleport the outside players so they are not in the same spatial location as the inside players
The first one is very easy to make, just do it so the radius won't reach the outside, or barely
The second one is aesthetically better but requires a lot of vector calculation because you need to create two scenes and two walls so they each present each other
I'd say these are lot easier than adjusting the audio effect on other objects
@@jimmyding335 Thanks for your input! There is a doorway to the room so it would be really weird to have very thick walls, and the teleportation thing is not an option since the goal is to make a seamless room experience, you enter into the room and you can leave the same way you came, no teleportation shenanigans :P also the goal isn't really to adjust audio effects on other objects, but rather have a persistent environmet property that intrinsically blocks sounds if that makes sense. You enter the room, and people outside it can't hear you, all while not accessing each individual's audio source. Seems like this isn't possible tho :(
so is the passtrough only replacing black color ? so none one my objects can be black ?can i choose different color like green ?
How to activate the passthrough if you are using Unity XR interaction Toolkit and not Oculus Integration?
that's what I'm trying to figure out as well. There's some new meta open xr plugin that does that but needs ARFoundation as well. No simple hack for integrating XRI and passthrough without going through the oculus plugin
@@MohitShukla-g5iActually I figured it put. The newer version of Unity Editor 2022.3.14fx has an option to make an MR (mixed reality) project. If you use that template it has a newer fancier xr hands and also capabilites of enabling passthrough when built on oculus. That’s as far as i got.
@@A_walk_in_the_park that’s awesome! I’m gonna try out having ovrmanager and the pass through layer somewhere in the scene and see if that works. Glad to know you got it working, but one fallback is that you don’t have pass through over oculus link with that method
I followed all of the steps but I can't get it to show in my headset when I press play. None of the tutorials I have followed allow my Quest 3 to play from within Unity. I am sure it's a setting or something I have done wrong. I did install the Meta Quest Developer Hub originally. Would this stop it from playing? I installed the Oculus app on my desktop. Both MQDH and Oculus app can see my Quest 3 via link cable. Just not displaying anything when I press play. Any ideas? I have confirmed my headset is in developer mode too.
SOLVED: I had to turn on Meta Quest Link in the Meta Quest Developer Hub. When I did this the home interface on my Quest 3 changed and it allowed me to hit play and see it all in action.
That package is already deprecated 2 months later and the OVRPlayer object does not exist any more in the SDK package they now provide. Bummer.
You’re the best
Thanks for being a subscriber for so long man it's cool to see you here :D
Do you guys know how to get the quest to rescan the global mesh data? If I left the room that I developed in, it loses the data and doesn't ask to rebuild it.
My quest arrived 2 days ago but I can’t manage to pair the headset and oculus desktop app. They simply cannot see each other neither via cable or air link
Having the same problem, if you found a sollution please let me know. Thanks
@@sarq4470 unfortunately, no solution yet. I think my pc is simply not supported. I have a 11gen i5 with a IrisXe.. probably that’s the issue
@@daviderapisarda3833My pc is also not usable for quest link so I have tried virtural desktop app in quest. It can work the unity app in a low quality and without hand tracking and passthrough, there is just a cube in darkness.
Your videos are the best of the best. You made me wait to buy the quest 3 since march!
Now I got it and i want to experiment with unity of course.. what specs does your computer have? I’m a little concerned mine won’t be able to do the job..
does it's work with quest 2?
how do i playtest the game whenever i hit play i dont get to move my hands it just changes the screen to the same thing as it is on the computer
At 8:03, I was not getting passthrough until I changed this black color's alpha channel (opacity) to be lower. It was its highest, so it was completely opaque. You can set it to lower or 0 to make it transparent. In this example, he has it at a very low 5.
am still seeing all black background even though all the cameras have i have are black and 0 alpha
I'm using a different package (Oculus XR Plugin) and dont see oculus thing
works nice with oculus integration v57, with the new meta all in one v59 Im getting stuck, the oculus folder that has things like invisiblePlane (in part 2 of this series) is missing, in fact the whole assets/oculus folder only has 1 file in it, any ideas?
Yeah looks like they forgot to add the plane which is annoyign, I would stick with oculus integration package for a bit before switching. :)
@@ValemTutorials ah thanks for confirming this, I learned that anything new in software is probably broken! so why did I choose XR? ha ha.
do you think the changes they made, its to make things super lightweight for developers? no examples just all the tools you need to do the job with very light repos?
Anybody knows if the Quest 3 is able to do image tracking? Would be awesome to implement that to position/align models with the real world
Hey - another great video. I noticed in your build settings you didn't have to change the texture compression to ACSI (I think I spelled that right) and its on ETC2 - is that standard now with the new versions of the OVR stuff?
3:33 excuse me Valem. 'oculus intergration' asset was deprecated. what should i do? i could't start this unity tutorial T_T,,,
Latest version I see is 2022.3.4f1, there is no 3.7?
Make a masterclass we pay u Bro xD i LOVE you
Thanks for all your suport
how far out are we from being able to use MR for experiences in the real world??? have been walking around denver using passthrough for around a week and cant help but want a TRUE MR experience wherever i go!!!! Keep it up!
Thanks, but I was hoping you'd opt for OpenXR rather than Oculus integration since many of us also want to support other headsets.
I will talk about open xr integration after ! :)
@@ValemTutorials Hey! Thank you for the amazing tutorials :) I was wondering if there are any updates on the OpenXR version of these tutorials? Very excited to switch from VR to AR and I want to dev for cross platform! Thank you :)
I followed your guide step by step, but at some point I must have done something wrong. I only see Quest 2 Controllers with my Quest 3 and also my Handtracking is working but instead of tracked hands I only see again Quest 2 Controllers... Or might it be because I use Virtual Desktop as Connection?
I had linked quest3 to unity by quest link, but the hand track didn't work, how i suppose to solve this problem?
if possible could you create a guide for interactions in AR if possible? Pretty please :D. The VR ones seems to not apply.
I have a current quest 2 project on 2021.3.2 LTS. I do have some issues getting the quest 3 to work, would I need to update the engine to the 2022 LTS version? hope you can help!
What issue did you have ?