You need to create much more VR tutorials, mobile game tutorials. Thats why people are using or preferring to use Unity!! Please understand this fact, guys. We love you!! Please dont forget us!!
Epic really needs to release more tutorials for VR and AR. That's why most people are still depending on Unity. Also, try to explain better WHY you're setting up the blueprints that way. Just connecting stuff and assuming that everyone understood doesn't make it a great tutorial.
I was waiting for AR and VR tutorials from you guys! Please keep them coming. This has been my main focus for the last couple of months. I was starting with simple photo vr tours, changed to full cg 360 tours with Twinmotion and now I want to learn the real deal with Unreal Engine. I was using UE before when I've imported my Twinmotion build environments into it. But I was solely using it for compositing through it's multi-layer rendering capabilities. Now the next step is in front of me. Thank you #UnrealEngine team. #Greattimes
The video covers the creation of AR and VR projects using Unreal Engine's templates, including a desktop VR experience and a mobile AR experience. It demonstrates how to extend the templates' capabilities through custom functionality, such as using Control Rig to create contact shadows and leveraging ARKit's environment capture for realistic reflections. Key moments: 00:10 The video discusses XR workflows using Unreal Engine, showcasing two projects based on Unreal Engine templates for VR and AR experiences, highlighting features like movement, grab system, hand menu, and spatialized audio in the VR template. -OpenXR becoming production-ready in Unreal Engine 4.27 and its features like movement, grab system, and VR Spectator Blueprint. -Customizing VR projects using Unreal Engine templates, adding Player Start capsule, aligning VRPawn, and adjusting navigation mesh for VR scenes. -Creating custom Actors for VR, adding Static Mesh Component, Grab Component, and selecting objects for interaction in the VR environment. 07:23 The video demonstrates creating a VR interaction using Grab Component, Input Framework, and Blueprints. It showcases how to set up a variant remote with different actors and how to add a new option to a menu in the VR template. -Setting up a VR interaction involves aligning Grab Component and simulating physics for a remote, demonstrating how to interact with objects in a virtual environment. -Creating a variant remote with different actors and utilizing Blueprint logic to cycle through variants, highlighting the importance of Construction Script for easy switching. -Adding a new option to a menu in the VR template by adjusting button components and logic in Blueprints, emphasizing the versatility of menus for in-game options and debugging. 13:59 To customize an AR project, ensure SDK license acceptance, name the project package, and understand the lighting model. Blueprint logic and UI elements are crucial for project flexibility and customization. -Understanding the lighting model and Blueprint logic in AR projects is essential for customization and flexibility. -Utilizing Blueprint for logic and UI elements allows for easy customization and extension of AR projects. -Adding custom geometry to AR experiences involves adjusting static meshes and UI elements for a personalized touch. 20:05 The video demonstrates deploying an AR application on Android and iOS devices, showcasing placement, gestures, UI customization, and Control Rig for character animation with shadows in Unreal Engine. -The AR template project allows scanning the environment to define placement, use gestures for object manipulation, customize UI, and take snapshots in the application. -The advanced example involves animated characters with Control Rig for generating shadows, utilizing ARKit for refined reflection captures in an iOS application, and extending functionality with Blueprint code. -Using Control Rig in Unreal Engine involves adding virtual bones for tracking feet and hips, setting up hierarchy, creating contact shadows, and manipulating bone positions and rotations for character animation. 26:12 The video demonstrates how to duplicate virtual bones, set their positions, and use Animation Blueprints to update Control Rig in the runtime for character animation. It also shows how to attach polygons to virtual bones to create shadows. -Duplicating and setting positions for virtual bones is shown, followed by using Animation Blueprints to update Control Rig for character animation. -Attaching polygons to virtual bones to create shadows is explained, along with the process of using Blueprint to achieve this effect. 32:16 Creating a Control Rig to generate shadow contact patches in AR involves adding virtual bones and planes, adjusting their positions, and integrating them into the BP_Placeable for a cohesive AR experience. -Adding virtual bones and planes to create shadow contact patches in AR by adjusting their positions and integrating them into the BP_Placeable for a complete AR setup. -Scaling down virtual bones and planes, duplicating components, and wiring them up to refine the AR shadow effects for a more realistic visual experience. -Utilizing Child Actors and scaling down objects like Robot_A_Shadow to enhance the AR visual elements, positioning them strategically for an immersive AR environment. 38:24 The video demonstrates creating Blueprint code to refine the Environment Capture probe using ARKit, enhancing the scene with chrome robots and contact shadows, followed by a brief overview of iOS deployment for the project. -Creating Blueprint logic to refine the Environment Capture probe using ARKit for a more immersive scene with chrome robots and contact shadows. -Overview of iOS deployment requirements, including setting up signing certificates and identifiers, essential for compiling the project on Xcode. 44:28 The process involves creating a Project Identifier, adding devices, saving a profile, importing into Unreal, and managing certificates and keys for iOS development. -The importance of matching the Bundle Identifier with the Project Identifier for iOS development to ensure validation and functionality. -The significance of following the documentation provided to understand and successfully navigate through the development process. Generated by sider.ai
Yes. More please. Want to get away from Unity and the lack of mobile focused AR content / AR unreal community is really holding me back. Also the same applies for using Unreal for machine Learning - I use unity right now to generate synthetic training images to detect solar panel defects (clean tech) with unity ……. Please hit the paddle on this
Hope you get to drop us another video to show off some fun AR features in UE I’m sure is worth diving into. Making AR in UE work nice gets my vote over videos mind, I’m sure you’re super busy 👍
If you are watching these type of vids please keep in mind these guys are not developers or programmers they are designers..the developers create the framework then designers rearrange artifacts which are coded by developers ie unreal engine..so no you are not watching a game coding tutorial this is a game design tutorial
Hmmm...yes, let's create a lesson of sorts for people to follow along with to gain a better understanding, but then not provide them any links to the assets or something similar to use. Very helpful!
Thx for this great webinar, but seem's only on v4, under unreal 5 libraries are missing in the content drawer (ex : "BP_startmenu" or "event graph"/"world hit test" in BP_arpawn) and some things was differents... i start directly with v5 and it's no so easy to find them :D
HELP! When I press on "Begin Scan", on my Galaxy S21 Ultra, nothing happens. When looking at the ARSessionStatus, the result says "FatalError", even though I've given the app my permission to use the camera. Any fix to this problem?
When I try to test this template it gives me an error, I have followed all steps and it can build. I have changed anything everything defaults from the template. Any help is appreciated.
I've redone the vid like 6 times ahahah I get this when trying to run the windows batch file " .android epositories.cfg could not be loaded. " I am on a work computer could this affect the cmd prompt? I didnt change the default install location and I have installed the sdk tools
I was very excited to try this and got all the way to Launch. I then get the message: "SDK Verification failed. Would you like to attempt the Launch On anyway?" Which I did and without surprise, it did not work. I'm using UE 5.1. Seems like the Android SDK from 2 years ago may be a bit dated? Any suggestions? Thank you!
I followed the instructions (Setting Up Android SDK and NDK for Unreal) for setting up android studio (ver 4.0), left everything to default while installing, installed the command-line tools in the SDK tools settings, restarted the PC, and then and ran Epic's SetupAndroid.bat file...and Unreal says it can't find the sdkmanager.bat file.
right click the bat file then edit, change Original: set SDKMANAGER=%STUDIO_SDK_PATH%\tools\bin\sdkmanager.bat Modified: set SDKMANAGER=C:\Users\YOURNAME\AppData\Local\Android\Sdk\tools\bin\sdkmanager.bat
@@MrPangahas Hey thanks for this! despite changing the bat as you suggested (changing the username to my own) it seems it is still unable to find the sdkmanager.bat. It is querying - Did you run Android Studio and install cmdline-tools after installing? I did do this - in fact I installed a couple of different version in case not all have the .bat. Any help with this would be enormous! Thanks!
can you provide any information on creating input devices for virtual environments with output sensory devices. Computer vision to human visioning using cgi automation
Can Microsoft Surface run AR programs? I Packaged exe, but when I run, after tap on Scan environment, nothing happened (Camera Allow to use on apps is enabled on windows)
Is there a place we can access the project files for the AR robot basketball template? I can't seem to find this anywhere. In particular, I am interested how the translucent-looking shadow material was achieved in AR. In my experience with UE4, if I use a Translucent material in an AR/VR app, the material will not be shown unless I have another actor's pixels shown behind the translucent actor. I am developing an MR app for Oculus Quest, and I cannot use Translucent materials in MR-passthrough mode because they do not show up unless opaque actor pixels are shown behind them. I see in the AR robot basketball template project, the shadows look translucent and render properly. Maybe this is because in this case, using iOS, Unreal has access to the pixels of the camera and properly composites them. In the Oculus MR-passthrough case, Unreal does not composite the passthrough camera pixels as this is handled by the Oculus OS. For the Oculus case, I bet Unreal just throws out any translucent pixels as an optimization since it does not know of any valid opaque pixels behind the translucent pixels. Additionally this causes problems for Oculus Mixed Reality Capture (MRC) as translucent actors do not show up in the MRC foreground layer since the MRC's SceneCapture2D removes any actors in the background of its render, making translucent actors invisible. Does anyone know how to render translucent materials in UE4 regardless of there being valid pixels behind the actor? Maybe this requires modification of the Engine's rendering system? It seems this is not an issue in Unity, so it is definitely possible.
I'm having so many crashes working and VR. I follow all the videos and I still get crashes. Please help! Running 5.0.1. Could this be the issue? Is there any way to rollback to 5.0?
Hi there - I'm trying to create my own static mesh, on import (via datasmith) into a new level, importing the static mesh into the BP_Placeable, on trying to compile I am met with the message 'This blueprint (self) is not a StaticMeshComponent, therefore Target must have a connection'. I merged actors when I imported them and it is one unified mesh in the BP_Placeable view port. Not sure what I need to do here that will fix it and very little online? Thanks for any help
This method of using Control Rig to attach shadows to the virtual bones works fine when you put an animation asset directly into the control rig and then into the output pose, but I can't get the control rig to function when i plug a State Machine into it, and then into the output pose.... Anyone able to help?
im having a problem my samsung android is not showing in the launch tool. i have done and successfully installed sdk and accept the terms but still my phone is not showing in the launch tool. also i have done enabling the usb debugging pls help
Hello I'm on UE 4.26 and i have this problem when launching the template on my Samsun A 10 ogPlayLevel: Error: ERROR: cmd.exe failed with args /c "C:\Users\Abdelhak\Documents\Unreal Projects\ARproject\Intermediate\Android\armv7\gradle ungradle.bat" :app:assembleDebug
hi im a student and preparing for a competition about vr development(make a scene that has interacitve functions,character,animation,level designs ) ,sametime, has a UI platform . The subject is about learning English,also my idea is to grab the letters from one box to combine a word on a screen as a beginner idk how long it would cost (i must finish it in 1 mouth) and is there any courses help me study this ?
Hi, I can get the template model running with no issue but when i bring my own models in they don't stay anchored to a point, I can look left and right fine, but when i try to move in or out the model stays at the same position, does anyone know what might be causing this?
I don't understand here is VR and AR, it's use AR for mobile phone... but if I got an Quest 2 or 3, I saw some demo that does Mixed Reality. it's not the same right, to do mixed reality I need to be in VR and from there I can switch to real world.... some explanation on this will be appreciated.
It's a pity packaging is so hit and miss. Take a single project and one day it works flawlessly, the next cannot preview in VR. Forget the AR side of things so many errors...
Hey, i got this error when i try to open SetupAndroid.bat : "Unable to locate sdkmanager.bat. Did you run Android Studio and install cmdline-tools after installing?". Could anyone please help me?
right click the bat file then edit, change Original: set SDKMANAGER=%STUDIO_SDK_PATH%\tools\bin\sdkmanager.bat Modified: set SDKMANAGER=C:\Users\YOURNAME\AppData\Local\Android\Sdk\tools\bin\sdkmanager.bat
Victor's Tutorial is on point! But this Autodesk-Guy is making me crazy. 1: He is not telling "why" he is doing something at the beginning of the tutorial, just doing random fancy stuff. 2: He is way over-compliating with the virtual bones, just use in Get Bone Location/Rotation by Name in the Actor Blueprint. 3: Please explain at the begining the "why" before you are starting coding a Timer on Skylight Capture and not just use Event Tick for that. And not just code a Timer because it is fancy.
How do you spell "False Advertising?" Thank you UE for making me falsely believe, "Hey! This AR thing is totally cool not so difficult. I will do that." HAH! This video is very well made and I'm a real UE fan. But after diligently following all the steps, AR did not work. Not at all. Zero. Honeymoon with a corpse level of action. Fortunately I found @Nils Gallist and his Android / VR / AR Setup in Unreal Engine 5 / 4.27 video! Follow that 17 minute video down to the last mind blowing detail (and the comments!) and then...if you are fortune favored, UE AR on Android will work. And then you can wait a year till UE AR becomes more like a real feature and not an angry hungry rabid dog you have to train to stay and roll over while you wear a sirloin hat. Kind of like that.
You need to create much more VR tutorials, mobile game tutorials. Thats why people are using or preferring to use Unity!! Please understand this fact, guys. We love you!! Please dont forget us!!
Юнити фуфло с ошибками
Yes, what this guy says!
2nd that motion
3rd that motion
Same here, only Unity have tutorials… 😢
Epic really needs to release more tutorials for VR and AR. That's why most people are still depending on Unity. Also, try to explain better WHY you're setting up the blueprints that way. Just connecting stuff and assuming that everyone understood doesn't make it a great tutorial.
Totally
How you doing did you have any documents or something, i need help in AR.
Same here, only Unity have tutorials… 😢
That’s the problem with more than half of the tutorials about anything here on yt
I was waiting for AR and VR tutorials from you guys! Please keep them coming. This has been my main focus for the last couple of months. I was starting with simple photo vr tours, changed to full cg 360 tours with Twinmotion and now I want to learn the real deal with Unreal Engine. I was using UE before when I've imported my Twinmotion build environments into it. But I was solely using it for compositing through it's multi-layer rendering capabilities. Now the next step is in front of me. Thank you #UnrealEngine team. #Greattimes
Same here, only Unity have tutorials… 😢
we need more of such tutorials . Really straightforward and clear . Plz make more of Such for AR
Thanks ....
I LOVE the Unreal VR videos and I am SUPER greatful for them! Thank you :*:*
victor's tutorial style is flawless. he's seen so many executed that he's able to eliminate all their weaknesses and make his concise and smart
This tutorial is gold. Thank you very much.
The video covers the creation of AR and VR projects using Unreal Engine's templates, including a desktop VR experience and a mobile AR experience. It demonstrates how to extend the templates' capabilities through custom functionality, such as using Control Rig to create contact shadows and leveraging ARKit's environment capture for realistic reflections.
Key moments:
00:10 The video discusses XR workflows using Unreal Engine, showcasing two projects based on Unreal Engine templates for VR and AR experiences, highlighting features like movement, grab system, hand menu, and spatialized audio in the VR template.
-OpenXR becoming production-ready in Unreal Engine 4.27 and its features like movement, grab system, and VR Spectator Blueprint.
-Customizing VR projects using Unreal Engine templates, adding Player Start capsule, aligning VRPawn, and adjusting navigation mesh for VR scenes.
-Creating custom Actors for VR, adding Static Mesh Component, Grab Component, and selecting objects for interaction in the VR environment.
07:23 The video demonstrates creating a VR interaction using Grab Component, Input Framework, and Blueprints. It showcases how to set up a variant remote with different actors and how to add a new option to a menu in the VR template.
-Setting up a VR interaction involves aligning Grab Component and simulating physics for a remote, demonstrating how to interact with objects in a virtual environment.
-Creating a variant remote with different actors and utilizing Blueprint logic to cycle through variants, highlighting the importance of Construction Script for easy switching.
-Adding a new option to a menu in the VR template by adjusting button components and logic in Blueprints, emphasizing the versatility of menus for in-game options and debugging.
13:59 To customize an AR project, ensure SDK license acceptance, name the project package, and understand the lighting model. Blueprint logic and UI elements are crucial for project flexibility and customization.
-Understanding the lighting model and Blueprint logic in AR projects is essential for customization and flexibility.
-Utilizing Blueprint for logic and UI elements allows for easy customization and extension of AR projects.
-Adding custom geometry to AR experiences involves adjusting static meshes and UI elements for a personalized touch.
20:05 The video demonstrates deploying an AR application on Android and iOS devices, showcasing placement, gestures, UI customization, and Control Rig for character animation with shadows in Unreal Engine.
-The AR template project allows scanning the environment to define placement, use gestures for object manipulation, customize UI, and take snapshots in the application.
-The advanced example involves animated characters with Control Rig for generating shadows, utilizing ARKit for refined reflection captures in an iOS application, and extending functionality with Blueprint code.
-Using Control Rig in Unreal Engine involves adding virtual bones for tracking feet and hips, setting up hierarchy, creating contact shadows, and manipulating bone positions and rotations for character animation.
26:12 The video demonstrates how to duplicate virtual bones, set their positions, and use Animation Blueprints to update Control Rig in the runtime for character animation. It also shows how to attach polygons to virtual bones to create shadows.
-Duplicating and setting positions for virtual bones is shown, followed by using Animation Blueprints to update Control Rig for character animation.
-Attaching polygons to virtual bones to create shadows is explained, along with the process of using Blueprint to achieve this effect.
32:16 Creating a Control Rig to generate shadow contact patches in AR involves adding virtual bones and planes, adjusting their positions, and integrating them into the BP_Placeable for a cohesive AR experience.
-Adding virtual bones and planes to create shadow contact patches in AR by adjusting their positions and integrating them into the BP_Placeable for a complete AR setup.
-Scaling down virtual bones and planes, duplicating components, and wiring them up to refine the AR shadow effects for a more realistic visual experience.
-Utilizing Child Actors and scaling down objects like Robot_A_Shadow to enhance the AR visual elements, positioning them strategically for an immersive AR environment.
38:24 The video demonstrates creating Blueprint code to refine the Environment Capture probe using ARKit, enhancing the scene with chrome robots and contact shadows, followed by a brief overview of iOS deployment for the project.
-Creating Blueprint logic to refine the Environment Capture probe using ARKit for a more immersive scene with chrome robots and contact shadows.
-Overview of iOS deployment requirements, including setting up signing certificates and identifiers, essential for compiling the project on Xcode.
44:28 The process involves creating a Project Identifier, adding devices, saving a profile, importing into Unreal, and managing certificates and keys for iOS development.
-The importance of matching the Bundle Identifier with the Project Identifier for iOS development to ensure validation and functionality.
-The significance of following the documentation provided to understand and successfully navigate through the development process.
Generated by sider.ai
Amazing, it's so straight forward I really enjoyed following this one!
Really cool stuff!
That was an amazing video and I feel like I can actually get started using soft soft. Thank you so much!
Awesome.
Thanks for sharing this video, this has cleared my so many doubts.
Thank you! 🙏
you deserve a lot more subs
Wow 🤩😍
I dont know if this has been covered, but it would be really cool to keep the hands from the old unreal template. They were really handful.
please do also in architectural and structural AR VR
Yes. More please. Want to get away from Unity and the lack of mobile focused AR content / AR unreal community is really holding me back. Also the same applies for using Unreal for machine Learning - I use unity right now to generate synthetic training images to detect solar panel defects (clean tech) with unity ……. Please hit the paddle on this
Nice video, it would be nice to see the Xcode side of things also for AR (full end-to-end process), the docs are a little dated.
I hear you, I was running out of time. The only part that tripped me up going to the Mac was getting the Bundle Identifier set.
Hope you get to drop us another video to show off some fun AR features in UE I’m sure is worth diving into. Making AR in UE work nice gets my vote over videos mind, I’m sure you’re super busy 👍
Amazing advice wow
can't wait for psvr2
Please make AR multiplayer webinar. It's pain in the private parts
Thank you for this video. Does this work in UE5EA?
If you are watching these type of vids please keep in mind these guys are not developers or programmers they are designers..the developers create the framework then designers rearrange artifacts which are coded by developers ie unreal engine..so no you are not watching a game coding tutorial this is a game design tutorial
Hmmm...yes, let's create a lesson of sorts for people to follow along with to gain a better understanding, but then not provide them any links to the assets or something similar to use. Very helpful!
Do you want them to model stuff for you?
Nice!
Nice Webinar! The environment capture thing on iOS doesn't work for me though. Metals stay black. I did the exact same thing he did at 37:00.
Did you find a solution?
Great tutorial! What is the scene on roof? Where I can download it?
VERY COOL! I've been waiting for these tuts! Question: what about image targets and vuforia equivalent and AR books??? =)
Have a play with AR Image Detection
@@antinnit Only in unity as I heard it was unusable in UE and not found any demonstrations of it....I would greatly like to be able to achieve in UE
Thx for this great webinar, but seem's only on v4, under unreal 5 libraries are missing in the content drawer (ex : "BP_startmenu" or "event graph"/"world hit test" in BP_arpawn) and some things was differents... i start directly with v5 and it's no so easy to find them :D
Not enough tutorials, i think I will go to Unity this time. 😅
When will you guys fix the Volumetric clouds issue in VR?! :(
I must confess that learning how to use the software is what gets procastinating to start making soft. It scares lol
thx, how to use or activate the Xbox Controller for the Rift? Any Tutorials on that? Appreciate it and keep rocking, cheers
The only tNice tutorialng I learnt myself in soft soft is pressing tab in the keyboard to bring up the channel rack
Nice! is it possible to have a character with arkit blenshapes talking with the AR template?
My device is not letting me to scan at 20:24 ....
HELP! When I press on "Begin Scan", on my Galaxy S21 Ultra, nothing happens. When looking at the ARSessionStatus, the result says "FatalError", even though I've given the app my permission to use the camera. Any fix to this problem?
Anyone know how to get an image sequence to play once an object has spawned into the scene in AR?
When I try to test this template it gives me an error, I have followed all steps and it can build. I have changed anything everything defaults from the template. Any help is appreciated.
Please teach us how to create AR videos...particularly music videos for me would be soooooo helpful
I've redone the vid like 6 times ahahah I get this when trying to run the windows batch file " .android
epositories.cfg could not be loaded. " I am on a work computer could this affect the cmd prompt? I didnt change the default install location and I have installed the sdk tools
how do you make dinamic shadows like you used to in older videos?
choose to keep the loop playing in any order, DJ's will love tNice tutorials.
Would it be possible to put Metahumans into an AR Scene?
For the iOS developer certificates do I need to get the paid developer account which costs around $99 ?
Hi is it possible to do a spectator view in a mobile AR project? very appreciate for any reply
Can you make a video based on creating a world like metaverse please
I was very excited to try this and got all the way to Launch. I then get the message: "SDK Verification failed. Would you like to attempt the Launch On anyway?" Which I did and without surprise, it did not work. I'm using UE 5.1. Seems like the Android SDK from 2 years ago may be a bit dated? Any suggestions? Thank you!
18:53 in my UE5 there is not my devicecs. what am i missing?
Unreal engine forever
Is there any difference in the AR Template across Games/ Automotive / Architecture ?
Its the same, just has different static meshes.
I hope the UE will help us to develop projects and make tutorials for AR. So far only Unity is providing this! 😢😢😢
I followed the instructions (Setting Up Android SDK and NDK for Unreal) for setting up android studio (ver 4.0), left everything to default while installing, installed the command-line tools in the SDK tools settings, restarted the PC, and then and ran Epic's SetupAndroid.bat file...and Unreal says it can't find the sdkmanager.bat file.
right click the bat file then edit, change
Original:
set SDKMANAGER=%STUDIO_SDK_PATH%\tools\bin\sdkmanager.bat
Modified:
set SDKMANAGER=C:\Users\YOURNAME\AppData\Local\Android\Sdk\tools\bin\sdkmanager.bat
@@MrPangahas Hey thanks for this! despite changing the bat as you suggested (changing the username to my own) it seems it is still unable to find the sdkmanager.bat. It is querying - Did you run Android Studio and install cmdline-tools after installing? I did do this - in fact I installed a couple of different version in case not all have the .bat. Any help with this would be enormous! Thanks!
can you provide any information on creating input devices for virtual environments with output sensory devices. Computer vision to human visioning using cgi automation
Can Microsoft Surface run AR programs? I Packaged exe, but when I run, after tap on Scan environment, nothing happened (Camera Allow to use on apps is enabled on windows)
Is there a place we can access the project files for the AR robot basketball template? I can't seem to find this anywhere.
In particular, I am interested how the translucent-looking shadow material was achieved in AR. In my experience with UE4, if I use a Translucent material in an AR/VR app, the material will not be shown unless I have another actor's pixels shown behind the translucent actor. I am developing an MR app for Oculus Quest, and I cannot use Translucent materials in MR-passthrough mode because they do not show up unless opaque actor pixels are shown behind them.
I see in the AR robot basketball template project, the shadows look translucent and render properly. Maybe this is because in this case, using iOS, Unreal has access to the pixels of the camera and properly composites them. In the Oculus MR-passthrough case, Unreal does not composite the passthrough camera pixels as this is handled by the Oculus OS. For the Oculus case, I bet Unreal just throws out any translucent pixels as an optimization since it does not know of any valid opaque pixels behind the translucent pixels. Additionally this causes problems for Oculus Mixed Reality Capture (MRC) as translucent actors do not show up in the MRC foreground layer since the MRC's SceneCapture2D removes any actors in the background of its render, making translucent actors invisible.
Does anyone know how to render translucent materials in UE4 regardless of there being valid pixels behind the actor? Maybe this requires modification of the Engine's rendering system?
It seems this is not an issue in Unity, so it is definitely possible.
Hi Timothy, have you come to any conclusions in the 9 months this since this was posted? I would very much like to know as well.
the attachment component are linked with the bone, even when I use only 2 vectors , X and Y are fine, but the Z, that shouldn´t link, are going wrong
Can you please guide me about the minimum system requirements of PC, Graphics Card and RAM to run VR in unreal engine?
Are we agree that we need to have an Apple Developer account (99$/y) just to test an AR experiment on a device ?
It's actually possible to build to iOS without a Developer Account. The trick is to using the "auto signing" feature.
I'm having so many crashes working and VR. I follow all the videos and I still get crashes. Please help! Running 5.0.1. Could this be the issue? Is there any way to rollback to 5.0?
What is it about the Epic office that makes real objects like that coffee cup look like they are being rendered by Unreal?
Where would I start to place objects on an wall instead of on a groundplane?
Hi there - I'm trying to create my own static mesh, on import (via datasmith) into a new level, importing the static mesh into the BP_Placeable, on trying to compile I am met with the message 'This blueprint (self) is not a StaticMeshComponent, therefore Target must have a connection'. I merged actors when I imported them and it is one unified mesh in the BP_Placeable view port. Not sure what I need to do here that will fix it and very little online? Thanks for any help
what I'm hoping for is understanding how to make your own custom mesh at 18:00 but with Rhino models
AR features dont work with Android SDK 30 as of yet... When can we expect support?
My Phones Camera is Not Getting Started, its showing some tiles pattern directly, what to do?
I have the same issue. Did you find a solution? I'm on 5.2.
This method of using Control Rig to attach shadows to the virtual bones works fine when you put an animation asset directly into the control rig and then into the output pose, but I can't get the control rig to function when i plug a State Machine into it, and then into the output pose.... Anyone able to help?
im having a problem my samsung android is not showing in the launch tool. i have done and successfully installed sdk and accept the terms but still my phone is not showing in the launch tool. also i have done enabling the usb debugging pls help
Hello
I'm on UE 4.26 and i have this problem when launching the template on my Samsun A 10
ogPlayLevel: Error: ERROR: cmd.exe failed with args /c "C:\Users\Abdelhak\Documents\Unreal Projects\ARproject\Intermediate\Android\armv7\gradle
ungradle.bat" :app:assembleDebug
hi im a student and preparing for a competition about vr development(make a scene that has interacitve functions,character,animation,level designs ) ,sametime, has a UI platform . The subject is about learning English,also my idea is to grab the letters from one box to combine a word on a screen as a beginner idk how long it would cost (i must finish it in 1 mouth) and is there any courses help me study this ?
on iOS the plane detection isn't working? UE 5.3 AR template.
Thank you! Just what I needed. Does Quest2 OpenXR Runtime work with Mac?
Hi, I can get the template model running with no issue but when i bring my own models in they don't stay anchored to a point, I can look left and right fine, but when i try to move in or out the model stays at the same position, does anyone know what might be causing this?
nailed it, turns out that the model i was bring in was too big, I scaled it down in blender then reimported it and it runs great.
Hello all
Can anyone help with dark camera transmission background with Handheld AR template and running on android, how to increase the exposure
is it possible to make web base AR in unreal engine 5 ?
how to change static mesh and colors of any texture in VR?
It's amazing, but not working in new unreal 5 + :(
I don't understand here is VR and AR, it's use AR for mobile phone... but if I got an Quest 2 or 3, I saw some demo that does Mixed Reality. it's not the same right, to do mixed reality I need to be in VR and from there I can switch to real world.... some explanation on this will be appreciated.
is the source code for this project realised anywhere?
It's a pity packaging is so hit and miss. Take a single project and one day it works flawlessly, the next cannot preview in VR. Forget the AR side of things so many errors...
Brother do you have any band
@UnrealEngine when I try to accept the SDK License it doesn't grey out?
I'm having that same problem as well 😒
Really they do a tutorial and vanish ? SDK License it doesn't grey out...
Hey, i got this error when i try to open SetupAndroid.bat : "Unable to locate sdkmanager.bat. Did you run Android Studio and install cmdline-tools after installing?". Could anyone please help me?
right click the bat file then edit, change
Original:
set SDKMANAGER=%STUDIO_SDK_PATH%\tools\bin\sdkmanager.bat
Modified:
set SDKMANAGER=C:\Users\YOURNAME\AppData\Local\Android\Sdk\tools\bin\sdkmanager.bat
Why sometimes project files are not made available is beyond me to understand!!
How to take video in ar unreal this.
08:35 bye
11:44 zero answers on the most questions...
Victor's Tutorial is on point! But this Autodesk-Guy is making me crazy. 1: He is not telling "why" he is doing something at the beginning of the tutorial, just doing random fancy stuff. 2: He is way over-compliating with the virtual bones, just use in Get Bone Location/Rotation by Name in the Actor Blueprint. 3: Please explain at the begining the "why" before you are starting coding a Timer on Skylight Capture and not just use Event Tick for that. And not just code a Timer because it is fancy.
Get Socket Rotation is an option too, can take Bone Names
first
22:07
Create a AR game
How do you spell "False Advertising?" Thank you UE for making me falsely believe, "Hey! This AR thing is totally cool not so difficult. I will do that." HAH! This video is very well made and I'm a real UE fan. But after diligently following all the steps, AR did not work. Not at all. Zero. Honeymoon with a corpse level of action. Fortunately I found @Nils Gallist and his Android / VR / AR Setup in Unreal Engine 5 / 4.27 video! Follow that 17 minute video down to the last mind blowing detail (and the comments!) and then...if you are fortune favored, UE AR on Android will work. And then you can wait a year till UE AR becomes more like a real feature and not an angry hungry rabid dog you have to train to stay and roll over while you wear a sirloin hat. Kind of like that.
not very helpful
- 10 secs
No need to waste time for VR, better do something that really needed like full linux support in UE5