- 42
- 258 229
Ludic Worlds
Czechia
เข้าร่วมเมื่อ 22 ก.พ. 2021
Learn the Art & Science of Creating Virtual Reality Experiences
I am a professional VR developer, with over 20 years experience in games and interactive media. My passion for VR began back in 2013, when I acquired my first VR headset: an Oculus DK1. I remember loading up the Tuscany demo, and being completely awestruck!
Since then, I have been working primarily in VR. A decade on, and my passion for VR has not abated.
Indeed, I’m keen to help others join in on the fun!
This has led me to creating this channel, which features easy-to-understand tutorials on XR development (using the Unity game engine). Whether you're a beginner or a seasoned developer, my goal is to help you understand the fundamentals of XR development, and show you how to create amazing experiences for your users.
I am a professional VR developer, with over 20 years experience in games and interactive media. My passion for VR began back in 2013, when I acquired my first VR headset: an Oculus DK1. I remember loading up the Tuscany demo, and being completely awestruck!
Since then, I have been working primarily in VR. A decade on, and my passion for VR has not abated.
Indeed, I’m keen to help others join in on the fun!
This has led me to creating this channel, which features easy-to-understand tutorials on XR development (using the Unity game engine). Whether you're a beginner or a seasoned developer, my goal is to help you understand the fundamentals of XR development, and show you how to create amazing experiences for your users.
Get Started with ML-Agents in Unity - Part 1: Setup & Installation
In this tutorial, we walk through the setup process for Unity ML-Agents, covering both the Unity and Python components. You’ll learn how to install necessary tools, configure the python environment, and run a basic ML-Agents training example within Unity.
📜 Doc containing the 'Commands' used in this video : github.com/LudicWorlds/ml-agents-course-assets/blob/main/conda-commands.md
→ Next Video • Coming Soon!
▬ Support My Work ▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
❤️ Support me on Patreon: www.patreon.com/ludicworlds
☕ Buy me a coffee: ko-fi.com/ludicworlds
Thank you for your support!
▬ Timestamps ▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
0:00 - Intro
1:22 - Setup Overview
2:07 - Install Visual C++ Redistributable
2:40 - Install CUDA Toolkit
5:14 - Install cuDNN
6:11 - Install Anaconda
7:08 - Anaconda PowerShell Prompt
8:05 - Unity & Python: Key Roles
9:03 - Set Up Virtual Environment
10:21 - Install Dependencies
11:57 - Download ML-Agents Source Code
12:57 - Install ML-Agents Python Package
13:57 - Open Unity Examples Project
15:18 - Import ML-Agents Unity Package
16:01 - Open Basic Example Scene
16:57 - Start Python Backend
18:18 - Play Basic Example in Unity
19:07 - Training Session Output
19:30 - Resume or Restart Training Session
20:09 - Outro
▬ Useful Links ▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
► ML-Agents Installation Instructions: unity-technologies.github.io/ml-agents/Installation/
► Microsoft Visual C++ Redistributable: learn.microsoft.com/en-us/cpp/windows/latest-supported-vc-redist
► Cuda Toolkit: developer.nvidia.com/cuda-toolkit
► Cuda Compatibility: developer.nvidia.com/cuda-gpus
► Cuda Downloads: developer.nvidia.com/cudnn-downloads
► CuDNN → Cuda Toolkit Support Matrix: docs.nvidia.com/deeplearning/cudnn/latest/reference/support-matrix.html
► Miniconda Download: docs.anaconda.com/miniconda/
► ML-Agents GitHub: github.com/Unity-Technologies/ml-agents
► Example Learning Environments: github.com/Unity-Technologies/ml-agents/blob/develop/docs/Learning-Environment-Examples.md
▬ Credits ▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
► Music by - CO.AG Music: www.youtube.com/@co.agmusic
#Unity #AI #mlagents #reinforcementlearning
📜 Doc containing the 'Commands' used in this video : github.com/LudicWorlds/ml-agents-course-assets/blob/main/conda-commands.md
→ Next Video • Coming Soon!
▬ Support My Work ▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
❤️ Support me on Patreon: www.patreon.com/ludicworlds
☕ Buy me a coffee: ko-fi.com/ludicworlds
Thank you for your support!
▬ Timestamps ▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
0:00 - Intro
1:22 - Setup Overview
2:07 - Install Visual C++ Redistributable
2:40 - Install CUDA Toolkit
5:14 - Install cuDNN
6:11 - Install Anaconda
7:08 - Anaconda PowerShell Prompt
8:05 - Unity & Python: Key Roles
9:03 - Set Up Virtual Environment
10:21 - Install Dependencies
11:57 - Download ML-Agents Source Code
12:57 - Install ML-Agents Python Package
13:57 - Open Unity Examples Project
15:18 - Import ML-Agents Unity Package
16:01 - Open Basic Example Scene
16:57 - Start Python Backend
18:18 - Play Basic Example in Unity
19:07 - Training Session Output
19:30 - Resume or Restart Training Session
20:09 - Outro
▬ Useful Links ▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
► ML-Agents Installation Instructions: unity-technologies.github.io/ml-agents/Installation/
► Microsoft Visual C++ Redistributable: learn.microsoft.com/en-us/cpp/windows/latest-supported-vc-redist
► Cuda Toolkit: developer.nvidia.com/cuda-toolkit
► Cuda Compatibility: developer.nvidia.com/cuda-gpus
► Cuda Downloads: developer.nvidia.com/cudnn-downloads
► CuDNN → Cuda Toolkit Support Matrix: docs.nvidia.com/deeplearning/cudnn/latest/reference/support-matrix.html
► Miniconda Download: docs.anaconda.com/miniconda/
► ML-Agents GitHub: github.com/Unity-Technologies/ml-agents
► Example Learning Environments: github.com/Unity-Technologies/ml-agents/blob/develop/docs/Learning-Environment-Examples.md
▬ Credits ▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
► Music by - CO.AG Music: www.youtube.com/@co.agmusic
#Unity #AI #mlagents #reinforcementlearning
มุมมอง: 800
วีดีโอ
Offline Speech Recognition on Meta Quest: Testing Unity Sentis + Whisper AI
มุมมอง 8832 หลายเดือนก่อน
In this video, I walk you through my Meta Quest project featuring offline speech recognition using Unity Sentis and OpenAI’s Whisper model. Learn how I integrated Whisper for local transcription and optimized the app for performance. 📀 Download my Unity project for FREE from here: www.patreon.com/posts/project-source-107788463 ← Previous Video • VR Meets AI: Introducing Unity Sentis: th-cam.com...
VR Meets AI: Introducing Unity Sentis
มุมมอง 7082 หลายเดือนก่อน
In this video, we explore some examples of how AI is being applied in VR. We also take a look at Unity Sentis to see how it can help integrate AI into our VR Apps. → Next Video • Offline Speech Recognition on Meta Quest: Testing Unity Sentis Whisper AI: th-cam.com/video/mOosvi54we0/w-d-xo.html ▬ Support My Work ▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬ ❤️ Support me on Patreon: www.patreon.com/ludicworlds ☕ Buy me a cof...
Get 'AR Foundation Samples' Running on Meta Quest 3
มุมมอง 1.5K4 หลายเดือนก่อน
Learn how to get Unity's 'AR Foundation Samples' project running on your Meta Quest 3. This tutorial covers configuring build and project settings, then deploying the project to your Quest 3. ▬ Support My Work ▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬ ❤️ Support me on Patreon: www.patreon.com/ludicworlds ☕ Buy me a coffee: ko-fi.com/ludicworlds Thank you for your support! ▬ Timestamps ▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬ 0:00 - Intro 1:13...
How to make a Mixed Reality app for the Quest 3 - Part 6: Bounding Boxes
มุมมอง 1.8K5 หลายเดือนก่อน
This is the sixth part of a short tutorial series, in which you will learn how to create Mixed Reality experiences on the Meta Quest 3. We will be building our app in the Unity game engine, using Unity's native 'XR Interactive Toolkit'. In this video, we take a look at Bounding Boxes. 📀 Get the completed Unity Project here: www.patreon.com/posts/project-source-3-105477211 → Next Video • Coming ...
Upgrade to 'Unity OpenXR: Meta 2' - Part 2
มุมมอง 1K6 หลายเดือนก่อน
Version 2 of the 'Unity OpenXR: Meta' package has been released! This is the second part of a tutorial, in which I show you how to upgrade your Meta Quest project to this new version. ← Part 1: th-cam.com/video/N6A4bEzUf8g/w-d-xo.html 📀 Get the fully upgraded Unity Project here → www.patreon.com/posts/project-source-3-102940793 ▬ Support My Work ▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬ ❤️ Support me on Patreon: www.pat...
Upgrade to 'Unity OpenXR: Meta 2' - Part 1
มุมมอง 1.8K6 หลายเดือนก่อน
Version 2 of the 'Unity OpenXR: Meta' package has been realeased! In this tutorial I show you how to upgrade your Meta Quest project to this new version. This tutorial is divided into two parts: This part will guide you through updating the necessary packages and project settings. The second part will show you how to restore some missing functionality (due to changes in AR Foundation's API). → ...
How to make a Mixed Reality app for the Quest 3 - Part 5: Raycasts & Anchors
มุมมอง 4K7 หลายเดือนก่อน
This is the fifth part of a short tutorial series, in which you will learn how to create Mixed Reality experiences on the Meta Quest 3. We will be building our app in the Unity game engine, using Unity's native 'XR Interactive Toolkit'. In this video, we take a look at Raycasting and Spatial Anchors. → Next Video • Coming soon! ← Previous Video • How to make a Mixed Reality app for the Quest 3 ...
How to make a Mixed Reality app for the Quest 3 - Part 4: Using Planes
มุมมอง 6K8 หลายเดือนก่อน
This is the fourth part of a short tutorial series, in which you will learn how to create Mixed Reality experiences on the Meta Quest 3. We will be building our app in the Unity game engine, using Unity's native 'XR Interactive Toolkit'. In this video, I will show you how to spawn some interactive cubes on top of a 'Table' plane. We will set up a Touch Controller, so that you can grab the cubes...
How to make a Mixed Reality app for the Quest 3 - Part 3: Plane Visualization
มุมมอง 7K9 หลายเดือนก่อน
This is the third part of a short tutorial series, in which you will learn how to create Mixed Reality experiences on the Meta Quest 3. We will be building our app in the Unity game engine, using Unity's native 'XR Interactive Toolkit'. In this video, I will show you how to create your own custom 'Plane Visualization'. I'll also show you how to set up a Touch Controller button, so that you can ...
How to make a Mixed Reality app for the Quest 3 - Part 2: Plane Detection
มุมมอง 12K10 หลายเดือนก่อน
This is the second part of a short tutorial series, in which you will learn how to create Mixed Reality experiences on the Meta Quest 3. We will be building our app in the Unity game engine, using Unity's native 'XR Interactive Toolkit'. In this video, we will look into the Quest's 'Space Setup', and the 'plane' data that is generated from this room scan. We will begin to explore how to visuali...
How to make a Mixed Reality app for the Quest 3 - Part 1: Passthrough
มุมมอง 40K11 หลายเดือนก่อน
This is the first part of a short tutorial series, in which you will learn how to create Mixed Reality experiences on the Meta Quest 3. We will be building our app in the Unity game engine, using Unity's native 'XR Interactive Toolkit'. In this first video, I will show you how to set up a Unity project for Quest 3 Mixed Reality development. I will then show you how to activate 'Pass-through'. L...
Introduction to Unity's XR Interaction Toolkit - Part 6: XR Grab Interactable
มุมมอง 3.2Kปีที่แล้ว
This is the sixth part of a short tutorial series, in which I cover the basics of using Unity's XR interaction Toolkit (version 2). In this video, we will take a look at the 'XR Grab Interactable' component. We will go through its basic settings, and see what affect these have on how the Interable responds to an Interactor. → Next Video • Coming soon! ← Previous Video • Introduction to Unity's ...
Create a GPT driven NPC using Inworld AI - Part 3: Unity VR Setup
มุมมอง 3Kปีที่แล้ว
Create a GPT driven NPC using Inworld AI - Part 3: Unity VR Setup
Create a GPT driven NPC using Inworld AI - Part 2: Import Character into Unity
มุมมอง 3.4Kปีที่แล้ว
Create a GPT driven NPC using Inworld AI - Part 2: Import Character into Unity
Create a GPT driven NPC using Inworld AI - Part 1: Character Creation
มุมมอง 3.4Kปีที่แล้ว
Create a GPT driven NPC using Inworld AI - Part 1: Character Creation
Exploring the Fusion of Unity & Chat/GPT AIs
มุมมอง 2.2Kปีที่แล้ว
Exploring the Fusion of Unity & Chat/GPT AIs
Create a Skybox using AI (and import into Unity)
มุมมอง 21Kปีที่แล้ว
Create a Skybox using AI (and import into Unity)
Introduction to Unity's XR Interaction Toolkit - Part 5: XR Direct Interactor
มุมมอง 3.9Kปีที่แล้ว
Introduction to Unity's XR Interaction Toolkit - Part 5: XR Direct Interactor
Cryptocurrency "Landscape" - Data Visualization in VR
มุมมอง 426ปีที่แล้ว
Cryptocurrency "Landscape" - Data Visualization in VR
XR Interaction Toolkit Version 2.3 Released (Unity)
มุมมอง 3.1Kปีที่แล้ว
XR Interaction Toolkit Version 2.3 Released (Unity)
Set up Unity's new XR Hands Package - OpenXR based Hand Tracking
มุมมอง 10Kปีที่แล้ว
Set up Unity's new XR Hands Package - OpenXR based Hand Tracking
Introduction to Unity's XR Interaction Toolkit - Part 4: Grabbing Objects
มุมมอง 5Kปีที่แล้ว
Introduction to Unity's XR Interaction Toolkit - Part 4: Grabbing Objects
Introduction to Unity's XR Interaction Toolkit - Part 3: Action-based Input
มุมมอง 6Kปีที่แล้ว
Introduction to Unity's XR Interaction Toolkit - Part 3: Action-based Input
Introduction to Unity's XR Interaction Toolkit - Part 2: XR Rig Setup
มุมมอง 7K2 ปีที่แล้ว
Introduction to Unity's XR Interaction Toolkit - Part 2: XR Rig Setup
Introduction to Unity's XR Interaction Toolkit - Part 1: Import & Setup
มุมมอง 11K2 ปีที่แล้ว
Introduction to Unity's XR Interaction Toolkit - Part 1: Import & Setup
This Outline Effect works great on Meta Quest 2! (Free Unity Asset)
มุมมอง 4.5K2 ปีที่แล้ว
This Outline Effect works great on Meta Quest 2! (Free Unity Asset)
How to Optimize Unity Project Settings for Meta Quest 2
มุมมอง 25K2 ปีที่แล้ว
How to Optimize Unity Project Settings for Meta Quest 2
How to use the Unity Asset Store (and get FREE Assets!)
มุมมอง 5K2 ปีที่แล้ว
How to use the Unity Asset Store (and get FREE Assets!)
i cant install python-3.10.12 help?
thanks for sharing!
Great Video! Extremly helpful to get started with ML-Agents! Hope more content regarding this topic is coming sooner then later :)
Better than most paid courses
I will be installing after sometime hope it works 🙂🥲
I was trying to install ml agents since 2 days finally found new video
Great tutorial. I have a question. Can we read the bounding box coordinates of all the objects that were specified while scene setup?
Thank you! 🙏 I would expect all the items of furniture to have a bounding box at least - I have not tested every item however.
thank you for going in depth on this
Hi many thanks to the tutorial, can you also give a tutorial about the new meshing feature?
thx!!!!
Will the XR simulator work too?
Hi, good introduction video. I was thinking maybe you can create a tutorial how to use unity sentis to create an AI NPCs that generate dialogs based on its knowledge database? I saw a video for the Inworld AI th-cam.com/video/gUCahPSAut4/w-d-xo.html but right now it don't exist in the asset store so I was wondering maybe I can create an AI with similar abilities but using instead the Unity Sentis. What do you think about it? Also is there a tutorial how to create models for Unity sentis?
Thanks for the suggestion! I've just checked the 'Inworld' website. and it seems they've changed their licensing - no more free tier! :( I do, in fact, plan to expand upon this tutorial: I would like to feed the results from 'Whisper' into a locally running LLM. Probably would not feasibly to run this on a standalone headset, but it may be OK for PC-based VR.
followed tutorial but my AR is not working. It only shows the solid background color. AR session is in along with the ar camera manager too
Great video and very much to the point. I want to use hand interaction instead of controllers. What do I need to do to enable it?
does this work with pico 4 ?
I don't own a Pico 4, unfortunately, so I can't advise on what modifications would be needed to make this series compatible with the headset.
"I'm curious, are the models that can be run on Unity Sentis limited?"
AFAIK, the models you can run on Unity Sentis are somewhat limited. It's designed to work on-device within your Unity app, so it's best suited for smaller, optimized models. You need to consider the capabilities of your target platform. You can use models exported to the ONNX format, but more complex models might not run efficiently.
@@LudicWorlds I wanted to put in a TTS model specialized for Korean and wondered if it was possible! thank you!
thanks for the video
next video hou to upgrade toversion 7
What about running the model in a background thread? Is that possible or is this bound to the main thread somehow?
Good point! However, I've not looked into that yet - this video represents a first step in getting Whisper working. I do intend to improve upon this (I'd also like it to detect and process audio without the need to press any buttons). Will add to my todo list!
this is amazing!
Thanks! :) working on a video explaining how this is achieved (I will also make the code available).
looking forward to seeing this
@@andru5054 I've made a video explaining how this works here: th-cam.com/video/mOosvi54we0/w-d-xo.html You can download the Unity project from here: www.patreon.com/posts/project-source-107788463
Another thing I would add to this would be using the Meta Quest Developer Hub to disable the proximity sensor so that build and run works correctly when you're not wearing it.
I take it that Vulkan is finally ready for prime time in Unity?
Yes, I've not encountered any issues with Vulcan for sometime, and it tends to perform better than OpenGL ES.
Can't wait for your lessons about VR and AI. Thank you !
Great content, thanks for sharing.
Thanks for another great video tutorial! I executed all the steps in your video and managed to build and run the app on the first try. Only one issue.. why in my app the options SImple AR, Anchor, Plane Detection Mode and Bounding Boxes are all greyed out in the menu? I can only select the three remaining buttons/options from the UI (Scene Capture, Display Refresh Rate, Normal Meshes)....
looking forward to those new videos
Me too
Hi, could you show us how to load a room into our unity so we can place virtual objects on a specific place? Or could you at least provide a source, where we can find information about this? I want to be able to build an app for a specific environment..
bump this ^
You saved me man. Thanks for the videos
Your subscriber count doesn't reflect the quality and value you provide. Please keep going!
Part 6 please!
So many fiddly bits... you've saved me so much time. Thank you!
Thanks for the video. I'm conducting a virtual reality (VR) experiment with Unity 3D. In one scene, I need to move a person from a bed, lifting them with both hands from behind and carrying them to a nearby chair. What do you think is the best approach for handling this scene?
If it's important to have the player gradually move to the chair (rather than teleport), make sure you move the 'XR Origin' (and not the camera directly). I would keep the movement slow and steady to minimize discomfort to the player.
@ludicworlds I have been following you on here for a while. Your tutorials are great. keep it going!! Im trying to have a virtual plane have a persistent anchor that will align my XR world to the fixed point of a real room but the XR always is somewhere different. Is there something I'm over looking?
Thank you! Sorry for the delay in replying, I have been away for the last couple of weeks. Did you manage to solve this? I would expect that aligning your scene relative to a previously saved anchor, should indeed work (I should probably do a tutorial specifically on the new Persistence Anchor system).
@@LudicWorlds That would be great. Keep going with the tutorials. your subs count will start rising as this tech rolls out to the public more with better and more affordable devices. Im still trying to get my head around Unity and all the different packages for just Quest 3 or cross platform. Something I would really appreciate is a tutorial on taking parts of demo/example scenes and using parts of them in my own development. I've been constantly hit with errors but I realise it just takes time to understand the unity system.
@@LudicWorlds have you considered doing a live stream ??? where you could one to one with users. It may be useful to gauge what your viewers need help with to get them over some technical hurdles. even if you may only want to do this with Patreon subs perhaps?
thx for this video, crystal clear, but I'm sad you did not explained what "schemes" are..
Thank you for the feedback. I think I may need to update this tutorial series soon, as so much has changed in XR interaction Toolkit Version 3.
Why do I not have a settings folder under my assets?
The 'Settings' folder in 'Assets' typically contains various project-specific settings files. In this project, these files define Graphics Settings (i.e., configuring the Render Pipeline). However, it could also contain Audio Settings, Build Settings, etc.
@@LudicWorlds Hi, thank you for your reply. Your video tutorials are so helpful! I still am a little confused because I do not have this settings folder within my assets like you show in your video, and because of that I am unable to apply the URP balanced renderer asset to the scriptable render pipeline setting like you show at 10:17. Let me know if you know why this could be happening and thank you again!
@@LudicWorlds same here. no settings in assets folder. any ideas ?
Hi, I see that you disable post processing completely, so there is no way to use post processing bloom effect on a quest 2 project?
The Quest 2 uses a tile-based rendering architecture, which makes full-screen render passes, such as those required for post-processing effects like "Bloom", quite expensive. While you can use post-processing effects if they are crucial for your app, it's important to be mindful of the potential performance impact.
@@LudicWorlds ok thanks, there is any difference with quest 3?
The Quest 3's GPU also uses tile-based rendering, so you still need to keep that in mind.
Thank you for free asset!
Hello, many thanks for the great tutorials and this video. After updating to Unity 6 (6000.0.7) with new AR Foundation package the outcome of the tutorial and also this sample project, the content in mixed reality freeze after some seconds (it could be 5 or 30) while with latest Unity LTS no. Does it happened to you as well? Thanks in advance!
After downgrading to 6000.0.5 it works again in case someone encounter the same issue
Strange, I upgraded to Unity 6 (6000.0.7) but wasn't able to replicate this. Was it happening in a particular scene?
Installed perfect. I did not have to do anything extra newtonsoftish... Running 6000.0.0.5f with HUB 3.8
That's great! Looks like there was a problem with the project's manifest file. Unity has recently pushed a fix to the GitHub repo: "Update manifest.json to include new Newtonsoft package dependency". github.com/Unity-Technologies/arfoundation-samples/commit/23b6685bfa42523849b4c4aa45bdaf337fdb1137
are unity still trying to take devs money?
i thinks yes
Not anymore
You explain so good. Love your videos.
Thank you, I really love your tutorials. However, in the new version of Inworld now, there is no longer the Inworld Player in the InworldController (I think changes are made to the PlayerController). Would you know how to reconfigure Inworld PlayerControllerRPM for the XR Origin Camera for the newer version? Thanks a million!
Thank you, I am glad you are enjoying my videos! :) It's been quite some time since I've looked at Inworld, so I don't know how to fix this in the current version. However, I've noticed that Inworld are now releasing 'Getting Started' tutorials on their TH-cam channel. Maybe this Unity series can help? th-cam.com/video/HRmTQF0CW7Q/w-d-xo.html
How can we make Mixed Reality toggleable? In case we want players to be able to go from MR to VR? What components have to be disabled?
Best clear explanation from the scratch
could i not build, just play in unity by simulator?
THANKS BRO!!