A New Virtual Reality: A 5,000 sqft VR Experience for Four Seasons Lake Austin | Unreal Fest 2023

แชร์
ฝัง
  • เผยแพร่เมื่อ 21 พ.ย. 2023
  • What does it take to appropriately represent a groundbreaking real estate project in virtual reality? In this talk, we’ll look at the challenges in communicating intricate, site-specific architecture, pushing the limits of visual quality in a fully ray-traced VR experience and providing a multi-user wireless experience inside a 5,000 sq. ft. custom showroom.
    You’ll gain insights into the challenges faced and the ultimate solutions adopted to ensure an end user experience that aligns with the caliber of the project.
    We’re excited to bring you sessions from Unreal Fest 2023, available to watch on demand: • Unreal Fest '23 Playlist
  • เกม

ความคิดเห็น • 44

  • @ibrews
    @ibrews 8 หลายเดือนก่อน +19

    thank you so much everyone for checking out our talk! this project has been a tremendous effort from many many people on many many teams. The R&D continues as we all work toward pushing the boundaries of realism in virtual reality. watch this space and cheers!
    -Alex from Agile Lens

  • @se7en28
    @se7en28 8 หลายเดือนก่อน +22

    Man Alex is like the only dude pushing VR in Unreal. Good on him for pushing the bar

    • @ibrews
      @ibrews 8 หลายเดือนก่อน +5

      there are dozens of us. DOZENS!

    • @behrampatel4872
      @behrampatel4872 8 หลายเดือนก่อน +2

      @@ibrews Dozens for pushing VR ? sure. Helping raise the tide for all boats (educating and sharing knowledge at the bleeding edge ) ? very few. count them on 'one' finger ;).
      Cheers
      b

    • @ibrews
      @ibrews 8 หลายเดือนก่อน

      @@behrampatel4872 🫶

    • @bolayer
      @bolayer 7 หลายเดือนก่อน +2

      @@ibrewsdozens of us I tell you 😂 need to catch up soon Alex

    • @ibrews
      @ibrews 7 หลายเดือนก่อน +1

      @@bolayer yes yes !

  • @nathie
    @nathie หลายเดือนก่อน +1

    I got to experience this in-person and it absolutely blew my mind.

    • @ibrews
      @ibrews 29 วันที่ผ่านมา

      thank you for coming Nathie!!

  • @behrampatel4872
    @behrampatel4872 8 หลายเดือนก่อน +12

    Alex is a very, very talented guy. Generous with his knowledge too. Cheers

    • @ibrews
      @ibrews 8 หลายเดือนก่อน +2

      You’re kind! There’s a very large team who helped make this happen and I’m just fortunate to be one of the messengers 🎉

  • @arprint3d
    @arprint3d 8 หลายเดือนก่อน +4

    Que increíble cuando ves estos grandes trabajos y ves cada minúsculo detalle que observan para lograr algo único. Felicitaciones! Me da mucha motivación para seguir avanzando en VR

  • @slot9
    @slot9 7 หลายเดือนก่อน +3

    That was amazing!

  • @manuvikraman1611
    @manuvikraman1611 8 หลายเดือนก่อน +3

    Very informative, thanks.

  • @syno3608
    @syno3608 8 หลายเดือนก่อน +5

    This is pure gold .Thanks !!

  • @ali.3d
    @ali.3d 8 หลายเดือนก่อน +5

    Such a great video and presentation, thanks heaps to everyone involved 🙌🏽

  • @dyna4studio942
    @dyna4studio942 8 หลายเดือนก่อน +2

    Good. Let´s push push push!

  • @Lupin0
    @Lupin0 8 หลายเดือนก่อน +3

    wow! awesome job!!

  • @Kor3Gaming_Ghost
    @Kor3Gaming_Ghost 8 หลายเดือนก่อน +2

    this guy is really talented for real. i love vr and i cannot wait to push it

  • @zakaria20062
    @zakaria20062 5 หลายเดือนก่อน +2

    Im not sure why Epic they take important feature as HTML5 support . I think we need this feature back

  • @arealvisionvideos
    @arealvisionvideos 6 หลายเดือนก่อน +2

    Hi Jose, Alex and Neil, incredible work!
    I don't quite understand the part about the mirrors, can ray tracing and sphere reflections capture work at the same time?
    I thought that if you have ray tracing reflections activated you cannot see screen space reflections.
    Thank you very much and excuse my English.

    • @ibrews
      @ibrews 5 หลายเดือนก่อน +1

      Hi! That’s correct, the project ONLY uses raytracing reflections but changes sample count and other levels of precision depending on context

    • @arealvisionvideos
      @arealvisionvideos 5 หลายเดือนก่อน

      @@ibrews Thank you so much 👍

    • @arealvisionvideos
      @arealvisionvideos 5 หลายเดือนก่อน

      The highest value that I can apply in samples is 2 with my RTX 3080 mobile

    • @ibrews
      @ibrews 5 หลายเดือนก่อน +1

      @@arealvisionvideos this runs on desktop rtx 4090

  • @uiefuh17
    @uiefuh17 8 หลายเดือนก่อน +5

    🤞

  • @brettcameratraveler
    @brettcameratraveler 8 หลายเดือนก่อน +3

    Incredible effort and attention to detail. Optitrack is very accurate for mocap but its expensive and not a easily portable hardware solution. If you were to do it again with the Quest 2, would Meta's Shared Spatial anchors have been good enough to be safe for multiple users in a shared space?
    How about in the case of critically aligned IRL objects with their digital twin counterparts? Repeatable?

    • @ibrews
      @ibrews 8 หลายเดือนก่อน +1

      Been doing real world alignment with objects for years with Vive Trackers, even controllers sometimes (if the users don’t need them)
      Shared spatial anchors are still very bad and we could not rely on them. No improvement yet as far as we can see

  • @haikeye1425
    @haikeye1425 8 หลายเดือนก่อน +4

    👍

  • @r.m8146
    @r.m8146 8 หลายเดือนก่อน +3

    This kind of problem would be so much more easy to solve if people were leveraging Dynamic Foveated Rendering. It's such a shame that people don't recognize its potential; I hope the Vision Pro will change that.

    • @ibrews
      @ibrews 8 หลายเดือนก่อน +8

      Alex here! We did try to use dynamic foveated rendering using the eye tracking of the Meta Quest Pro but a) in 4.27 it requires the Oculus branch of Unreal and b) the latency was perceivable
      Much much better in 5.3 now!

    • @brettcameratraveler
      @brettcameratraveler 8 หลายเดือนก่อน

      ​@ibrews Roughly what percentage gain in performance did you see after you toggled DFR on?

    • @ibrews
      @ibrews 8 หลายเดือนก่อน

      @@brettcameratraveler at best 20% ? Wasn’t worth it for the artifacts

  • @juanipignatta414
    @juanipignatta414 8 หลายเดือนก่อน +1

    🤩

    • @ibrews
      @ibrews 8 หลายเดือนก่อน +1

      go Juani go!!

  • @anmolsandhu3619
    @anmolsandhu3619 3 หลายเดือนก่อน

    do you think the tracking has improved with the new meta quest 3? I.e. the special equipment you made for the meta quest pro headsets is no longer needed ?
    also, I have been personally working on a VR scene and I am able to get absolutely amazing quality in PC VR for a scene but I see these sleek white lines, really faded but if you really try to look you can see them, usually its on furniture like sofa's or beds maybe its the texture that is causing that or maybe I am just missing a rendering setting?
    I would really appreciate it if you can tell me what settings did you use in rendering tab to get the highest quality possible ?

    • @juanipignatta414
      @juanipignatta414 3 หลายเดือนก่อน +1

      Hello there! If you are using raytracing and the white lines you are seeing are in the edges of the furniture, it is related to raytracing and the max roughness setting. Editing the values it in the postprocess volume or the roughness in the material should fix it !

    • @ibrews
      @ibrews 2 หลายเดือนก่อน

      @@juanipignatta414 seconding Juani ! :D

  • @JBBost
    @JBBost 6 หลายเดือนก่อน

    I like how they can't sell rich people houses unless every single one of the houses in that ugly clump of development has the same paper-thin veneer of reality as the people trying to buy property there.

  • @mugabsyll5155
    @mugabsyll5155 2 หลายเดือนก่อน

    Lol you guys are not the first. I've build a capital just for VR. I'm one developer with no billions

  • @Danuxsy
    @Danuxsy 8 หลายเดือนก่อน +1

    Couldn't you use neural nets to add detail onto the images without using more polygons? In fact why use polygons and traditional rendering at all? In theory a neural driven image could have infinite resolution, the closer you get to something the more detail is seen as the neural net keep generating what should be there.

    • @roquecepeda2932
      @roquecepeda2932 8 หลายเดือนก่อน

      Probably, but that project started about 2 years ago, the tech was pretty much unknown for the archviz industry. and probably you would need lots and lots of renderings from the original scenes.

    • @junkaccount7449
      @junkaccount7449 8 หลายเดือนก่อน +1

      Nanite works great and NeRFs aren’t mature enough for production applications like this. The super resolution model you’re describing doesn’t exist yet for rendering realtime 3D scenes.. it might in 5-10 years but right now it’s Sci-Fi lol

    • @joseuribe7415
      @joseuribe7415 8 หลายเดือนก่อน +3

      Good suggestion. We did explore it at the time, but we could not get the results we wanted to achieve. Remember that in VR with full 6DOF, we needed to make sure even if you get super close to any object, you would still get a very realistic view. Neural nets are promising and we will continue our R&D on them, but it was just not the case for this project.