Radiance Cascades Rendered Directly

แชร์
ฝัง
  • เผยแพร่เมื่อ 21 ส.ค. 2024
  • In this video we explore data stored in radiance cascades by observing it directly. This is equivalent to precalculating a scene, storing a cross-section of its radiance field and then rendering it from any viewpoint and any angle in O(1).

ความคิดเห็น • 211

  • @Alexander_Sannikov
    @Alexander_Sannikov  หลายเดือนก่อน +113

    How ya'll folks found this video? I'm seeing a lot of half-life themed visitors coming from somewhere, and I've no idea where from.

    • @porttek0oficial
      @porttek0oficial หลายเดือนก่อน +47

      This was on my recommended page and had interesting thumbnail. I watched it until the end and I still don't know what I am looking at.

    • @Alexander_Sannikov
      @Alexander_Sannikov  หลายเดือนก่อน +14

      @@porttek0oficial sorry for that..

    • @valshaped
      @valshaped หลายเดือนก่อน +27

      Don't be sorry for it! This was a wonderful demo, and you don't need to be a graphics programmer to find it interesting, fun, enjoyable, and visually appealing!

    • @Deathbynature89
      @Deathbynature89 หลายเดือนก่อน +39

      Radiance Cascade sounds like Resonance Cascade combined with a cool thumbnail it is intriguing.
      "What is a Radiance Cascade? I wanna know."
      This was recommended to me on my homepage. I have watch videos on radiance fields and tutorials on Gaussian Splatting. The thumbnail also looks like something that could be rendered in Source 2.

    • @SuboptimalEng
      @SuboptimalEng หลายเดือนก่อน +25

      Probably getting recommended because of SimonDev’s recent video on Radiance Cascades heh.

  • @OtterCynical
    @OtterCynical หลายเดือนก่อน +522

    Gordon doesn't need to hear all of this, he's a highly trained professional.

    • @minecraftermad
      @minecraftermad หลายเดือนก่อน +15

      No no that's resonance cascade.

    • @Etka06
      @Etka06 หลายเดือนก่อน +1

      @@minecraftermad same thing

    • @Gelatinocyte2
      @Gelatinocyte2 หลายเดือนก่อน +8

      Gordon doesn't need to *see* all of this, he's a highly trained *game dev.*

  • @astr0_th3_man84
    @astr0_th3_man84 หลายเดือนก่อน +400

    My dumbass thinking this was a half life video

    • @Lexie_T
      @Lexie_T หลายเดือนก่อน +2

      There's no chance of a half life video.

    • @elpanatv2537
      @elpanatv2537 หลายเดือนก่อน +15

      resonance cascade🔥🔥🔥🔥🔥🔥🗣🗣🗣🗣

    • @noisetide
      @noisetide หลายเดือนก่อน +4

      @@elpanatv2537 It's time to choose Mr. Freeman...

    • @makeandbreakgames1791
      @makeandbreakgames1791 หลายเดือนก่อน +1

      same lol i thought it was a high quality render or something

    • @freakmoler5563
      @freakmoler5563 หลายเดือนก่อน

      It was the weird thumbnail that tricked me

  • @K4leidos
    @K4leidos หลายเดือนก่อน +158

    Never change your mic. It's somehow almost perfect

    • @derc4N
      @derc4N หลายเดือนก่อน

    • @atlas_19
      @atlas_19 หลายเดือนก่อน +16

      It's like a voice recording/voice report from games and movies.

    • @liegon
      @liegon หลายเดือนก่อน +4

      It has a lofi feel, like a cassette recording.

    • @antonio_carvalho
      @antonio_carvalho หลายเดือนก่อน +3

      The slightly nasal voice in a matter of factly intonation also contribute to the effect

    • @Jam_Axo
      @Jam_Axo 17 วันที่ผ่านมา

      its a mad scientist mic in the making

  • @Klayperson
    @Klayperson หลายเดือนก่อน +33

    this is a sickass way to render cosmic shadow people

  • @DilettanteProjects
    @DilettanteProjects หลายเดือนก่อน +147

    I never thought I'd live to see a radiance cascade

    • @Kavukamari
      @Kavukamari หลายเดือนก่อน +35

      let alone create one...

    • @Masonova1
      @Masonova1 หลายเดือนก่อน

      ☝️🤓 actually it was a resonance cascade

    • @DilettanteProjects
      @DilettanteProjects หลายเดือนก่อน +3

      @@Masonova1 I like to do this thing sometimes where I notice that one word sounds a bit like another and then I make a joke out of that

    • @Synthonym
      @Synthonym 13 วันที่ผ่านมา

      @@Masonova1 whoooosh

  • @Alexander_Sannikov
    @Alexander_Sannikov  ปีที่แล้ว +212

    One thing that I forgot to mention in the video is the, um, sparkles? These are path tracing fireflies that made their way into radiance fields -- they go away the more time you give path tracing to converge, but I did not bother waiting more than half an hour and I thought they look kind on cool anyway. They don't exist while using this data structure for calculating actual global illumination because it needs much lower resolution to be resolved and so it converges much faster.

    • @johnsherfey3675
      @johnsherfey3675 หลายเดือนก่อน

      Would denoising help?

    • @Mr.LeoNov
      @Mr.LeoNov หลายเดือนก่อน +1

      ​@@johnsherfey3675I don't think so giving how little resolution there is

    • @ThePrimaFacie
      @ThePrimaFacie หลายเดือนก่อน +2

      You should pin this as the top comment

    • @monx
      @monx หลายเดือนก่อน +5

      thanks. the video is missing a brief assurance that this "ghost skull" asset is presented exactly as it would appear in a game. It is not a human head in debug mode.

    • @draco6349
      @draco6349 หลายเดือนก่อน +2

      it's a fascinating effect, actually- a volume that sparkles isn't something i've seen much before. i wonder if it has any actual use.

  • @valshaped
    @valshaped หลายเดือนก่อน +45

    The cascades look like the raw output from a light field camera! Very cool!

    • @Alexander_Sannikov
      @Alexander_Sannikov  หลายเดือนก่อน +21

      There's only so many ways you can encode a light field..

    • @johndawson6057
      @johndawson6057 หลายเดือนก่อน +2

      ​@@Alexander_Sannikovso it is used in light field cameras then?

  • @longjohn7992
    @longjohn7992 หลายเดือนก่อน +29

    “Carmack doesn’t need to hear all this he’s a highly trained professional”

  • @Vulcorio
    @Vulcorio หลายเดือนก่อน +36

    No idea what I've just listened to but the imagery is very fascinating.

  • @ThePrimaFacie
    @ThePrimaFacie หลายเดือนก่อน +21

    My brain watching this and hearing how its done is doing the sparkly bits of the model. Thanks for the vid

  • @euphemia3092
    @euphemia3092 ปีที่แล้ว +23

    Absolutely blown away by all of your work. Thank you for sharing!!

  • @footspring94
    @footspring94 หลายเดือนก่อน +3

    like a digital hologram. crazy that this has been doable for such a long time and only now has been found. And just by someone on yt.

  • @4.0.4
    @4.0.4 หลายเดือนก่อน +15

    As Wave Function Collapse terrain generation has proven, cool names inspired on physics are generally better for game development.

    • @fritt_wastaken
      @fritt_wastaken หลายเดือนก่อน +2

      But wave function collapse is an abysmally terrible name. It has nothing to do with waves or functions. Gives wrong impression on both essence and complexity of the algorithm

  • @bebroid420
    @bebroid420 ปีที่แล้ว +11

    2 years ago I was experimenting with directional lightmaps, trying to achieve both diffuse and sharp specular lighting. I've been messing with "plenoptic textures" that look very similar to this demo. It's really interesting how one can come up with a similar concept while trying to achieve different goal. The whole idea with the cascades and usage of this technique to calculate screen space global illuminance.... Just wow!

  • @codymitchell4114
    @codymitchell4114 23 วันที่ผ่านมา

    Absolutely gorgeous rendering

  • @aburak621
    @aburak621 ปีที่แล้ว +11

    Thanks Alexander for you ExileCon presentation! It was a joy to watch and learn.

  • @melody3741
    @melody3741 23 วันที่ผ่านมา

    This is the first time I have ever comprehended these, because everyone else basically just called it black box. Thank you!!!

  • @jeremyashcraft2053
    @jeremyashcraft2053 11 วันที่ผ่านมา

    Really clever stuff! I know PoE 2 will be shipping with this lighting technique, and I can't wait to see it in action!

  • @CharlesVanNoland
    @CharlesVanNoland หลายเดือนก่อน +4

    This is very reminiscent of lightfield rendering (originally "image based rendering" 20 years ago) of the sort that OTOY and Lytro were working on a decade ago, except here you have multiple resolutions for multiple depths? I'll have to look at your paper to understand the cascade aspect.

    • @Alexander_Sannikov
      @Alexander_Sannikov  หลายเดือนก่อน +2

      each cascade is encoded in a way that's similar to the good ol' image based rendering. but the most powerful property of RC is how information is distributed across multiple cascades.

  • @perkele1989
    @perkele1989 20 วันที่ผ่านมา

    The halflife refs are clearly due to your phenomenal HL voice and production quality

  • @Vegan_Kebab_In_My_Hand
    @Vegan_Kebab_In_My_Hand หลายเดือนก่อน +2

    Nice video, and great explanation of the cross-sections and their relationship to the spatial and angular resolutions!

  • @DabidarZ
    @DabidarZ หลายเดือนก่อน +3

    im so confuseddddd BUT THAT LOOKS cool and i love seeing new stuff!!!

  • @TheNSJaws
    @TheNSJaws ปีที่แล้ว +36

    would you mind doing more of these? Not just for Cascade rendering, but in general.
    I quite appreciate your PoE presentations, and every time I rewatch them, I wish they gave you more time.

  • @footspring94
    @footspring94 หลายเดือนก่อน +1

    just think how this could replace individual props like tables in corners or show massive events like large cut scenes or animated backgrounds. I also think it will work very well as a way to manage surface texturing.

  • @f7029
    @f7029 หลายเดือนก่อน

    This is amazing!!!!! Really excited to see what comes out of this

  • @Merthalophor
    @Merthalophor 7 วันที่ผ่านมา

    What I gather:
    We're "storing" in some way how the 3d model looks like if viewed through a plane from any angle, without keeping the 3d model. The most obvious way to do that is to store for each pixel of the plane what the pixel would look like if viewed from every angle. We could store, say, 200 different angles, which would mean that we have to store for _every_ pixel 200 different colors. Then, when rendering, we could check what angle we're looking at a pixel, then linarly interpolate between the colors associated with similar angles.
    What this paper shows is that this is not necessary, and we can make the image still look decent while storing much less information. The key idea is that while we want to store a _few_ angles for _every_ pixel, we only need a _few_ pixels that store a _lot_ of angles. So for example, we could store 4 angles for _every_ pixel - then have a separate map that stores 8 angles for every 16th pixel - then another map that stores 16 angles but only every 36 pixels, and so on. By cleverly interpolating this information, we get a really life-like image, while only storing a fraction of the information (and, conversely, while only having to _calculate_ a fraction of the information, if running in real time).

    • @Alexander_Sannikov
      @Alexander_Sannikov  7 วันที่ผ่านมา

      the really important part is that the tradeoff of spatial:angular density is only possible for a given distance range. that's why RC stores radiance intervals, because they capture light coming from a certain distance range.

    • @Merthalophor
      @Merthalophor 4 วันที่ผ่านมา

      @@Alexander_Sannikov Distance... from the object being rendered? If you moved away from the plane the quality would decrease?

  • @TheEagleFace
    @TheEagleFace หลายเดือนก่อน +3

    babe wake up they just dropped virtual holograms

  • @LukeGeaney
    @LukeGeaney หลายเดือนก่อน

    Subbed - found your channel via the SimonDev video on radiance cascades then your ExileCon video :)

  • @chris-hayes
    @chris-hayes หลายเดือนก่อน

    How.. illuminating. Very cool.

  • @user-mx7mc7sv2q
    @user-mx7mc7sv2q หลายเดือนก่อน

    It came from recommendations and I have no idea what this is. But it's damn cool! Now I'll go deeper that rabbit hole to find out what it is and how you managed to make it work

  • @photometricman
    @photometricman 9 หลายเดือนก่อน +4

    A lot of similarities in the atlas to lightfield photography. Highly interesting work. Thank you!

    • @Alexander_Sannikov
      @Alexander_Sannikov  9 หลายเดือนก่อน +2

      it is capturing the same data, so yeah

    • @Tr33fiddy
      @Tr33fiddy 7 หลายเดือนก่อน +1

      Good thought. Like the Lytro light field camera.

  • @LoraHannike
    @LoraHannike หลายเดือนก่อน

    digital hologram, finally
    great work! keep it up

    • @Alexander_Sannikov
      @Alexander_Sannikov  หลายเดือนก่อน +1

      i have a video on actual simulated holograms. check my community page to see what it looks like.

    • @LoraHannike
      @LoraHannike หลายเดือนก่อน

      @@Alexander_Sannikov I will

  • @Visigoth_
    @Visigoth_ 22 วันที่ผ่านมา

    Cool video (Game Dev)... YT sees my interest in "digital volumes" and recommended this.

  • @DJDDstrich
    @DJDDstrich ปีที่แล้ว +11

    Hey Alex thanks for the video - Really cool to see a little more insight into this Radiance Cascades tech you've developed! Must be fun to play around with various little projects like this now that the tech has proven itself! How are you going with potentially open sourcing some form of it in the near future? Is it still a possibility? It seems like the only viable GI option for real-time procedural games like ARPGs or sim/management games, and, well I want to get my greedy little mitts on it! :D

    • @Alexander_Sannikov
      @Alexander_Sannikov  ปีที่แล้ว +26

      I'm close to publishing an article first, them I'm going to open-source it.

    • @DJDDstrich
      @DJDDstrich ปีที่แล้ว +4

      @@Alexander_Sannikov amazing mate! I'm super excited and hope you get the recognition from it! Where might the article land and how do I get notified?

    • @Alexander_Sannikov
      @Alexander_Sannikov  ปีที่แล้ว +12

      @@DJDDstrich I'm probably going to make a short announcement video on this channel with a link to something like google drive pdf before sending it to a journal.

  • @blakes8901
    @blakes8901 หลายเดือนก่อน

    this genuinely might change everything. good luck. I hope this actually pays off for you monetarily.

  • @pravkdey
    @pravkdey หลายเดือนก่อน

    Looks pretty! If it's too memory intense i can imagine it being used sparingly, like for smoke grenade or effects on weapons

  • @UncoveredTruths
    @UncoveredTruths ปีที่แล้ว +1

    such a cool technique

  • @Deathend
    @Deathend หลายเดือนก่อน

    Totally a coincidence that it looks like a interdimensional ghost celebrating cico de mayo.

  • @sublucid
    @sublucid ปีที่แล้ว +1

    Super cool! Show us what this same model looks like illuminated by the final technique!

  • @chapterone1162
    @chapterone1162 หลายเดือนก่อน

    Really cool stuff man good work

  • @ellescer
    @ellescer หลายเดือนก่อน +1

    no idea what your talking about or what any of this means but, looks wicked cool dude

  • @satibel
    @satibel 11 หลายเดือนก่อน +3

    have you tried that with a billboard explosion? it may allow high speed rendering with some semi-3d effect.

  • @pravkdey
    @pravkdey หลายเดือนก่อน

    The title sounds like a sleeper agent phrase haha

  • @kabu8341
    @kabu8341 หลายเดือนก่อน

    If you ever happen to have enough free time on your hands, I would love to have a video explanation for your radiance cascade for people who know not much about graphics programming :D SimonDev's Video was a good start for it tho, now I understand the things you do a bit more. Very fascinating. I still can't really imagine what those probes look like. As far as I understand now it sounds like you have those probes with different properties in the whole... err.. view? volume? of the camera and they are scanning everything and sending the averaged data to the camera and this is what you see in the end?

  • @manapotion1594
    @manapotion1594 ปีที่แล้ว +3

    Cool research. So, does that mean the angle of view is quite limited? Does the quality degrades with the "depth" of the object from camera?

    • @Alexander_Sannikov
      @Alexander_Sannikov  ปีที่แล้ว

      the angle is not limited, but the quality does degrade with depth with this encoding

  • @firstclass3223
    @firstclass3223 ปีที่แล้ว

    Awesome stuff, well explained.

  • @iestynne
    @iestynne หลายเดือนก่อน

    Thanks for showing the atlas view.
    I'm curious why you chose parabolic encoding?

  • @seccentral
    @seccentral หลายเดือนก่อน

    the profiler is legit. 👍

  • @JamesAidanKing
    @JamesAidanKing 21 วันที่ผ่านมา

    yea i dont know what any of this is but it looks cool

  • @skicreature
    @skicreature 28 วันที่ผ่านมา

    Your mention of the accuracy of penumbras makes me wonder if this same technique could be modified in the calculation of radiation dose calculation in radiation therapy. Basically we do a bunch of complex ray tracing or kernel convolutions to perform dose calculations in radiation therapy and calculate the interactions based on a fluence maps from particular angles. For us however reflectivity isn't so important. Mostly we just have attenuation (simple calculation), and a couple different types of scattering interactions (much more complex requiring monte carlo interactions). However, our accuracy in dose calculation tends to decrease at the edge of beams (at the penumbra) as the calculation becomes most difficult there with scattering interactions beginning to dominate over the attenuation interactions.

    • @Alexander_Sannikov
      @Alexander_Sannikov  28 วันที่ผ่านมา

      @@skicreature people are already applying RC to a bunch of non-light radiative transfer processes. any process where energy is propagated in rays is suitable.

  • @attashemk8985
    @attashemk8985 ปีที่แล้ว

    Looks cool, will be interesting if nerfs could be baked in this cascade technique

  • @grimtin10
    @grimtin10 หลายเดือนก่อน

    yo this is really cool

  • @RPG_Guy-fx8ns
    @RPG_Guy-fx8ns หลายเดือนก่อน

    so basically you bake point clouds to an atlas of cube maps and use it to render imposter pixels

  • @Jianju69
    @Jianju69 หลายเดือนก่อน

    Interesting effect, somewhat like glitter immersed in plastic. How much memory does this demo use? Seems too resource intensive for complex real-time scenes.

  • @simonl1938
    @simonl1938 หลายเดือนก่อน

    this is so cool

  • @thewaysh
    @thewaysh ปีที่แล้ว +10

    Great video, thank you. Please upgrade your microphone!

    • @iamvfx
      @iamvfx ปีที่แล้ว +9

      I like it, it sounds like a recording from 1960s 😁

    • @Alexander_Sannikov
      @Alexander_Sannikov  ปีที่แล้ว +16

      Sorry for that. That's why I usually don't voice my videos. I should get an actual mic, but I normally much prefer to program something than to waste time setting it up.
      UPD: ok, anyway, ordered a mic. Again, sorry for the quality on this one.

  • @harry1010
    @harry1010 หลายเดือนก่อน

    Never seen radiance fields rendered so fast - on the cpu, too!quick q - do you have angular resolution inconsistencies at different layers due to how the resolution changes at each depth layer?

  • @perialis2970
    @perialis2970 หลายเดือนก่อน

    cool, this is a super cool video (not understanding one thing)

  • @addmix
    @addmix หลายเดือนก่อน

    Sparkles

  • @eucenor4171
    @eucenor4171 หลายเดือนก่อน

    0:07 global elimination technique 💀

  • @nates9778
    @nates9778 ปีที่แล้ว

    This is cool!

  • @iamvfx
    @iamvfx ปีที่แล้ว

    Cool stuff! Needs more high quality examples

  • @AtomicBl453
    @AtomicBl453 หลายเดือนก่อน

    i'm curious, would this help with more realistic sun sparkles on a body of water?

  • @PanzerschrekCN
    @PanzerschrekCN ปีที่แล้ว +1

    Looks great!
    But it is hard to understand how it works.
    As i understand this, you just render the scene into a tiny cubemap for each point on a regular grid. You have 4 variants of this grid (cascades) - from less detail (spatial) to more detail, less detail grid contains more detail cubemaps, more detail grid contains less detail cubemaps. It is still unclear for me how fetching and mixing from these arrays of cubemaps works.

    • @Alexander_Sannikov
      @Alexander_Sannikov  ปีที่แล้ว +4

      That's right, the blending part is kind of hard to explain in the video, that's why I'm writing an article with a proper explanation. However, the idea is actually very simple: each cubemap simply has an alpha channel, so since each cascade encodes its own depth range, they are always sorted front-to-back, so you just blend them using their alpha channel.

    • @JannikVogel
      @JannikVogel ปีที่แล้ว

      @@Alexander_Sannikov How do you choose these depth ranges? Are they manually chosen or do they "emerge" from the data (due to information not being representable at some angular resolution for example)?

    • @Alexander_Sannikov
      @Alexander_Sannikov  ปีที่แล้ว

      @@JannikVogelthe discretization scheme is completely scene-independent. That means the exact same radiance cascades can be used to capture radiance of an arbitrary scene. In the paper I explain in great detail why depth ranges need to increase exponentially for subsequent cascades. If you're on "graphics programming" discord, you can have a look at the draft of my paper that people are reviewing right now.

  • @lemonjumpsofficial
    @lemonjumpsofficial หลายเดือนก่อน

    IT'S A HOLIGRAM!!!!!

  • @gaia35
    @gaia35 หลายเดือนก่อน

    This is why computers were built.

  • @TavishMcEwen
    @TavishMcEwen หลายเดือนก่อน

    so cool!!

  • @rapideye101
    @rapideye101 หลายเดือนก่อน +1

    how does the sampling work then with the cascades? can you also make a video on that? or is there a paper?

  • @Lucsueus
    @Lucsueus หลายเดือนก่อน

    Saw this video on my homepage recommendations, no relation to Half-life at all (haven't looked at source/half-life content in forever).
    To be honest I have no clue what this even means, but I'm very intrigued and I often enjoy watching people passionately explain something that to me is very niche.
    Thank you for sharing!

  • @simplegamer6660
    @simplegamer6660 หลายเดือนก่อน

    Damn, what mic are You using? It sounds rad af. I seriously wanna know
    P.S. Why do i have a feeling that i'm close to being the only one who came here *not* because i thought the video is somehow involved with half-life?:D

    • @Alexander_Sannikov
      @Alexander_Sannikov  หลายเดือนก่อน +1

      Boys, looks like we have a non-half-life person here. I repeat, a person who didn't come to joke about resonance cascades.

  • @AlexDicy
    @AlexDicy หลายเดือนก่อน

    How do you style ImGui like that? Is there a theme or did you manually change backgrounds and border radius?

    • @Alexander_Sannikov
      @Alexander_Sannikov  หลายเดือนก่อน +2

      it's open source you can check. legitengine by raikiri

    • @AlexDicy
      @AlexDicy หลายเดือนก่อน

      @@Alexander_Sannikov Thanks!

  • @anipodat394
    @anipodat394 หลายเดือนก่อน

    Half-Life + Hollow Knight = Radiance Cascade

  • @oBdurate
    @oBdurate หลายเดือนก่อน

    Bro sounds like Posy

  • @Terszel
    @Terszel หลายเดือนก่อน

    yeah

  • @oldlifeorig5028
    @oldlifeorig5028 หลายเดือนก่อน

    Does it mean that we can look into 3d models and see it more realistically from inside? Not like backfaces, but volume of it's mesh idk

  • @Skythedragon
    @Skythedragon ปีที่แล้ว

    So from what I understand from this and the presentation, you take an array of textures, and for each pixel in that texture, look at it's 8 neighbors, and do some short distance (1px) raytracing, and repeat that for the number of cascades you have?
    Then to get the final value, you read the pixels in the final cascade for the direction you want to trace in?
    Wouldn't this make it O(n) for the number of cascades instead of having some fixed upper bound?

    • @Alexander_Sannikov
      @Alexander_Sannikov  ปีที่แล้ว +6

      Nope, after the precalculation is done, there's no raycasting in this demo at all. Rendering the radiance field of one cascade in this case is equivalent to just reading a cubemap texture for every pixel, which is just a hardware bilinear interpolation. So, each of the 4 cascades are looked up this way and then merged. See at 4:23 each cascades literally stores tiny cubemaps: you don't raymarch them, you interpolate them.

  • @hnlo
    @hnlo 10 วันที่ผ่านมา

    Great work! I'm really interested to find out how to encode the radiance cascade data into a texture, can you give me some pointers?

  • @vanillagorilla8696
    @vanillagorilla8696 หลายเดือนก่อน

    A digital hologram.

    • @Alexander_Sannikov
      @Alexander_Sannikov  หลายเดือนก่อน

      @@vanillagorilla8696 i have a video about actual digital wavefield holograms if you're interested in that

    • @vanillagorilla8696
      @vanillagorilla8696 หลายเดือนก่อน

      @@Alexander_Sannikov I'd love that.

  • @wonkaytry
    @wonkaytry หลายเดือนก่อน

    mirrors

  • @gadirom
    @gadirom ปีที่แล้ว

    Ah! So there’s no sphere harmonics, you used cube maps. Still seems like a trickery. I’m looking forward for the paper. Really, O(1) looks like magic.

    • @Alexander_Sannikov
      @Alexander_Sannikov  ปีที่แล้ว +4

      I use spherical harmonics to gather diffuse GI (not in this demo). This demo does not need gathering irradiance, so it only uses parabolic mapping (basically, cubemaps).

    • @gadirom
      @gadirom ปีที่แล้ว

      @@Alexander_Sannikov I see. Thank you.

  • @aladorn
    @aladorn ปีที่แล้ว

    thank you for the video...veeery impresive.... please upgrade your mic

  • @ABWABWABWABWABWABWABWABWABWA
    @ABWABWABWABWABWABWABWABWABWA ปีที่แล้ว

    hell yeah

  •  หลายเดือนก่อน

    is that gauss splattering?

  • @themerpheus
    @themerpheus หลายเดือนก่อน

    so its like deep shadow maps but for irradiance?

  • @neon_Nomad
    @neon_Nomad หลายเดือนก่อน

    Ghost skeletons

  • @perkele1989
    @perkele1989 20 วันที่ผ่านมา

    you didnt even touch the roughness slider :( please, go into more tehcnical detail about how this works ! graphics programmer in me is amazed

    • @Alexander_Sannikov
      @Alexander_Sannikov  20 วันที่ผ่านมา +1

      now I also wonder what that slider did, because it makes no sense :D

  • @stimpyfeelinit
    @stimpyfeelinit ปีที่แล้ว

    neat!!!!!

  • @NeoShameMan
    @NeoShameMan ปีที่แล้ว

    It's funny i do something like that to render gi on mali 400 gpu in real time lol, but instead of storing the results, i store the uv of the points, which allows real time updates by simply texture feedback, ie the texture samples itself to resolve gi. I had the cascade idea but didn't implemented it, i didn't knew it would be that efficient. It's on unity's forum under the name exploration of custom diffise rtgi, the technique is called MAGIC for mapping approximation of gi compute 😂. I resolve slowly because i haven't tested how much i can render per frame, so it's one ray per pixel per frame, my computer is dead now lol hasn't finished.

  • @computerghost596
    @computerghost596 13 วันที่ผ่านมา

    I dont think this is half life guys

  • @user-nm4mi2sq1o
    @user-nm4mi2sq1o 7 หลายเดือนก่อน

    Оп, оказывается это был тизер affliction в PoE)

    • @Alexander_Sannikov
      @Alexander_Sannikov  7 หลายเดือนก่อน

      Как Вам удалось их соединить вообще?

    • @user-nm4mi2sq1o
      @user-nm4mi2sq1o 6 หลายเดือนก่อน

      ​@@Alexander_Sannikov Отвечая на данный вопрос, мне станет стыдно. Сам я мало что понимаю в разработке движка или проектировании/моделировании объектов. Но ваши панели на exilecon и данные ролики посматриваю, чисто из любопытства. Подача материала отличная
      Тут я увидел схожую модель наслоения частиц на объект, если можно так выразится, с текущей лигой в игре...даже цвета совпадают )
      Заглянул сюда после вашего подкаста у CARDIFF'а. К слову, было бы отлично, если бы у вас получилось периодически организовывать с ним, или другими стримерами, такие подкасты. Хотя бы раз в полгода
      У зарубежной аудитории есть Крис и Джонатан, а у нас будете вы.

  • @JannikVogel
    @JannikVogel ปีที่แล้ว

    Can you share those 8k images you have used in this demo for people who want to reproduce this (without having to capture their own scene first), or could you even upload the entire demo code somewhere?

    • @Alexander_Sannikov
      @Alexander_Sannikov  ปีที่แล้ว +2

      If you want to try this i really recommend replace the volume rendering part with some really simple SDF fractal raymarcher or a sphere. The only reason why i used volume data is because it's obvious that i'm not rendering it in realtime (that'd be much slower).
      That being said, at some point I will publish the sources.

  • @Danuxsy
    @Danuxsy หลายเดือนก่อน +1

    So what would be the usecase of this?

    • @NightmareCourtPictures
      @NightmareCourtPictures หลายเดือนก่อน

      Use case of global illumination? With O(1) time complexity? Practically all applications for every graphical representation for the foreseeable future.

    • @Danuxsy
      @Danuxsy หลายเดือนก่อน

      Well this is different because it is NOT real-time like path tracing we see in games, this is computed offline and so cannot change afterwards. It's similar to splatting, looks like real life but not very usable in games (as of now anyway), me sleep.

    • @NightmareCourtPictures
      @NightmareCourtPictures หลายเดือนก่อน

      @@Danuxsy it's a stretch to say path tracing is real-time.
      i also don't know where or why you have the impression this isn't real-time, nor why it wouldn't be real-time, given how efficient it is.

    • @Danuxsy
      @Danuxsy หลายเดือนก่อน

      @@NightmareCourtPictures i watched the ExileCon2023 talking about their GI implementation using radiance cascades so I see that it does have good usecases yess, it's cool !

    • @caffiend81
      @caffiend81 หลายเดือนก่อน

      @@Danuxsy it's usable in real time. Path of Exile 2 is using Radiance Cascades. IIRC their dev team published a white paper on the technique.

  • @homematvej
    @homematvej ปีที่แล้ว +1

    Is there a paper or something?

  • @k-vandan4289
    @k-vandan4289 ปีที่แล้ว

    cool

  • @paranoidPhantom
    @paranoidPhantom หลายเดือนก่อน +2

    Нихуя не понял, но очень интересно)

  • @icaroamorim3123
    @icaroamorim3123 3 หลายเดือนก่อน

    I'm still a bit too stupid to understand but I will keep trying until I implement it

    • @icaroamorim3123
      @icaroamorim3123 3 หลายเดือนก่อน

      Just read the full paper again, I can understand it much better now.

  • @user-yx5wd5yy6h
    @user-yx5wd5yy6h ปีที่แล้ว

    ничего не понял, но очень интересно =)

  • @macratak
    @macratak ปีที่แล้ว

    wheres the siggraph paper my man. im tryna read that!!!

  • @dougbeard7624
    @dougbeard7624 หลายเดือนก่อน

    Audio is pretty bad. You need to compress.