Yes and no. If u for example walk on grass the grass moves with u same for high winds or for example if u swim or u jump to water u see the psychics of moving water. I don't think the problem is the psychics is. More likely the graphic is almost on the limit now is only one step over that and this is real life footage graphic that the graphic looks so realistic that u can't see a difference between for example rocks in game and rocks in real world...
they said it's easy for game developers to use RT you can just throw it in there, if that's true i just need one thing, RT for older games, New Vegas will be a good start.
@@waynepayne9875 but for real, this shit is pretty mindblowing albeit raw and messy as a throw in without currating the environment... still, looks better with than without, and that says something...
@Tyler Russell I'm pretty sure they have Ray tracing for resident evil 2 in the sweet effects mod, or excuse me reshade or whatever the hell it is now the All MOD... I guess the real question is when is Adobe going to buy it reshade and add it as a plugin for Photoshop...
@Tyler Russell what differentiates between whether it uses the RT cores or the regular processing? I mean theoretically you could configure it to use those cores right or is that something that only Nvidia does? cuz that would be an annoying licensing issue in general
Cycles isn't raytracing. Classic Blender internal is raytracing. Cycles it pathtracing. The difference is that rays bounce in path tracing, they just end when you hit something in raytracing
I would say path tracing is also a form of raytracing. It's just a far more sophisticated, realistic and expensive form of ray tracing than for example primitive whitted style raytracing. The main difference is that in pathtracing we trace a much higher number of rays.
Could've been good to also add a frame where the ray bounces from an object to another object,and then to a light source, to show that it's not just a camera->1 obj bounce->light kind of thing. But overall, straightforward explanation.
The old method is not necessarily just rasterization. It is a part of the vision pipeline (often the end). You would write a lighting model for your object. This result in bad shadow, reflection and refraction (although techniques such as shadow mapping does work to a degree).
Below are two functions from my own code that showcases what raytracing can be in a basic way...the test is going to fire a ray from mouse and trace it to check if it collides with a triangle.Note: Raytracing for rendering involves firing not from the mouse but camera or other places - Millions of them. //// TRIANGLE INTERSECTION FUNCTION: (This is the test we want to run for every triangle in the scene) glm::vec3 PlaneN = glm::triangleNormal(v0, v1, v2); // uncertain if normal is rotated? glm::vec3 RayDirection = RayEnd - RayStart; RayDirection = glm::normalize(RayDirection); //// backface check //float loldot = glm::dot(RayDirection, PlaneN); //if (loldot > 0) {return false;} glm::vec3 PlanePointToRayStart = v0 - glm::vec3(RayStart); // PLANE EQUATION float t = glm::dot(PlanePointToRayStart, PlaneN) / glm::dot(RayDirection, PlaneN); HitReturn = glm::vec3(RayStart) + RayDirection * t; // compute triangle intersection glm::vec3 C; // edge 0 glm::vec3 edge0 = v1 - v0; glm::vec3 InterToV0 = HitReturn - v0; C = glm::cross(edge0, InterToV0); float LeftEdgeDot = glm::dot(PlaneN, C); if (LeftEdgeDot < 0) { return false; } // edge 1 glm::vec3 edge1 = v2 - v1; glm::vec3 InterToV1 = HitReturn - v1; C = glm::cross(edge1, InterToV1); float BottomEdgeDot = glm::dot(PlaneN, C); if (BottomEdgeDot < 0) { return false; } // edge 2 glm::vec3 edge2 = v0 - v2; glm::vec3 InterToV2 = HitReturn - v2; C = glm::cross(edge2, InterToV2); float DiagonalEdgeDot = glm::dot(PlaneN, C); if (DiagonalEdgeDot < 0) { return false; } ///// THE LOOP: (below is the loop, it goes through all the triangles in scene and does the above test to see if there is a collision): for (auto &T : GL.TRIANGLES) { Triangle_Intersection(T, DistanceToHit); }
Very interesting, Linus, thanks for explaining that! I've been hearing the term for years, wondering what it meant, thinking about looking it up someday and then always forgetting or never getting around to it, and this video explains it pretty well!
I finally figured out what the RTX 2080 stands for Real Time + Ray Tracing =RT + RT RT X 2 X 10+80 =RTX 20 + 80 RTX 2080 I have no idea what I'm doing right now.
Once the api is taken fully advantage of, 3d rendering software is going to see a massive jump in rendering performance. I would think game developers may want to only use the technology to enhance specific parts of the game for now, while leaving the actual play rasterized for performance. Only time will tell.
things not touched on here: path tracing, ray-object intersection methods, acceleration structures for ray-object intersection checks, multiple bounces/samples and other effects besides shadows noise issues encountered while ray tracing, nvidia's fancy pants filtering that turns noisy raytraced output into nice clean usable output. Im nerd boi and ive been thinkin about makin my own darn video about this that doesnt gloss over the fundamentals
Isn't that "noise" caused by the sparsity of rays used to cut the ray tracing to a more manageable task (something that a GPU can handle in a fraction of a second instead of days) rather than any consequence of Ray Tracing itself? And nvidia's fancy pants technique is to blur and approximate the missing data.
raytracing in its simplest form doesnt introduce any noise as youre only tracing as if all surfaces either dont reflect light in any direction but towards the camera or are only perfectly reflective. as you can imagine though in the real world light scatters off of surfaces in all directions (usually). so ray tracing in the simple form isnt noisy and also isnt super taxing, but that also wont give you: soft shadows, global illumination, things like caustics, etc. which nvidias stuff definitely allows for. to get things like, for this example soft shadows, in real life soft shadows are caused by the fact that there's no such thing as a point source light (infinitely small point light), all lights have volume. so in order to get soft shadows with ray tracing, when you sample shadow rays you need to sample randomly within the volume of the light source. and the keen among you may realize that once you start randomizing the direction of things whambamshapow, youve introduced noise into the picture (literally and figuratively). the same goes for things like global illumination where you need to take into account radiance of objects and smartly do multiple bounces off the initial object to simulate how light bounces around a scene and light from a green wall may reflect off and and onto other objects. theres randomness there that introduces noise as well. usually the way to deal with this kind of noise is just to render the same thing over and over and do a rolling average to converge the frames into a progressively less noisy image. but thats why this is hard to do in real time, you cant sample enough randomized paths to converge enough to the ground truth to look good in real time. at least, not without a good denoiser. and nvidias denoiser is really good, way more than just bluring/approximating. if its anything like the paper i read on a similar denoising technique, it uses machine learning combined with a rasterized render of the scene and some other wizardry to reconstruct the image. you can easily find examples of it being demonstrated with the denoiser on and off, basically can turn an unrecognizable noisy mess into a clean crisp image, with the occasional artifacting of course. hopefully this clears it up a little, so in one way yeah ya could say the noise is caused by making the task more managable, but i do think it is a fundamental issue with ray/path tracing. tho the more i think about it i guess if ya just had an insanely powerful computer that could crunch enough paths in realtime it wouldnt be an issue. its hard to word how i really feel about it but the noise is a fundamental property of light really. you get noise in the real world for similar reasons, like low photon count. things can look noisy just in your normal vision at night. and especially with cameras. just the fundamental way of basically accumulating photons looks noisy when you dont have enough photons. and same when you cant trace enough paths in ray tracing. so, eh, i guess ¯\_(ツ)_/¯
@@Bolt6265 this channel is called "techquickie". Of course they'll gloss over fundamentals. Make your own video. I'm serious I'll watch it. Just don't make it a half-hour long thesis.
Bolt Man Whoa there, we've got a know it all. lol kidding around but remember this is 'Techquickie' not Tech-hour-to-explain-every-single-detail-fully, dont get me wrong I would watch a longer vid with more detail but this channel is for quickies (I think around 7 mins or less fit that bill)
no matter how much you think you know, there is always more to learn. Even from simplified sources brains love to trick themselves into thinking they are perfect and all knowing
The first of the tech in games was called high dynamic range or HDR. It is light that changes color when reflecting with say a wall: Yellow light onto a red wall creates an orange orb. Raytracing it the same principle but with images: Ever see a shadow in an alley way of a cat. Now that cats shadow is controlled by the cats low light vision, hunting for food (ominous as to scare off bigger predators). Cypress hill said it best, I'd watch you back but it's best to watch your front. Another example of reflected image is our eyes are round and the World oval, it's why in low light we see images reflecting off shadow from straight shots. Humans lost night vision in place of color, to spot poisonous fruits and berries. But our nocturnal vision never really went aways. A child in the whom has blue eyes and then get's pigment from it's parents surroundings. Blue round eyes take the low light into the iris. As raytracing happens in everyday life. Now if the World was round not oval, you would not get it, the reflective World would not exist: The saying when you stare into the abyss sometimes something looks back means one thing, you may see your self.
right for about 2:05 in the video and i fully understand that ray tracing is kinda shading and lightning on the scene which you will be seeing in the game while playing so just looking at the scene for example scene where two person talking in the room will be effecting them by lightning in the room and then shading on their faces like making it even more realistic at the same time while watching those two people talking to each other
Love how every one memes/pokes fun at ray tracing, when it made many movies possible with realistic CGI (depending on who does it) and used in Pixar movies, Nvidia didn't invent ray tracing, it take time for this to get better. CGI and computer graphics wouldn't look so realistic without ray tracing.
But I think part of the criticism is that the video card market is suffocating, and the demand for affordable, significant upgrades has been met with immature technology that's costly in every way and might not be important until their next upgrade.
There is more to rendering games than just throwing more cores at the problem. Its only logical someone HAD to move the technology forward, just as programmable shaders replaced fixed-function shaders, raytracing will slowly replace programmable shaders. Clinging on to a limited technology is never a good thing, even if it upsets people in the short term due to the cost it will improve game visuals and make developeing those games a bit easier in the long term.
The original ray tracing algorithm was created by Appel back in the 60s. It just tells you if there is an obstacle between two points, which can then be used to compute shaows. The method explained in this video is actually called recursive ray tracing or Whitted ray tracing, which enables the computation of reflections and refractions. Photo realism can't be achieved with ray tracing alone. For that, you need more advanced methods such as path tracing, photon mapping, subsurface scattering, etc.
This made me somewhat excited for this technology. It could be really cool. I hope in 3 years games can run well using this and I can enjoy those reflections on my Freesync IPS panel.
2:14 Well. You can get some pretty amazing raytraced graphics with a regular gaming PC, too, even without that much money. (or probably even a regular non gaming pc.) The bigger problem is just the time I would say. It is a big difference when you give your rendering software minutes or even hours to only render a single frame of an animation, instead of requiring your computer to render multiple images per second like in a computer game. I am pretty sure most people have computers capable of rendering this image at 2:30. Just not at all in real time. On the other hand if you have a lot of money you can save a lot of time. You just buy many computers and each computer can render a different frame of your animation simultaneously.
'minutes or even hours' lol... I would be glad for that! Depending on the scene, minutes for a complete and somewhat noise free frame is pretty damn amazing.
You just convinced me to become a subscriber on the very first video that I watched from here. I found his explanation of a very complicated technology very user friendly! Good stuff!!
I think any and every game company would gladly embrace active ray tracing. It really is the "holy grail" as rather than approximating shadows in textures, faking reflections (or just outright duplicating objects), or skipping out on reflections and shadows to save time -- it would all be computed in real time creating a better result more easily. The problem is that we are not at that point yet. If the low amount of ray tracing paths and shortcuts used to make something renderable in real time yields a worse result than just faking it, then that will make companies less likely to adopt it. If the demand is too high on the current systems and it tanks the frame rate to unplayable rates, companies are less likely to adopt it. And both of those hard limitations are on top of the natural one of consumer adoption rate, which is an even tougher nut than most new graphic technologies. To cover people on old and new technologies, this would mean needing to do double the textures, and if the demand for any graphical improvement to the limited consumer pool doesn't outweigh the cost, most companies won't bother... and that cost is higher than most new technologies. I've seen it done before when 3D was becoming a thing - games that had both 2D and 3D objects created so older and newer systems could handle it. I also expect it to be the same type of result: the 2D looked much better than the 3D as the simple 2D image was able to use every pixel for detail, and losing object depth and angles was a small sacrifice compared to the detail loss required to make the 3D object manageable for the hardware of the time. Which means people with RTX cards may be playing even ray-traceable games without ray-tracing until the results can beat the current bag of tricks used to fake it. TL;DR - I think we're in the boat of Ray Tracing in real time not effectively being reached. The content does not only not exist yet, but the results it provides will likely fall short in many cases - I see low amounts of ray tracing paths the same as low amounts of polygons in early 3D. Though I still see it as the future, and this is the first real step. I'm hoping the line can be measured in years rather than decades, but we'll see.
One of the first papers (if not the first) about ray tracing has formulas for it: citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.156.1534 . But there might be improved formulas or already existing libraries/frameworks out there.
So basically, it renderes the enviroment in 3d simulation, then simulates where light would travel from said lightsources. And wether your in the those lines detirmines the shades that are rendered.
3:08 "consumer grade ray tracing". Production renderers do the same thing (trace from the camera into the scene), it'd make no sense to just trace a shit load of rays that aren't going to randomly actually hit the camera lens. So that isn't how ray tracing for games differs. It differs from production rendering in that it only renders a few rays per pixel, and then the AI tensor cores use a deep neural network to denoise the result. Rasterizing is still the main rendering tech, and takes care of anti aliasing. In a production renderer, thousands of camera rays may be shot out per pixel for good anti aliasing, and they may be distributed across time (for motion blur) or space (across the area of a lens for bokeh effects). From each camera ray, thousands of shading rays may be cast too (reflections, refractions, both of which could be rough, requiring way more rays than glossy), and then finally directed towards a light. All the different types of shading rays, multiplied by thousands of camera rays exponentially increases the number of rays cast. So its really the same thing, just far fewer rays.
Was thinking this exact thing when I got to that part of the video, legit had to pause and have a "hang on, what did he say?" moment. Also, fun fact, Turing also accelerates the actual process of tracing individual rays, as the RT cores accelerate structure traversal and intersection tests. That, combined with the AI denoiser cleaning up the image, is what allows Turing to do "real time" raytracing.
Description of rasterization was all kinds of funky as well.. Most nvidia demos used standard bilateral filtering and so on, will be very interesting to see what techniques we will see in future.
Ray tracing will be the future of lighting in games, real time always follows production as hardware allows (PBR being one of the most recent huge jumps in quality brought in from production rendering) but I don't think it is anything more than enthusiast right now. Just imagine not needing to adjust shadow map quality, or enable some shadow softening options, or even decide which AO you want because those are all tricks and techniques to try and simulate those things ray tracing just does. It doesn't even necessarily help production, as we've had real time previewers in things like octane and even the free blender for years, so personally I'm going to hold off a few generations. Very exciting advancement though, and great explanation.
it does help production IMO, at least for individuals. I mean, with RTX integration into many raytracing render engines, you see quite a decent speedup, which is great, especially for previewing a scene (for finals, you might have a renderfarm anyway).
As a KiCad user, it can take minutes to hours to trace out a moderately complicated PCB design with component models, even on a dual Xeon E5-2680 machine and the consumer-grade reverse tracing techniqe employed by KiCad. However the tracing result of KiCad is often a superb visual experience.
No idea why I watched this when I know more about ray tracing than Linus knows about how to hold his PC hardware firmly in his hands. Anyways, almost every game developer has dreamed of being able to use ray tracing in their games some day. I'd say that there is 0 chance that developers won't develop games with ray tracing support. It's more likely that they'll try to do anything to help the ray tracing technology be available to as many people as possible so that they had a chance to rely entirely on it. Ray tracing can save so much work for developers, you can't even imagine.
not only that but its actually less work to develop for a ray tracing API and for the most part rasterized work will only be improved by ray tracing techniques with zero input from the developer (unless they want to aim for photorealism ofc), I can see engines like unreal and unity working in both raytracing and rasterization really soon.
It feels like when we went from 2d graphics, to 3d graphics, to 3d photo-realistic graphics and people said that it "probably wasn't worth the power required since it was going to perform so poorly" To anyone *not* excited about it. Just remember things iterate. And this is one of those things normal people dont consciously notice or think about
I really hope this feature will become the new norm, it'll be a giant leap in graphics. And yes a huge cost in performance, but as people have said this is just the first generation, you gotta start somewhere.
indeed even with better drivers and software to work with i dont think youll be seeing much in the well mainly tripple A games that have ray tracing built in and youll be lucky to see a 60fps at 1080p with a 2080ti or 2080. It requires a lot of power to render per second and the fact they nvidia and amd have been able to do so in real time at 30-60 frames per second that is god damn impressive.
Why is nobody considering that using RTX actually reduces the load on the shaders? If developers allow people to choose the RTX complexity in the scene you could easily have a gorgeous 60fps game while still allowing much higher frame rates at the cost of it not looking as good, but likely STILL better than shaders alone.
Someone correct me if i'm wrong..ray tracing determines how an object in a game would look when light is interacted onto tuat object wh8ch give it a shadow in certain area.
RT seems most useable for VR (as it already uses polygon counting for rendering the environments and objects) and rasterization for regular 3D rendering on a monitor or tv
Me right here. The first 30 seconds did nothing to explain tovme or make sense, how in the world am I going to follow another line if my eyes can only see to that point that's what I don't understand
on a simple note... I hope games begin to use it. it does seems like the next step of graphics and I would appreciate if we evolved a little bit on that. it does not seem to be a complicated tool for developers to use either.
Hi. This has been in 3D software like Maya 20 years ago or so but using software, using the CPU (slow, not in real time- unless it's a renderfarm, like mentioned). God bless, Proverbs 31
a renderfarm is usually used for animation rendering tho, to keep rendertimes below centuries, so I wouldn't really call it 'real time'. But I mean, real time is pretty vague. If you have a good PC, you can often preview your scene somewhat interactively (if it doesn't contain really computationally expensive things like detailed volumes etc).
"Ray tracing has made headlines lately as the rendering method of the future for games" Didn't we use ray tracing for a really long time, we just baked lights onto texture in order to save CPU power.
@@alexbaker1569 That's what you do in for example counter strike 1.6 when you create maps. Light gets ray traced and baked onto textures. I'm certain that the whole quake engine uses this method.
@@IAMDIMITRI yes, we used ray tracing to create maps, but once the maps were created it was baked in, we have never before used real time ray tracing, the kind that is now starting to used in games
Yeah, they spend their every waking moment playing games so that they can put different clips on their videos for your personal enjoyment. No, they have more important things to do than play games during work (at least most of the time).
@@TheFoxyGamerOfficial I was sarcastic...look this is my personal opinion,but picture it like this.Giant corporate chess game and Nvidia just made their move.Just like they did with SLI,PhysX,GameWorks,GPP...and now Ray Tracing.The question is does the other side have a counter move or is it game over?I compared it to PhysX because thats another technology customers didnt really want at the time it was introduced.Bottom line...its business thats what it is..
What you mean by a counter move? Ray tracing is here since 4/5 years, just almost nobody is using it, that's why for gamers there's no point in using windows 10 because ray tracing is only used in directx 12.
I enjoy ray tracing very much and prefer it. The one thing I do have against it is that sometimes I feel as though the reflectiveness is a little strong. I'm not an expert or a coder though, is this from ray tracing or something else? For example 1:55 - 2:01 shows the images side by side with a very noticeable difference in light reflection between the two, BUT why does the tea pot have a mirrored effect and the normal one has absolutely none? Was this only introduced with ray tracing or is it to make more of eye catch to make the second image look better?
Would this give me a ton of FPS? Because I couldn't care less about ultra fancy looking tech if it won't give me a shit ton of FPS. FPS is what makes a game feel great. A good looking game that can reach 144hz is always superior to an ultra looking game but could only be spewed at 60.
We don't know yet. People keep making conclusions about games that had two-weeks to implement RTX on Alpha drivers and assuming that's what we are going to get. Logically if RTX handles things shaders traditionally did, then in some circumstances it might increase the frame rate, depending on if you can tweak the accuracy of RTX to get a balanced GPU load.
I rather play at 60hz with ultra realism than at 144 fps with crappy visuals. FPS above 60 means nothing at all, only if you're on competitive gaming, even though it isn't a must, just for the ones who want the best result possible.
What is Ray Tracing?
Me: *Expensive*
ARJ for know!! GIVE IT TIME BRO!!
It's only expensive if you want it real time
My computer's 12 yrs old and I've done quite a few ray-traced 3D renders.
Haha gpu money huhuhuhu
@@retrodragon2249 Ah that's so old- I ray trace with my CPU
oops I path trace with my CPU
One-Year Later: Sold out every where
"Just buy it" - Tom's Hardware
"NO" - Gamers Nexus
"NO" - Tom, THE Tom, formerly of Tom's Hardware.
"It JuST WorKs" - Nvidia.
"NO..speaking of which, why didn't you guys use the benchmark results for AMD graphics cards that I sent you?" - Igor of Tom's Hardware Germany.
4Head
*you get a friend named Ray* ... and then follow him without knowing
Lol 😆
That profile pic 😎👌
Not just follow him, also trace him as well.
@@W1zardRyan , i link it to you, tomorrow... ;3
@@FizzleFXheheh thx homie
Graphic keeps getting better and better while the physics are still bad
Yes and no. If u for example walk on grass the grass moves with u same for high winds or for example if u swim or u jump to water u see the psychics of moving water. I don't think the problem is the psychics is. More likely the graphic is almost on the limit now is only one step over that and this is real life footage graphic that the graphic looks so realistic that u can't see a difference between for example rocks in game and rocks in real world...
Depends on the engine the game uses
u need to see that new unreal engine destruction demo
Play a game a 15-year old game and tell my physics aren't 10x better at least.
check out project Borealis, the physics are great!
But i am in a dark room with no lights on
The closet?
Can we get 5000 subscribers with no videos? Your phone emits light or the monitor your using.
Southeast Africa?
I got the other problem, the lights are on but nobody is home
Your monitor?
they said it's easy for game developers to use RT you can just throw it in there, if that's true i just need one thing, RT for older games, New Vegas will be a good start.
GTA 4 raytraced would be gorgeous
@@waynepayne9875 th-cam.com/video/Uo7nu4fSWpA/w-d-xo.html
even better... gta 5, it's like 4 but with gameplay and (characters?)
@@waynepayne9875 but for real, this shit is pretty mindblowing albeit raw and messy as a throw in without currating the environment... still, looks better with than without, and that says something...
@Tyler Russell I'm pretty sure they have Ray tracing for resident evil 2 in the sweet effects mod, or excuse me reshade or whatever the hell it is now the All MOD...
I guess the real question is when is Adobe going to buy it reshade and add it as a plugin for Photoshop...
@Tyler Russell what differentiates between whether it uses the RT cores or the regular processing? I mean theoretically you could configure it to use those cores right or is that something that only Nvidia does? cuz that would be an annoying licensing issue in general
You mean Wallet Tracing?
silverhelmet61 wallet taking*
Wallet Vaporizing*
Icon matches humour
@@dazza2350 That makes no sense
xD
how about games with no dlc and good storyline rather than having a super reflective spoon
What about both?!
Who doesn't love a super reflective spoon?
There is no spoon..
No wait guys, I found it! 🥄
NVidia doesn't create games. And no company, that's actually creating games, is making super-reflective spoons.
I'm here because of ray traced minecraft videos
Me too
Yes
Me too bro
Same
Who isn't?
2018:
Nidia & AMD: Raytacing!!!
Game Devs: BattleF13ld V!
2019:
Nvidia: RTX 2080ti!!!!
Game Devs: Minecraft!!!!
This is why I’m here
And also Quake II!
2020-
Nvidia: RTX 3090 Ti
Devs: Minecraft RTX at 8K 60fps
Quake Quake going in to attack
Is ray tracing in MicroSoft Paint possible?
th-cam.com/video/vo9ztLY34n0/w-d-xo.html
TH-cam is getting really brave now with these double ads
Sort of makes you wanna fro' up, don't it?
I close the video back to back and it eventually fks off
@@yolocholo9611 thats exactly wat i do but shit it be popping up 10 times sometimes and the way u said " it eventually fucks off"😂
Yeah sucks
Thanks for making this video. I was confused on the topic
It’ll probably be one of the greatest things to happen in gaming but it’s just not ready yet and anything under a 2080ti imo will be useless
As a Blender Cyclic user, I love ray tracing
Cycles*
Cycles isn't raytracing. Classic Blender internal is raytracing. Cycles it pathtracing. The difference is that rays bounce in path tracing, they just end when you hit something in raytracing
I would say path tracing is also a form of raytracing. It's just a far more sophisticated, realistic and expensive form of ray tracing than for example primitive whitted style raytracing. The main difference is that in pathtracing we trace a much higher number of rays.
Correct, path tracing counts under ray tracing.
Have you used EEvee yet?
Could've been good to also add a frame where the ray bounces from an object to another object,and then to a light source, to show that it's not just a camera->1 obj bounce->light kind of thing. But overall, straightforward explanation.
Would you make video about NV-Link as well?
I second this
Maybe they should only once consumers can get it. They haven't even tested it yet at LMG.
Oh, please do, Linus!
The old method is not necessarily just rasterization. It is a part of the vision pipeline (often the end). You would write a lighting model for your object. This result in bad shadow, reflection and refraction (although techniques such as shadow mapping does work to a degree).
Funny how u show Minecraft as an example of no ray tracing but now we have ray tracing in Minecraft lol
Mathias Gaming th-cam.com/video/lg5mjKD034w/w-d-xo.html
actually u dont, u just have path tracing
@@Anewold No it doesn't.
Gr8humanilation9TV it does now
@@Anewold path tracing is just glorified ray tracing (rays bounce of objects instead of returning immediately)
2018: Why are we Tracing Ray?
2020: PS5- I FOUND RAY!
Xbox Scarlett tho
@@samuelaurora3632 yuck
MyDbzfan9000 I know I’m ugly but f you man
@@MyDbzfan9000 guess what? xbox series x is more powerful than ps5 so hell yeah
nintendo boys
Below are two functions from my own code that showcases what raytracing can be in a basic way...the test is going to fire a ray from mouse and trace it to check if it collides with a triangle.Note: Raytracing for rendering involves firing not from the mouse but camera or other places - Millions of them.
//// TRIANGLE INTERSECTION FUNCTION: (This is the test we want to run for every triangle in the scene)
glm::vec3 PlaneN = glm::triangleNormal(v0, v1, v2); // uncertain if normal is rotated?
glm::vec3 RayDirection = RayEnd - RayStart;
RayDirection = glm::normalize(RayDirection);
//// backface check
//float loldot = glm::dot(RayDirection, PlaneN);
//if (loldot > 0) {return false;}
glm::vec3 PlanePointToRayStart = v0 - glm::vec3(RayStart);
// PLANE EQUATION
float t = glm::dot(PlanePointToRayStart, PlaneN) / glm::dot(RayDirection, PlaneN);
HitReturn = glm::vec3(RayStart) + RayDirection * t;
// compute triangle intersection
glm::vec3 C;
// edge 0
glm::vec3 edge0 = v1 - v0;
glm::vec3 InterToV0 = HitReturn - v0;
C = glm::cross(edge0, InterToV0);
float LeftEdgeDot = glm::dot(PlaneN, C);
if (LeftEdgeDot < 0) { return false; }
// edge 1
glm::vec3 edge1 = v2 - v1;
glm::vec3 InterToV1 = HitReturn - v1;
C = glm::cross(edge1, InterToV1);
float BottomEdgeDot = glm::dot(PlaneN, C);
if (BottomEdgeDot < 0) { return false; }
// edge 2
glm::vec3 edge2 = v0 - v2;
glm::vec3 InterToV2 = HitReturn - v2;
C = glm::cross(edge2, InterToV2);
float DiagonalEdgeDot = glm::dot(PlaneN, C);
if (DiagonalEdgeDot < 0) { return false; }
///// THE LOOP: (below is the loop, it goes through all the triangles in scene and does the above test to see if there is a collision):
for (auto &T : GL.TRIANGLES) { Triangle_Intersection(T, DistanceToHit); }
sorry I don't speak Portuguese
Very interesting, Linus, thanks for explaining that! I've been hearing the term for years, wondering what it meant, thinking about looking it up someday and then always forgetting or never getting around to it, and this video explains it pretty well!
I just got an RTX capable graphics card after my old RX 580 started to die, and, it's freakin amazing.
Jiggarays. JUST BUY IT
The more you buy, the more you save
jijarays yo mean
I love how he just slides in his advertisements, gets me every time.
I finally figured out what the RTX 2080 stands for
Real Time + Ray Tracing
=RT + RT
RT X 2 X 10+80
=RTX 20 + 80
RTX 2080
I have no idea what I'm doing right now.
@ThaKos69 it should be
(( Real Time x Ray Tracing) x 1040) + ti =
(RT x 2 x 1040) + ti =
RT x 2080 +ti =
RTX 2080ti
Smh😒
R = 18th letter
T = 20th letter
X = 24th letter
20+24-18 = 26
26÷2=13
Illuminati Confirmed
Tay racing?
VeoBroLIVE lil tays?
😂😂😂😂 killer bro!
No, it's Tray Racing, which is presumably what butlers do on their free time...
Gay racing
you stupid bro meaning funny LOL
Once the api is taken fully advantage of, 3d rendering software is going to see a massive jump in rendering performance. I would think game developers may want to only use the technology to enhance specific parts of the game for now, while leaving the actual play rasterized for performance. Only time will tell.
Come on guys, it's Ray tracing. It just works
things not touched on here:
path tracing,
ray-object intersection methods,
acceleration structures for ray-object intersection checks,
multiple bounces/samples and other effects besides shadows
noise issues encountered while ray tracing,
nvidia's fancy pants filtering that turns noisy raytraced output into nice clean usable output.
Im nerd boi and ive been thinkin about makin my own darn video about this that doesnt gloss over the fundamentals
I'll watch it if it's shorter than 15 minutes
Isn't that "noise" caused by the sparsity of rays used to cut the ray tracing to a more manageable task (something that a GPU can handle in a fraction of a second instead of days) rather than any consequence of Ray Tracing itself? And nvidia's fancy pants technique is to blur and approximate the missing data.
raytracing in its simplest form doesnt introduce any noise as youre only tracing as if all surfaces either dont reflect light in any direction but towards the camera or are only perfectly reflective. as you can imagine though in the real world light scatters off of surfaces in all directions (usually). so ray tracing in the simple form isnt noisy and also isnt super taxing, but that also wont give you: soft shadows, global illumination, things like caustics, etc. which nvidias stuff definitely allows for. to get things like, for this example soft shadows, in real life soft shadows are caused by the fact that there's no such thing as a point source light (infinitely small point light), all lights have volume. so in order to get soft shadows with ray tracing, when you sample shadow rays you need to sample randomly within the volume of the light source. and the keen among you may realize that once you start randomizing the direction of things whambamshapow, youve introduced noise into the picture (literally and figuratively). the same goes for things like global illumination where you need to take into account radiance of objects and smartly do multiple bounces off the initial object to simulate how light bounces around a scene and light from a green wall may reflect off and and onto other objects. theres randomness there that introduces noise as well. usually the way to deal with this kind of noise is just to render the same thing over and over and do a rolling average to converge the frames into a progressively less noisy image. but thats why this is hard to do in real time, you cant sample enough randomized paths to converge enough to the ground truth to look good in real time. at least, not without a good denoiser. and nvidias denoiser is really good, way more than just bluring/approximating. if its anything like the paper i read on a similar denoising technique, it uses machine learning combined with a rasterized render of the scene and some other wizardry to reconstruct the image. you can easily find examples of it being demonstrated with the denoiser on and off, basically can turn an unrecognizable noisy mess into a clean crisp image, with the occasional artifacting of course.
hopefully this clears it up a little, so in one way yeah ya could say the noise is caused by making the task more managable, but i do think it is a fundamental issue with ray/path tracing. tho the more i think about it i guess if ya just had an insanely powerful computer that could crunch enough paths in realtime it wouldnt be an issue. its hard to word how i really feel about it but the noise is a fundamental property of light really. you get noise in the real world for similar reasons, like low photon count. things can look noisy just in your normal vision at night. and especially with cameras. just the fundamental way of basically accumulating photons looks noisy when you dont have enough photons. and same when you cant trace enough paths in ray tracing. so, eh, i guess ¯\_(ツ)_/¯
@@Bolt6265 this channel is called "techquickie". Of course they'll gloss over fundamentals. Make your own video. I'm serious I'll watch it. Just don't make it a half-hour long thesis.
Bolt Man Whoa there, we've got a know it all. lol kidding around but remember this is 'Techquickie' not Tech-hour-to-explain-every-single-detail-fully, dont get me wrong I would watch a longer vid with more detail but this channel is for quickies (I think around 7 mins or less fit that bill)
2019: Ray Tracing
2020: Contact Tracing.. lol
Best comment on here...dude ! Hilarious
who else knows "a lot" about ray tracing but still clicked on the video because its linus
no matter how much you think you know, there is always more to learn. Even from simplified sources
brains love to trick themselves into thinking they are perfect and all knowing
NEVER use quotation marks for *_emphasis_*
you're not my "real" mom!
@@frosty9392 100% true
me
In America, you trace ray.
In Soviet Russia, ray trace YOU!
@ARCH - FX maybe because we're not in 2011 anymore
You win
For worst use of tired old joke.
@@Riquelme4232 >2011
I guess we all know when you started browsing the internet.
Dead meme.
God this joke is so dead and bad
The first of the tech in games was called high dynamic range or HDR. It is light that changes color when reflecting with say a wall: Yellow light onto a red wall creates an orange orb. Raytracing it the same principle but with images: Ever see a shadow in an alley way of a cat. Now that cats shadow is controlled by the cats low light vision, hunting for food (ominous as to scare off bigger predators). Cypress hill said it best, I'd watch you back but it's best to watch your front. Another example of reflected image is our eyes are round and the World oval, it's why in low light we see images reflecting off shadow from straight shots. Humans lost night vision in place of color, to spot poisonous fruits and berries. But our nocturnal vision never really went aways. A child in the whom has blue eyes and then get's pigment from it's parents surroundings. Blue round eyes take the low light into the iris. As raytracing happens in everyday life. Now if the World was round not oval, you would not get it, the reflective World would not exist: The saying when you stare into the abyss sometimes something looks back means one thing, you may see your self.
Linus, WHERE ARE THE CAT TIPS
His cats are died
Really? What color did he die them? Why would anyone die their cat anyway? seems weird
Some of elder cats, i'm forgot which one
@@KangJangkrik Not sure if you are pretending to be able to speak English and you can't, or you're pretending you can't speak English and you can
Pulseway is the only ad I see when I see any video of techquickie...... But ngl I love it.
I cracked nvidia's system guys
(( Real Time x Ray Tracing) x $1040) + ti =
(RT x 2 x 1040) + ti =
RT x 2080 +ti =
RTX 2080ti
right for about 2:05 in the video and i fully understand that ray tracing is kinda shading and lightning on the scene which you will be seeing in the game while playing so just looking at the scene for example scene where two person talking in the room will be effecting them by lightning in the room and then shading on their faces like making it even more realistic at the same time while watching those two people talking to each other
Oh, finally linus talked about RTX cards...
Understandable. Have a nice day
you did a great job explaining it but I'm just too dumb to understand lol
I like this guy’s energy. He makes me want to listen to anything he says.
Facts
Love how every one memes/pokes fun at ray tracing, when it made many movies possible with realistic CGI (depending on who does it) and used in Pixar movies, Nvidia didn't invent ray tracing, it take time for this to get better. CGI and computer graphics wouldn't look so realistic without ray tracing.
Nvidia didnt invented ray tracing, but they make it real time instead.
All hail CUDA.
But I think part of the criticism is that the video card market is suffocating, and the demand for affordable, significant upgrades has been met with immature technology that's costly in every way and might not be important until their next upgrade.
The memes aren't making fun of the general technique of ray-tracing. They're making fun of RTX.
There is more to rendering games than just throwing more cores at the problem. Its only logical someone HAD to move the technology forward, just as programmable shaders replaced fixed-function shaders, raytracing will slowly replace programmable shaders.
Clinging on to a limited technology is never a good thing, even if it upsets people in the short term due to the cost it will improve game visuals and make developeing those games a bit easier in the long term.
The original ray tracing algorithm was created by Appel back in the 60s. It just tells you if there is an obstacle between two points, which can then be used to compute shaows. The method explained in this video is actually called recursive ray tracing or Whitted ray tracing, which enables the computation of reflections and refractions. Photo realism can't be achieved with ray tracing alone. For that, you need more advanced methods such as path tracing, photon mapping, subsurface scattering, etc.
I'm tracing a picture of my friend Ray right now. I'm "Ray Tracing".
Good explanation, thank-you
Its the thing that your 4th grade teacher made you do with a straight edge and an arrow.
This made me somewhat excited for this technology. It could be really cool. I hope in 3 years games can run well using this and I can enjoy those reflections on my Freesync IPS panel.
Playstation 5 details officially came and they say it has a "Ray tracing".
At the same time some guy named ray is getting paranoid because he swears he is seeing people following him
@@toasthead Shut the fuck up
Never knew that 3:22 also explains how Atom Heart Father works
2:14 Well. You can get some pretty amazing raytraced graphics with a regular gaming PC, too, even without that much money. (or probably even a regular non gaming pc.) The bigger problem is just the time I would say. It is a big difference when you give your rendering software minutes or even hours to only render a single frame of an animation, instead of requiring your computer to render multiple images per second like in a computer game. I am pretty sure most people have computers capable of rendering this image at 2:30. Just not at all in real time.
On the other hand if you have a lot of money you can save a lot of time. You just buy many computers and each computer can render a different frame of your animation simultaneously.
'minutes or even hours' lol... I would be glad for that! Depending on the scene, minutes for a complete and somewhat noise free frame is pretty damn amazing.
Wow in 3 years the graphics of videogames have evolved so much its amazing
Minecraft is the most realistic graphics game I ever played.
Same
But it's the ONLY graphics game I've played
You just convinced me to become a subscriber on the very first video that I watched from here. I found his explanation of a very complicated technology very user friendly! Good stuff!!
"unless you're sitting in complete darkness" how did ya know?
I think any and every game company would gladly embrace active ray tracing. It really is the "holy grail" as rather than approximating shadows in textures, faking reflections (or just outright duplicating objects), or skipping out on reflections and shadows to save time -- it would all be computed in real time creating a better result more easily.
The problem is that we are not at that point yet. If the low amount of ray tracing paths and shortcuts used to make something renderable in real time yields a worse result than just faking it, then that will make companies less likely to adopt it. If the demand is too high on the current systems and it tanks the frame rate to unplayable rates, companies are less likely to adopt it. And both of those hard limitations are on top of the natural one of consumer adoption rate, which is an even tougher nut than most new graphic technologies. To cover people on old and new technologies, this would mean needing to do double the textures, and if the demand for any graphical improvement to the limited consumer pool doesn't outweigh the cost, most companies won't bother... and that cost is higher than most new technologies. I've seen it done before when 3D was becoming a thing - games that had both 2D and 3D objects created so older and newer systems could handle it. I also expect it to be the same type of result: the 2D looked much better than the 3D as the simple 2D image was able to use every pixel for detail, and losing object depth and angles was a small sacrifice compared to the detail loss required to make the 3D object manageable for the hardware of the time. Which means people with RTX cards may be playing even ray-traceable games without ray-tracing until the results can beat the current bag of tricks used to fake it.
TL;DR - I think we're in the boat of Ray Tracing in real time not effectively being reached. The content does not only not exist yet, but the results it provides will likely fall short in many cases - I see low amounts of ray tracing paths the same as low amounts of polygons in early 3D. Though I still see it as the future, and this is the first real step. I'm hoping the line can be measured in years rather than decades, but we'll see.
Yeah but I kinda need the mathematical logic behind this to program it in a framework.
One of the first papers (if not the first) about ray tracing has formulas for it: citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.156.1534 . But there might be improved formulas or already existing libraries/frameworks out there.
Take a physics class in optics
th-cam.com/video/frLwRLS_ZR0/w-d-xo.html heres a disney video explaining the physics logic behind ray tracing :D
Is ray tracing in MicroSoft Paint possible?
th-cam.com/video/vo9ztLY34n0/w-d-xo.html
So basically, it renderes the enviroment in 3d simulation, then simulates where light would travel from said lightsources. And wether your in the those lines detirmines the shades that are rendered.
3:08 "consumer grade ray tracing". Production renderers do the same thing (trace from the camera into the scene), it'd make no sense to just trace a shit load of rays that aren't going to randomly actually hit the camera lens. So that isn't how ray tracing for games differs.
It differs from production rendering in that it only renders a few rays per pixel, and then the AI tensor cores use a deep neural network to denoise the result. Rasterizing is still the main rendering tech, and takes care of anti aliasing. In a production renderer, thousands of camera rays may be shot out per pixel for good anti aliasing, and they may be distributed across time (for motion blur) or space (across the area of a lens for bokeh effects). From each camera ray, thousands of shading rays may be cast too (reflections, refractions, both of which could be rough, requiring way more rays than glossy), and then finally directed towards a light. All the different types of shading rays, multiplied by thousands of camera rays exponentially increases the number of rays cast.
So its really the same thing, just far fewer rays.
Was thinking this exact thing when I got to that part of the video, legit had to pause and have a "hang on, what did he say?" moment.
Also, fun fact, Turing also accelerates the actual process of tracing individual rays, as the RT cores accelerate structure traversal and intersection tests. That, combined with the AI denoiser cleaning up the image, is what allows Turing to do "real time" raytracing.
Description of rasterization was all kinds of funky as well..
Most nvidia demos used standard bilateral filtering and so on, will be very interesting to see what techniques we will see in future.
Well made video. Thank you for keeping it to about 5minutes.
Ray tracing will be the future of lighting in games, real time always follows production as hardware allows (PBR being one of the most recent huge jumps in quality brought in from production rendering) but I don't think it is anything more than enthusiast right now. Just imagine not needing to adjust shadow map quality, or enable some shadow softening options, or even decide which AO you want because those are all tricks and techniques to try and simulate those things ray tracing just does. It doesn't even necessarily help production, as we've had real time previewers in things like octane and even the free blender for years, so personally I'm going to hold off a few generations. Very exciting advancement though, and great explanation.
give it a few years..
I really hope so!
In few years, people wont even remember they complained about ray tracing.
Its not what you wanted but it IS what you need.
it does help production IMO, at least for individuals. I mean, with RTX integration into many raytracing render engines, you see quite a decent speedup, which is great, especially for previewing a scene (for finals, you might have a renderfarm anyway).
You told me something I actually needed to know in under ten minutes!
Schools should take notes
Is ray tracing in MicroSoft Paint possible?
th-cam.com/video/vo9ztLY34n0/w-d-xo.html
But to understand ray-tracing, me must fist talk about alternate universes
As a KiCad user, it can take minutes to hours to trace out a moderately complicated PCB design with component models, even on a dual Xeon E5-2680 machine and the consumer-grade reverse tracing techniqe employed by KiCad. However the tracing result of KiCad is often a superb visual experience.
No idea why I watched this when I know more about ray tracing than Linus knows about how to hold his PC hardware firmly in his hands.
Anyways, almost every game developer has dreamed of being able to use ray tracing in their games some day. I'd say that there is 0 chance that developers won't develop games with ray tracing support. It's more likely that they'll try to do anything to help the ray tracing technology be available to as many people as possible so that they had a chance to rely entirely on it. Ray tracing can save so much work for developers, you can't even imagine.
not only that but its actually less work to develop for a ray tracing API and for the most part rasterized work will only be improved by ray tracing techniques with zero input from the developer (unless they want to aim for photorealism ofc), I can see engines like unreal and unity working in both raytracing and rasterization really soon.
one of the better explanations of raytracing. nice vid 💪
0:11 pause it, he looks serious
mom i want linus tech tips
we got linus tech tips at home
linus tech tips at home:
2:59
>Linus : one frame per day
>Image says 29 hours
There is 29 hours in a day.
Thank you so much for explaining Linus
We've been using ray tracing in Blender for years, yet for games it's difficult. Within the next few years we will see games use ray tracing
From deep takes to real time ray tracing, TECHKNOWLEGEDY!
It feels like when we went from 2d graphics, to 3d graphics, to 3d photo-realistic graphics and people said that it "probably wasn't worth the power required since it was going to perform so poorly"
To anyone *not* excited about it. Just remember things iterate. And this is one of those things normal people dont consciously notice or think about
I watched different videos and different channels but somehow are all him. This guy is everywhere.
I really hope this feature will become the new norm, it'll be a giant leap in graphics.
And yes a huge cost in performance, but as people have said this is just the first generation, you gotta start somewhere.
"become" is the word. There is nothing for the normal gamer to use it for.
indeed even with better drivers and software to work with i dont think youll be seeing much in the well mainly tripple A games that have ray tracing built in and youll be lucky to see a 60fps at 1080p with a 2080ti or 2080. It requires a lot of power to render per second and the fact they nvidia and amd have been able to do so in real time at 30-60 frames per second that is god damn impressive.
See as how a lot of PC Master Races are allergic to sub 120 fps images, probably won't see much sheeple celebration for it in a while as well.
It's exciting tech but give it time. Not currently practical if you prioritise fps over visuals (like competitive gamers)
Why is nobody considering that using RTX actually reduces the load on the shaders?
If developers allow people to choose the RTX complexity in the scene you could easily have a gorgeous 60fps game while still allowing much higher frame rates at the cost of it not looking as good, but likely STILL better than shaders alone.
Someone correct me if i'm wrong..ray tracing determines how an object in a game would look when light is interacted onto tuat object wh8ch give it a shadow in certain area.
something that costs u 1.2k
Because you HAVE to buy the top-end model, right?
Ey
RT seems most useable for VR (as it already uses polygon counting for rendering the environments and objects) and rasterization for regular 3D rendering on a monitor or tv
Is anyone else in complete darkness?
Me right here. The first 30 seconds did nothing to explain tovme or make sense, how in the world am I going to follow another line if my eyes can only see to that point that's what I don't understand
on a simple note... I hope games begin to use it. it does seems like the next step of graphics and I would appreciate if we evolved a little bit on that. it does not seem to be a complicated tool for developers to use either.
Wtf is am incidentally sitting in a dark room.
Bhuvan Chandra And me
Engrrrish.
Is your screen on? Because that is a source of light.
im sitting in a room with light. but that light comes from behind my slightly open door, not giving me a visible light source to use
@@pr9101 That just means you have to switch to path tracing.
1:05 how could you even say "usually triangles" :D
I would like to see a ray traced
lightning in real life, with nvidia series.
rtx: it’s on
Kidney: it’s gon
BUT I AM SITTING IN COMPLETE DARKNESS ITS A LIFESTYLE CHOICE!
I call BS, how would you watch the video? :p
I am sitting here in bright sunlight "puts on shades"
You're the funniest thing I've ever seen
love this channel so far! super useful and cool info in not a lot of time! well done
Anyone here because next gen consoles have ray tracing?
2 years behind
"Just buy it™"
~ Avram Piltch
Can't understand a thing
Such a great clarity on your Explanation Just educated me about Ray tracing... Thanks for the Video Linus...
Hi. This has been in 3D software like Maya 20 years ago or so but using software, using the CPU (slow, not in real time- unless it's a renderfarm, like mentioned).
God bless, Proverbs 31
a renderfarm is usually used for animation rendering tho, to keep rendertimes below centuries, so I wouldn't really call it 'real time'. But I mean, real time is pretty vague. If you have a good PC, you can often preview your scene somewhat interactively (if it doesn't contain really computationally expensive things like detailed volumes etc).
Thanks I was quite curious about how this works.
"Ray tracing has made headlines lately as the rendering method of the future for games" Didn't we use ray tracing for a really long time, we just baked lights onto texture in order to save CPU power.
No, we didn't use ray tracing before this in 3D games, before baked lights we just didn't have lighting
@@alexbaker1569 im just saying that we had baked lights for a long time.
@@IAMDIMITRI true, but that's not what you were saying, you said we have used ray tracing in games before, which is false
@@alexbaker1569 That's what you do in for example counter strike 1.6 when you create maps. Light gets ray traced and baked onto textures. I'm certain that the whole quake engine uses this method.
@@IAMDIMITRI yes, we used ray tracing to create maps, but once the maps were created it was baked in, we have never before used real time ray tracing, the kind that is now starting to used in games
You only need to watch until 0:30
ps5 anyone?
Yep
No
Yee
Noo
Noshit
Thank you. I hear it all the time but didn't really get it. This was super simple. 👌
0:49 wait a second, you used that same clip of Minecraft from "Intel's coolest CPU... Thanks to AMD!" video! Do you guys even play Minecraft?!
Yeah, they spend their every waking moment playing games so that they can put different clips on their videos for your personal enjoyment.
No, they have more important things to do than play games during work (at least most of the time).
@@Mismatch- lol, I was joking.
I have a question...If i use a Dvi-d (dual link) to displayport adapter and use the rtx gpu, does ray tracing performance get affected...
PhysX 2.0
How could it be PhysX 2.0? The Ray-tracing technology has nothing to do with physics at all.
@@TheFoxyGamerOfficial I was sarcastic...look this is my personal opinion,but picture it like this.Giant corporate chess game and Nvidia just made their move.Just like they did with SLI,PhysX,GameWorks,GPP...and now Ray Tracing.The question is does the other side have a counter move or is it game over?I compared it to PhysX because thats another technology customers didnt really want at the time it was introduced.Bottom line...its business thats what it is..
What you mean by a counter move? Ray tracing is here since 4/5 years, just almost nobody is using it, that's why for gamers there's no point in using windows 10 because ray tracing is only used in directx 12.
I enjoy ray tracing very much and prefer it. The one thing I do have against it is that sometimes I feel as though the reflectiveness is a little strong. I'm not an expert or a coder though, is this from ray tracing or something else? For example 1:55 - 2:01 shows the images side by side with a very noticeable difference in light reflection between the two, BUT why does the tea pot have a mirrored effect and the normal one has absolutely none? Was this only introduced with ray tracing or is it to make more of eye catch to make the second image look better?
the latter. You can simulate rough and diffuse materials in raytracing too. It just isn't as different from normal game engines.
*doesn't look as different
Would this give me a ton of FPS?
Because I couldn't care less about ultra fancy looking tech if it won't give me a shit ton of FPS. FPS is what makes a game feel great. A good looking game that can reach 144hz is always superior to an ultra looking game but could only be spewed at 60.
>"only" 60fps
Console peasants be like :(
We don't know yet. People keep making conclusions about games that had two-weeks to implement RTX on Alpha drivers and assuming that's what we are going to get.
Logically if RTX handles things shaders traditionally did, then in some circumstances it might increase the frame rate, depending on if you can tweak the accuracy of RTX to get a balanced GPU load.
I rather play at 60hz with ultra realism than at 144 fps with crappy visuals. FPS above 60 means nothing at all, only if you're on competitive gaming, even though it isn't a must, just for the ones who want the best result possible.
Thank you for this much simple and easy explanation.