Ding ding. This guy gets it. Issue is unreal engine having a strangle hold on the industry with their baked in ray tracing. I have 3090, 4090 and a 4080 super. They all get crushed in unreal 5 games. I fully expect this to continue.
He is right on the money when he says it is forced. Developers are intentionally making the game look ugly as hell when you disable ray/path tracing compared to games before this crap was invented.
@@bosskey23 Yeah that is the sad truth. They will never grind like they used to in prior generations when they made games. Today everything is automated and has a plugin.
Good for you but at the certain point you may not have an option, or it will be made unfeasible for you to do it. Devs can always cripple the raster lighting (that they could do great, absolutely satisfactory things with if they wanted) to make people use RT, because it's less work for them. You using RT and upscaling with frame gen is a golden goose for corpos making games. They will always use those crotches to save on development time and talent. But people will be paying for their savings.
Sorry but that game looks amazing with PT. Sorry that you feel the way you do. Not sure why we can't enjoy PT and work on optimizing it rather than acting like butthurt little children.
I feel like the lighting quality difference between rasterization and path tracing is shrinking as rasterization continues to advance. Yes, rasterization is more work for the developers as they have to manually place the light sources into the game and do approximations. However, with advanced techniques like sophisticated shaders, they can achieve high-quality, accurate lighting with rasterization, and I think the quality will be enough to satisfy nearly all gamers. Rasterized lighting can still look incredible, and while it may not be quite as accurate as path tracing with tracing lots of rays per pixel, it can still look very high quality at a fraction of the computational cost. I think all gamers would rather have high-quality, accurate lighting with high FPS rather than photorealistic lighting that brings your GPU down to less than 15 FPS. The biggest problems with RT and PT are noise and artifacts. Because video games don't use enough rays per pixel due to hardware not being anywhere near powerful enough. Denoisers have to be used to fill in the missing pixels where the rays weren't able to get to. This creates low-quality and inaccurate lighting effects as the denoisers blend the sampled rays together. The higher the ray count, the better the lighting effects look, as they will be more accurate with less noise and artifacts. The thing is, hardware will never be powerful enough to be able to run video games at a high enough ray count to the point where denoisers become irrelevant. Current-gen path-traced games such as Alan Wake 2 use 3 rays per pixel, and even the 4090 struggles to run that. At native 4K max settings with PT, the 4090 gets around 25-30 FPS. Offline renderers such as movies typically use 1000+ rays per pixel, and even that's not enough to completely get rid of noise and artifacts.
@@Frogboyx1gaming Handcrafted rasterization gives game designers/game artists more control when it comes to aesthetics and art styles because rasterization is completely manual. Rasterization is the king of gaming graphics and is the best rendering technique for video games. It will not be completely replaced, at least to me, until ray tracing will become the game changer they claimed it was. In order for that to happen, hardware needs to get substantially faster.
And I have a feeling that you are literally clueless, dude. There's no point playing Alan Wake 2 at 4K native, when DLSS looks the same if not better. I had no problems playing this game with full PT on my RTX4080S (80-110fps) and that's very impressive considering how amazing the lighting in this game look. Yes, I saw a little bit of noise, but it wasnt excessive like in Quake 2 RTX (PT), or Half Life 2 RTX (PT) and most of the time it was something I could ignore. Without RT this game has a lot more artifacts because it's still using real time GI and ugly SSR and noisy shadows.
At least you'll have the necessary amount of fps to run your games. I'm still stuck with a Radeon 7 and an Intel core i7 47xx. Roughly a 10 year old rig. I hate how expensive this industry has become. It's totally unnecessary.
@@Serandi1987 When I bought my Asus 4090 Rog Strix I just wanted the créme de la créme graphics-wise experience and performance as well, little did I know that in order to decently run certain triple A games I would have to sacrifice that which I wanted the most in the first place. Today though I have to rely on mods to improve both visuals and fps, what a joke these high-end GPUs are.
@@elban73 I feel like we peaked in raw graphics presentation years ago so Nvidia had to come up with a grift and convince people they need ray tracing and path tracing. 4K native with high fps and baked in lighting and reflections is all I want and need.
I think the real issue is that the Return of Investment is getting lower each time. Graphics cards are much more expensive but graphic improvements are not that noticeable. In fact a common trend is that new games are blurrier than before (due to TAA and upscaling fakery), higher latency (due to fake frames generation and overall lower framerate), and stutter hell (mainly due to Unreal Engine 5) but I also have noticed it in Raytraced Games (there is some microstuttering going on)
@@Frogboyx1gaming ive had the RTX 4090 since it released and every game I've tried that offer Ray tracing ( not counting cyberpunk 2077 ) will give you less than an hour of play time before crashing and freezing with Ray tracing enabled I play all my games with Ray tracing disabled because in my opinion it's not worth the hit in performance let alone the headache of the game freezing and crashing after a hour or two of playing Ray tracing Is in it's infancy stage still even with the RTX 40 series being Nvidas third generation of graphics cards that offer Ray tracing ability. We just aren't there yet truthfully. If someone is considering purchasing a graphics card at the high end the 7900XTX should be there first choice than the RTX 4080 and then the 7900XT on down
@kevinerbs2778 control was released during the RTX 20 series. I haven't tried that game since my RTX 3080ti. Alot of the newer games that have ray tracing implemented doesn't do that well as far as not having any issues
@@bigdaddyt150played control with RT enabled from start to finish. Never once did it crash on me. Another game Dying Light 2 also never crash when enabling RT. though it is more demanding than Control.
@@chrisking6695Performance hit (expected) and visual artefacts (also expected, the hardware isn’t close to capable enough), which necessitate the use of DLSS, which vastly reduces image quality unless used at high resolutions. Is it worth the multiple double digit performance and general visual quality hit? Perhaps, but not for everyone.
@@sulimanthemagnificent4893 no its not worth it, and ray tray is only good, like transformative on very few games still to this day. And even then.. even then i don't like playing those games such as CP2077, Indiana Jones. The games that i'm gonna play the shit out of are Monster Hunter Wilds, Doom Dark Ages, and Elden Ring Nightreign in 2025, these 2 will definitely have ray tracing but i bet it's not gonna be one of the games where RT is gonna be very good where im missing out of how good it looks even with the performance hit. And also Subnautica 2, which i will not have a problem with performance or RT at all lol.
@@MarcusT3ch 5090 isnt for gamers wether you like it or not.. why amd and intel doesnt have such a graphic card ? well because amd and intel target gamers, so stop crying and buy intel or amd
Stop playing day 1 games. They are released half baked anyway. Give them a year or two, or 5. Now your hardware requirements are greatly diminished. Game gets optimized, and the bugs get worked out. Get over your FOMO
on launch, then after a year offer a PSYCHO RT mode with 4 times as many rays, oh what your 5090 can only get 30fps with this setting with upscaling performance and frame gen on, better buy the 6090.
So true. This is how NVIDIA is forcing generational upgrades. If it's not the VRAM shortage then it's the RT performance. At this point RT is not a bonus, it's a requirement.
Its already happening in Indiana Jones. The funny thing is, despite the game getting bundled with new RTX card you purchase recently. Its really not have many people play on it. It just massive echo chamber in forum.
@@michaelbuto305 The gaming industry literally plays in NVIDIA's hand since NVIDIA holds a 90% market share. Game developers are gonna optimize their games so that NVIDIA will have a reason to further push and market raytracing as a "standard".
Good video. Companies these days, especially NGreedia, have learned to incorporate planned obsolescence into their product lines. There will always be a way for someone in the pipeline to gimp any card, including the flagship offerings. Shit sucks.
1080ti still works for 1080. The problem is that not enough people have upgraded to 1440p 144hz. Then they'd need to create more competition. Right now, they sell 1080p cards for $299
Ray tracing today is just what tessellation was 15 years ago, where it required specific hardware to do it and Nvidia cards were the first to utilize it and AMD eventually caught up, but in some early games before AMD did hardware controlled tessellation quantities the game devs would ramp up the tessellation and it would cripple performance on AMD cards, I suspect this was planned by Nvidia to make AMD cards look bad. Now we have a proprietary RT tech being utilized in games which is heavily favoring Nvidia cards but the funny thing is not even Nvidia cards can run them properly, again its probably by design so everyone continuously upgrades their PCs to support the latest RT title. however there is still some hope as a lot of game devs are either not using UE5 or they are implementing different lighting techniques with great results.
A memory bus can be thought of literally as lanes of traffic. The more lanes dedicated to traffic, the greater the flow. The graphics processor is connected to the RAM on the card via a memory bus, and it is also very important to look at the memory bandwidth of the card as well as how many (GB/sec) it can handle. In effect, this is how fast the traffic flows down lanes. And what I learned in the last few years is the need to keep an eye on the amount of VRAM as well, and Nvidia has used a lack of VRAM to create planned redundancies in older generations of GPUs, and the fact we now see hardware requirements for ray tracing in newer games is writing on the wall for a lot of older GPUs. But over the years of gaming, I've learned to always buy cards with larger bus widths and to keep an eye on bandwidth; with slow bandwidth, this results in the traffic in effect moving more slowly. You need both to work to get the best overall experience-and I would also now suggest getting a GPU with a minimum of 16GB of VRAM; just my personal thoughts: a good foreshadowing of GPU hardware requirements is Indiana Jones and the Great Circle. So look at benchmarks of this game for any new GPU you are thinking of getting. Now with ray tracing, I think we may see a setting in the graphic menu. That will let you set the number of times the GPU raytraces, so let’s call it bouncing. If you are a gamer, you will set the ray to say bounce 4, 8, 12, 16, etc. So you turn on the number of rays to a level that your GPU can handle; now, the influence that Nvidia has over the game development due to market share, it is massive, and this needs to end, and the only thing that will affect this is that Intel and AMD gaining market share.
To further your analogy: interface width represents your traffic lanes. Bits or transfers per second are the speed limit. Latency and timings are like traffic light timings that can slow down traffic if they're not set right.
After 30 years this guy has finally figured out Nvidia's business model of always having a new faster graphics card to sell you. As far as raytracing/path tracing goes it is big time diminishing returns after you hit 50 rays per pixel and then 200. Above around 200 it takes several thousand to look a notch better. There also maybe real-time A.I. filters on the horizon where you can pick through filter options on how you want your game to look. In 2-4 more generations the tensor cores may be able to do that.
We are truly in the era of diminishing returns. Games in general, do look better than they did in 2015 - 2018 but... Not at the cost increase. The implementation is the issue right now. We (the gamers) are buying the direct cost reduction in development time for big game industries. CD Project Red: Cyber Punk 2077 really is a well optimized game (not so much on release :)) It runs extremely well on AMD apu handhelds. Anything on Modern Unreal engine 5: Generally runs like crap on even the top end. This is generally due to the Publisher or Licensed company cutting cost of developers that know what they are doing to optimize the game engine and make it bespoke. (there are some great optimized game using UE5 just not many) Apple sort of used this model to also get upgrades of their phones every year in a analogous way. Instead of cutting the battery of a phone down with software every two years it's Nvidia giving game companies and creators of general licensed game engines a "new spec" that essentially leaves older cards in the dust with minimal upgrades to fidelity.
This is why Nvidia limits VRAM in all cards aside from their xx90 series. It's planned obsolescence. Either buy the $2000+ GPU, or expect your current card to be tapped out by the time the next cards roll out.
@Frogboyx1gaming and they shouldn't break. This entire industry needs a kick up the backside. The market needs a big correction and Lisa Su either needs to leave AMD to allow for another CEO to compete harder, or let the competition flourish like Intel and ARM. I miss the competitive days of Nvidia vs. ATI.
This is just the way it is dude. The issue is that modern GPUs cannot render games real time at 4k native. And I'm not talking UE5 engine. There are older games such as RDR2, which is heralded as the poster boy for how to optimize a game, that still bring a 4090 to its knees without DLSS. We are simply at a point where we need other methods to keep FPS up if we want to continue making games that look next gen for 4k gaming. But of course I could be wrong.
Whats up man. I got a pc question for u. When u built your last pc did u re-purchase windows or were u able to reactivate it? I ask cuz I'm replacing my motherboard and want to know what to expect upon turn on. Does it load up directly to windows or do i need to do something to get my windows back? Thanks In advance
Been saying this from the very launch of RT. Even Nvidia hardware can't run it without cheats like DLSS. And suckers line up to buy cards that can't run 4k native at 60 fps with RT enabled, no matter how much money you throw at it.
4090 plays Dragonage Veilguard at 60fps 4K native with max RT. It's a stupid way to play it though. DLSS and frame generation work perfectly in that game.
This is basically the same as when they kept increasing polygons. At some point there is diminishing returns and you just cannot go further up. I believe that raytracing will hit a ceiling, but Nvidia is already working on the next big thing which is "neural rendering". So yes, they will always have something new to add but not always more raytracing. Also, soon they will need massive NPU power to run the brains of AI for games.
I bought into the RT thing on my last GPU upgrade with a 4070 TI Super and now having experienced it, I'm over it. The problem is I don't have anywhere to go except Nvidia. Both AMD and Intel are focusing on low to mid tier offerings. Maybe a used RX 7900 XTX is in my future?
Rays scale, just like resolution and triangle density, to some point of diminishing returns. You wouldn't call resolution a never ending money making scam, more is generally better, so why look at raytracing any different. Current RT has issues with the huge performance impact and low raycount leading to artifacting, but it is a neccesary step for better graphics long term. The 4090 will do 4k raster with baked lighting at 100+fps easy. There is plenty of space to improve but we are clearly into diminishing returns.
True. agree. just for people who wanna do Productivity sadly have to choose Nvidea Route. AMD is for gamers then they should be even less priced. RX7600, RX7700xt, RX7900GRE, 7900XT, & 7900xtx are at this time available at msrp after inflated prices. Why I will ever choose it when 7900xt is same as 4070ti S. no discount ever.
I got that Turtle Beach wheel and pedals hooked up. I definitely think its worth checking out for the price. I got a folding Podium One rig for it, just like I've always used with my Logitech set, but it can barely handle that direct drive, and I don't think it wiill for very long. Hope you're recovering from the hack bs well.
@@Frogboyx1gaming Brooo, You're so right! And I don't know why the heck I said Podium One🤣Seems like you knew what I meant though. I have the Next Level GT Lite, which is the same price as the Victory. I'll see if I can send it back to order the one you suggest.
Exactly true. We were promised ray tracing in 20 series. It has a fleeting moment with ray traced reflections in 3090 and 3080 cards. Now, 40 series cards other than 4090 can't afford the overhead. While the 4080 has the grunt it lacks the v-ram. 5090 should be able to do it for probably 12 months, max, at 4k.
All the upscalers look good when upscaling to 4K, the majority of PC gamers though are on 1080p and 1440p monitors where upscaling looks horrendous, quality, balanced, and performance all look terrible on a 1080p display, on a 1440p display anything below quality upscaling looks terrible. On a 4k display you can use performance upscaling and have it look better than native 1080p or 1440p. As for the always chasing RT, that has been the same with every new visual tech on PC since the 90s. It was the same with soft shadows, new lighting techniques, bump mapping/parallax occlusion mapping/tesselation, volumetrics, hairworks/tressFX, etc, etc.
@@Frogboyx1gaming They did the same with previous stuff as well, shadows used to be the big performance killer, they'd keep increasing the resolution of the shadows in new titles and you felt like you were back to square one. Tesselation actually ruined AMD/ATi for a good while as well because it was pushed by Nvidia and ran better on their cards. The difference now though i guess is that they're still increasing the resolution and there's more ray bounces coming at the same time.
It's actually not really unlimited, the real limit is how many pixels the used monitor utilizes. So if you use a 4K monitor then you need as much as bandwidth as you want since 1 frame is literally 4Mb. To get 60 FPS your system must be able to transport 60 (FPS) x 4 (4K/4Mb = 240Mb per clockcycle (refreshrate).... etc. Adding raytracing easily qaudruples the amount needed thus always cluttering the VRAM eventually.
Another thing game developers can do is it so that every new game released has to have ray tracing baked in and no turn off switch. And making it require a level of GPU that can play it regardless of if you can afford it or not. You want to play this game? Then you have to have this GPU end of discussion. Ray tracing is nice but damn, not every game needs to be photo realistic quality. Make a game with good visuals that any GPU can run, give the game a good story, and just release the game not broken. Sucks when a game is released before it's finished and requires multiple updates to play right. Let's slow down, loosen the tie, and release the game when it's finished and bug free. Deadlines be damned. I can't be the only one that feels this way.
AMD never pushed for raytracing when nvidia started campaigning for it... We all should have known that there's some hidden agenda behind that campaign... We were just fine improving on rasterization techniques and we were getting close to hitting visual realism without any considerable hit on performance till the advent of raytracing and UE 5 being built around it... Light geometry can easily be simulated to near the real thing in raster... Imo we should ditch raytracing completely...
It takes 4-5 years to develop a GPU. AMD didn't jump on the bandwagon because of the amount of work that was already done on their upcoming cards. They slapped in the bare minimum and hoped for the best. After all, the current gen consoles are based on their RDNA architecture so that will be the baseline. Mark Cerny made it pretty clear that we hit a wall with how far raster lighting can go. It's obvious PS6 will be based around trying to achieve path traced lighting.
@Phil_529 mark cerny saying they hit a wall is not as plausible as if AMD said it... But what do i know... Present gen do not look of feel like a generational leap from last gen and despite the improvements of hardware on paper it does not translate to what we see, present gen hardware tanks badly under pressure if subjected to what is obtainable especially in UE 5... I do not want to believe that all the game devs suddenly became lazy... Because people are blaming the devs, no ome is talking about the engine used and render Options...
Depends on the implementation, i started GOW Ragnarok after 100 percenting Indiana Jones and to me Ragnarok and Forbidden West (which I'd played earlier in the year) easily beat Indy when it comes to visuals and neither of those games have RT.
I am telling you this is the beginning of the end of Scamvidia. And another thing. through all these gpu-scalping-years game developers and Steam kept a deafening suspicious silence , where normally they should be all over the place complaining how their sale rate has dropped.
The new Indiana Jones game requires a RT hardware GPU. They're able to utilize RT cores for lighting while not using ray tracing. Reading their System Requirements - GPU Hardware Ray Tracing Required. I would believe it was a scam before, but if games will require these RT GPUs, then I guess it really isn't a scam.
games will not even allow you to turn rt off XD we soooo screwed if they keep requiring more we will have to upgrade every gen and resell value will be low.
From what I can see the 5090 is the only one that has been redesigned. Isn't the tell in how much they are throwing into the memory bus and quantity on that card? It looks like the others they threw in some minor core increases and faster memory and that is about it.
The 5090 looks like this generation 1080ti. AMD is missing the RT & Power. I have a 7900 XTX FSR is trash. I don’t use if I don’t have to. I do want to use the RT.
You’re 100% correct everything will be streamed. There will come a day where hardware is a thing of the past and everything will be a subscription. You will own nothing and you will be happy.
Eventually you hit the point where you cannot get the image better with more work. The human eye has limits and monitors even more so. That said it's a ways off. Unless there is a breakthrough somewhere it's at least a decade out.
It's not a coincidence that Nvidia gimp their cards VRAM in order to upsell you to the next tier despite the gpu's being very capable in terms of performance. Good examples for these are the 3070, 3080, 4060, 4070 and its other iterations other than the Ti Super variant. Those cards are very capable but because of the lack of VRAM they perform worse than their counterpart from AMD.
Did the opposite, didnt bought anything new after the 1080ti and saved alot of money. Thanks to the UE5 i have to upgrade now because the 1080ti is too slow for it at 1440p. With the next card i still go for raster power alone, if its fast in RT its nice to have.
the main issue with nVidia is that even after AMD announced FSR being able to run on any hardware, even after that nvidia said that DLSS will be only available for certain GPUs, maybe the newest DLSS will be available only for the 40/50 series GPUs AMD for the win, i hope they will release FSR soon in january 2025
@Phil_529 FSR4 will probably need AI cores but still people can use FSR 3.1 which is good enough for older hardware, too bad nVidia bribes developers to not implement FSR 3.1 in their games, cyberpunk for example
@@HoretzYT You realize it was AMD who was blocking DLSS from being in their sponsored titles for years, right? Cyberpunk shipped with the best AMD upscaling tech available at the time (FidelityFX CAS and later upgraded to FSR 2). It wasn't until Starfield that AMD was publicly shamed into not doing that anymore.
Ray tracing is very nice feature at gaming.I don't buy expensive cards that i get better Ray tracing 1500-2000 is too much money this purpose.Thank you for this great video👍
Nah it’s not just because of ray tracing. Developers have just gotten lazy. They don’t optimize really that much anymore especially when all triple A games are in UE5. Just look at Warzone in CoD. They don’t compress textures anymore it’s all texture streaming, (each texture package is like 4k quality) thats insane. Sure ray tracing and path tracing is going to stress the fuck out of your new gen gpu. But each studio is relying more on up scaling and frame gen and well that’s why us not getting 20gb+ cards is so bad..
Problem is shortly we are going to be in a situation where nvidia offer 16gb (unless you spend insane amounts), and amd also offer only 16gb. The 20gb and 24gb cards are being phased out from amd. The new 90 cards from amd only go to 16gb.
Just dont buy the game, look how bad great circle sells. Atleast on wukong u can run it without rt. Also yeah nvidia cards below 4070ti super isnt worth even if u run RT due to vram limitations
Remember when tessellation came out with assassins' creed and one vender did really good at it and it was open source while the other was closed and not good and it then they removed together? Yeah i remember.
for the people 4090 traces like 8-10 rays per pixel or something to get where it needs to be is 1000 per pixel or something. We have a ways to go. Machine learning is not AI, different concepts. The industry is getting all this stuff confused. But yeah, hey if nvidia gets neural rending to work good and stuff, great. At some point we will hit performance walls
I don´t care about RT because their prime example Cyberpunk is doing it poorly in my opinion, in which universe do wet floors turn into mirrors? Some surfaces turn mirror like when wet but with RT it´s like every wet surface becomes a mirror 🤮
I've been rolling with AMD because they always had reasonable prices and extra ram. While Nividia gpu's with parody had less ram and were a few hundred dollars more. But this time, I was thinking about Nividia for RT and 1440p gaming, but now I don't know. I was planning on getting a 4070 super ti. Around 700$. Now I'm thinking about getting the AMD 7900 XTX.
The next gen is literally around the corner, in what world does it make to buy a current gen card when the next gen is out in a few weeks? Even if next gen cards are a bad deal proportionally, you'll still get more bang for buck. A 4070 was considered a dud this gen, but it was still a way better deal than a 3080 at launch, same for 7900XT vs 3090TI. Don't ever buy a current gen card when the next gen is about to launch, you'll be angry.
@JynxedKoma i'm starting to get sick and tired of RT. It's just too demanding for most cards. You pretty much have to buy the very highest of high end cards, and even those still have issues with some games. Nvidia is really starting to look like a bag of assholes with their prices.
we are at a point now where raytracing looks incredible. il be upgrading to a 5090 and i wont see the reason to upgrade for a long long time
3 วันที่ผ่านมา
Nice theory but will never happen. It would only cause stagnation in sales. Denoiser + certain number of rays per pixel will look so good that even doubling number of rays will change nothing/almost nothing. People would stop to use high settings and keep GPUs for longer. It's really obvious that next thing is AI computing used for textures and geometry. BTW. we know it from leaks. Then next step will be NPCs AI.
I think NVIDIA makes better cards, but I'm also down to get the entire gaming community to boycott them until they make their high end GPUs more affordable. The problem with that idea is that their top of the line GPUs are just objectively better than the competition, so people are going to buy them regardless. They know this, and that's why they continue to price the way they do.
Devs are going to go with RT and DLSS. It’s simply cost too much money to rasterize the lightning and creating highly detailed 4K textures. The cost of making a game is doubling every gen. A 100% increase but games go up 16% by 10 dollars. It’s not sustainable in the long run. Ai textures and PT/RT is the future.
But the problem with ray tracing, as of now, is that we need more rays and more bounces for rays of light. That's why every game with RT is so blotchy, dark in places, unstable mess. There is just not enough data to create stable well lit image. It's not possible to do real time raytracing with current hardware, that's why there is so much fakery, averaging and AI up/resampling. It's still too early to run RT on current hardware. We should have started the transition to RT much later in time. So, YES, game developers will create more computationaly expensive games with RT. If games were as simple as Jensen showed you at the first RTX20 presentation, that is a rectangular room with sphere and cube in it, then yes, we could devote all the resources for ray tracing and have nice lighting. But games are not just a cube and sphere in a room. RT is not like the scam with tessellation in some games, where game objects were created with more complex geometry, which were indistinguishable from the same object without tessellation. While tessellation is a good technology it was abused in some games.
I'm running a prebuilt I got in 2022 ryzen 9 5950x/ EVGA rtx 3090 cost me £3100.Agreed the 90class brings a tonne of vram so never any stutter or problems with loading textures.However I hate that Ngreedia never provide framegen for ampere unbelievable it's amd that save the day.Otherwise I'd be getting not so impressive framerates.I really am impressed with my 3090 strange that I'm having to thank team red for providing me this with most recent Avatar Frontiers Of Pandora, Starwars Outlaws, Black Myth wukong for supporting FSR3/framegen.As long as I have framegen I'm not sure I'll buy a 5090, that would mean a new PC.
New games: Look bad, play bad, run bad, store bad, priced bad. I can't see a single reason to upgrade my hardware. This year I challenged myself to not use a desktop and just used a handheld gaming PC. Had the most fun I have had in years because I was restricted to older titles. Its all a sham at this point. I have an oled monitor as well and it really shows off how blurry and washed out new titles are. Everything has this fog like white over it I have to use hacks, ini edits or reshade to try and get rid of that older titles just don't have. I think its just a matter of time before I nope out of gaming all together. I'm not spending as much as a nice used car on a PC so I can turn around be lectured by a game and also have to try and fix its graphics with mods to store a 300GB game on an ssd to get worse game play than Jack and Dexter.
I will run my 3080 and 5800x until the wheels fall off. Does absolutely everything I could want for now. I don't care about ray tracing really... half the time it doesn't look great anyway.
I have been blown away a lot of times on how damn close my 500 dollar consoles can get too my 2700 dollar pc ….. yea my pc looks better but not 2200 dollars better if im being honest but thats why for me personally im always gonna go pretty high up the stack in gpu’s so o can get some decent uplift over the console at least in RTand be able to run apt at playable fps to make it more worth it for me lol
RT or Lumen will be pushed more and more and Nvidia performs better at that. Newer games will have one or the other and without possibility to turn it off. So now not only will the 4070 ti super, but the 4070 ti will surpass the 7900xt etc... I wish it wasn't so.
Just buy the card with the most VRAM, enjoy PT & RT ... until you can't, then keep lowering DLSS, then enjoy pure rasterization, then lower settings, then lower resolution till fps becomes unplayable ... then plan for you next GPU (& new PC) upgrade. I still prefer Nvidia coz DLSS > FSR.
Just get a ps5 pro and provlem solved. All the modern pc gaming nerds are a lol. Most of em disnt even experience the 90s pc gaming wjen it was actually great circa duke nukem 3d, quake, quake 2 quake 3 etc. Pc gaming got crappy aboit 2010 and now in 2024 its a stutter fest. Now days pc gets broken crappy ports of games. I remeber building a bleeding edge PC for Doom 3 and half life 2 in the early 2000s. Thats when PC gaming peaked.
Ding ding.
This guy gets it.
Issue is unreal engine having a strangle hold on the industry with their baked in ray tracing.
I have 3090, 4090 and a 4080 super.
They all get crushed in unreal 5 games.
I fully expect this to continue.
it doesnt help that developers using unreal 5 tend not to optimize on top of things
It's a endless loop chase towards power.
He is right on the money when he says it is forced. Developers are intentionally making the game look ugly as hell when you disable ray/path tracing compared to games before this crap was invented.
@@cemsengul16Ray tracing is becoming the standard. It saves devs a lot of time when making their games
@@bosskey23 Yeah that is the sad truth. They will never grind like they used to in prior generations when they made games. Today everything is automated and has a plugin.
SOLVED: I don't run ray tracing!!
;)
Good for you but at the certain point you may not have an option, or it will be made unfeasible for you to do it. Devs can always cripple the raster lighting (that they could do great, absolutely satisfactory things with if they wanted) to make people use RT, because it's less work for them. You using RT and upscaling with frame gen is a golden goose for corpos making games. They will always use those crotches to save on development time and talent. But people will be paying for their savings.
Let's play Cyberpunk2077 every month for the 5th year in a row - Nvidia Ray Trace buyers.
They will be resurrecting Cyberpunk 2077 really soon LOL 😆 😂 🤣
@@Frogboyx1gamingYou bet. And people will be happy paying $2500+ to play Black Myth Wukong above 50fps at 1080p native.
My card is amd, but playing cyberpunk once a year is kinda valid 😂
So good
@@Frogboyx1gaming i wish to rather see FSR 3.1 in that game after 4 years that would be first good thing for AMD players but thats not gonna happend.
Sorry but that game looks amazing with PT. Sorry that you feel the way you do. Not sure why we can't enjoy PT and work on optimizing it rather than acting like butthurt little children.
I feel like the lighting quality difference between rasterization and path tracing is shrinking as rasterization continues to advance. Yes, rasterization is more work for the developers as they have to manually place the light sources into the game and do approximations. However, with advanced techniques like sophisticated shaders, they can achieve high-quality, accurate lighting with rasterization, and I think the quality will be enough to satisfy nearly all gamers. Rasterized lighting can still look incredible, and while it may not be quite as accurate as path tracing with tracing lots of rays per pixel, it can still look very high quality at a fraction of the computational cost. I think all gamers would rather have high-quality, accurate lighting with high FPS rather than photorealistic lighting that brings your GPU down to less than 15 FPS. The biggest problems with RT and PT are noise and artifacts. Because video games don't use enough rays per pixel due to hardware not being anywhere near powerful enough. Denoisers have to be used to fill in the missing pixels where the rays weren't able to get to. This creates low-quality and inaccurate lighting effects as the denoisers blend the sampled rays together. The higher the ray count, the better the lighting effects look, as they will be more accurate with less noise and artifacts. The thing is, hardware will never be powerful enough to be able to run video games at a high enough ray count to the point where denoisers become irrelevant. Current-gen path-traced games such as Alan Wake 2 use 3 rays per pixel, and even the 4090 struggles to run that. At native 4K max settings with PT, the 4090 gets around 25-30 FPS. Offline renderers such as movies typically use 1000+ rays per pixel, and even that's not enough to completely get rid of noise and artifacts.
I honestly prefer the artistic look of hand placed visuals
@@Frogboyx1gaming Handcrafted rasterization gives game designers/game artists more control when it comes to aesthetics and art styles because rasterization is completely manual. Rasterization is the king of gaming graphics and is the best rendering technique for video games. It will not be completely replaced, at least to me, until ray tracing will become the game changer they claimed it was. In order for that to happen, hardware needs to get substantially faster.
Path tracing should have stayed with static images and prerendered scenes for another 20 years. Its just not ready for realtime gaming.
And I have a feeling that you are literally clueless, dude. There's no point playing Alan Wake 2 at 4K native, when DLSS looks the same if not better. I had no problems playing this game with full PT on my RTX4080S (80-110fps) and that's very impressive considering how amazing the lighting in this game look. Yes, I saw a little bit of noise, but it wasnt excessive like in Quake 2 RTX (PT), or Half Life 2 RTX (PT) and most of the time it was something I could ignore. Without RT this game has a lot more artifacts because it's still using real time GI and ugly SSR and noisy shadows.
Sadly companies care about more profits and less cost ray tracing cuts costs, so they will continue to develop until raster is not needed
Nvidia forced me to buy an RX7900 XTX for $829. Never owned an AMD card before. Probably never own another Nvidia card. :(
Nvidia is pretty good at that
At least you'll have the necessary amount of fps to run your games.
I'm still stuck with a Radeon 7 and an Intel core i7 47xx. Roughly a 10 year old rig.
I hate how expensive this industry has become. It's totally unnecessary.
I'm a 4090 owner but I'd like to thank you for being honest, straightforward and for calling these greedy companies out! You got a new subscriber.
Thank you for your support! 🙏 I really appreciate it.
Same just subbed too.
Gg 2024!
also a 4090 owner but almost never use ray- & pathtracing and always use DLSS quality with max uv the GPU!
@@Serandi1987 When I bought my Asus 4090 Rog Strix I just wanted the créme de la créme graphics-wise experience and performance as well, little did I know that in order to decently run certain triple A games I would have to sacrifice that which I wanted the most in the first place. Today though I have to rely on mods to improve both visuals and fps, what a joke these high-end GPUs are.
@@elban73 I feel like we peaked in raw graphics presentation years ago so Nvidia had to come up with a grift and convince people they need ray tracing and path tracing. 4K native with high fps and baked in lighting and reflections is all I want and need.
I think the real issue is that the Return of Investment is getting lower each time. Graphics cards are much more expensive but graphic improvements are not that noticeable. In fact a common trend is that new games are blurrier than before (due to TAA and upscaling fakery), higher latency (due to fake frames generation and overall lower framerate), and stutter hell (mainly due to Unreal Engine 5) but I also have noticed it in Raytraced Games (there is some microstuttering going on)
Absolutely I notice it more.
@@Frogboyx1gaming ive had the RTX 4090 since it released and every game I've tried that offer Ray tracing ( not counting cyberpunk 2077 ) will give you less than an hour of play time before crashing and freezing with Ray tracing enabled
I play all my games with Ray tracing disabled because in my opinion it's not worth the hit in performance let alone the headache of the game freezing and crashing after a hour or two of playing
Ray tracing Is in it's infancy stage still even with the RTX 40 series being Nvidas third generation of graphics cards that offer Ray tracing ability. We just aren't there yet truthfully.
If someone is considering purchasing a graphics card at the high end the 7900XTX should be there first choice than the RTX 4080 and then the 7900XT on down
@@bigdaddyt150 Control didn't ever randomly close for me with RT enable?
I have two RTX 2080 ti's in S.L.I with a 5800x 3D.
@kevinerbs2778 control was released during the RTX 20 series. I haven't tried that game since my RTX 3080ti.
Alot of the newer games that have ray tracing implemented doesn't do that well as far as not having any issues
@@bigdaddyt150played control with RT enabled from start to finish. Never once did it crash on me. Another game Dying Light 2 also never crash when enabling RT. though it is more demanding than Control.
Ray tracing is like motion blur to me.
I immediately turn it off when i can.
Creating a problem to sell you a solution is way too common these days.
Not sure how RT in games like CP2077 is a problem. Smh.
@@chrisking6695Performance hit (expected) and visual artefacts (also expected, the hardware isn’t close to capable enough), which necessitate the use of DLSS, which vastly reduces image quality unless used at high resolutions.
Is it worth the multiple double digit performance and general visual quality hit?
Perhaps, but not for everyone.
You don't know what it is then.
@@sulimanthemagnificent4893 no its not worth it, and ray tray is only good, like transformative on very few games still to this day. And even then.. even then i don't like playing those games such as CP2077, Indiana Jones. The games that i'm gonna play the shit out of are Monster Hunter Wilds, Doom Dark Ages, and Elden Ring Nightreign in 2025, these 2 will definitely have ray tracing but i bet it's not gonna be one of the games where RT is gonna be very good where im missing out of how good it looks even with the performance hit.
And also Subnautica 2, which i will not have a problem with performance or RT at all lol.
Can’t blame Ngreedia if folks keep buying their GPUs.
What do you expect, people to buy AMD garbage? are you a fanboy who likes to burn money, do you live from government checks instead of hard work?
@@gamingtemplar9893So let’s pay $2500 for a GPU, right.
@@MarcusT3ch 5090 isnt for gamers wether you like it or not.. why amd and intel doesnt have such a graphic card ? well because amd and intel target gamers, so stop crying and buy intel or amd
@@gamingtemplar9893
Nvidia got it in your mouth
@@xRealUzzamd messed up on their chip for higher range. Intel just getting in the game.
This is exactly why I don't care about RT.
Sapphire Pulse 7900xtx owner.
A fanboy basically
@gamingtemplar9893 😂
@@gamingtemplar9893 based amd fanboy
@@gamingtemplar9893 no playboy ;)
The flipside is, if I may, the less you buy the more you get screwed!
Stop playing day 1 games. They are released half baked anyway. Give them a year or two, or 5. Now your hardware requirements are greatly diminished. Game gets optimized, and the bugs get worked out. Get over your FOMO
They gonna put 32 rays into Witcher 4 😔
on launch, then after a year offer a PSYCHO RT mode with 4 times as many rays, oh what your 5090 can only get 30fps with this setting with upscaling performance and frame gen on, better buy the 6090.
If it's not Raytracing it will be something else you'll need to upgrade for, and AMD will find a reason for you to upgrade to. This is how it works.
Thank you for your logic, it keeps me grounded.
It keeps me grounded too. That's why I share it.
So true. This is how NVIDIA is forcing generational upgrades. If it's not the VRAM shortage then it's the RT performance. At this point RT is not a bonus, it's a requirement.
It's only going to get worse
It's not a bonus, it's a requirement. Very well said
Its already happening in Indiana Jones. The funny thing is, despite the game getting bundled with new RTX card you purchase recently. Its really not have many people play on it. It just massive echo chamber in forum.
@@michaelbuto305 The gaming industry literally plays in NVIDIA's hand since NVIDIA holds a 90% market share. Game developers are gonna optimize their games so that NVIDIA will have a reason to further push and market raytracing as a "standard".
@@laszlozsurka8991The cost of games ate doubling every gen. So expect devs to use RT and DLSS to save dev time.
Good video. Companies these days, especially NGreedia, have learned to incorporate planned obsolescence into their product lines. There will always be a way for someone in the pipeline to gimp any card, including the flagship offerings. Shit sucks.
1080ti still works for 1080. The problem is that not enough people have upgraded to 1440p 144hz. Then they'd need to create more competition. Right now, they sell 1080p cards for $299
Ray tracing today is just what tessellation was 15 years ago, where it required specific hardware to do it and Nvidia cards were the first to utilize it and AMD eventually caught up, but in some early games before AMD did hardware controlled tessellation quantities the game devs would ramp up the tessellation and it would cripple performance on AMD cards, I suspect this was planned by Nvidia to make AMD cards look bad. Now we have a proprietary RT tech being utilized in games which is heavily favoring Nvidia cards but the funny thing is not even Nvidia cards can run them properly, again its probably by design so everyone continuously upgrades their PCs to support the latest RT title.
however there is still some hope as a lot of game devs are either not using UE5 or they are implementing different lighting techniques with great results.
A memory bus can be thought of literally as lanes of traffic. The more lanes dedicated to traffic, the greater the flow. The graphics processor is connected to the RAM on the card via a memory bus, and it is also very important to look at the memory bandwidth of the card as well as how many (GB/sec) it can handle. In effect, this is how fast the traffic flows down lanes. And what I learned in the last few years is the need to keep an eye on the amount of VRAM as well, and Nvidia has used a lack of VRAM to create planned redundancies in older generations of GPUs, and the fact we now see hardware requirements for ray tracing in newer games is writing on the wall for a lot of older GPUs.
But over the years of gaming, I've learned to always buy cards with larger bus widths and to keep an eye on bandwidth; with slow bandwidth, this results in the traffic in effect moving more slowly. You need both to work to get the best overall experience-and I would also now suggest getting a GPU with a minimum of 16GB of VRAM; just my personal thoughts: a good foreshadowing of GPU hardware requirements is Indiana Jones and the Great Circle. So look at benchmarks of this game for any new GPU you are thinking of getting.
Now with ray tracing, I think we may see a setting in the graphic menu. That will let you set the number of times the GPU raytraces, so let’s call it bouncing. If you are a gamer, you will set the ray to say bounce 4, 8, 12, 16, etc. So you turn on the number of rays to a level that your GPU can handle; now, the influence that Nvidia has over the game development due to market share, it is massive, and this needs to end, and the only thing that will affect this is that Intel and AMD gaining market share.
To further your analogy: interface width represents your traffic lanes. Bits or transfers per second are the speed limit. Latency and timings are like traffic light timings that can slow down traffic if they're not set right.
idk man it seems like 1 lane running at 10ghz is the same thing as 10 lanes running at 1 ghz lol
The RT is the secret elixir to milking gamers!
Exactly 💯 my friend
Humble, honest guy! I respect it! He’s speaking facts!
Frog slowly becoming the Nvidia Dreamcastguy
bruhhh 😂 He is the king of rambling. But I can't get enough of it for some reason.
LMFAO dreamcastguy
😂😂
After 30 years this guy has finally figured out Nvidia's business model of always having a new faster graphics card to sell you. As far as raytracing/path tracing goes it is big time diminishing returns after you hit 50 rays per pixel and then 200. Above around 200 it takes several thousand to look a notch better. There also maybe real-time A.I. filters on the horizon where you can pick through filter options on how you want your game to look. In 2-4 more generations the tensor cores may be able to do that.
We are truly in the era of diminishing returns. Games in general, do look better than they did in 2015 - 2018 but... Not at the cost increase. The implementation is the issue right now. We (the gamers) are buying the direct cost reduction in development time for big game industries.
CD Project Red: Cyber Punk 2077 really is a well optimized game (not so much on release :)) It runs extremely well on AMD apu handhelds.
Anything on Modern Unreal engine 5: Generally runs like crap on even the top end. This is generally due to the Publisher or Licensed company cutting cost of developers that know what they are doing to optimize the game engine and make it bespoke. (there are some great optimized game using UE5 just not many)
Apple sort of used this model to also get upgrades of their phones every year in a analogous way. Instead of cutting the battery of a phone down with software every two years it's Nvidia giving game companies and creators of general licensed game engines a "new spec" that essentially leaves older cards in the dust with minimal upgrades to fidelity.
This is why Nvidia limits VRAM in all cards aside from their xx90 series.
It's planned obsolescence. Either buy the $2000+ GPU, or expect your current card to be tapped out by the time the next cards roll out.
i’m hoping intel cards start competing for way cheaper! that would be amazing. they seem to be heading that way! 🙏
The B580 is actually a darn good card! Hopefully the B750 is in the pipeline. :)
@ yes i agree!!! i have high hopes that they also go all the way to high end by 2026.
agreed 100% Nvidia assumes its customers like selling every 2 year and losing 50-150 bucks doing so aswell
Finally someone is talking about AMD
You finally found me after 18 months
Someone just got the internet. Welcome.
enroll in a comedy class ..
I am a big fan of Nvidia but we need more TH-camrs like you for Nvidia to hear us prices are very high.
I think most youtubers that are pro AMD usually break due to the overwhelming push back from Nvidia fans sadly 😥
@Frogboyx1gaming and they shouldn't break.
This entire industry needs a kick up the backside. The market needs a big correction and Lisa Su either needs to leave AMD to allow for another CEO to compete harder, or let the competition flourish like Intel and ARM.
I miss the competitive days of Nvidia vs. ATI.
"people aren't smart ,"common sense" isn't always as common as we think
solution turn off Ray tracing. It’s a waste of time anyway..
This is just the way it is dude. The issue is that modern GPUs cannot render games real time at 4k native. And I'm not talking UE5 engine. There are older games such as RDR2, which is heralded as the poster boy for how to optimize a game, that still bring a 4090 to its knees without DLSS. We are simply at a point where we need other methods to keep FPS up if we want to continue making games that look next gen for 4k gaming. But of course I could be wrong.
I'm not sure where 4k popularity came from anyway. I've always thought 1440p was enough.
Apparently not.
Ray tracing is a graphical improvement, but mostly it’s a benefit for graphics cards manufacturers. There is no other reason to upgrade otherwise.
Whats up man. I got a pc question for u. When u built your last pc did u re-purchase windows or were u able to reactivate it? I ask cuz I'm replacing my motherboard and want to know what to expect upon turn on. Does it load up directly to windows or do i need to do something to get my windows back? Thanks In advance
I was able to reactivate it because I just swapped mother boards but if you build a new PC you will have to buy a new key
It just loaded up to the same brand MB I went from AM4 MSI tomahawk B550 to AM5 MSI tomahawk B650
I'm just swapping motherboard and cpu. But it's like you did. A step up of the same. Using same boot drive m.2 How were you able to reactivate it?
Do games have toggle-able options to tone down RT?
yeah most do
Been saying this from the very launch of RT. Even Nvidia hardware can't run it without cheats like DLSS. And suckers line up to buy cards that can't run 4k native at 60 fps with RT enabled, no matter how much money you throw at it.
4090 plays Dragonage Veilguard at 60fps 4K native with max RT. It's a stupid way to play it though. DLSS and frame generation work perfectly in that game.
For path tracing there have to be a point of dishmining returns.
Also people pay those prices.
This is basically the same as when they kept increasing polygons. At some point there is diminishing returns and you just cannot go further up. I believe that raytracing will hit a ceiling, but Nvidia is already working on the next big thing which is "neural rendering". So yes, they will always have something new to add but not always more raytracing. Also, soon they will need massive NPU power to run the brains of AI for games.
I bought into the RT thing on my last GPU upgrade with a 4070 TI Super and now having experienced it, I'm over it.
The problem is I don't have anywhere to go except Nvidia.
Both AMD and Intel are focusing on low to mid tier offerings.
Maybe a used RX 7900 XTX is in my future?
It would be a great all around card for you
Rays scale, just like resolution and triangle density, to some point of diminishing returns.
You wouldn't call resolution a never ending money making scam, more is generally better, so why look at raytracing any different.
Current RT has issues with the huge performance impact and low raycount leading to artifacting, but it is a neccesary step for better graphics long term.
The 4090 will do 4k raster with baked lighting at 100+fps easy. There is plenty of space to improve but we are clearly into diminishing returns.
True. agree. just for people who wanna do Productivity sadly have to choose Nvidea Route. AMD is for gamers then they should be even less priced. RX7600, RX7700xt, RX7900GRE, 7900XT, & 7900xtx are at this time available at msrp after inflated prices. Why I will ever choose it when 7900xt is same as 4070ti S. no discount ever.
I got that Turtle Beach wheel and pedals hooked up. I definitely think its worth checking out for the price. I got a folding Podium One rig for it, just like I've always used with my Logitech set, but it can barely handle that direct drive, and I don't think it wiill for very long.
Hope you're recovering from the hack bs well.
The turtle Beach is like 6.5NM I believe I suggest something a little more stable like the Next Level Racing Victory cockpit
@@Frogboyx1gaming Brooo, You're so right! And I don't know why the heck I said Podium One🤣Seems like you knew what I meant though. I have the Next Level GT Lite, which is the same price as the Victory. I'll see if I can send it back to order the one you suggest.
Exactly true. We were promised ray tracing in 20 series. It has a fleeting moment with ray traced reflections in 3090 and 3080 cards. Now, 40 series cards other than 4090 can't afford the overhead. While the 4080 has the grunt it lacks the v-ram. 5090 should be able to do it for probably 12 months, max, at 4k.
All the upscalers look good when upscaling to 4K, the majority of PC gamers though are on 1080p and 1440p monitors where upscaling looks horrendous, quality, balanced, and performance all look terrible on a 1080p display, on a 1440p display anything below quality upscaling looks terrible.
On a 4k display you can use performance upscaling and have it look better than native 1080p or 1440p.
As for the always chasing RT, that has been the same with every new visual tech on PC since the 90s. It was the same with soft shadows, new lighting techniques, bump mapping/parallax occlusion mapping/tesselation, volumetrics, hairworks/tressFX, etc, etc.
RT is different from previous features though because they can literally just add more rays
@@Frogboyx1gaming They did the same with previous stuff as well, shadows used to be the big performance killer, they'd keep increasing the resolution of the shadows in new titles and you felt like you were back to square one. Tesselation actually ruined AMD/ATi for a good while as well because it was pushed by Nvidia and ran better on their cards. The difference now though i guess is that they're still increasing the resolution and there's more ray bounces coming at the same time.
It's actually not really unlimited, the real limit is how many pixels the used monitor utilizes.
So if you use a 4K monitor then you need as much as bandwidth as you want since 1 frame is literally 4Mb. To get 60 FPS your system must be able to transport 60 (FPS) x 4 (4K/4Mb = 240Mb per clockcycle (refreshrate).... etc.
Adding raytracing easily qaudruples the amount needed thus always cluttering the VRAM eventually.
Another thing game developers can do is it so that every new game released has to have ray tracing baked in and no turn off switch. And making it require a level of GPU that can play it regardless of if you can afford it or not. You want to play this game? Then you have to have this GPU end of discussion. Ray tracing is nice but damn, not every game needs to be photo realistic quality. Make a game with good visuals that any GPU can run, give the game a good story, and just release the game not broken. Sucks when a game is released before it's finished and requires multiple updates to play right. Let's slow down, loosen the tie, and release the game when it's finished and bug free. Deadlines be damned. I can't be the only one that feels this way.
AMD never pushed for raytracing when nvidia started campaigning for it... We all should have known that there's some hidden agenda behind that campaign...
We were just fine improving on rasterization techniques and we were getting close to hitting visual realism without any considerable hit on performance till the advent of raytracing and UE 5 being built around it...
Light geometry can easily be simulated to near the real thing in raster...
Imo we should ditch raytracing completely...
Spoken like someone who doesn't understand different lighting techniques and hardware budget in game engines
It takes 4-5 years to develop a GPU. AMD didn't jump on the bandwagon because of the amount of work that was already done on their upcoming cards. They slapped in the bare minimum and hoped for the best. After all, the current gen consoles are based on their RDNA architecture so that will be the baseline. Mark Cerny made it pretty clear that we hit a wall with how far raster lighting can go. It's obvious PS6 will be based around trying to achieve path traced lighting.
@Phil_529 mark cerny saying they hit a wall is not as plausible as if AMD said it...
But what do i know...
Present gen do not look of feel like a generational leap from last gen and despite the improvements of hardware on paper it does not translate to what we see, present gen hardware tanks badly under pressure if subjected to what is obtainable especially in UE 5...
I do not want to believe that all the game devs suddenly became lazy... Because people are blaming the devs, no ome is talking about the engine used and render Options...
@@AlanSmithee-h6q even on flagship levels what do you have to say about the alan wake 2 situation that could make a 4090 look bad for its price?
Ray tracing is nice though…
Depends on the implementation, i started GOW Ragnarok after 100 percenting Indiana Jones and to me Ragnarok and Forbidden West (which I'd played earlier in the year) easily beat Indy when it comes to visuals and neither of those games have RT.
not really everything looks like a mirror. its like the dimwits that would place shiny puddles everywhere found a new toy. ANNOYING AND DISABLED!
I am telling you this is the beginning of the end of Scamvidia.
And another thing. through all these gpu-scalping-years game developers and Steam kept a deafening suspicious silence , where normally they should be all over the place complaining how their sale rate has dropped.
The new Indiana Jones game requires a RT hardware GPU. They're able to utilize RT cores for lighting while not using ray tracing. Reading their System Requirements - GPU Hardware Ray Tracing Required. I would believe it was a scam before, but if games will require these RT GPUs, then I guess it really isn't a scam.
games will not even allow you to turn rt off XD we soooo screwed if they keep requiring more we will have to upgrade every gen and resell value will be low.
Which games? The few I played gives am option to enable rt or not
@@bombyo3634avatar frontiers ONLY has ray tracing and you can’t turn it off
@@AceLooneyLoz never knew that. Thanks
Raytracing doesn't make better games. Simple as that.
They said the same thing as soon as ray tracing was announced as a GPU feature.
From what I can see the 5090 is the only one that has been redesigned. Isn't the tell in how much they are throwing into the memory bus and quantity on that card? It looks like the others they threw in some minor core increases and faster memory and that is about it.
The 5090 looks like this generation 1080ti. AMD is missing the RT & Power. I have a 7900 XTX FSR is trash. I don’t use if I don’t have to. I do want to use the RT.
@@Gen_X_Gamer_79 What does FSR have to do with the rest of your statement or the subject?
Game developers complain sales are down, but they don't build their games to the equipment that most gamers have .
The day we have photo realistic lighting it will more than likely be on an Amazon pay to play server.
Right
You’re 100% correct everything will be streamed. There will come a day where hardware is a thing of the past and everything will be a subscription. You will own nothing and you will be happy.
I must confess...I drank the ray-aid. I rushed in without considering the long term affects of this technology.
Eventually RT will be viable
I like raytracimg .... looks pretty..... what happens if developers stop putting lighting in rasterization and you are forced to use raytracying
Then thankfully we have PS5 Pro
Eventually you hit the point where you cannot get the image better with more work. The human eye has limits and monitors even more so. That said it's a ways off. Unless there is a breakthrough somewhere it's at least a decade out.
RayTracing will become more mainstream with the RTX 6000 series - watch and learn.
The more you buy the more you save. Because I looove nvideeeiia cmon sing it with me guys.
It's not a coincidence that Nvidia gimp their cards VRAM in order to upsell you to the next tier despite the gpu's being very capable in terms of performance. Good examples for these are the 3070, 3080, 4060, 4070 and its other iterations other than the Ti Super variant. Those cards are very capable but because of the lack of VRAM they perform worse than their counterpart from AMD.
u are right, but one "upside" is, u can always hack a game at least on PC and lower the rays, they cant block this, at least on single player
True but you would have to know how to do it.
@@Frogboyx1gaming modders would do that, we already have mods doing that, even increasing, like for cp2077
Did the opposite, didnt bought anything new after the 1080ti and saved alot of money. Thanks to the UE5 i have to upgrade now because the 1080ti is too slow for it at 1440p. With the next card i still go for raster power alone, if its fast in RT its nice to have.
the main issue with nVidia is that even after AMD announced FSR being able to run on any hardware, even after that nvidia said that DLSS will be only available for certain GPUs, maybe the newest DLSS will be available only for the 40/50 series GPUs
AMD for the win, i hope they will release FSR soon in january 2025
Yeah and when FSR4 doesn't suck because it's more expensive to run they will also have to lock old hardware out.
@Phil_529 FSR4 will probably need AI cores but still people can use FSR 3.1 which is good enough for older hardware, too bad nVidia bribes developers to not implement FSR 3.1 in their games, cyberpunk for example
@@HoretzYT You realize it was AMD who was blocking DLSS from being in their sponsored titles for years, right? Cyberpunk shipped with the best AMD upscaling tech available at the time (FidelityFX CAS and later upgraded to FSR 2).
It wasn't until Starfield that AMD was publicly shamed into not doing that anymore.
Ray tracing is very nice feature at gaming.I don't buy expensive cards that i get better Ray tracing 1500-2000 is too much money this purpose.Thank you for this great video👍
Thank you my friend for watching it.
Nah it’s not just because of ray tracing. Developers have just gotten lazy. They don’t optimize really that much anymore especially when all triple A games are in UE5. Just look at Warzone in CoD. They don’t compress textures anymore it’s all texture streaming, (each texture package is like 4k quality) thats insane. Sure ray tracing and path tracing is going to stress the fuck out of your new gen gpu. But each studio is relying more on up scaling and frame gen and well that’s why us not getting 20gb+ cards is so bad..
Ok, my friend give it a couple more generations for it to sink in.
I have a 4090 and I don't use ray tracing. Just turn it off and leave it off. You will be much happier.
Problem is shortly we are going to be in a situation where nvidia offer 16gb (unless you spend insane amounts), and amd also offer only 16gb. The 20gb and 24gb cards are being phased out from amd. The new 90 cards from amd only go to 16gb.
It's going to get expensive
That’s crazy why they doing that ? So my 7900xtx is still gonna be the best amd GPU for awhile ?
My younger colleague just buy rtx4070ti super and I asked him for what killer game he bought this beast and he said for Counter-Strike.
Oh wow
Just dont buy the game, look how bad great circle sells. Atleast on wukong u can run it without rt. Also yeah nvidia cards below 4070ti super isnt worth even if u run RT due to vram limitations
Remember when tessellation came out with assassins' creed and one vender did really good at it and it was open source while the other was closed and not good and it then they removed together? Yeah i remember.
for the people
4090 traces like 8-10 rays per pixel or something
to get where it needs to be is 1000 per pixel or something.
We have a ways to go. Machine learning is not AI, different concepts. The industry is getting all this stuff confused.
But yeah, hey if nvidia gets neural rending to work good and stuff, great. At some point we will hit performance walls
I don´t care about RT because their prime example Cyberpunk is doing it poorly in my opinion, in which universe do wet floors turn into mirrors? Some surfaces turn mirror like when wet but with RT it´s like every wet surface becomes a mirror 🤮
I hope 9070xt will be about 600-650 and I will buy it
I've been rolling with AMD because they always had reasonable prices and extra ram. While Nividia gpu's with parody had less ram and were a few hundred dollars more. But this time, I was thinking about Nividia for RT and 1440p gaming, but now I don't know. I was planning on getting a 4070 super ti. Around 700$. Now I'm thinking about getting the AMD 7900 XTX.
7900XTX has pretty good RT my friend
@@Frogboyx1gaming "Has pretty good RT my friend" said nobody with a brain cell ever about AMD.
The next gen is literally around the corner, in what world does it make to buy a current gen card when the next gen is out in a few weeks? Even if next gen cards are a bad deal proportionally, you'll still get more bang for buck. A 4070 was considered a dud this gen, but it was still a way better deal than a 3080 at launch, same for 7900XT vs 3090TI. Don't ever buy a current gen card when the next gen is about to launch, you'll be angry.
@JynxedKoma i'm starting to get sick and tired of RT. It's just too demanding for most cards. You pretty much have to buy the very highest of high end cards, and even those still have issues with some games. Nvidia is really starting to look like a bag of assholes with their prices.
we are at a point now where raytracing looks incredible. il be upgrading to a 5090 and i wont see the reason to upgrade for a long long time
Nice theory but will never happen. It would only cause stagnation in sales. Denoiser + certain number of rays per pixel will look so good that even doubling number of rays will change nothing/almost nothing. People would stop to use high settings and keep GPUs for longer. It's really obvious that next thing is AI computing used for textures and geometry. BTW. we know it from leaks. Then next step will be NPCs AI.
I'm REALLY looking forward to what AMD and Intel are going to show at CES because of Ngreedia turning me away with their prices
I think NVIDIA makes better cards, but I'm also down to get the entire gaming community to boycott them until they make their high end GPUs more affordable. The problem with that idea is that their top of the line GPUs are just objectively better than the competition, so people are going to buy them regardless. They know this, and that's why they continue to price the way they do.
Devs are going to go with RT and DLSS. It’s simply cost too much money to rasterize the lightning and creating highly detailed 4K textures. The cost of making a game is doubling every gen. A 100% increase but games go up 16% by 10 dollars. It’s not sustainable in the long run. Ai textures and PT/RT is the future.
But the problem with ray tracing, as of now, is that we need more rays and more bounces for rays of light. That's why every game with RT is so blotchy, dark in places, unstable mess. There is just not enough data to create stable well lit image. It's not possible to do real time raytracing with current hardware, that's why there is so much fakery, averaging and AI up/resampling. It's still too early to run RT on current hardware. We should have started the transition to RT much later in time. So, YES, game developers will create more computationaly expensive games with RT. If games were as simple as Jensen showed you at the first RTX20 presentation, that is a rectangular room with sphere and cube in it, then yes, we could devote all the resources for ray tracing and have nice lighting. But games are not just a cube and sphere in a room. RT is not like the scam with tessellation in some games, where game objects were created with more complex geometry, which were indistinguishable from the same object without tessellation. While tessellation is a good technology it was abused in some games.
Nvidia is just Apple now, we have to accept it and wait until competition arrives
I gotta get a 5090 simply for the bragging rights.
Thank you! I save money and staying on 4090, maybe upgrade it when 6090 coming. This wheel of marketing never stop.
"Das Kapital"
Rog strix 3090/ i9 10900k didn’t need to upgrade for the 40 series and won’t need to upgrade for the 50 series!
I quit gaming due to 40 series pricing. Have fun with the 5090s $2600 tag boys
I'm running a prebuilt I got in 2022 ryzen 9 5950x/ EVGA rtx 3090 cost me £3100.Agreed the 90class brings a tonne of vram so never any stutter or problems with loading textures.However I hate that Ngreedia never provide framegen for ampere unbelievable it's amd that save the day.Otherwise I'd be getting not so impressive framerates.I really am impressed with my 3090 strange that I'm having to thank team red for providing me this with most recent Avatar Frontiers Of Pandora, Starwars Outlaws, Black Myth wukong for supporting FSR3/framegen.As long as I have framegen I'm not sure I'll buy a 5090, that would mean a new PC.
Trade that 3090 for a 7900XTX and you can have frame generation on all DX11 DX12 games
Hows that Subizzle treating you? Looks pretty good.
yes fsr4 needs to run on all 7900 series gpu's
New games: Look bad, play bad, run bad, store bad, priced bad. I can't see a single reason to upgrade my hardware. This year I challenged myself to not use a desktop and just used a handheld gaming PC. Had the most fun I have had in years because I was restricted to older titles. Its all a sham at this point. I have an oled monitor as well and it really shows off how blurry and washed out new titles are. Everything has this fog like white over it I have to use hacks, ini edits or reshade to try and get rid of that older titles just don't have.
I think its just a matter of time before I nope out of gaming all together. I'm not spending as much as a nice used car on a PC so I can turn around be lectured by a game and also have to try and fix its graphics with mods to store a 300GB game on an ssd to get worse game play than Jack and Dexter.
I will run my 3080 and 5800x until the wheels fall off. Does absolutely everything I could want for now. I don't care about ray tracing really... half the time it doesn't look great anyway.
I have been blown away a lot of times on how damn close my 500 dollar consoles can get too my 2700 dollar pc ….. yea my pc looks better but not 2200 dollars better if im being honest but thats why for me personally im always gonna go pretty high up the stack in gpu’s so o can get some decent uplift over the console at least in RTand be able to run apt at playable fps to make it more worth it for me lol
We can't get away from RT. Next console generation there won't even be rastorized games anymore
I love raytracing cause the realer the better. But yes it's all a scam in 2 year 5090 wiĺ be like all the other cards.
RT or Lumen will be pushed more and more and Nvidia performs better at that. Newer games will have one or the other and without possibility to turn it off. So now not only will the 4070 ti super, but the 4070 ti will surpass the 7900xt etc... I wish it wasn't so.
You do realize that FSR 4 will come to 7900XTX right
It will come for 7900xt? Right? I hope yes
AMD and Sony have found it too aparrently.
Just buy the card with the most VRAM, enjoy PT & RT ... until you can't, then keep lowering DLSS, then enjoy pure rasterization, then lower settings, then lower resolution till fps becomes unplayable ... then plan for you next GPU (& new PC) upgrade. I still prefer Nvidia coz DLSS > FSR.
Just get a ps5 pro and provlem solved. All the modern pc gaming nerds are a lol. Most of em disnt even experience the 90s pc gaming wjen it was actually great circa duke nukem 3d, quake, quake 2 quake 3 etc. Pc gaming got crappy aboit 2010 and now in 2024 its a stutter fest. Now days pc gets broken crappy ports of games. I remeber building a bleeding edge PC for Doom 3 and half life 2 in the early 2000s. Thats when PC gaming peaked.
PC gaming went from complain about dual card setups stutter to all single card setups stutter.