Why you guys call it 'fake frames'' when it does indeed work? If was not for the DLSS man my battlefield 2042 would be around 40 fps .. DLSS on I get 80-120 and I can feel that.. plain and simple..Also I never notice any ghosting or anything bad at all with my image quality.. Unless maybe i get a fucking lens and stop every frame to compare.. Also you guys act like most games dont have DLSS or something.. Specially now that we as players will be able implement it by ourselves.. Weren't you guys shitting through your mouths on HOW EXPENSIVE iot would be? and ended up being same prices?
can you please tell me when the intell GPUs are available and if they can kindly give the boot to Jensen's leather jacket club bs that would be fantastic because its my all to waking nightmare this man hasn't been arrested for price fixing and gouging of architecture licenseing agreements that I know in my bones he forces competitors at gun point to sign on the dotted line. Extortion, embezzlement, fraud - I mean a RICO charge is probably the best case scenario for Jensen at this point, in the manner in which he'd flatout lied to not only his shareholders but the goobers as well.
Dont get the point of these posts 🤣 your 1080 gets bodied by the 4080 with any dlss. 5080 will destroy it even more. My 4080 runs games at 4k perfectly with dlss on quality which renders the game at or above 1440p and upscales to 4k. (Which actually ends up looking a tiny bit better sometimes) + frame gen doesnt even make anything look worse. (Not that i need it. It does make things a lot smoother without any downsides). Keep crying about GPUs not improving. They are probably the best theyve been in a long time.
@@yeahbuddy300lbs @ 30fps with 4x MFG, would create for literally about 100ms of delay between current frame and next real frame (input registered). This is without other latencies. At 20fps that they want you to use it, might as well go make a coffee, get some pastry before your next input shows up on the screen. But hey at least the wait time is smooth!
@@notlahaanaren’t they using ai to approximate the ray tracing, when the game is a hybrid with a rasterised scene and ray traced lighting, so wouldn’t the fakeness only apply to lighting
@@OkOkOkIMightKnowYou If you're talking about global illumination, shadows, reflections etc - then yes it's a hybrid of already-being-used techniques with RTRT mixed in. "AI" cleans the image up, as having proper amount of bouncing, precision, rays etc is extremely taxing and we're still way more than a decade off that "end-goal" (Of proper RTRT). Rasterization is a fundamental image rendering "technique", using FP, INT ALUs to do it. RT cores are just there to accelerate ray-tracing performance. So just to make things clear, rasterization will stay and be the main performance metric for a GPU. It's literally what makes the image. Anything beside it is just to accelerate parts of it. What I'm on about is Frame-Generation. It's fully fake frames. There's no real rendering done in these frames, it's just prediction of the frames between the real frames that are rendered, using ML/AI (whatever you want to call it). With 50-series, and MFG (Multi Frame Generation) they're adding 3 frames in-between - effectively faking their way to 4x the fps. But the latency cost is bound to be insane, and in no way will the game feel responsive going from 30 to 120.
Nah, good game code can seperate the rendered frame from game logic. So, they can just skip those frames and process the game logic and leave the ai to interpolate for those frames and you won't get lag.
While DLSS does involve generating frames that are not natively rendered, it aligns with the modern interpretation of "The Way It's Meant to Be Played." It allows for enhanced performance and visual fidelity, pushing the boundaries of what's possible in gaming. Whether you consider them "fake" or "AI-enhanced" frames, they contribute to an experience that NVIDIA positions as the optimal way to play games.
@@squirrelsinjacket1804 Every single competitive player in existence? Fake info + input lag, while for 80iq fifa player thats "no difference", quite a lot of of people dont even consider turning on upscaling/fake frames or whatever they come up with.
Cheaper for the devs, which are mostly owned by private equity firms who have a legally binding obligation to turn a profit for their share holders. Make it quick, make it "good enough", and pump them out as often as possible. See modern Hollywood movies.
It's primarily the new engines. We are needing 400% increase in fidelity to get something to look 10% better because we are so damn good already, and taking the time to bake in lightijg and manually crafting bif cinematic shots overall doesnt really save compared to ray tracing that is much more conputationally difficult, but will make all sections of the game see minor increases rather than just crafted segments. It sucks but diminishing returns are where the industy is, and gaming in particular has never really needed to sell itself on non-graphical improvments, so we see these messes of high fidelity graphics that can barely run using AI to fix it because the marketing team cant work with anything less
You can’t download RAM that’s not how computers work RAM has to be painstakingly grown and picked from RAM trees that only come into flower once every two years cmon
$549 for 4090 performance is fake as shit. They are comparing fake frames to native real frames, and passing it off as legit comparison. Super shady advertising.
I mean... what's the problem? Their customer base is convinced they can't see the difference between fake and real frames, DLSS on vs. native. Let them buy it.
You won’t see the difference playing the game so it’s not fake performances ! Are you Somme kind of scientist or a game developer ? Because honestly ai is way more efficient for gaming rather than a high end performance graphic card
yea never trust their fake marketing performance increase BS. Always wait for independent benchmarks and reviews. There are a lot of people who dont like Frame gen and I personally don't always enable it on my 4090 unless i need to. i run most games just DLSS quality gives a great experience.
@@MN-jw7mm and you can? haha ok buddy The fact the amd crowd is still CLINGING to the "fake frames", but all of the sudden when AMD came out with their FG counterpart it was "omg amd so good amd is great!! Free frames!!! " And now when nvidia once again is clearly in the lead, y'all out here claiming fake frames again. Bums
that quote isn't about consumer graphics cards though, i'm pretty sure he was talking to the datacenter audience about beefy ai processor things. As for the card we'll just have to see when youtube channels can actually benchmark it vs 30 and 40 series
@ but you don’t understand! Current cards are already more than perfect for gaming!!! *rtx4060 runs crysis 2 (released more than 10 years ago) on 20-30 fps. (4k ultra dlss on rt off)* But you don’t understand! We are going to use local AI on this!!! *even base apple m4 mac mini is more than great for running 70b llama or similar models. If you are planning to train your own AI, well, you are not a datacenter. You are not amazon, google, openai, apple or meta. If you are, you will not buy 5090. You will buy H200 racks.*
On Nvidia short of. On Amd and intel is 50/50 This is normal Nvidia has always been a software company while intel and amd are hardware designers Is achieving the same results for with different approaches
@@mardnm123 it's quite a pm brainrot thing, besides that, they really screwed up badly.. I know how it feels but don't know what it's like to have a good graphics card nor processor. My ancient laptop (Intel HD 3000, Intel Core i5 2430M) is 13 or 14 years old as of right now and already beyond life support fr
@@Ziddel most graphics settings just tweak shader values, like shadow resolution, AO res, disabling/enabling certain texture maps that are more computationally expensive. As for model fidelity, I'm assuming they just use lower LODs for lower quality levels, and higher LODs for higher quality levels. PC and Console have become closer and closer in terms of "hardware" compatibility, and most games aren't made at the super low level anymore, so there isn't much to "optimize" for PC.
@@yesyes-om1po Pc bros still can't handle the fact that PC is overpriced AF and that the reason they aren't getting better graphics us because pc gaming is a fuckn lie as opposed to consoles holding back everything
Old McJensen had a card AI, AI, No. And on his card he had some tech AI, AI, No. And a D-L here And an S-S there Here a frame, there a frame Everywhere a Gen-Frame Old McJensen had a card AI, AI, No.
True but that's how they will make normal games from ai input for as long as possible to make the most money. You think they are just gonna give up on a billion dollar industry easy? Like having a start trek food replicator and now we don't need supermarkets anymore. The inventor would mysteriously disappear. It won't be soon that you can tell the pc to make a game and give it some variables, get some coffee and play it right away.
I love the silence from the reporters in the audience, as if they finally realized that their jobs are going to be taken by AI as well. The artists were right.
I like this video because it does show the 30% performance increase but that is on a game with path tracing, which if you guys didn't know is not a common form of raytracing. And its kinda ridiculous that, the technology to actually live render path traced lighting exists. However, it is litterally no excuse even if DLSS does give almost a 10x performace boost to these cards with AI, that AI is the future of gaming GPUs. I understand the limitations of the hardware for the size and power so that is a factor but I do hope AI is a temporary and not permanent fix to the issues with GPU's.
2031, Jensen in full Lederhosen: "Our 7090 Ti is capable of rendering a whole 5 key frames per minute to remind the AI to occasionally return to reality"
@@ThePizzabrothersGamingWell, it shouldn't be the problem of the costumers, we give our money for better technologies, they develop them, that's how it works.Why give charity to companies that don't care about us? Edit: sorry for my bad english, I'm still learning xd
Games already look realistic AF. Go back ten years to Arkham Knight(which runs buttery smooth) and it's hardly improved since. Can we just stop focusing on *slightly* better graphics and go back to optimizing games, please?
That involves actually using our heads, and voting with our wallets though. Buy AMD, Switch to Linux, etc, you need to make a proper, decisive display that you are NOT interested in this slop... And you need to make sure their statistics and numbers, reflect that... If you keep buying their s**t, even if they're doing s**t you don't like, they're just gonna continue doing it... They have no incentive to change. I bought an all-AMD computer, and i DID switch to Linux, near the end of last year, it's been maybe 3-4 months now... Not only was AMD a better gamign experience for me on Windows, but gaming on Linux is a better experience too, 0 stutter... Higher REAL FRAMES with AMD (used to have a 2080 Ti), 0 stutter from Linux... It blew my mind the first time i tried Linux, lmao... I didn't know gaming could be so good!
It was disturbing how blatantly obvious it was that he was talking directly to the shareholders and to nobody else. Especially the comments he made about everyone and their 10,000 dollar computers at home and how 2000 is just a small investment to upgrade that. So disgustingly out of touch
$2000 for 32GB of GDDR7 is cheap. 7 years ago, that would have costed you $15,000 USD for less performance. People are so delusional these days just because they see a price tag of $2000. You know NVIDIA doesn't have to release these cards at all they can just release 5070s and call them 5090s and fk it. So that developers and people who need the power of 5080/90 don't have them available at all.... It's crazy how people don't even realize with their tiny brains how much power is a 5090 and how little $2,000 is in the real world of computing. If you can't afford them, you probably don't need them!
Now this is the clearest proof that Path-tracing in video games arrived way too early while regular Ray tracing hasn’t even matured and is mostly still hard to run on most GPUs. This is also proof that Nvidia no longer cares about raw performance. AI is not always a must-have and it’s much better to not be required or forced to use it.
I see the picture now. They hading towards fake performance by AI. It means they no longer will make increasingly better GPUs, but increasingly more DLSS dependant
both are gimmicks. raytracing and pathtracing are the same thing. They are dynamic lighting with NO rays being removed for optimization. Raytracing on looks 1% better for 600% worse performance. We've had this technology for almost 30yrs now its called "dynamic lighting". Plus the only game developers to actually PUT LIGHT SOURCES inside their game in the last 20yrs that i can think of was saints row 2022.. And you guys think thats a bad game. No other developers even care about lighting they never put it into their games.. They just select a GI value and be done
@@scalz420 I mean cyberpunk objectively says you're incorrect, they look way better. If you can run path tracing at decent stable fps you're having a much better experience than a person running ray tracing or none at all. Without going down this route their taking you reach a point where the new generations of gpus performance is in the single digit percentages, due to minute tweaks. But with this new architecture you get a lot more performance. People see 20 fps no frame generation and think its a bad card but they are not considering the settings the game is running when the most powerful non frame generation card would not even run the game at all.
At least let us see that the 5090 runs the original 4K image quality of the 2077, which can achieve twice the performance improvement of the 4090, that is, 20fps > 40fps, at least twice the original performance. Let’s promote the advanced AI technology of DLSS 4. When I saw In the promotional video of 2077, when the RTX 4090 is 21fps vs the RTX 5090 27fps, my whole head explodes and I lose any sense of expectation for the 50 series.🤯
@@NonsensGamingi have good 60-75 in stalker 2 yet with frame gen on 120 FPS input lag is massive (as comp fps player i could tell) it is not a solution it is a band-aid
Basically what you're saying is, upscaling over native is the future... I've been a console player for years but recently thought about getting into PC, so I need to know what's a scam and what isnt
Some cars like the Corvette c7(or 8 I can't remember exactly) have that eco feature where it turns off a few cylinders to conserve fuel. Granted, you can still turn it back to V8, but still
Yep. For every 1 real frame you now get 3 fake frames rather than just 1 fake frame. So now only real frames 25% of the time. Totally not going to fuck over latency and cause smearing issues all over the place. Not to mention does nothing for you if the game in question does not support the latest version of DLSS if it supports it at all.
for what it's worth, the latency is not at the 28FPS stuff, they're downscaling it to like, 1080P to get it to 60~ FPS and upscaling the rest but its' still far from a "true" 200 FPS
i mean realistically 3k series cards should last you upwards of a decade. buying every 1 or 2 gens is stupid. run that baby into the ground and wait till the 7ks are out at least.
Just hold to it. I used my 1070 since it came out all the way until 2023. It was still perfectly capable of running AAA but with everything at low or medium. I only upgraded to the 3060 because I had the money to do it, otherwise i would still be using that 1070.
That 3060ti SHOULD last you a lot longer. But since new game devs do not give a shit about optimization(city skylines 2 for example) i think you will need to upgrade sooner
God dammit you’re so right. We’re being taken for fools, getting deliberately scammed of true raw performance. This is why I prefer amd for everything but 3d rendering in blender, they do us justice
@goldenmikytlgp3484 Even though i like AMD, they are doing the same thing. No big company is without blame. Look at the 7800x3d prices now and 6 months back, Intel was a shitshow lately so they took advantage of it.
20xx - last generation with real performance benchmark, real TFLOPS etc. 30xx - Nvidia changed method how they count TFLOPS to pretend that cards are 2x faster (dual issue) 40xx - frame generation 2x to pretend that cards are 2x faster 50xx - frame generation 4x to pretend that cards are 2x faster We can assume that RTX 6070 will still use 8 GB memory and use 8x frame generation to pretend that cards are 2x faster
Is truly crazy how Nvidia real GPU performance is so shit to native res no DLSS and yet they come with this idea of seling software using a tool who creates more fake frames on the screen to give you the illusion your card is so strong in performance.
@@socks2441 Depends on needs and game especially though, most gamers arent trying to run 4k on a 5070, although I do understand the argument that for $549 it should be capable of it thats ignoring prices nowadays. People really overemphasize the need for VRAM tbh, especially since most gamers are only now switching to 1440p gaming. Also, you picked out the only card in the lineup so far with 12GB of VRAM when every other card aside from the 5090 is pretty reasonably priced for todays value and have 16GB.
This cards have the best possible performance on the market and the prices are actually very fair. 7 years ago, you would have to pay $15,000 USD for 1 of these cards. People are just delusional now. And Moore's law was long gone...
Pricing never used to increase with each generation though, other than occasionally for inflation. Each generation they find ways to push out more power for the same price for each performance tier, thus they can charge the same as last generation and still make the same healthy profit margins for each generation. Naive gamers got fooled years ago that each new generation should be more expensive, and Nvidia ran with it. We've been getting massively ripped off for years.
It's funny because in theory 30% improvement is not too bad. However the lie of making it sounds SO MUCH BETTER and then realizing they LIED makes it feel like you are getting an even worse deal.
I'm mostly confused how they get performance down to 20FPS native. My 4070 Super does better at full native 1440p, 4k is only 2x the pixels ... Surely a 4090 is more than 2x the power of a 4070 super?
I really feel like the 5090, THE best graphics card you can buy when it comes out, should probably be able to run Cyberpunk full RT at like 100fps no DLSS. I mean you're telling me, a game that came out 2 months after the freaking 3090 did... Still cannot be ran by anything out today? Bonkers.
Its running rt overdrive or pathtracing which is a preview tech they released on 2023, and not the original rt that came with the game launch way back in 2020.
But you didn't have to rip me off Turned off my DLSS and saw my framerate plummet And I don't even need that much But you advertise it like that and I gave my trust 😔
i think like this gen of gpu will by like 7 gen of intel gpu, when after few moths drop a lot better 8 gen, the 60xx series will less power draw cuz will have smaller litography for now 50xx series looks like a refresh of 40xx but with dlss 4 and gddr7
It’s more he’s talking to idiots who are going to pay him literal thousands of dollars for ‘this’ so he needs to speak their language to grift them properly.
@@DarkMatterKid Why do you suppose the UFC advertises grappling when ROK Special Forces easily and effortlessly killed those who dared to try double leg takedowns against them in war? The UFC has to speak their language too in order to get them to pay lots of money for pseudo grappling found at sambo, bjj, judo, karate, etc.
@@coopersheldon394 Because you don't get a pass to *murder* your opponent in UFC. Obviously. Shit analogy is shit and I don't even watch none of them meat heads beating on each other.
0:45 look at the pole in the background and the surrounding objects, the native one looks much sharper and clearer than the blurry mess on the right lmao.
Look at the Powerconsumption too! Get a 2080, costs around 200 bucks. Watch out for the OC and non OC Chip-Variants in this Generation. Undervolt +OC = GPU-Power 167 Watts 2105 MHZ +600 MHZ Memory-Clock. STILL an absolute beast!
Rtx 6070 or 6080 will be affordable version of 5090. Solved. Buy rtx 5070 now. Motive should always be maximum value for money and not high end product.
@hyoroemonmeto6874 I still can run games on a GTX1080 from 2018 on 1440p. And if the game is poorly optimized just clock in Lossless Scaling that is 5$ on Steam. Nvidia Graphic cards nowadays are a scam since neither them or the actual games go out optimized and resort heavily on fake frame generation "technology" (which I repeat you can get on steam) to make people spend 600-1200$ on mid Graph cards
used titan X pascal runs practically identical to this. 27 FPS max rays in elden ring. IDK, i did a poll on LTT forum and people agreed. there isn't a price to performance argument for new hardware. It's scrapyard wars except the builds will last several years while competing with a stagnant games and hardware industry
Well Nvidia doesn't want to do another 1080 Ti, they have seen that it impacts future sales, they want to keep people upgrading, since few will be able to afford the top of the line gpu.
I'm not a tech guy but I think Frame generation should simply be for covering a bit of uneven framerate. like 30-55 frames to 60. or 90-110 to 120. Going from sub 30 to over 200 seems like overkill and would have smudging or image breaking as you're getting more than triple the number of generated frames than the real ones.
Maybe games are different ... but TV manufactures have been doing this for years , back in the full HD days there were 240HZ LCD TVs and the result was OK ...
I remember that prefectly. 2080ti was 1200€ aprox and 3070 before crypto boom , 500/600€, was great. I still have one 3070, although I'm going to change it now.
@@Charly_dvorak 3060ti user here i get 4k60 granted on some game si do use DLSS to reduce the load but ive gotten native 4k60 on others note this is on indie games not modern AAA trash
The games? Poorly optimized The frames? Fake The prices? Sky high Truly the worst timeline. I won't waste my time or my money on any UE5 slop that comes out next.
I have had the same PC since 2017. It’s time to upgrade but I have been thinking about going to a console because of this…..have to spend tons of money to get good performance and on consoles at least they are optimized
Yea its said how many people don't realize that DLSS/MFG actually make gaming WORSE in competitive games. Im never turning dlss/mfg/ ray tracing on so im not getting any real improvement at 1080p raster which sucks because i have a 480hz OLED monitor and the 4090 cant hit that and 5090 will only give tiny improvement but for much more $ and i cant buy 5080 cause its not even a REAL 5080 chip its a 5070
@@HDRcade upscalers can be very useful at extracting extra FPS for peak performance at high-ends when used correctly, but what's happening is devs are relying on them in place of optimisation. This is all while introducing downsides such as visual artifacting or fuzzy/blurry visuals and shadowing. So what happens is games run about the same as they would have if they were optimised normally except now they also gain a slight blurry haze and shadowing around light sources or when moving quickly.
@@AMDRyzen57500F I mean, I agree on a technicality but you could also make the argument it's not the devs fault for being lazy either and it's ours for not calling them out on it. That's why I instead just go to the source, to the feature itself.
This is NOT how we envisioned AI takeover to be in the 90s lol Expectation: Robotic AI that can help you with daily tasks Reality: Use AI to generate fake frame rates in games LMAO
DLSS4 will be available all the way back to the 20xx series, it has been confirmed in the presentation by nVidia. We'll see how the older cards will be able to handle it though, as it may come with a performance hit.
@@SantiagoGomezJorge It also adds a bunch of blurry noise ghosting dogshit, and gives a reason for engine devs to keep making bad unoptimized software.
The main issue with DLSS is that it doesn't look good. Games look fuzzy when you enable frame generation. What's the point in gaining performance when the image quality is sacrificed. If I wanted to play on 720p then I would just set it to 720p
@@isimsoyisim4979 bro you seriously need to verify what you're saying, you don't even know how wrong you are on the dlss part. Do some research before speaking.
Not true. Source: I've had my 4090 for over 2 years and prefer to use Frame gen in every single place it's available. I game at triple 1440p (7680x1440 240hz) and at 4k. In basically every scenario I prefer to use FG and DLSS. So you are telling me that your don't use Fame gen on your 4090? That's bonkers... Why even buy it?
Enough with that, there's no more friends and no brotherhood. Especially if your brother hates you. (Me... because i having a family problem right now,k
@@hengry2 nvidia a trillion dollar company and intel is in trouble financially. I don't know if there's a feasible way to do it other than break CUDA's hold on the sector through open standards. You'd have to drag nvidia along for that.
Thank You for watching this Video! make sure to hit that sub button for more RTX 50 memes XD
lol
Why you guys call it 'fake frames'' when it does indeed work? If was not for the DLSS man my battlefield 2042 would be around 40 fps ..
DLSS on I get 80-120 and I can feel that.. plain and simple..Also I never notice any ghosting or anything bad at all with my image quality.. Unless maybe i get a fucking lens and stop every frame to compare..
Also you guys act like most games dont have DLSS or something.. Specially now that we as players will be able implement it by ourselves..
Weren't you guys shitting through your mouths on HOW EXPENSIVE iot would be? and ended up being same prices?
can you please tell me when the intell GPUs are available and if they can kindly give the boot to Jensen's leather jacket club bs that would be fantastic because its my all to waking nightmare this man hasn't been arrested for price fixing and gouging of architecture licenseing agreements that I know in my bones he forces competitors at gun point to sign on the dotted line. Extortion, embezzlement, fraud - I mean a RICO charge is probably the best case scenario for Jensen at this point, in the manner in which he'd flatout lied to not only his shareholders but the goobers as well.
@@PuppetierMaster Lol you talk like the majority care about price NOT changing while you get MAD performance..
We can add dlss to any games bro..
imma scalp these and youre all gonna cry when you cant plug them into your buttplug mining machines
Can we just start optimizing games again please?
No
Don’t be crazy.
Don't buy unoptimized ones
Survey says!
...
No!
Ho ho, tough luck gamers. Better luck next time, ha ha ha!
Yes but teach ai to do it first. We lazy
"The more GPU you buy, the shinier my jacket gets"
-Uncle Jensen, Circa 2025
Next year, he'll show up on stage with a whole crocodile hanging from his back.
We're starting to run short of alligator here.
True
It will be a glizzy shine jacket for 6090 reveal
he just upgraded to ray tracing jackets
"and when every frame is fake... none of them will be"
😂😂
Maybe the real frames were the fake ones after all….
where's this quote from?
Look at AI minecraft
@@cyberpunkdenton9497 Ghostbusters 2
My watercooled GTX1080: "Can I go now?" Not this year my friend, not this year...
And if your 1080 dies, you could always buy a RX 7800 (XT) from AMD, it's basically the 1080 of AMD. :P
Your GTX1080 (probably): I'm tired, boss. Tired of being on the road, lonely as a sparrow in the rain.
Hey man, when intel arc's b series drops, its a solid switch option!
Dont get the point of these posts 🤣 your 1080 gets bodied by the 4080 with any dlss. 5080 will destroy it even more. My 4080 runs games at 4k perfectly with dlss on quality which renders the game at or above 1440p and upscales to 4k. (Which actually ends up looking a tiny bit better sometimes) + frame gen doesnt even make anything look worse. (Not that i need it. It does make things a lot smoother without any downsides). Keep crying about GPUs not improving. They are probably the best theyve been in a long time.
@@Popipa182 Knowing that, I recently cleaned the waterloop, replaced thermal paste and pads + equipped 2TB nvme. Runs like a dream.
30% more performance with a 30% larger chip and 30% more power draw
what a technological marvel
Don't forget the 30% price increase as well. Totally revolutionary gpu.
Also 25% higher msrp
My 4090 already draws 600w lol.
Dodge Hellcat. Larger displacement, larger engine, more fuel consumption, more power.
And 30% more melted connectors!
"The more you buy the more we make"
-NVIDIA
As is the job of every company...No one complains when their retirement/investment fund goes up.
Nvidia is more rich than rich people in future
They are already b2b business we are small fry for them because they have compute cards to sell in ai stuff
This cracked me up lol
The more you buy the more we save
Graphics Processing Unit ❌
Frames Generating Unit✅
Underrated comment
FGU
Fr😅
NVIDIA released the world's first GPU in 1999, now they have FGUs.
From GPU to FGU, more like FU
30% more performance and 575w consumption power on 2000usd
The TDP of the RTX 5090 will have me passing out after looking at the electric bill. 575 watts is crazy!
@@akin242002Maybe 575 KILOwatts?
@@akin242002 buy a 1000w+ psu, now, if you want to play elden rings, GTA6 or work :D -the companies
there should be a mandatory info for average electricity bill for running those terrible gpu.
2007: Cake is a lie
2025: Frames are a lie
@@JJ_______ EVERYTHING’S CAKE!
Lossless scaling will trive.
Cake was true though. This, not so much.
Atleast the cake was there in the cutscene
2020: Fridge is a lie
They selling fake frames, can I pay with fake money…
DLS$
Here's the deal: 400$ for the 5090, the rest of the money is generated with DLSS 4 😂
Ai generated money 🤑🤑🤑🤑🔥🔥🔥
1 real real note 3 generated
Its 1:1 deal they have to have to accept your offer😂😂😂😂😂😂😂
You mean crypto?
"In rtx 5070 you will get smoothest lag"
"Smoothest high latency you have EVER experienced. The more you lag, the more you don't."
@@yeahbuddy300lbs @ 30fps with 4x MFG, would create for literally about 100ms of delay between current frame and next real frame (input registered). This is without other latencies. At 20fps that they want you to use it, might as well go make a coffee, get some pastry before your next input shows up on the screen. But hey at least the wait time is smooth!
@@notlahaanaren’t they using ai to approximate the ray tracing, when the game is a hybrid with a rasterised scene and ray traced lighting, so wouldn’t the fakeness only apply to lighting
@@OkOkOkIMightKnowYou If you're talking about global illumination, shadows, reflections etc - then yes it's a hybrid of already-being-used techniques with RTRT mixed in. "AI" cleans the image up, as having proper amount of bouncing, precision, rays etc is extremely taxing and we're still way more than a decade off that "end-goal" (Of proper RTRT).
Rasterization is a fundamental image rendering "technique", using FP, INT ALUs to do it. RT cores are just there to accelerate ray-tracing performance. So just to make things clear, rasterization will stay and be the main performance metric for a GPU. It's literally what makes the image. Anything beside it is just to accelerate parts of it.
What I'm on about is Frame-Generation. It's fully fake frames. There's no real rendering done in these frames, it's just prediction of the frames between the real frames that are rendered, using ML/AI (whatever you want to call it). With 50-series, and MFG (Multi Frame Generation) they're adding 3 frames in-between - effectively faking their way to 4x the fps. But the latency cost is bound to be insane, and in no way will the game feel responsive going from 30 to 120.
Nah, good game code can seperate the rendered frame from game logic. So, they can just skip those frames and process the game logic and leave the ai to interpolate for those frames and you won't get lag.
So, the Nvidia's slogan "the way it's meant to be played" means with fake frames?
While DLSS does involve generating frames that are not natively rendered, it aligns with the modern interpretation of "The Way It's Meant to Be Played." It allows for enhanced performance and visual fidelity, pushing the boundaries of what's possible in gaming. Whether you consider them "fake" or "AI-enhanced" frames, they contribute to an experience that NVIDIA positions as the optimal way to play games.
@aexetanius chat-gpt ahhh reply
@@impressive006 That's the joke.
it's a typo. what they mean is "the way it's meant to be PAID"
Is because they are teaching how to play with the customers.
Lol, NVIDIA, “fake it till you make it” was not supposed to be meant as a business strategy.
who tf plays at native resolution these days if they don't have to
@@squirrelsinjacket1804Anyone whit an IQ over 60
A lot of the time it is. Snake oil salesmen are still a thing in modern times, they just don’t sell snake oil.
@@squirrelsinjacket1804 Every single competitive player in existence? Fake info + input lag, while for 80iq fifa player thats "no difference", quite a lot of of people dont even consider turning on upscaling/fake frames or whatever they come up with.
@@squirrelsinjacket1804 I waste too much energy every day in 2025 trying to fend off scams even when I'm not trying to buy something
We really have reached a point where our games are such unoptimized messes we need an AI to do the between frames. We really have messed up
There's only 1 way to solve this, stop buying.
Cheaper for the devs, which are mostly owned by private equity firms who have a legally binding obligation to turn a profit for their share holders. Make it quick, make it "good enough", and pump them out as often as possible. See modern Hollywood movies.
It's primarily the new engines. We are needing 400% increase in fidelity to get something to look 10% better because we are so damn good already, and taking the time to bake in lightijg and manually crafting bif cinematic shots overall doesnt really save compared to ray tracing that is much more conputationally difficult, but will make all sections of the game see minor increases rather than just crafted segments. It sucks but diminishing returns are where the industy is, and gaming in particular has never really needed to sell itself on non-graphical improvments, so we see these messes of high fidelity graphics that can barely run using AI to fix it because the marketing team cant work with anything less
We're living in a time where we can use a magic electrical box to play video games all day. We're living in a wonderful time
More like the devs messed it up when they stopped caring about optimizing.
meanwhile my 1060: "i didnt hear no bell"
My 1080TI is still chugging along just fine too.
The 1060 is the definition of a trooper. Hope that little guy doesn't give up o7
1660 still living the life
1080ti the greastest card to ever be made
BASED
Can I now start downloading more RAM?
More VRAM 😏
You can’t download RAM that’s not how computers work RAM has to be painstakingly grown and picked from RAM trees that only come into flower once every two years cmon
Always best to download more CACHE first. Helps speeds along the RAM DL ;)
Been doin it since 2018
Come on download speed. You're to slow. I want my ram expansion now....
It's actually scary the amount of people on social media that are genuinely falling for this too
More weapon vids nao. :D
PS5 pro fanboys already been getting ran through by Sony lies they have a whole fanbase of stupidity
Huh, weird seeing you here. And yeah, social media people are nitwits
Same way people feel for the PS5 Pro?
Look at the amount of people who voted for Trump. Theres a correlation.
$549 for 4090 performance is fake as shit. They are comparing fake frames to native real frames, and passing it off as legit comparison. Super shady advertising.
I mean... what's the problem? Their customer base is convinced they can't see the difference between fake and real frames, DLSS on vs. native. Let them buy it.
You won’t see the difference playing the game so it’s not fake performances ! Are you Somme kind of scientist or a game developer ? Because honestly ai is way more efficient for gaming rather than a high end performance graphic card
yea never trust their fake marketing performance increase BS. Always wait for independent benchmarks and reviews. There are a lot of people who dont like Frame gen and I personally don't always enable it on my 4090 unless i need to. i run most games just DLSS quality gives a great experience.
@@MN-jw7mm and you can? haha ok buddy
The fact the amd crowd is still CLINGING to the "fake frames", but all of the sudden when AMD came out with their FG counterpart it was "omg amd so good amd is great!! Free frames!!! "
And now when nvidia once again is clearly in the lead, y'all out here claiming fake frames again.
Bums
@@MN-jw7mm well it is actually hard to see a difference. Personnally I have a 4070 and when using frame gen, fps feels very real to me
PC gamers: I WANT THE TRUTH!
Jensen: YOU CAN'T HANDLE THE TRUTH!
I can
@@vave2607 pay us 2000$ for no extra raw performance but more fake fpsss🤑🤑🤑☝🏼☝🏼🗣️
@vave2607 okay as you wish, here's the truth: it's a scam. 🤯
Also Nvidia:
You will never reach the truth!
Son we live in the world who have badly optimized games and overprice gpu. And those issues need to be patched with ai generated fake frames
*RTX 5070: Bringing you the future of gaming - where even your bank account will need AI upscaling.* 💸
"The more you buy the more you save"
Now thats a quote to start 2025 huh ?
Goes perfectly with "8GB is the same as 16GB"
it's not about facts...it's words...they matter the most
lol
Sl4very is freedom
The more you buy the more you save
You own nothing and you're happy
that quote isn't about consumer graphics cards though, i'm pretty sure he was talking to the datacenter audience about beefy ai processor things. As for the card we'll just have to see when youtube channels can actually benchmark it vs 30 and 40 series
The most terrifying quote of the year
Wait... the frames were all fake?
NVIDIA: Yes... always has been.
You forgot the sound of Nvidia cocking a pistol from behind.
I may have an old 1050ti, but at least, my frames are real
@@robertpaws 10 real frames.
@@MrBlueDev😂😂😂
2000$ for fake frames 🔥🔥🔥🔥
"The more you buy, the less real frames you get!" - The jacket dude
his name is jensen
@@fordonyI know. I made emphasis on the jacket on purpose, instead of his name 😂
@@foxboi6309 gotcha
His name is not Jensen, it's Jen-Hsun, Chinese don't have english names, they adopt those names
Single item that caught my eyes were his jacket. Forget 5090, I want that jacket brooo
And this will continue as game devs know they don't have to worry about optimizing shit, AI is here
Whoever made this video, many thanks. Saved me hundreds of minutes of writing arguments. Now I will just paste this link to the reddit comments.
but but they said x4 the performance and even said "trust me bro", so it must be true
@ but you don’t understand! Current cards are already more than perfect for gaming!!!
*rtx4060 runs crysis 2 (released more than 10 years ago) on 20-30 fps. (4k ultra dlss on rt off)*
But you don’t understand! We are going to use local AI on this!!!
*even base apple m4 mac mini is more than great for running 70b llama or similar models. If you are planning to train your own AI, well, you are not a datacenter. You are not amazon, google, openai, apple or meta. If you are, you will not buy 5090. You will buy H200 racks.*
just say your broke and cant afford it little bro.
@keller8821 just say you have no soul and need to spend money on already debunked scams
@@keller8821 “It's easier to fool people than to convince them that they have been fooled.”
― Mark Twain
Are the new generation of GPUs just software upgrades instead of hardware upgrades at this point?
Yeah. Sounds a lot like it. "Here is a more expensive brick that has this new software, enjoy sucker"
Wait until they add subscription plans. 12 months of DLSS 5 for just $1000!
Which would not be that bad if it was not locked to the hardware. Truly the worst of both worlds
On Nvidia short of. On Amd and intel is 50/50
This is normal Nvidia has always been a software company while intel and amd are hardware designers
Is achieving the same results for with different approaches
kinda, they build hardware around software now instead of the other way around
We're gonna go from
"Bro I lagged"
To
"Bro the Ai made up something that isn't there"
"Bro I saw nothing there who killed me?¿¿"
@@yourmomistakenbyNOTHING THERE?
RAHHHHHHH
HELLO
HELLO
GOODBYE
*FUNNY AHH EMERGENCY_2*
youre joking about this, but its the reason I had to turn off DLSS while playing Stalker 2, shit felt like a fever dream at times.
@@mardnm123 it's quite a pm brainrot thing, besides that, they really screwed up badly.. I know how it feels but don't know what it's like to have a good graphics card nor processor. My ancient laptop (Intel HD 3000, Intel Core i5 2430M) is 13 or 14 years old as of right now and already beyond life support fr
Wasn’t that legitimately the whole damn plot of a mission impossible movie?
CPU be like: yeah now i have to do half of his work
Good. Now the game devs still delivering crap optimizations so the new 5090 will solve it delivering fake frames
They still have to make it play on consoles that run on 3060 and 3070
@@gameison4813 Yeah but they can still ignore PC optimization, especially for the higher graphic settings not available on console
@@Ziddel most graphics settings just tweak shader values, like shadow resolution, AO res, disabling/enabling certain texture maps that are more computationally expensive.
As for model fidelity, I'm assuming they just use lower LODs for lower quality levels, and higher LODs for higher quality levels.
PC and Console have become closer and closer in terms of "hardware" compatibility, and most games aren't made at the super low level anymore, so there isn't much to "optimize" for PC.
They will make the game even crappier, soon Solitaire will need a 5090 to run.
@@yesyes-om1po
Pc bros still can't handle the fact that PC is overpriced AF and that the reason they aren't getting better graphics us because pc gaming is a fuckn lie as opposed to consoles holding back everything
Old McJensen had a card
AI, AI, No.
And on his card he had some tech
AI, AI, No.
And a D-L here
And an S-S there
Here a frame, there a frame
Everywhere a Gen-Frame
Old McJensen had a card
AI, AI, No.
😂😂👌
How do you think of this lol
Excellent.
LMFAO
Oh my God this is hilarious
2023: "Imagine Frame"
2025: " Imagine Frame but 4x"
2030: " Imagine GPU"
2077: " Imagine Games"
.
.
.
.
Thx for 4k likes 💖
At this point just imagine I own the game and don’t make me buy it
They already made minecraft run fully in AI
2077 is wishful thinking haha. more like 2031
2077 Wake the f up samurai.
At some point we'll just be using our imagination. Full circle back to cavemen era.
This format just keeps on giving
Soon the graphics cards will do no rendering at all and just host the neural network which will guess what the image looks like
True but that's how they will make normal games from ai input for as long as possible to make the most money. You think they are just gonna give up on a billion dollar industry easy? Like having a start trek food replicator and now we don't need supermarkets anymore. The inventor would mysteriously disappear. It won't be soon that you can tell the pc to make a game and give it some variables, get some coffee and play it right away.
@@Swel_Remarkable
I wonder what those billions will do for them once they're in the dirt
What are the downsides?
And imagining will not not take processing power. Only the word will change from rendering to imagining.
@@All_Y Only upsides. This comment was not about the video I guess
Literally the entire presentation - AI AI AI AI AI
With such pacing AAA games will turn into an AIAIAI soon
Unfortunately the same for the AMD presentation before this one. And where the hell was Lisa?
I love the silence from the reporters in the audience, as if they finally realized that their jobs are going to be taken by AI as well. The artists were right.
OIIAOIIA AI
@tobiasgreeeen fun fact jensen and lisa are cousins
just sell us a piece of paper with "turn on dlss" with a price tag at this point
Underrated commented lmao
piece of paper $1699 per inch
Nvidia fanboys will defend it by saying "every word you write on it Ai will generate more 3 them"
I like this video because it does show the 30% performance increase but that is on a game with path tracing, which if you guys didn't know is not a common form of raytracing. And its kinda ridiculous that, the technology to actually live render path traced lighting exists.
However, it is litterally no excuse even if DLSS does give almost a 10x performace boost to these cards with AI, that AI is the future of gaming GPUs. I understand the limitations of the hardware for the size and power so that is a factor but I do hope AI is a temporary and not permanent fix to the issues with GPU's.
2031, Jensen in full Lederhosen: "Our 7090 Ti is capable of rendering a whole 5 key frames per minute to remind the AI to occasionally return to reality"
You’re joking, but technology will probably be like this in a few years
Meanwhile the AI has already forgotten that you were playing Skyrim 20th anniversary and has been displaying Tetris for the past 8 minutes.
And DLSS 7 is only $14.99 per month
Not just fake frames but fake resolution too. They give you 1080p30 but tell you it's 4K120 and charge accordingly.
100%
you on mad copium if you expecting 4k120 on any gpu
@@ThePizzabrothersGamingWell, it shouldn't be the problem of the costumers, we give our money for better technologies, they develop them, that's how it works.Why give charity to companies that don't care about us?
Edit: sorry for my bad english, I'm still learning xd
@@ThePizzabrothersGaming 4090 doing older games at 4k120 what are you talking about
@@user-jg2qd7wg5e4K120 would take the power of a home refrigerator; somewhere in the range of 3000W.
Games already look realistic AF. Go back ten years to Arkham Knight(which runs buttery smooth) and it's hardly improved since. Can we just stop focusing on *slightly* better graphics and go back to optimizing games, please?
but the shadows, man. every can in a cupboard needs to have its own shadow. at the proper angle. lmao jk but thats what PT is.
That involves actually using our heads, and voting with our wallets though.
Buy AMD, Switch to Linux, etc, you need to make a proper, decisive display that you are NOT interested in this slop... And you need to make sure their statistics and numbers, reflect that... If you keep buying their s**t, even if they're doing s**t you don't like, they're just gonna continue doing it... They have no incentive to change.
I bought an all-AMD computer, and i DID switch to Linux, near the end of last year, it's been maybe 3-4 months now... Not only was AMD a better gamign experience for me on Windows, but gaming on Linux is a better experience too, 0 stutter... Higher REAL FRAMES with AMD (used to have a 2080 Ti), 0 stutter from Linux... It blew my mind the first time i tried Linux, lmao... I didn't know gaming could be so good!
You're free to play more indie games.
I'm genuinely so tired of it, why does every game have to strive for the best graphics possible
arkham looks ugly af lol
That was a great AMD ad, thanks.
It was disturbing how blatantly obvious it was that he was talking directly to the shareholders and to nobody else. Especially the comments he made about everyone and their 10,000 dollar computers at home and how 2000 is just a small investment to upgrade that. So disgustingly out of touch
$2000 for 32GB of GDDR7 is cheap. 7 years ago, that would have costed you $15,000 USD for less performance. People are so delusional these days just because they see a price tag of $2000.
You know NVIDIA doesn't have to release these cards at all they can just release 5070s and call them 5090s and fk it. So that developers and people who need the power of 5080/90 don't have them available at all....
It's crazy how people don't even realize with their tiny brains how much power is a 5090 and how little $2,000 is in the real world of computing. If you can't afford them, you probably don't need them!
@@yushkovyaroslavlmao. Your tiny brain is missing the point.
@@yushkovyaroslav Do you have any credentials, any experience, any evidence to back up any claim you just made?
I built myself an all-AMD gaming rig for less than $1000, and to quote a wise man, "It just works."
@@yushkovyaroslav It's a good thing you're here to protect the defenseless billion dollar corporation from cyber bullies.
Hang in there, 3080...youve just been reenlisted for another 5 years
😂😂😂
@@omarhamdan6552 1080 ti: "im tired boss"
My 2070s will still smoke these young bloods 💨
1080 finally showing it's age, but honestly it's been a good run.
No joke, i'm still with the GT 1030
Now this is the clearest proof that Path-tracing in video games arrived way too early while regular Ray tracing hasn’t even matured and is mostly still hard to run on most GPUs. This is also proof that Nvidia no longer cares about raw performance. AI is not always a must-have and it’s much better to not be required or forced to use it.
Raytracing was always supposed to be simply a stepping-stone for path trcing because we couldn't quite get there, it's not a permanent solution
I see the picture now. They hading towards fake performance by AI. It means they no longer will make increasingly better GPUs, but increasingly more DLSS dependant
nvidia said long ago they'd rather pursue fake frames rather then raw performance
both are gimmicks. raytracing and pathtracing are the same thing. They are dynamic lighting with NO rays being removed for optimization. Raytracing on looks 1% better for 600% worse performance. We've had this technology for almost 30yrs now its called "dynamic lighting". Plus the only game developers to actually PUT LIGHT SOURCES inside their game in the last 20yrs that i can think of was saints row 2022.. And you guys think thats a bad game. No other developers even care about lighting they never put it into their games.. They just select a GI value and be done
@@scalz420 I mean cyberpunk objectively says you're incorrect, they look way better. If you can run path tracing at decent stable fps you're having a much better experience than a person running ray tracing or none at all. Without going down this route their taking you reach a point where the new generations of gpus performance is in the single digit percentages, due to minute tweaks. But with this new architecture you get a lot more performance. People see 20 fps no frame generation and think its a bad card but they are not considering the settings the game is running when the most powerful non frame generation card would not even run the game at all.
At least let us see that the 5090 runs the original 4K image quality of the 2077, which can achieve twice the performance improvement of the 4090, that is, 20fps > 40fps, at least twice the original performance. Let’s promote the advanced AI technology of DLSS 4. When I saw In the promotional video of 2077, when the RTX 4090 is 21fps vs the RTX 5090 27fps, my whole head explodes and I lose any sense of expectation for the 50 series.🤯
5 Times the frames, 5 Times the input lag.
the amount of input lag due to frame gen if you do not have a already decent base FPS without frame gen is insane
@@NonsensGamingi have good 60-75 in stalker 2 yet with frame gen on 120 FPS input lag is massive (as comp fps player i could tell) it is not a solution it is a band-aid
@@NonsensGaming yet devs start using frame gen to TARGET 60 fps in recommended specs (monster hunter wilds), I cannot believe this bs
4 times the size, 16 times the details ... wait why does that sound familiar ?!
Its like buying a car with a V8 engine and finding out only 4 cylinders make power and the rest are just there for the V8 sound 😅
Thats actualy a good deal, lol, i wouldnt mind saving all that gas and still sounding like a V8.
@@SrUnr3alNah, it still uses the gas from all 8 cylinders, you just get the same performance as a 4.
Basically what you're saying is, upscaling over native is the future...
I've been a console player for years but recently thought about getting into PC, so I need to know what's a scam and what isnt
Some cars like the Corvette c7(or 8 I can't remember exactly) have that eco feature where it turns off a few cylinders to conserve fuel. Granted, you can still turn it back to V8, but still
@@SrUnr3al You'll still need to run the other 4 cylinders to produce a V8 noise so you're not saving any gas in a meaningful way lol
"Why would I reveal my plan if you had a chance to stop it? I already did it 35 frames ago"
Yo is that ozimandias from Watchmen? Nice one haha
Yep. For every 1 real frame you now get 3 fake frames rather than just 1 fake frame. So now only real frames 25% of the time. Totally not going to fuck over latency and cause smearing issues all over the place. Not to mention does nothing for you if the game in question does not support the latest version of DLSS if it supports it at all.
for what it's worth, the latency is not at the 28FPS stuff, they're downscaling it to like, 1080P to get it to 60~ FPS and upscaling the rest
but its' still far from a "true" 200 FPS
Well, I think me and my 3060ti are going to be spending a lot more time together.
i mean realistically 3k series cards should last you upwards of a decade. buying every 1 or 2 gens is stupid. run that baby into the ground and wait till the 7ks are out at least.
based and responsible
Just hold to it. I used my 1070 since it came out all the way until 2023. It was still perfectly capable of running AAA but with everything at low or medium. I only upgraded to the 3060 because I had the money to do it, otherwise i would still be using that 1070.
Have fun, I’ll be enjoying my new 5080 🎉
That 3060ti SHOULD last you a lot longer.
But since new game devs do not give a shit about optimization(city skylines 2 for example) i think you will need to upgrade sooner
I remember when selling fake products was also called fraud.
I also remember when fraud used to be illegal. Good times.
This hits a little too hard. Just about every industry is a money grab right now.
The input lag and blur on that 5070 will be insane
God dammit you’re so right.
We’re being taken for fools, getting deliberately scammed of true raw performance.
This is why I prefer amd for everything but 3d rendering in blender, they do us justice
Conald made fraud great again
@goldenmikytlgp3484 Even though i like AMD, they are doing the same thing. No big company is without blame. Look at the 7800x3d prices now and 6 months back, Intel was a shitshow lately so they took advantage of it.
20xx - last generation with real performance benchmark, real TFLOPS etc.
30xx - Nvidia changed method how they count TFLOPS to pretend that cards are 2x faster (dual issue)
40xx - frame generation 2x to pretend that cards are 2x faster
50xx - frame generation 4x to pretend that cards are 2x faster
We can assume that RTX 6070 will still use 8 GB memory and use 8x frame generation to pretend that cards are 2x faster
Lol 5070 matching a 4090 in performance at 549??????
I highly doubt that
What are you talking about, the 50 series has good memory already, you're beating the dead horse that is the last generation not the new one lol
Is truly crazy how Nvidia real GPU performance is so shit to native res no DLSS and yet they come with this idea of seling software using a tool who creates more fake frames on the screen to give you the illusion your card is so strong in performance.
12gb is not enough on my 3080ti, so it's definitely not enough on the 5070 with imaginary '4090 performance' @@MontySlython
@@socks2441 Depends on needs and game especially though, most gamers arent trying to run 4k on a 5070, although I do understand the argument that for $549 it should be capable of it thats ignoring prices nowadays. People really overemphasize the need for VRAM tbh, especially since most gamers are only now switching to 1440p gaming. Also, you picked out the only card in the lineup so far with 12GB of VRAM when every other card aside from the 5090 is pretty reasonably priced for todays value and have 16GB.
This is what you get without competition
Their fault AMD, try as hard as they can, is sub par?
$2000 price and 27 fps of raw performance. That is ridiculous
in the game 2020
And people make fun of ps5 pro
agreed. Utterly Ridiculous.
@@Ssian_It doesn't even have 10 fps in same scenario (path tracing).
@@introvertonparty6233tbf that 2020 game has prolly the most taxing graphical settings known to man rn
Moore's law is still alive, but for pricing instead of performance
This cards have the best possible performance on the market and the prices are actually very fair. 7 years ago, you would have to pay $15,000 USD for 1 of these cards. People are just delusional now. And Moore's law was long gone...
@@yushkovyaroslav best performance if you are fine with ai generating 15/16 pixels of your image making everything look trashy LOL
Pricing never used to increase with each generation though, other than occasionally for inflation.
Each generation they find ways to push out more power for the same price for each performance tier, thus they can charge the same as last generation and still make the same healthy profit margins for each generation.
Naive gamers got fooled years ago that each new generation should be more expensive, and Nvidia ran with it. We've been getting massively ripped off for years.
@@yushkovyaroslavthat is the most illogical comment I've read yet.
@@socks2441 He has made many of them all throughout the comments, feel free to enjoy.
It's funny because in theory 30% improvement is not too bad. However the lie of making it sounds SO MUCH BETTER and then realizing they LIED makes it feel like you are getting an even worse deal.
Not bad for CPU. But 2 years for 30 on GPU with UE5 is just not enough.
@tonylazutto5794
Same! 30% in two years is weak especially when the price is also 30% higher.
30% on true performance is crazy............................................
I'm mostly confused how they get performance down to 20FPS native. My 4070 Super does better at full native 1440p, 4k is only 2x the pixels ... Surely a 4090 is more than 2x the power of a 4070 super?
dlss carries the performance margins its literally 2x better than 4090 keep coping
I recently found your channel, and you make some great content. I always love laughing at tech jabs.
"Framerate is a myth, VRAM is a joke. We are controlled by something far greater."
"Upscaling, the DNA of Ray-tracing."
What's more fitting ? Framerate as a myth, or VRAM ?
@@Phoenix973It's perfect the way it is
- Monsoon, of the Winds of Ray-Reconstruction
"How about full res gameplay, is that upscaling?"
Elite tier reference
I really feel like the 5090, THE best graphics card you can buy when it comes out, should probably be able to run Cyberpunk full RT at like 100fps no DLSS. I mean you're telling me, a game that came out 2 months after the freaking 3090 did... Still cannot be ran by anything out today? Bonkers.
Its running rt overdrive or pathtracing which is a preview tech they released on 2023, and not the original rt that came with the game launch way back in 2020.
Literally GPU breaker of decade just like Crysis 3 in the past one.
The game is very poorly optimized, its not the GPU issue.
How much more do you think someone can “optimize” path tracing? 😂 it’s a preview tech for a reason.
@@slq-16-i-rt31 You mean Crysis 1. Crysis 2-3 ran on shit like HD 7850 which was, at its absolute best, an alright card for its day.
Amazing to see some actual truth on here, many thanks, loved it 👍
Every breath you take
And every move you make
Every frame is fake
Every step you take
I'll be watching you
How my poor wallet aches, with every bench you fake
But you didn't have to rip me off
Turned off my DLSS and saw my framerate plummet
And I don't even need that much
But you advertise it like that and I gave my trust 😔
They selling fake frames, can I pay with Monopoly money? 🤔
$1 for every $4 MSRP.
well you technically already do
By next year, Jensen will give the keynote with an all gold jacket like that indian guy.
Jensen copulate with reptiles
i think like this gen of gpu will by like 7 gen of intel gpu, when after few moths drop a lot better 8 gen, the 60xx series will less power draw cuz will have smaller litography
for now 50xx series looks like a refresh of 40xx but with dlss 4 and gddr7
wow, we're living in the distopian future where you can download more frames for the low low price of $2000
Imma be that guy. My 1080ti is still peak and working great.
Feel that brother!
I miss my 1080
3060 feels like the bottleneck in my rig, 80% utilisation, frame drops, the works
u sure its not a thermal throttle or cpu bottleneck?
the 1080ti was too powerful, Nvidia will never release a card of that magnitude at a consumer price like that ever again
exactly, there's not a lot of software demand for improved anything. they want another bitcoin rush. it's a solution in search of a problem
1:06 for a genius…he sounds pretty stupid
It’s more he’s talking to idiots who are going to pay him literal thousands of dollars for ‘this’ so he needs to speak their language to grift them properly.
@@DarkMatterKidliterally and it works every time lol love it
@@DarkMatterKid Why do you suppose the UFC advertises grappling when ROK Special Forces easily and effortlessly killed those who dared to try double leg takedowns against them in war? The UFC has to speak their language too in order to get them to pay lots of money for pseudo grappling found at sambo, bjj, judo, karate, etc.
That was for ai companies not us 😂
@@coopersheldon394 Because you don't get a pass to *murder* your opponent in UFC. Obviously. Shit analogy is shit and I don't even watch none of them meat heads beating on each other.
Been like since the GTX series. RIP FPS chasers, the journey will never be over. I am still standing firm on my GTX 1070.
maybe the real frames were the friends we dlss off along the way
o7
U won't believe but I am not using DLSS with 2060, whenever I open it, games becomes a shit though I am playing on only 1080
"What are you waiting for?"
"I don't know some real frame count improvements, I guess."
"Me too kid, me too."
Underrated comment
0:45 look at the pole in the background and the surrounding objects, the native one looks much sharper and clearer than the blurry mess on the right lmao.
I cant stand playing with DLSS if im going to have anything like that on it would be DLAA
@@hockeyking2433unfortunately DLAA is still super blurry just like any other TAA solution, but at least it doesn't have as much ghosting, I guess...
Bro really picked out a pole lmao you dlss haters are so odd to me.
My g they look the same lmaoo
Idk if it’s just me but it looks fine
Look at the Powerconsumption too! Get a 2080, costs around 200 bucks. Watch out for the OC and non OC Chip-Variants in this Generation. Undervolt +OC = GPU-Power 167 Watts 2105 MHZ +600 MHZ Memory-Clock. STILL an absolute beast!
Yay more blurry messes, but more frames for a measly $2,000
Wait unit scalpers gets there hand on it 5090=for5090$
@@minerockgaming4717 Scalping is not a thing anymore we waint in 2020 bruh
Rtx 6070 or 6080 will be affordable version of 5090. Solved. Buy rtx 5070 now. Motive should always be maximum value for money and not high end product.
@@minerockgaming4717 lol yea
@@MathiewMayhow much do you think a 5090 would cost after scalper prices in your opinion? I wanna know because I kinda want to save for pne
Remember when you could buy the top tier graphics card of the moment for 400-500$ and it really performed with games?
That probably ignoring inflation.
And PC gamers are poor people who use parts from 5+ years ago and complain it can't run latest thing anyway
Dude is the boot shiny from all the licking your doing@@hyoroemonmeto6874
@hyoroemonmeto6874 I still can run games on a GTX1080 from 2018 on 1440p. And if the game is poorly optimized just clock in Lossless Scaling that is 5$ on Steam. Nvidia Graphic cards nowadays are a scam since neither them or the actual games go out optimized and resort heavily on fake frame generation "technology" (which I repeat you can get on steam) to make people spend 600-1200$ on mid Graph cards
used titan X pascal runs practically identical to this. 27 FPS max rays in elden ring. IDK, i did a poll on LTT forum and people agreed. there isn't a price to performance argument for new hardware. It's scrapyard wars except the builds will last several years while competing with a stagnant games and hardware industry
Well Nvidia doesn't want to do another 1080 Ti, they have seen that it impacts future sales, they want to keep people upgrading, since few will be able to afford the top of the line gpu.
My comment has 4x the likes with DLSS and FG turned on 😐
LOL😂
Made my day! Here, 4 likes from me. ;)
The time will come that you will need to buy AI subscription to use AI frame gen... Like there was heating seats for BMW :)
We swapping to AMD with this one 🔥🔥
🔥🔥🔥
should have done much sooner
1080ti was my last nvidia card
never regretted it
you clearly haven't seen the new AI tech in the amd chips😹😭😊
@@nxheart_xoxo if an RTX 5090 that is $2,000 only runs 25 FPS on cyberpunk with no DLSS and maxed graphics it is safe to say it is a scam
Way ahead of ya there lol
I'm not a tech guy but I think Frame generation should simply be for covering a bit of uneven framerate. like 30-55 frames to 60. or 90-110 to 120. Going from sub 30 to over 200 seems like overkill and would have smudging or image breaking as you're getting more than triple the number of generated frames than the real ones.
@@billx6545 at this point, forget generating my frame, can that gpu help me generate mmr for me
I mean you can use frame gen and lock the frame rate to achieve this.
I’ll just get the 5080 in a few months and test it myself. Not gonna buy it for the first 4-5 months cause of scalpers.
@@Hanley9000Lock your refresh rate* sometimes frame gen will just ignore FPS cap
Maybe games are different ... but TV manufactures have been doing this for years , back in the full HD days there were 240HZ LCD TVs and the result was OK ...
Remember when the 3070 was actually the same performance as a 2080 Ti? Good times...
Is it really? Because my 3070 feels already outdated
@edistefi the only thing bad about your 3070 is the vram size, other than that its a good card
I remember that prefectly. 2080ti was 1200€ aprox and 3070 before crypto boom , 500/600€, was great.
I still have one 3070, although I'm going to change it now.
The 3060 ti had a little more performance than the 2080
@@Charly_dvorak 3060ti user here i get 4k60 granted on some game si do use DLSS to reduce the load but ive gotten native 4k60 on others note this is on indie games not modern AAA trash
My 3060 TI is good enough for now.
The games? Poorly optimized
The frames? Fake
The prices? Sky high
Truly the worst timeline. I won't waste my time or my money on any UE5 slop that comes out next.
Hotel? Trivago
The Witcher 4 is made on UE5.
And CP77 Orion will as well, bud.
@@SeriousDragonify Skip. EZ.
@@SeriousDragonifyOkay? And?
Time to sail the seas buddy.
2k for fake frames, what a time to be alive
You do realize that these cards are still the best in raw performance, right? where all these stupid people come from?
and fake resolution on top of that, dont forget the upscaler
@@arch1107 according to nvidia the upscaler makes the game better than native. lol
@@Lohsir like apple saying their 8 gbs of ram are like 16 gbs on pc
they are full of crap, a mediocre company at best
Classic marketing strategy at its finest. "Tricking" people to think that they got a great deal.
CONVINCING them that this is the best deal they have ever made!
@@Moe_Posting_Chadthey know how to convince those fans🎉🎉🎉
I have had the same PC since 2017. It’s time to upgrade but I have been thinking about going to a console because of this…..have to spend tons of money to get good performance and on consoles at least they are optimized
rtx 30 series, amd 7 or 6k series are great choices going forward, use an amd cpu though the intel chips are trash
Input lag going to feel worse than cloud gaming lmao
Yea its said how many people don't realize that DLSS/MFG actually make gaming WORSE in competitive games. Im never turning dlss/mfg/ ray tracing on so im not getting any real improvement at 1080p raster which sucks because i have a 480hz OLED monitor and the 4090 cant hit that and 5090 will only give tiny improvement but for much more $ and i cant buy 5080 cause its not even a REAL 5080 chip its a 5070
@@LordBattleSmurf whats the point of 480hz
it already does with current gen
@@epsilon752 bluffing
@@epsilon752 less blur , better motion , better reaction time , more information on screen , less input lag
whats the point of your question
I hate FG and upscalers.
Destroying visuals and making devs lazy.
I think upscalers are pretty good/useful but I'm not sold on FG.
@@HDRcade upscalers can be very useful at extracting extra FPS for peak performance at high-ends when used correctly, but what's happening is devs are relying on them in place of optimisation.
This is all while introducing downsides such as visual artifacting or fuzzy/blurry visuals and shadowing.
So what happens is games run about the same as they would have if they were optimised normally except now they also gain a slight blurry haze and shadowing around light sources or when moving quickly.
Its not the fg or upscalers' fault, its just that the game devs use it as a lame excuse for less optimisation
@@AMDRyzen57500F I mean, I agree on a technicality but you could also make the argument it's not the devs fault for being lazy either and it's ours for not calling them out on it.
That's why I instead just go to the source, to the feature itself.
@@thomaselvidge That is some stupid logic, they aren't lazy because of us....yeah no...
You are just paying for better AI at this point lol.
It is not even better AI its just more AI, nothing is better from the 4000 series, its just more of AI and new software, the actual diff is like 5 fps
So make a cheaper variant with no AI because I'm not spending $2K for AI upscaled AI generated frames slop
@@MathiewMay Just say you don't know what you are talking about lil bro
@@AzaiaMonotafor real, god, what even happened to improving native rasterizarion performance??
@@spoono current technology limit maybe?..
The Bloodborne soundtrack is excellent placement in this video !
This is NOT how we envisioned AI takeover to be in the 90s lol
Expectation: Robotic AI that can help you with daily tasks
Reality: Use AI to generate fake frame rates in games
LMAO
Wel they did say AI is 5th gen of computers after microprocessors soooooooo
You should watch their keynote. Physical AI is also being invested by Nvidia
The frame rate is real, the frames are fake.
@@MathiewMay I still don't get it
don't forget fake responses from help desk
The sad part is that 4000 series could probably run dlss 4 but it won't.
not probably, definitely
it will tho
DLSS4 will be available all the way back to the 20xx series, it has been confirmed in the presentation by nVidia. We'll see how the older cards will be able to handle it though, as it may come with a performance hit.
mult generation wont come to 40 series but better single frame generation is coming to all rtx gpus
@@bundy3764 frame gen adds input lag as well. i cant imagine multi frame gen feeling any smoother
Let’s just hope these prices stay at MSRP…
Scalpers: Hello there
Stores: Hello there
@@NikTekexactly 😭
@@MeatbagSlayerHK47 Scalpers, here boys, com'on boys
They haven't since 3000 series (if not earlier). So no, they will not be at MSRP. 5070 at 550 dollars? It's gonna be 800+ dollars for a lot of people
How long until rtx 4x reach the msrp price?
thanks to all the people upgrading to RTX fakety90 i can finally afford 3090
And then we have Rashid over here still rocking a 3dFX Voodoo 3 from 1999.
Which can’t run crash bandicoot 3
Voodoo? I loved my Voodoo.
I was playing counterstrike in 1999 on an ISA video card slot in software mode. Even OpenGL would run like ass.
SLI voodoo 2s was peak gaming at one point.
She has been a faithful petal….
28 fps to 242 fps. "Multi Frame Generation (4x Mode)". Surely it plays well too
No, it will have horrible input latency. Frame generation destroys latency when the baseline performance is bad. So this will be absolute dogshit.
Gonna be useless in any competetibe game.
@@300ml_brasil they presented Reflex 2 which improve that
@@SantiagoGomezJorge It also adds a bunch of blurry noise ghosting dogshit, and gives a reason for engine devs to keep making bad unoptimized software.
"We speed up your car from 50 km/h to 250 km/h. But you will only be able to actually control the car for about 1/5 of the time".
The main issue with DLSS is that it doesn't look good. Games look fuzzy when you enable frame generation. What's the point in gaining performance when the image quality is sacrificed. If I wanted to play on 720p then I would just set it to 720p
Well, play at 30 fps
120fps native > 300fps astigmatic blur
@@isimsoyisim4979 WHAATT????
@@isimsoyisim4979 bro you seriously need to verify what you're saying, you don't even know how wrong you are on the dlss part. Do some research before speaking.
Not true. Source: I've had my 4090 for over 2 years and prefer to use Frame gen in every single place it's available. I game at triple 1440p (7680x1440 240hz) and at 4k. In basically every scenario I prefer to use FG and DLSS. So you are telling me that your don't use Fame gen on your 4090? That's bonkers... Why even buy it?
Why did you not include the original music that went along with the incredibles clip? IT is so good
The real frames were the friends we made along the way 😊✌️❤
Enough with that, there's no more friends and no brotherhood. Especially if your brother hates you. (Me... because i having a family problem right now,k
Turns out the friends you made were generated using dlss.
The real friends are the frames we made along the way, Jensen probably...
Next 6070 with 2 frames extra for 3000$
Intel will challenge the 6k series I think, no more cakewalk for Nvidia
@@rudy1999 yknow i actually believe that
@@hengry2 nvidia a trillion dollar company and intel is in trouble financially. I don't know if there's a feasible way to do it other than break CUDA's hold on the sector through open standards. You'd have to drag nvidia along for that.
Meanwhile, AMD:
Edna Mode: Do you know where the RX 9070XT is?
Elastigirl: It's in a company meeting
Edna Mode: *Do you know where it is?!?*
Finally more unoptimized games for your GPU to handle itself.