@@PedroCorreia-o9i I honestly think that switching to AMD 9800X3D system from 14700k will give me a much better performance uplift since I alredy have the 4090.
@Skeleleman that's what everyone already does. Waittttt you actually thought that bit of paper/plastic/1s and 0s on a computer actually has real VALUE. You've fallen for it haven't you, hard.
@@cptbluemax1234 I know that fiat is "worthless" since the gold standard was abandoned. But still you can interchange it for other goods that have "real value" can't really do that with an A4 piece of paper. As long as this stays true fiat has value.
Meanwhile, game developers: Oh, so that means we don't have to optimize our games anymore! Just generate 75% of the frames and resolution out of thin air! No one will be able to tell!
that's the main point of using AI. instead of depending on Software Engineers to optimize. Save the money specning and let the AI handed IT so nVidia wins with more selling cardsand Puplishers save more and gamer pay more
@@bigturkey1Just never forget that corporations with shareholders are *always* capable of fucking with both themselves and their customers. The product is no exception.
@@bigturkey1 Input lag is tied to your real frames, fake frames do not make it lower and if your game runs at 20fps you can basically do a instant 180 flick and see the delay in your eyes and for anyone who has developed a muscle memory thru competitive fps games this isn't just an issue it literally makes me feel ill, input lag makes my inner monologue scream in agony, feel like i'm wheelchair bound.
RTX 5070 looks like it'll have the raw performance of RTX 4080. With the power of AI you'll be able to double the performance (not texture) of a 4090 that is also using the same AI, just an outdated model. Do some research before spitting such oompa loompa takes.
@@hiimshana You might want to look at those 5070 specs again. Raw performance of a 4080... What a joke. Bro, it won't even have the raw performance of a 4070 super. Literally worse rasterization specs than a previous generation's refresh.
@olebrumme6356Objection! Technically the Xbox had better raw performance then the ps, but somehow the ps5 consistently shows to be better, both in resolution and frames.
This is not? "Fake frames" are generated by interpolating existing frames with the use of an AI model trained on the games themselves by the devs themselves. It's not a way to sell us fake performances, it's a way to both give us the usual generational gap we're used to (10-15%), while improving a technology more and more over the years to reach even higher ceilings.
@@hiimshanaInterpolation is basically synthetic generation of frames and synthetic generated are used here to fill the gaps between each real frames. This is by definition fake frames.
What you mean "Fake"? , These are frame's that are generated by AI , which is an addition meant to help you get extra frames , So you either use it or lose it , After all it costs 550$ which is mind blowing price from a company like Nvidia, I also bet that raw performance is probably 15% lower than the 4090 which is not bad much tbh.
@@hiimshanaok stop being a nvidia shill you can like a company and buy their products without defending everything they do even if its bad like with this over reliance on ai and fake frames you can't interact with 75% of the frames with multi frame frame generation since they are just made up by ai so it will feel terrible and dlss fg artifacts will only be more noticable if litherly more of your frames are ai than really rendered by your gpu
in 10 years, the RTX 9090 will actually be able to run Cyberpunk 2077 on Ultra with RT at native 4k at native 60fps! It will have a TDP of 5000W and cost $5000.
a quote that is stuck in my head is "We landed men on the moon and brought them back with the same computing power that is in 'Record Your Own' birthday cards. Developers used to be confined by their hardware, now the hardware is being confined by poor developers"
let me help you. For example you have $1000 real dollars in your bank account. You give them to me and now you have $0. But if you enabled DLSS4(tm) with MFG(R) - you will now get $2000 imaginary dollars back into your bank account. In your imagination you now became twice as rich! The more you buy, the more you save!
Could I have some AI sugar with that AI butter to make my AI cake? Let's not forget the AI spoons to eat that AI cake, or maybe you're more of an AI fork user when eating AI cake. Anyway, AI thank you... AI AI AI
Bet you enjoy playing cyberpunk at 60fps. Difference is you imagining a fake cake and get nothing yet this tech gives you a tangible performance increase in a game
Actually the 5070Ti will be better than a 4080S. 1,300 less CUDA raster cores but the 14% more cores on the 4080S will be overtaken by the slightly better architecture and significantly more memory bandwidth on the 5070Ti, plus 4 more GB of VRAM than the standard 4080 you have, and for a bit less money than the 4080/S are currently selling for. Don't get me wrong, I'm all for shitting on them for the new cards, but they are an improvement.
@@beckriv9854 What are you yapping about? The base model of 4080 has 16 GB of VRAM just like 4080S and 5070 Ti will have. There will be improvements obviously with the new architecture and memory but most of the improvements are from the MFG. Also let’s just wait for the benchmarks.
@Simon_Denmark The above is a direct quote from a friend of mine who is an advisor to the people who are responsible for the AI implementation in companies. He is the person who advises the supposed experts, he also has an aerospace engineering degree. That's what he's yapping about, and when it comes to computers he knows more than you. Sit down precious.
@@hiimshanawhat's the point of having all of these just for your game to be plagued with ghosting, artifacts and input latency ? The more complex your graphics get, the more noticeable these issues get. Ai tools like frame gen were meant to be used on top of an already stable frame rate, now devs use it to brute force optimization which in turn, won't make your graphics look substantially better which is a sad trend these days.
@@hiimshanawe are NOT getting any of that, these technologies encourage developers to half-ass optimization and just rely on players to use framegen and DLSS to get smidge of those frames back, its also very delusional to say these cards will bring better graphics and resolutions when DLSS murders visual quality and framegen introduces unbearable input lag(Reflex barely reduces input lag). Games with DLSS these days look worse than previous gen because of all above and the simple fact of no one using native resolutions anymore
Something noone ever seems to talk about is that companies like Nvidia have a whole team of neuroscientists, behavioural neurobiologists, different kinds of psychologists etc. All to work within the PR teams to make sure profit is maximised. Remember kids, when a big company like that seems to behave ignorant, incompetent or foolish, it's most often a very well crafted ploy. Sometimes with decade long strategies in mind. We are taught that companies these days only look for short term profit to please their restless, power/money hungry shareholders. And they dont strategise much beyond a yearly profit. Believe you me, all of the biggest tech companies have multiple strategies, in their books, that reach into multiple decades. ALWAYS keep in mind that companies are simply entities, often run by less/non emphatic people, that try to please often worse people, by simply making money. Whatever they might project and make you think, they are not your ally, let alone your friend. Behind every single behaviour, manipulating you and extracting profit from you, is the end goal.
@@Syntex366so basically, they updated the graphics cores, being able to “support” the new model of ai, when they could have easily just realised another card, which is an ai only card, so dlss would be 4 times better while not being fake frames…
One thing that is wrong about the 50-series is that they sell it as a gaming GPU like any other standard GPUs for gaming. The 50-series should have its own line where it clearly distinguishes itself to be an AI-oriented gaming graphics processor, rather than clumping it up together with their primary line of gaming GPUs and sharing the same naming scheme (of course, no gamer is too stupid to not realize the 50-series is AI-oriented even if it's not said so). I guess one big issue with the 50-series would be that they are not actually rendered frames, and is only acceptably good if their frame generation is as indistinguishable as actual frames that are rendered/calculated and that it won't affect gaming aspects like input latency and other stuff; we have no idea how impactful will this much frame generation as they have advertised will affect actual gaming scenarios.
If you understand tech you know why this is necessary, traditional raster performance isn’t scaling like it used to so it’s either 10/20% gains only or use AI to maintain similar jumps between generations that people want
Not even… the batch sizes are still terrible because of limited VRAM. These cards let you train AIs faster, but they still also be shittier (AI slop continues).The only card that is even close to an upgrade is the 5090 with 32gb vram… but at that pricepoint most personal projects are better served with AWS and serious AI companies probably use V100s/A100s anyways with dedicated AI hardware. 5xxx series are jack of all trades but master of none
@rytraccount4553 100% this. The “you will own nothing and be happy” strategy continues, only it’s smart now. Instead of coaxing and/or forcing you into a subscription lifestyle, companies just refuse to actually sell you anything worth spending your hard-earned money on. Want an actual upgrade over existing tech? Get hooked on a subscription to access it.
@@profaa927 thats the originial saying but the meaning has changed. If you think that 12gb of vram is comparable to 4090 then you’ve probably never tried to train big models…
This is bad. It's just software upgrades at this point. It's not raw power improvement not to mention 12gb vram for 1440p cards. Hell even 1080p maxed out some games require 12gb at this point. These cards are no way future proof for even 2 years. Nvidia biggest scam. Customers need to pass on this gen tbh.
@@majus1334 oh really, a card made for gaming, unlike A4000, is mostly good for games. Unbelievable, it's almost as if they know different tasks require different solutions It's still a 40% increase 4090 vs 5090, wtf do you want, it gets harder to make these things smaller as long as we are on silicone road
@@giedrius2149 I have a question on this card related to Valorant, is this card good? Cuz I heard that ai gives layency and delay and I dont want to have any input delay or any latency on the game because I want to compete, im not there for the graphics, Im 1 rank away from the top rank...
@@giedrius2149 I'm not sure if I get what you're trying to say? You mean that it doesn't make sense to expect an RTX gpu to do anything well, besides gaming? Also, wasn't the 3k series that started this hardware "gimping" and trying to compensate that with software (DLSS)?
the fall had nothing to do with the event or the launch, the job opening report came out higher then expected and the markets panicked, if you think the launch was good then you just got another oprtonity to buy for cheaper before they starts shipping the products, if you think it's sucked then you should have sold already after the event.
I'll pay 1/4th of the price and tell the retailer that I'm paying for the actual raw performance, he can pretend the rest of the money is generated by AI.
It's proven that lying is harder than saying the truth, because you need to really think about what you're saying. He was lying almost as well as trump on a day ending with "y".
I love how we went from games that had too much bloom, to games that had too much brown, to games that combined those two and also applied a vaseline filter as a guise to increase FPS because game developers have no idea how to optomize their software and hardware companies are pushing nothingburger solutions that look beyond awful. And theyre charging you the price of two computers to do it
I have 4080 super got it at 850$ last month and turning off dlss and frame gen is not a bad gpu but the performance gain compare to 1080 ti is not even double we are living in a dark times in terms of everything only market collapse will save us now these companies lost their mind soul everything its all about greed and scumming now
@@GFE. have you ever thought that the room for raw performance gains was way bigger back then when 1080 ti was launched? Now any mid card beats the 1080ti, that's how far this market reached.
Tbh i don't think it's nvidia's fault. Technically they have done their job, even though it's frame gen. The graph showed at 0:41 signifies poor game development and not working on optimization. I am suprised that even though many people ask for optimization, no interviews have been done discussing about using less resources with best game visuals
Thats my take. They can only squeeze so much out of every generation but game developers continue to release unoptimized games. And it isn't just nVidia. AMD and Intel have been working on the exact same technologies.
@@dregothic Well in this case, then playing old games would be more enjoyable. I mean if newer games require more resources, then instead of getting latest graphics, we can choose to play games that we had never played. I am doing this for more than a year and i do not have any FOMO attacks. I guess if everyone does it, they can address it. Either way, moders can do the undone job
@@stupiocity245 If only people would. It seems some are but a lot of games still sell well enough. To top it off UE5 has things in it that make developers able to do things without really trying. UE5 games tend to run worse on my system because game devs just add a feature without really optimizing it.
@GamingIntelCoreM-y 25% more fps at double the price. It beats the rx7900xtx but you also pay it more, the big difference is the dlss suit. As much as we don't like it, having the possibility to hit the frame rate you actually want losing almost nothing on quality is sweet.
Did someone remember times when games hate every single dev if their game have blurry textures but running 60+ FPS? And now we came to the time when you can't simply play your game even at 30FPS without it being blurry(fake frames generated by AI + DLSS, FSR, XeSS). Nice. Ironically - i may say.
I find it hilarious reading people say "maybe i'll wait another year before upgrading my 4090..." Like, high-FPS bros, you don't need to upgrade all the time, spending 4 times the actual worth of the graphics card. 4090 still good for a while yet. I'll bet most of the high-FPS bros couldn't tell the difference but convince themselves they do. Like feeding bacon to pigs at that point, but whatever lmao
I mean, it will be an extra >33ms at 30 real frames. >66ms at 15 real frames etc. And not just like regular, because the GPU will always have rendered one entire frame it won't show you yet (cuz it needs that frame to generate the ones in between). So 30 real frames without AI have 33 ms delay. And 30 real frames with a total of 120 frames have 66 ms delay.
It's so cringe how they gatekeep their new DLSS/MFG. It's "exclusive" to the 50 series because if they gave the 40 series support, there would be no reason to buy the 50 series. They did the same thing with the 40 series and frame FG/DLSS3 when it launched. You could have shilled out for the 3090 Ti half a year earlier, to only find out that the 4090 is cheaper (still super expensive) and has frame gen support. Their new software exclusivity for the next gen is their current biggest upsell tactic. The 60 series will have multi-multi-frame gen and the 50 series will only have multi, and the 40 will only have frame gen, and the 30 series will have none whatsoever.
Ah yes, Nvidia first creates the problem with their RT-only approach tanking performance outside of their own proprietary architecture and is then selling you the solution in the form of more upscaling and frame-gen that specifically works on the issues they themselves created, on their own platform, in games with their own backing. CDPR and Nvidia running the long con, jeez
You should review how RT works, if it's just software it will be less effective. Lumens by software is an example of that. On hardware RT is simply better as the RT cores reduce the burden on the GPU cores
@@Rysix19 Exactly, Nvidia developed an insanely efficient hardware pipeline to calculate Raytracing in realtime. You all got to remember this shit was unthinkable a few years ago, rendering raytracing, even a few second long clip, took minutes if not hours back in the days. Of course they are not samaritans and just hand out tech for free lol.
You can see the performance of the 50xx RTX on the Nvidia website. Only Far Cry 6 doesn't have DLSS, so if I understand correctly it is the only game that shows an actual performance increase compared to 40xx? About 20%?
Correct, and the comparison is with the older non super cards, which are already 10% faster for the 70ti and 80 and 20% faster for the 70. So there's really not much of an improvement, especially on the 70 series.
@@DinoTrollerinoYo I’m still rocking my 1070ti wanted to upgrade. Now for ME personally ion gaf about frames being fake or not, as long as they are there. My gpu and frames are not my girlfriend so ion care if they fake or not💀 so only perfomancewise would i be able to play all new games at 1440p and 60+ and maybe up to 144 fps? (With the 500$ 5070) - i mostly play fps games and want as much fps as possible in them.but sometimes im playing god of war, Elden ring or something. Ion care about 4k i just want it to run smooth. Will it do its Job?
@@stocc2g used to have the 1070ti myself. Upgraded a few monts ago to a 4070super and OLED 1440p monitor. 5070 should perform similar,so you should get 100+ fps on max settings on most games native resolution, of course if you start turning on RT you won't hit that.
@@AMDFan-s1y for casual players and to have a good gameplay feel 144hz is the spot. for pro players i don't think after 360hz is possible to feel any better. in the human eyes atleast talking about FPS games, really casual games like god of war and things like that that you don't really compete with anyone i think 60fps is a really good and comfy spot.
@@TioBru. the mass market literally doesn't even care if games reach 60fps. Everything beyond 60 fps is mostly bullshit because of fake frames like we're seeing with Nvidia. It's the black frame insertion we used to see with TVs all over again.
How would that work with online games? Just like 66% packet loss and pretending you've won the match to keep your sanity? Does it generate my history and stats too? 😮
But if you can pay 600 bucks and get a "fake" 4090 performance BUT the graphics arent looking worse with DLSS then I don't see any problem with it? :(( I seem to not understand why people are so upset about it besides the bad marketing and not being clear that the raw power isnt better but the AI power is WAY better? :((
@@apexxdeathgaming5791do some research before showing how empty your brain is. They did the same thing as with 40. Less than More 20% performance than last gen
You pay for raw power and a technology that is ALSO able to generate more frames for people that are into that. Try running Cyberpunk at 4K all maxed with RT on, let's see how pleasant the framerate will be without DLSS and FrameGen.
What's even more sad is that Lossless Scaling has been able to do 4x Frame Gen for a while now. And I'm pretty sure it works on any card..
15 วันที่ผ่านมา +9
It does though it lacks in many ways. Lots of artifacts especially at the edges of the screen Still, for a cheap, pure software only solution, it's incredible.
only good thing about new gen gpus is that they drive down the prices of the 3000 series. definitely gonna be snagging a 4060 as an upgrade to my 1070 soon. only $300 and its prominence is only going to go down with the release of the 5000 series. +it's still a solid gpu even without dlss or fake frames.
I feel like with GPU performance i've been holding my breath since the 20XX series. I'm still waiting on that next 'no strings attached' leap that happened when we went from the 900 series to the 1080ti, but it seems like those days are over. From now on, every single GPU ever released will be labelled as 50% better or 100% better or 69% better, but with the worlds largest asterisk* next to it and you can bet your cheeks that asterisk means *running at 150ms system latency with a blurry af image and tons of ghosting
@@xenird And then what ? Same story, just like with Intel CPUs before Ryzen. Our whole Industry is a shitshow, because the already shitty capitalistic market system is collapsing from many factores undermining it. Greedy Elitists being the primary factore.
@@SD-vp5vo Bold of you to think you can get a 5070 at MSRP. Scalpers, custom coolers from other brands and nvidiots will make them go for $750 minimum.
@@SD-vp5vo You are assuming we just have to purchase a GPU. The target audience this time is users of 2070 and below. So the users of 2070 might have to do platform upgrade to DDR5: CPU, MB, RAM, SMPS and GPU on the top. Else on old platform with PCIe3, No rebar the card won't perform well.
They could but it would be blurry, have additional latency and will look worse than native IRL res with artifacts and ghosting issues... But it would shine a whole lot more due to the brand spanking new shiny T.U.R.D AI tech!
How many games have DLSS4 now? 0 games. How many games will have DLSS4 at launch? 2 or 3 games maybe. How many games will have DLSS4 a year later of its launch? 10 games at maximum if you are lucky. How Nvidia see us: 🤡💰
They're essentially reinventing video compression where you have a reference full frame and then frame deltas that just transform the original frame until you reach a full frame again
All that tech would be fine and interesting as an optional thing for full pathtracing or something. Meanwhile they want to make it *the* selling point of the new GPU gen.
@@apexxdeathgaming5791 how is keyframes and delta frames of compressed video different from framegen having rasterized frames which are used as a reference and then ai generated ones? I'm really curious what exactly is false about using that analogy.
@@RupertBundem The same benchmark run with the 4090 had 20fps at raw processing power (without any AI features on). The 5090 has 27fps. Not brilliant at all
@@Linkario8630% more MSRP for 30% more frames buying high-refresh OLED, good quality analog keyboard and a high-polling MMO mouse along with a good mousepad like artisan probably brings you more enjoyment for the same price
@gonozal8_962 yep. Especially considering that those 30% more frames are at really low fps. Microgains for the customer, massive gains for the company.
Brotherman the baseline is the same. They didn't cut 3/4 of your rasterize performance to sell you "fake frakes" instead. They are ADDING them on top of the usual performance gain from the old generation. Do some research first.
Don't forget to subscribe to the channel guys. More content like this is coming soon, stay tuned ;)
Did you forgor about the 12gb of vram? Its dogshit a 4090 with 12gb vram. And yes i am subbed
Please Nik, make a video of Taash from Dragon Age being the 5070 and identifying as a 4090 🤣
@@PedroCorreia-o9i I identify as an adolescent crocodile raised by field mice after this farce. I'll wait for 6090.
@@PedroCorreia-o9i I honestly think that switching to AMD 9800X3D system from 14700k will give me a much better performance uplift since I alredy have the 4090.
You do too much memes videos these days
First rule of PC upgrades.......Let someone else go first
And watch the connectors burn
I sold my secondary GPU just in case ahead of the launch, but I'll be sticking with my 4090 for at least another year methinks.
Nope, I still use i9 9900K this running good in 4K, I will upgrade next processor until GTA 6 launch in PC
First rule of any technology
2nd rule "future proofing" doesn't exist
The 6000 series will just be a box of LSD where you sit and hallucinate the entire game in front of your TV/Monitor.
Hahahahaha, seem legit
That'd be kinda awesome if the price is fair.
me wants lol
atleast the game would be optimised.
🤣🤣🤣
- Here is your brand new GPU now with 3x imaginary performance!
- Thanks here is the money.
- Wait, it's just paper.
- Just imagine it's money.
@Skeleleman that's what everyone already does.
Waittttt you actually thought that bit of paper/plastic/1s and 0s on a computer actually has real VALUE.
You've fallen for it haven't you, hard.
@@cptbluemax1234 I know that fiat is "worthless" since the gold standard was abandoned. But still you can interchange it for other goods that have "real value" can't really do that with an A4 piece of paper.
As long as this stays true fiat has value.
"16X the detail" type beat
Best😂
Bro you are genius 😎❤
Meanwhile, game developers: Oh, so that means we don't have to optimize our games anymore! Just generate 75% of the frames and resolution out of thin air! No one will be able to tell!
that's the main point of using AI. instead of depending on Software Engineers to optimize. Save the money specning and let the AI handed IT so nVidia wins with more selling cardsand Puplishers save more and gamer pay more
For that reason Nvidia has launched the Nvidia RTX Kit, to help developers better optimize work on their GPUs.
@@microvision9 You know AMD has AI technologies too, right?
Never heard of FSR and AFMF?
Thank god, they certainly don't know how to do it themselves.
BINGO
"CEO math" "The more you buy the more you save" Jensen you are not a clown, you're the whole circus right now
Just like apple math... 999$ for a macbook air.... with base storage and 8gb ram😂😂😂
@@Boredashell666 now you can't get 8bg of ram and only 16, but they just force added the price of that upgrade on the macbook?
I actually thought NikTek had smoothly edited that line in... wow.
@@CIinbox There is no way a human being stood on a stage with the entire tech world watching and actually said those words. That has to be an edit.
@@Boredashell666 Well, at least they put air on the product, telling you, is like selling you air for 999 dollars XD
200 "fps" with 30 fps input lag...oh man
20 fps input lag
why do you think it will be laggy? nvidia isnt going to release something that barely works, they aren't AMD
@@bigturkey1 my brother in christ stop riding nvidia
@@bigturkey1Just never forget that corporations with shareholders are *always* capable of fucking with both themselves and their customers. The product is no exception.
@@bigturkey1 Input lag is tied to your real frames, fake frames do not make it lower and if your game runs at 20fps you can basically do a instant 180 flick and see the delay in your eyes and for anyone who has developed a muscle memory thru competitive fps games this isn't just an issue it literally makes me feel ill, input lag makes my inner monologue scream in agony, feel like i'm wheelchair bound.
The funniest part is that as soon as he said about AI he stuttered, the audience reaction must have been great.
He has Unreal engine syndrome
He didn't have DLSS turned on
He needs dlss to fill in the gaps
He was thinking of his new leather jacket
Jensen stuttered for has missing vram xD
At this point just Nvidia DLSS Frame Gen will use *Epitaph* to see 10 seconds into the future
Wallet: Korega Requiem Da
Omg 😂 smartest use of jojo jokes by u lot
and in that future Moore's Law Requiem will punch in its face shouting 'MUDA!'
smart jojokes
Golden wallet requiem: You will never reach FPS
0:55 don't tell me it's unedited
It's not..
They dont even had the balls to activate RT for the left side.
I will, what you gonna do about it?
@@osama9501 So it's edited?
@@yum5372 think about it
RTX 6070 will have 4 GB VRAM and raw performance of a GTX 260 but with the power of AI you will be able to imagine double the performance of a 5090
RTX 5070 looks like it'll have the raw performance of RTX 4080.
With the power of AI you'll be able to double the performance (not texture) of a 4090 that is also using the same AI, just an outdated model.
Do some research before spitting such oompa loompa takes.
@@hiimshana Chill kid. The 50 series are pretty shitty when it comes to raw performance, just like you in school.
It's a joke lmao@@hiimshana
@@hiimshana You might want to look at those 5070 specs again. Raw performance of a 4080... What a joke. Bro, it won't even have the raw performance of a 4070 super. Literally worse rasterization specs than a previous generation's refresh.
@olebrumme6356Objection! Technically the Xbox had better raw performance then the ps, but somehow the ps5 consistently shows to be better, both in resolution and frames.
This is downloadable memory all over again.
You made my day !
This is not?
"Fake frames" are generated by interpolating existing frames with the use of an AI model trained on the games themselves by the devs themselves.
It's not a way to sell us fake performances, it's a way to both give us the usual generational gap we're used to (10-15%), while improving a technology more and more over the years to reach even higher ceilings.
@@hiimshanaInterpolation is basically synthetic generation of frames and synthetic generated are used here to fill the gaps between each real frames. This is by definition fake frames.
@@hiimshanaso fake frames. i don't care about who's faking them to sell me something, they're fake frames.
The Problem is they are Dead Frames ad well! You cant interact with 3 out of 4 Frames. So you Basically watch a movie😂
Damn, even electricity is AI now.
AI is the future after all
AMD literally said it at their CES event, "AI is the new Electricity"
AI requires a ton of electricity and water so it ain't looking good..
@@NikTek well is using all of it
jajajaja
@@AMDFan-s1ycook the earth for fake frames. Poggers
Can't wait for the 6000 series where 100% of the frames will be fake.
If all the frames are fake, none of them are
Some guy
Imo I dont care if someone call them fake or not. If it looks good with solid performance, Im okay.
They are working on AI making games in real time, so that might actually exist.
😂😂
What you mean "Fake"? , These are frame's that are generated by AI , which is an addition meant to help you get extra frames , So you either use it or lose it , After all it costs 550$ which is mind blowing price from a company like Nvidia, I also bet that raw performance is probably 15% lower than the 4090 which is not bad much tbh.
Before you bought the hardware and the technologies came as a free gift, today you buy the technologies and the hardware comes as a free gift
"In the future, entertainment will be randomly generated"
Indeed we just didn't fully understand the meaning at the time.
As much as you don't fully understand how AI works in GPUs.
@@hiimshana As much as you don't fully understand what that sentence might mean
Goat reference
@@hiimshanaYou probably saw couple of shorts on Ai and now thinking yourself as some Ai expert
@@hiimshanaok stop being a nvidia shill you can like a company and buy their products without defending everything they do even if its bad like with this over reliance on ai and fake frames you can't interact with 75% of the frames with multi frame frame generation since they are just made up by ai so it will feel terrible and dlss fg artifacts will only be more noticable if litherly more of your frames are ai than really rendered by your gpu
Dude is literally a snake oil merchant.
that sells working products that do in fact provide benefits over their predecessors
@@apexxdeathgaming5791but not enough benefit to justify the humongous price. Which he lies about, even more shamelessly than Temu.
Explains the snake skin jacket
@@apexxdeathgaming5791 too little improvements for too much money.
"Literally"? So Nvidia "literally" had it's own brand of snake oil in the super market? Do you "literally" even think before you type.
in 10 years, the RTX 9090 will actually be able to run Cyberpunk 2077 on Ultra with RT at native 4k at native 60fps! It will have a TDP of 5000W and cost $5000.
I prefer to secure a down payment for a house than buy that bulls**t
Lmao sounds about right
50000*
Impossible without 3 extra fake frames.
Impossible without AI generated fake frames.
Impossible without my shiny new jacket.
Impossible without all this jargon and obfuscation of facts.
If I'm buying fake frames, I will pay with monopoly money
100 dollar real money 1500 whit ia dollars (monopoly) to buy 5090
It's okay if you're broke, lil bro
@@99mage99 keep consuming, buddy.
but monopoly money is real monopoly money
@@B.M.Skyforest Keep staying in the past 😂😂
imagine if game developers optimised their games like they did back in the early days?
a quote that is stuck in my head is "We landed men on the moon and brought them back with the same computing power that is in 'Record Your Own' birthday cards. Developers used to be confined by their hardware, now the hardware is being confined by poor developers"
We didn't reinvent the circus. We repackaged it in a much more modern way.
- NVIDIA CEO Jensen Huang
I'm so close to hate the word "Modern", especially when some incompetent people use it as an excuse.
Low-Huanging fruit….
The more YOU BUY, The more I SAVE
CEO - Jensen Huang
The more You Buy, The more I make
- Jensen 😂
@@NikTekFalse. Nvidia throttles output because they like spiking prices even if it means selling less.
@@NikTek LOL
you save his lifestyle
Stupid people will buy this anyway
no way they are actually asking TWO GRAND for the 5090
First rule of upgrading your PC. Don't ever listen to NVIDIAs marketing.
Do not listen to everyone's marketing I would personally say.
CEO math is not accurate but it is correct...
wtf does that even mean,
"The more you buy the more you save"
that is some terryology type shit.
Enterprise bulk buy
You waste 100% of the money you dont spend
Buddy this is the most brain dead quote in history of humanity. What else did you expect from nvidia
let me help you. For example you have $1000 real dollars in your bank account. You give them to me and now you have $0. But if you enabled DLSS4(tm) with MFG(R) - you will now get $2000 imaginary dollars back into your bank account. In your imagination you now became twice as rich! The more you buy, the more you save!
That wasnt meant for gamers
The more you buy, the more shiny my jacket is.
😂😂
Mfer gonna replace the sun soon
Underrated joke 😆🤌
Im Dying pls help!!!!! 🤣🤣🤣🤣🤣🤣
And it was very shiny indeed that night. ✨
Could I have some AI sugar with that AI butter to make my AI cake?
Let's not forget the AI spoons to eat that AI cake, or maybe you're more of an AI fork user when eating AI cake.
Anyway, AI thank you... AI AI AI
Bet you enjoy playing cyberpunk at 60fps. Difference is you imagining a fake cake and get nothing yet this tech gives you a tangible performance increase in a game
Next thing you know, they charge a subscription to unlock the AI part of the gpu
Do not give them ideas, please!
You can get a Battle Pass for your graphics cards each season now also.
Well they somewhat did that with enterprise GPUS...
Please remove this comment, as I’m afraid they might take you seriously and start charging us for that option. 😂😅
It would be funny, if it wasn't tragic
1:20 If I do not buy then I save even more 😂😂😂
Yeeeeeeeeee😊
😂😂
The more you buy the more the CEO saves in his bank account!!!
@@Rafael57YT that's why it's called CEO maths!
100IQ
Gamers are now rise up against fake frames.
the AI will replace the gamers
It already did with the people on facebook or whatever... Also happened to youtube, how will us verify you are a human now?
And I thought buying a 4080 a couple of months ago was a mistake...
Best mistake I ever made
Actually the 5070Ti will be better than a 4080S. 1,300 less CUDA raster cores but the 14% more cores on the 4080S will be overtaken by the slightly better architecture and significantly more memory bandwidth on the 5070Ti, plus 4 more GB of VRAM than the standard 4080 you have, and for a bit less money than the 4080/S are currently selling for.
Don't get me wrong, I'm all for shitting on them for the new cards, but they are an improvement.
@@beckriv9854 What are you yapping about? The base model of 4080 has 16 GB of VRAM just like 4080S and 5070 Ti will have. There will be improvements obviously with the new architecture and memory but most of the improvements are from the MFG. Also let’s just wait for the benchmarks.
@Simon_Denmark The above is a direct quote from a friend of mine who is an advisor to the people who are responsible for the AI implementation in companies. He is the person who advises the supposed experts, he also has an aerospace engineering degree. That's what he's yapping about, and when it comes to computers he knows more than you. Sit down precious.
@@beckriv9854 they both have 16 gb lmao
@@beckriv9854 My dad works at Nintendo...
Can't wait for devs to mention their game runs at 60fps... but only when you enable the 10x frame gen
If that allows for greater worlds, more complex graphics and vectors and higher resolution, then I can't wait for it either :)
Basically 10 fps, math checks out. 😡
@@hiimshana and for what? for the generic 3rd person shoulder perspective with Unreal Slop 5 number 245? no thanks.
@@hiimshanawhat's the point of having all of these just for your game to be plagued with ghosting, artifacts and input latency ?
The more complex your graphics get, the more noticeable these issues get. Ai tools like frame gen were meant to be used on top of an already stable frame rate, now devs use it to brute force optimization which in turn, won't make your graphics look substantially better which is a sad trend these days.
@@hiimshanawe are NOT getting any of that, these technologies encourage developers to half-ass optimization and just rely on players to use framegen and DLSS to get smidge of those frames back, its also very delusional to say these cards will bring better graphics and resolutions when DLSS murders visual quality and framegen introduces unbearable input lag(Reflex barely reduces input lag). Games with DLSS these days look worse than previous gen because of all above and the simple fact of no one using native resolutions anymore
for a videogame? what about production and traditional rendering?
Finally someone said it :D
"The more you buy the more you save"
Guys he said it multiple times, this means it's more true
Am I the only one that doesn't know WTF they are trying to say? Lol
the second time he said it was ai generated just like the frames
@@mattcapons Honestly though for this I don't even think the scalpers agree with him
@@ElderSnake90Probably implying about crypto mining if I have to guess or Ai work
Something noone ever seems to talk about is that companies like Nvidia have a whole team of neuroscientists, behavioural neurobiologists, different kinds of psychologists etc. All to work within the PR teams to make sure profit is maximised.
Remember kids, when a big company like that seems to behave ignorant, incompetent or foolish, it's most often a very well crafted ploy. Sometimes with decade long strategies in mind. We are taught that companies these days only look for short term profit to please their restless, power/money hungry shareholders. And they dont strategise much beyond a yearly profit. Believe you me, all of the biggest tech companies have multiple strategies, in their books, that reach into multiple decades.
ALWAYS keep in mind that companies are simply entities, often run by less/non emphatic people, that try to please often worse people, by simply making money. Whatever they might project and make you think, they are not your ally, let alone your friend. Behind every single behaviour, manipulating you and extracting profit from you, is the end goal.
they're basically just selling slightly updated hardware (if you can even call that) that can barely beat the previous generation + AI frames
In other words, they didn’t upgrade shit, they added an AI software that’s programmed to feed you a visual load of shit.
Welcome to the limit of Silicon. Are you fanboys starting to understand now? AMD will suffer from the same evil.
@@Syntex366so basically, they updated the graphics cores, being able to “support” the new model of ai, when they could have easily just realised another card, which is an ai only card, so dlss would be 4 times better while not being fake frames…
One thing that is wrong about the 50-series is that they sell it as a gaming GPU like any other standard GPUs for gaming. The 50-series should have its own line where it clearly distinguishes itself to be an AI-oriented gaming graphics processor, rather than clumping it up together with their primary line of gaming GPUs and sharing the same naming scheme (of course, no gamer is too stupid to not realize the 50-series is AI-oriented even if it's not said so).
I guess one big issue with the 50-series would be that they are not actually rendered frames, and is only acceptably good if their frame generation is as indistinguishable as actual frames that are rendered/calculated and that it won't affect gaming aspects like input latency and other stuff; we have no idea how impactful will this much frame generation as they have advertised will affect actual gaming scenarios.
If you understand tech you know why this is necessary, traditional raster performance isn’t scaling like it used to so it’s either 10/20% gains only or use AI to maintain similar jumps between generations that people want
Crazy 1+1=11 math logic coming from a tech guy
"RTX 7090, at this point you just had to dream the game yourself, well sit back"
7090? By 6090 it will be a case of where all your frames low-res rendered and are AI upscaled or something lol.
Check out “Oasis Minecraft AI”, a “game” generated from training AI on Minecraft videos
Invents 7 frames each time, but knowing Nvidia they would set the memory bus to 128-bits with 6Gb VRAM on the 7060 models.
16 times the fake frames
So this is basically an AI card not a GPU card.
Not even… the batch sizes are still terrible because of limited VRAM. These cards let you train AIs faster, but they still also be shittier (AI slop continues).The only card that is even close to an upgrade is the 5090 with 32gb vram… but at that pricepoint most personal projects are better served with AWS and serious AI companies probably use V100s/A100s anyways with dedicated AI hardware. 5xxx series are jack of all trades but master of none
@@rytraccount4553 A jack of all trades but a master of none is still better than a master of one.
@rytraccount4553 100% this. The “you will own nothing and be happy” strategy continues, only it’s smart now. Instead of coaxing and/or forcing you into a subscription lifestyle, companies just refuse to actually sell you anything worth spending your hard-earned money on. Want an actual upgrade over existing tech? Get hooked on a subscription to access it.
@@profaa927 thats the originial saying but the meaning has changed. If you think that 12gb of vram is comparable to 4090 then you’ve probably never tried to train big models…
It is a 3000 series card with AI hardware unlocked. But this graphics card will cannibalize their enterprise cards that cost x3 they limit the VRAM.
This is bad. It's just software upgrades at this point. It's not raw power improvement not to mention 12gb vram for 1440p cards. Hell even 1080p maxed out some games require 12gb at this point. These cards are no way future proof for even 2 years. Nvidia biggest scam. Customers need to pass on this gen tbh.
This ai generated fake frames will not even be existed in my 3D Rendering work flows
WTF
Yep. Try explaining that to nvidia fanboys, lol. It's only mostly good for games and that's it...
Hmmm for me also 3d artist
@@majus1334 oh really, a card made for gaming, unlike A4000, is mostly good for games. Unbelievable, it's almost as if they know different tasks require different solutions
It's still a 40% increase 4090 vs 5090, wtf do you want, it gets harder to make these things smaller as long as we are on silicone road
@@giedrius2149 I have a question on this card related to Valorant, is this card good? Cuz I heard that ai gives layency and delay and I dont want to have any input delay or any latency on the game because I want to compete, im not there for the graphics, Im 1 rank away from the top rank...
@@giedrius2149 I'm not sure if I get what you're trying to say?
You mean that it doesn't make sense to expect an RTX gpu to do anything well, besides gaming?
Also, wasn't the 3k series that started this hardware "gimping" and trying to compensate that with software (DLSS)?
the launch got destroyed so fast I didn't even have time to cash in the little stock boost it got at first
the fall had nothing to do with the event or the launch, the job opening report came out higher then expected and the markets panicked, if you think the launch was good then you just got another oprtonity to buy for cheaper before they starts shipping the products, if you think it's sucked then you should have sold already after the event.
@@nisimabekasis6482 "the markets panicked" Meanwhile the markets: Biggest bubble in the history of the stock market, and bubble keeps growing
@@nisimabekasis6482 Stocktraders and Shareholders. Your the real problem of this society.
I'll pay 1/4th of the price and tell the retailer that I'm paying for the actual raw performance, he can pretend the rest of the money is generated by AI.
How does he consistently fuck up on stage each year lmao
its what money does to a person... they start acting all fcked up and sell their souls.
He's a old man now
It's proven that lying is harder than saying the truth, because you need to really think about what you're saying. He was lying almost as well as trump on a day ending with "y".
@@TheNefastor you can stop shilling for the Democrats now, Kamala is 10 million dollars in debt, you aren't getting your money.
Idk, ask AMD
I love how we went from games that had too much bloom, to games that had too much brown, to games that combined those two and also applied a vaseline filter as a guise to increase FPS because game developers have no idea how to optomize their software and hardware companies are pushing nothingburger solutions that look beyond awful. And theyre charging you the price of two computers to do it
it will end the same way too, no one buys that shit like no one bought the 4000 series and the industry moves on to something else stupid
it goes deep.
GPUs are the last thing you should put AI in
There's no way this advertising is legal
It's technically not false advertising.
i can tell you why it is
big corpo
big money
In america almost any bullshit is legal...
Who's gonna stop them?
why wouldn't it be exactly?
Turns out 1080ti was the best card of all times and looks like we will never get anything better.
I have 4080 super got it at 850$ last month and turning off dlss and frame gen is not a bad gpu but the performance gain compare to 1080 ti is not even double we are living in a dark times in terms of everything only market collapse will save us now these companies lost their mind soul everything its all about greed and scumming now
@@GFE. That's insane if it's not even double
I still have it... it still runs almost everything, crazy. I will never sell this card.
RX 5700 XT is better than 1080 Ti.
@@GFE. have you ever thought that the room for raw performance gains was way bigger back then when 1080 ti was launched? Now any mid card beats the 1080ti, that's how far this market reached.
Tbh i don't think it's nvidia's fault. Technically they have done their job, even though it's frame gen. The graph showed at 0:41 signifies poor game development and not working on optimization. I am suprised that even though many people ask for optimization, no interviews have been done discussing about using less resources with best game visuals
Thats my take. They can only squeeze so much out of every generation but game developers continue to release unoptimized games. And it isn't just nVidia. AMD and Intel have been working on the exact same technologies.
@@dregothic Well in this case, then playing old games would be more enjoyable. I mean if newer games require more resources, then instead of getting latest graphics, we can choose to play games that we had never played. I am doing this for more than a year and i do not have any FOMO attacks. I guess if everyone does it, they can address it. Either way, moders can do the undone job
@@stupiocity245 If only people would. It seems some are but a lot of games still sell well enough. To top it off UE5 has things in it that make developers able to do things without really trying. UE5 games tend to run worse on my system because game devs just add a feature without really optimizing it.
@@stupiocity245Then modders deserve the money, not the "AAA" devs, damn im glad that i play both on pc and switch
@@SimoneBellomonte True. Especially those mod who optimizes the game
When you measure yourself while wearing heels.
Against a person who also wear "heals".
ya heel...
More like while on a helicopter
If it's impossible without ai then it's unworthy of more than 500$
For that you can get a used RX 6800 XT which performs similar to RX 7700 XT
@@AMDFan-s1y I was thinking about getting a AMD card now when I will change my 2060
@@AMDFan-s1yI have rx 6700xt and let me tell you the rx 6800xt is nowhere near rtx 4090, the rtx 4090 even humiliates the shameful rx 7900xtx😆
@GamingIntelCoreM-y 25% more fps at double the price.
It beats the rx7900xtx but you also pay it more, the big difference is the dlss suit.
As much as we don't like it, having the possibility to hit the frame rate you actually want losing almost nothing on quality is sweet.
Still using 570 , no need to upgrade for at least 2-3 years.
For what started as a funny meme, you now have the most biting and accurate commentary on the GPU situation in the world. I love it.
Did someone remember times when games hate every single dev if their game have blurry textures but running 60+ FPS? And now we came to the time when you can't simply play your game even at 30FPS without it being blurry(fake frames generated by AI + DLSS, FSR, XeSS). Nice.
Ironically - i may say.
Bro what game on earth right now do you need frame gen to hit 30 fps on with modern cards
@@turkey4041Unreal Engine 5 games. They dont optimize that shit. They just try to make everything realistic and then upscale it.
That is not CEO Math that's CEO Meth
Last time someone lied about something on stage, they went 16 times the detail into how it would blow their socks off.
I feel like there should have been a HUGE asterisk
"AI LEATHER JACKET" 😭😭😭
Every company slaps AI in everything they make nowadays
פוריל
At this point.. is it even leather, bro??
🙏@@StoicWasTaken
AI leather jacket had me 💀 😂
I find it hilarious reading people say "maybe i'll wait another year before upgrading my 4090..."
Like, high-FPS bros, you don't need to upgrade all the time, spending 4 times the actual worth of the graphics card. 4090 still good for a while yet.
I'll bet most of the high-FPS bros couldn't tell the difference but convince themselves they do. Like feeding bacon to pigs at that point, but whatever lmao
Game optimization is huge. Some old games still have performance issues on 4090, because of bad optimization. Sometimes only mods fix that.
@@Wft-bu5zcoptimization is almost like a lost technology at this point it feels like
input lag will be 1 hour now
idk if this is a joke, y'all just seem serious and dumb
I mean, it will be an extra >33ms at 30 real frames.
>66ms at 15 real frames etc.
And not just like regular, because the GPU will always have rendered one entire frame it won't show you yet (cuz it needs that frame to generate the ones in between).
So 30 real frames without AI have 33 ms delay.
And 30 real frames with a total of 120 frames have 66 ms delay.
@2NDBEATS even dlss3 makes latency and delay are you expecting 4x frame generation to not make delay
You’ll see lots of nice things while you’re waiting though.
why do you think it will be laggy? nvidia isnt going to release something that barely works, they aren't AMD
"AI Leather Jacket" 😂 1:00
What does that even mean
Honestly I want an AI jacket 😩 😫
@@FXV56 "the more money they made, the more shinier Jensen jacket"
I tried using frame-gen on 4000 series and it's literally crap. The input delay is too much.
The clicks arrive in 2-3 business days
"In 5070 you will get smoothest lag"
thank you envidia for halting the progress on graphics cards
It's so cringe how they gatekeep their new DLSS/MFG. It's "exclusive" to the 50 series because if they gave the 40 series support, there would be no reason to buy the 50 series. They did the same thing with the 40 series and frame FG/DLSS3 when it launched. You could have shilled out for the 3090 Ti half a year earlier, to only find out that the 4090 is cheaper (still super expensive) and has frame gen support.
Their new software exclusivity for the next gen is their current biggest upsell tactic. The 60 series will have multi-multi-frame gen and the 50 series will only have multi, and the 40 will only have frame gen, and the 30 series will have none whatsoever.
Ah yes, Nvidia first creates the problem with their RT-only approach tanking performance outside of their own proprietary architecture and is then selling you the solution in the form of more upscaling and frame-gen that specifically works on the issues they themselves created, on their own platform, in games with their own backing. CDPR and Nvidia running the long con, jeez
What problem? Rtx is a problem? You dont have to use it
You fanboys must be crazy
Wait, I need to get my aluminum hat. 😂
That's the stupidest theory I've read today.
You should review how RT works, if it's just software it will be less effective. Lumens by software is an example of that. On hardware RT is simply better as the RT cores reduce the burden on the GPU cores
@@Rysix19 Exactly, Nvidia developed an insanely efficient hardware pipeline to calculate Raytracing in realtime. You all got to remember this shit was unthinkable a few years ago, rendering raytracing, even a few second long clip, took minutes if not hours back in the days. Of course they are not samaritans and just hand out tech for free lol.
You can see the performance of the 50xx RTX on the Nvidia website. Only Far Cry 6 doesn't have DLSS, so if I understand correctly it is the only game that shows an actual performance increase compared to 40xx? About 20%?
Correct, and the comparison is with the older non super cards, which are already 10% faster for the 70ti and 80 and 20% faster for the 70. So there's really not much of an improvement, especially on the 70 series.
@@DinoTrollerinoYo I’m still rocking my 1070ti wanted to upgrade. Now for ME personally ion gaf about frames being fake or not, as long as they are there. My gpu and frames are not my girlfriend so ion care if they fake or not💀 so only perfomancewise would i be able to play all new games at 1440p and 60+ and maybe up to 144 fps? (With the 500$ 5070) - i mostly play fps games and want as much fps as possible in them.but sometimes im playing god of war, Elden ring or something. Ion care about 4k i just want it to run smooth. Will it do its Job?
@@stocc2g used to have the 1070ti myself. Upgraded a few monts ago to a 4070super and OLED 1440p monitor. 5070 should perform similar,so you should get 100+ fps on max settings on most games native resolution, of course if you start turning on RT you won't hit that.
@@DinoTrollerino All right thanks a lot sir
@@stocc2g are you black?
CEO math makes the CEO richer
When the goal is frames at any cost this is what you end up with. It's like the megapixel wars in cameras with fake bumps in resolution.
fr, when will people learn, eventually fps count will just be unimportant.
It already is unimportant. FPS doesn't mean anything anymore. You don't need 500 FPS for Minecraft.
@@AMDFan-s1y for casual players and to have a good gameplay feel 144hz is the spot. for pro players i don't think after 360hz is possible to feel any better. in the human eyes atleast
talking about FPS games, really casual games like god of war and things like that that you don't really compete with anyone i think 60fps is a really good and comfy spot.
@@TioBru.You dont need More than 80 FPS for esport. If you do firstly stop playing and throw away your setup bought by parents money
@@TioBru. the mass market literally doesn't even care if games reach 60fps. Everything beyond 60 fps is mostly bullshit because of fake frames like we're seeing with Nvidia. It's the black frame insertion we used to see with TVs all over again.
2025 : Lazy Game Dev with Fake frame GPU
Oh and we'll have to sell our kidneys to buy one, plus an eye too.
How would that work with online games?
Just like 66% packet loss and pretending you've won the match to keep your sanity?
Does it generate my history and stats too? 😮
Can't wait to tell my mom to buy me infinite 5090s by saying "The more you buy, the more you save"...
Ai is so cringe, you pay for fake frames instead of raw power.......
its the same thing they did with 40 series which was still 100% better than 30 series do some research and cry some more lol
Lol go buy AMD then. Oh wait they're also balls deep into AI now so much so that they didn't even announce their cards lol
But if you can pay 600 bucks and get a "fake" 4090 performance BUT the graphics arent looking worse with DLSS then I don't see any problem with it? :(( I seem to not understand why people are so upset about it besides the bad marketing and not being clear that the raw power isnt better but the AI power is WAY better? :((
@@apexxdeathgaming5791do some research before showing how empty your brain is. They did the same thing as with 40. Less than More 20% performance than last gen
You pay for raw power and a technology that is ALSO able to generate more frames for people that are into that.
Try running Cyberpunk at 4K all maxed with RT on, let's see how pleasant the framerate will be without DLSS and FrameGen.
AI leather jacket got me 😂
What's even more sad is that Lossless Scaling has been able to do 4x Frame Gen for a while now. And I'm pretty sure it works on any card..
It does though it lacks in many ways. Lots of artifacts especially at the edges of the screen Still, for a cheap, pure software only solution, it's incredible.
"If something sounds too good to be true, it probably is"
only good thing about new gen gpus is that they drive down the prices of the 3000 series. definitely gonna be snagging a 4060 as an upgrade to my 1070 soon. only $300 and its prominence is only going to go down with the release of the 5000 series. +it's still a solid gpu even without dlss or fake frames.
I feel like with GPU performance i've been holding my breath since the 20XX series. I'm still waiting on that next 'no strings attached' leap that happened when we went from the 900 series to the 1080ti, but it seems like those days are over. From now on, every single GPU ever released will be labelled as 50% better or 100% better or 69% better, but with the worlds largest asterisk* next to it and you can bet your cheeks that asterisk means
*running at 150ms system latency with a blurry af image and tons of ghosting
Intel is our only hope
The 1080 ti was a mistake by Nvidia. AMD said they had a big thing coming and Nvidia overshot under pressure.
@@xenird And then what ? Same story, just like with Intel CPUs before Ryzen. Our whole Industry is a shitshow, because the already shitty capitalistic market system is collapsing from many factores undermining it. Greedy Elitists being the primary factore.
The 30 series was amazing let's br honest, its just that dlss should stick to upscale instead of trying to quantum shit frames from the future
I´ll need AI to calculate my debt after buying one of these
Buying a 5090 for it to AI calculate how long I have until the electricity and water get shut off 🗿🗿
you don't have 500 bucks?
@@SD-vp5vo Bold of you to think you can get a 5070 at MSRP. Scalpers, custom coolers from other brands and nvidiots will make them go for $750 minimum.
And depending on your model, the AI may or may not even be truthful. 😅
@@SD-vp5vo You are assuming we just have to purchase a GPU. The target audience this time is users of 2070 and below. So the users of 2070 might have to do platform upgrade to DDR5: CPU, MB, RAM, SMPS and GPU on the top. Else on old platform with PCIe3, No rebar the card won't perform well.
I’m not understanding, what’s the issue with “fake” frames what does that mean
Jensen should just save his money by using AI Jacket.
They could but it would be blurry, have additional latency and will look worse than native IRL res with artifacts and ghosting issues... But it would shine a whole lot more due to the brand spanking new shiny T.U.R.D AI tech!
CEO meth. The more you buy, the more I save.
How many games have DLSS4 now? 0 games.
How many games will have
DLSS4 at launch? 2 or 3 games maybe.
How many games will have DLSS4 a year later of its launch? 10 games at maximum if you are lucky.
How Nvidia see us: 🤡💰
its 75 games at launch
You are waaay off but whatever.
They're essentially reinventing video compression where you have a reference full frame and then frame deltas that just transform the original frame until you reach a full frame again
this is false, do research and try again
@@apexxdeathgaming5791 "ERM UR WRONG TRY AGAIN *BUDDY* " -🤡
All that tech would be fine and interesting as an optional thing for full pathtracing or something. Meanwhile they want to make it *the* selling point of the new GPU gen.
@@apexxdeathgaming5791 "ERM UR WRONG TRY AGAIN *BUDDY* " - 🤡
@@apexxdeathgaming5791 how is keyframes and delta frames of compressed video different from framegen having rasterized frames which are used as a reference and then ai generated ones? I'm really curious what exactly is false about using that analogy.
Ghost frame generator
Guy's the 5070 is not more powerful than the 4090. It's DLSS 4 that's giving it more frames, they're not lying to anyone.
The fact he linked a 4090 on sale for 3.3K gbp is hilarious
Oh wait that's on purpose 😂
it's 4.1k euro :(
He knows that not everyone is a dumbass and will buy into his shit
The 5090 is 575 watts using 12VHPWR. How long before the fires start? Maybe someone will actually die this time.
100% sure many will burn in bed
It says four 8 pins not 12vhpwr
@@kerkertrandov459 Its the wattage that matters.
575 watts!?!?!? Christ. Gonna need a data center in my office just to handle the heat.
@@Stratus41298 Run a system with liquid nitrogen to cool it
I can't believe be said, "CEO math. It is not accurate, but it is correct."
I wonder how many people thought, "Let me go reserve one right now then!"
The more you buy the shinier his jacket
Little known fact: Jensen actually wore a Shirt. The leather jacket was added in using AI.
bro the 26 fps are with path tracing the rtx 4090 dont even get 10 stfu
"The more you buy, the more I deliver shittier products"
The products are great. Way better than what the competition has on offer.
@@cirescytheThank you monopolist enjoyer u are truly serving the well deserved company with your statements👏👏
@@cirescythe snuff on that copium
@@cirescythe the products are shit, and the competition is shit as well.
You, sir, are a certified gigachad with that statemrnt right here! xD mad respect!
These cards aren't even an upgrade then. Just a facade. More fake frames.
@@RupertBundem a blind man i can see. rich people, dumb choises.
@@RupertBundem The same benchmark run with the 4090 had 20fps at raw processing power (without any AI features on). The 5090 has 27fps. Not brilliant at all
@@Linkario8630% more MSRP for 30% more frames
buying high-refresh OLED, good quality analog keyboard and a high-polling MMO mouse along with a good mousepad like artisan probably brings you more enjoyment for the same price
L@gonozal8_962
😂
No
@gonozal8_962 yep. Especially considering that those 30% more frames are at really low fps. Microgains for the customer, massive gains for the company.
Nelson's jacket had ray tracing enabled 😂
on a twitch live people in the chat were counting the AI number more than a hundred times the most painful 1H
>Wait for more than 2 years for RTX 5XXX
>Same fucked up prices
>Get 3 fake frames
BRAVO, HUANG
Brotherman the baseline is the same.
They didn't cut 3/4 of your rasterize performance to sell you "fake frakes" instead.
They are ADDING them on top of the usual performance gain from the old generation.
Do some research first.
The A.I assistant Ray-tracing on that jacket is off the charts.
Goddamnit when will CEOs learn that the consumer doesn't want AI to fake a good product, they just want a good product.
“Jensen got that hell hole running so efficiently that the whole game is now run from a single frame when booting"
Kinda sorta the future, yes.
Every day of new information that gets released, I become increasingly happy that I built my pc three months ago.