@@N0N0111 I can buy it just fine, but nvidia will get no money from me, ever again. What a waste of a company. TNT2, GF3 ultra, Ti4600, 6800GT, 7900GTs, 8800GTs, then they went greedy and I was out. Then I got 4890s and they were amazing along with 5950s, 6950s, 7950s, and 290X. I did get the 980 Ti, b/c Fury couldn't overclock. it was just too poor to justify when 980 Ti OCed like an AMD 7950. But, now, I don't care what they make. Absolute trash company in every way. 5700XT was fine for midrange, now I have 6800XT, which is 100% rock solid for 3 yrs, now. I can wait till RDNA 5. Games suck, now, anyway. I'm playing 10 yr old stuff lol.
At this point is it surprising anymore? Nvidia can do that and people still continue to buy their cards instead of AMD cards with more VRAM. It's people who are just enabling Nvidia to get away with this.
That's the problem for more than a decade now. Many people did not want to buy those true pro card especially the semi pro crowd. Some companies even use geforce instead of quadro/tesla for their workstation solution. Now gamer have to face the consequence when nvidia start pricing their geforce with semi pro in mind. Cheap than true pro card but still more expensive than the "toy price" gamer used to pay for gpu.
The only real difference is that they have a driver "certification" and extra memory chips (which may or may not use ECC). Other than that they're the same garbage boards as the consumer cards.
The only true difference is that Nvidia pays devs to cut features if non scum"pro" gpu is detected. Like ambient occlusion in solidworks just do not exist with 4080, but working fine with a1000(or whatever) which is like cut down 3050
I know you joke, but if you can afford a 5090 you are very likely a professional in some capacity in your job to be able to afford this kind of expense. And even if you personally isn't most of the 5090 buyers will be.
If AMD have anything that can compete woth 5090 things will be different. But the issue is not about AMD competing. But how to keep AMD alive and at the same time give the illusion that AMD still competing
@@arenzricodexd4409 This channel's RDNA 4 leaks show AMD should be quite competitive in the upper midrange. I say let the cards come out and see where they stand. Might not get 80-series performance, but 70-series performance should be easily there, maybe more.
I love how we universally equate 'professional' to 'clueless cash cow' these days. I saved thousands buying 'not pro' hardware in a professional setting over the years. Like how some Dell 'pro' server ram kits are literally Crucial kits with the same exact part number, and 'pro' Dell SSDs that are Micron basic models with another sticker slapped on top of it.
Not compared to the Crypto and AI bros with deep pockets. The ones mostly buying the 5090 or 4090 are the rich crypto, AI or people with too much money. The 5090 or 4090 still has tons of demand and was never meant to be affordable. The Ai and Crypto bros are buying 10 to 20 at a time so there’s still a huge market trying to buy them. Some can’t afford a A100 at $30k and they are hard to resell so they will raid the 4090’s and 5090’s
Wait till you discover that the engineers that design cars go to tech extravaganza’s and get the specs of the tech and plug and play. Your bmw is a Toyota, Chevy, Dodge, HP etc. lol first generation Mini Cooper was a Dodge Neon practically.. The only thing that is unique on a car really today is the emblem!
I remember when people used to argue that the '90-class' pricing didn't matter, because we'll still have the 80 and 70-class models to fall back on... How that working out? lol Nvidia was never going to raise the ceiling without also raising the floor. Keep spending 2 grand on their GPUs, and they'll keep treating you like a mug.
I buy high end cards because I do 4k gaming. Expecting people to boycott what's in their best interest when nobody is there to compete is such a losing message. If you don't buy their cards they'll just raise the prices and call them professional anyways. They don't care about the gaming market now that they can make insane profit with AI.
Right! Except it won't be 2K for the high end GPU, it'll be pushed up to 3K, double the 3090 more money than sense price. The more money than sense crowd will just rationalise their purchases claiming they need native 4K @120+ HZ with ray tracing, rather than it's a vanity purchase.
@@RobBCactive yeah i could buy the cards out right and still have enough to vacation or buy a car or what ever..but 1.2k hell anything over 900 for a graphics card? let alone 1.5 and up! its crazy..gaming alonethat cant be justified
They can’t even make them fast enough! The professionals and the Crypto and AI bros with deep pockets. The ones mostly buying the 5090 or 4090 are the rich crypto, AI or people with too much money. The 5090 or 4090 still has tons of demand and was never meant to be affordable. The Ai and Crypto bros are buying 10 to 20 at a time so there’s still a huge market trying to buy them. Some can’t afford a A100 at $30k and they are hard to resell so they will raid the 4090’s and 5090’s
5090 and 4090 are made for crypto and Ai bros. They can’t even make them fast enough! The Crypto and AI bros have deep pockets. The ones mostly buying the 5090 or 4090 are the rich crypto, AI or people with too much money. The 5090 or 4090 still has tons of demand and was never meant to be affordable. The Ai and Crypto bros are buying 10 to 20 at a time so there’s still a huge market trying to buy them. Some can’t afford a A100 at $30k and they are hard to resell so they will raid the 4090’s and 5090’s
@@AlphaConde-qy7vi This has nothing to do with nvidia fanboyism. the simple fact is that amd is unable/unwilling to compete at the high end for the next generation, and all consumers suffer as a result.
@@mingyi456 stop complaining and just don't buy. if you keep buying, they keep making. this is the same as amd, you dont buy because its bad which is why they didn't bother to give you 90class card. you dont buy 5000series, then nvidia will be a bit generous with super or 6000series. stop blaming amd if you just wanting to buy nvidia.
That fucking sucks so bad.. If it was just a 4090 re skined with less power and cheaper like the same price as the 4080 super it would be a good release.... I have a 4080 right now and upgrading to the 5080 doesn't make sense.
@@elpato3190pipedream. The best deal from AMD for the entire RX8000 series probably something that have 7900GRE performance at $450 instead of $550 that we have now.
This is all thanks to covid shortages + scalpers and morons buying 3080 and 3090s at stupidly high prices ...Nvidia saw this and have taken full advantage of this (started with the 3080ti and 3090ti pricing).... Lovelace was an absolute dumpster fire (quite literally) with the pricing and now this shit.... But I guess we are all "professionals" now 😒
As observer168 said I would blame crypto more, but yeah some still paid that inflated price for gaming. Me I got my 3080 on release at the original RRP, thank god. I hope this time these prices bite NVidia in the butt as crypto mining not GPU bound so much.
Dont blame a virus for the mistake of consumers. People bought those cards for those prices and then started complaining when next gen went up with price. Like wtf? Start voting with your wallet if you want cheaper products. Simple supply and demand equation...
The professionals and the Crypto and AI bros with deep pockets. The ones mostly buying the 5090 or 4090 are the rich crypto, AI or people with too much money. The 5090 or 4090 still has tons of demand and was never meant to be affordable. The Ai and Crypto bros are buying 10 to 20 at a time so there’s still a huge market trying to buy them. Some can’t afford a A100 at $30k and they are hard to resell so they will raid the 4090’s and 5090’s
i want to know about the price of the battlemage gpus, instead of false propoganda about its canccelled because this guy hs been crying for few years about it being cancelled but it releases in the end
@@bmqww223He is just leaking what is going on in the company at the time he does intel wasn't sure about releasing it so he guessed he thought they probably won't that's it.
@@bmqww223 You have a poor memory or selective hearing. He said BM was cancelled for desktop outside of a small die (well, nothing intel makes is small, but still hella cutdown lol). No information currently disputes that.
@@TheGuruStud why are you editing our own comments ? pretty sure you have problem of selective hearing that youre not sure what you say lol..... don't be a fanboy we all know here what he said ... he has been crying cancelled so much that nobody believes him anymore he just said in prvious video that its not even fast enough to beat 4060 but now he says that its almost close to 7500xt or 7600xt....
So nVidia isn't charging an arm ana leg for 5080 and 5090 because they have no competition, but only because they're for professionals? That makes everything so clear.
Are red flag was probably when AMD said they won't compete in the high end this time around for gpus the 8800 XT looks to be marketing around a 4080 a little weaker but stronger than a 4070 TI and it's going to be marketing around the 500.00 mark midrange. And that's it for 2025 power Wise from AMD supposedly unless they pull a pop-up launch that was hidden for the high end.
@@Ziyoblader with these pricings, I sadly expect AMD to up the top 8800 XT to $599 with 5070ti being 10% faster and $799. I don't expect anything shaking up in the market, they're fitting together like a puzzle as always in the duopoly. AMD's complicit
Made for Ai and Crypto bros only! They can’t even make them fast enough! The Crypto and AI bros have deep pockets. The ones mostly buying the 5090 or 4090 are the rich crypto, AI or people with too much money. The 5090 or 4090 still has tons of demand and was never meant to be affordable. The Ai and Crypto bros are buying 10 to 20 at a time so there’s still a huge market trying to buy them. Some can’t afford a A100 at $30k and they are hard to resell so they will raid the 4090’s and 5090’s
Nvidia = Blackwell even more expensive than Lovelace - no surprise there AMD = RDNA 4 launching in Q1'25, but FSR4 won't be ready for launch - no surprise there Intel = Battlemage is an absolute sh*t fest - absolutely no surprise there 😫
@@liberteus Just follow standard procedure, which is to wait a month or 2 and get it at a 'discount', otherwise also known as the 'price it should have launched at to begin with'.
@@AmiableChief it takes way longer than that now for AMD to get back to the price it should have been. And I'll buy a 5090 day 0 with a complete new PC, once every 4 years, at company's expense anyways.
@@thetechrealistnot all people can afford to buy a used card. I'm not talking about the price, some people sell broken used cards in the market and if you do not have a system to test a GPU right away you can't get a refund. I got to know this the hard way...
I"d go so far as to say its actually a 5070 now. it was pushed to an 80 class spec in 4000 series and this is ANOTHER drop in class performance. The 5090 is not the full die, so its even dropped a class.
I mean I am a professional but I would like a card that doesn’t cost a kidney and an arm…. Also when you go north of 1000$ I and many other “professionals” expect the best. If the 80 class and 90 class have a massive gap in performance it’s going to feel like I’m getting ripped off… this just brings me back to NVIDIA needing to properly market cards for the professional market with their Quadro line.. at least there the cards were not masquerading as a good gaming gpu..
I understand what they're going for about the 5090 and it's price but the 5080 is giving me shit vibes, given I have the 4080 and I feel ripped off but the new 5080 is a no buy what's so ever because of the VRAM, even if it's better than the 4090 the VRAM still a huge deal breaker, if they just made it the same as the 4090's vram with a slightly lower power draw it wouldn't suck that bad and made it's price at $1k max I would have considered buying it.
All these products are a hard skip. No VRAM increase, almost no Raster performance increase. The only selling point for these dogshit cards is better Ray Tracing which no one cares about. Skip the regular 50 launch and wait for the 50 SUPER launch at the end of 2025. Thats when the new VRAM modules will roll out and you will see 50% more VRAM capacity in every SKU. That combined with full uncut dies (10-15% more shaders) and faster GDDR7 (15-20% more bandwidth) that will be the series off cards where the price to performance is what you want.
Heard something similar before about the fact there will be 2x capacity gddr7 modules but it sounds like a pipe dream to imagine them releasing their super cards with double the vram, where you got that info from ?
Made for Ai and Crypto bros only! They can’t even make them fast enough! The Crypto and AI bros have deep pockets. The ones mostly buying the 5090 or 4090 are the rich crypto, AI or people with too much money. The 5090 or 4090 still has tons of demand and was never meant to be affordable. The Ai and Crypto bros are buying 10 to 20 at a time so there’s still a huge market trying to buy them. Some can’t afford a A100 at $30k and they are hard to resell so they will raid the 4090’s and 5090’s
if you look at latest amount of failures in sales and big hitters shutting down, then it seems so yes. Only the (pro´s) @ gaming journalists that gets free copy´s and the gamedevs them self likes the games they make it seems. Maybe they should cater to the((normal)) people, and start ignoring the Pro´s. Both for nvidia and Gamedevs.
I guess AMD appreciates if Nvidia uses their superior marketing to convince people that anything above the 5070ti isn't for gamers. One less reason for them to worry about RDNA4 not competing in high end being an issue.
The problem is that the "RTX 5080" (in name only) is actually an RTX 5070 if you look at the leaked specifications. So the "RTX 5070" (in name only) won't actually be a true RTX 5070-that will be the graphics card labeled as "RTX 5080." Typically, a 70-class card has half the shaders and VRAM of the 90-class, and the "RTX 5080" matches this perfectly based on the leaked specs. Nvidia calling the "RTX 5080" a professional-grade graphics card is just a way to make you compare the "RTX 5070" (which will effectively be a 60-class card) with the "RTX 5080" (an RTX 5070 disguised as an RTX 5080). Always check the technical specifications, not just the names, to avoid being misled.
Nice quick outlook of what to expect in the next few months. Let's hope that the FSR 4 rollout is better than 3's. Also, hopefully it works or has a fallback for RDNA 3 or even 2 cards.
@@hotdogsarepropagandaThat's what makes it more laughable but based on rumors the die size is still a 70 class card disguised as an 80 class card which never belonged in the professional tier to begin with. I can't put up with any reviewers that give into such blatant anti-consumer manipulation. Hopefully all this is just rumor mill nonsense but I'm getting mad at just the possibility they are doing this.
Blows my mind how Nvidia went from being a fairly decent (no worse than average) company, to rivaling Apple with their anti consumer policies in such a short time. Sincerely hope they go under, would rather have intel vs amd to choose from for video cards than a company like nvidia fucking up the market.
Nvidia is going nowhere. They are the best of the best for a reason and that’s why they charge what they charge because no one effectively competes with them.
@@kevinerbs2778 Except Generative A.I is not a bubble, nor a fad. Generative A.I has virtually limitless applications across every line of business, and those who think otherwise are going to eventually be left behind.
@jonshaffer5793 AMD is not giving up, they are playing the long game. Look at steam usage charts for rtx 4090,it's far less than the mid to entry GPUs. AMD is trying to gain market share for Radeon while maintaining and gaining marketshare with Ryzen.
Leaks suggest the 8800 xt will be comparable to the 4080, at around half the price last I heard. And the 7900 xtx is already ahead of the 4080s in fps outside of heavy RT, at -25% the price. The only reason you'd NEED a 4080S is if you're playing in 4k with RT on ultra with DLSS. In 1440p, or 4k W/O RT, you have options.
Intel managed to fuck themselves over good and proper right as AMD was gathering all their remaining strength for the "do or die" battle. Nvidia is, unfortunately, an entirely different beast. I'd LOVE to see Radeon get to 50% market share... but remember that Ryzen has just surpassed 40% despite being overall better for several generations in a row.
@ agree. But ryzen was not able to win in games until they introduced x3d. Im just hoping they have something this cool for gpus under their sleeves instead of unsuccessfully copying nvidia
@@pashabiceps95 "Im just hoping they have something this cool for gpus under their sleeves instead of unsuccessfully copying nvidia" They don't, FYI they can't even successfully copy Nvidia. According to leaks the next gen of AMD GPUs will mainly see a RT performance improvement which will only be comparable to the 4000 series..... AMD has FSR4 coming but that is apparently being delayed, meanwhile Nvidia also has DLSS 4 in the works, AMD doesn't even have anything that will compete with the 5080.... The only way for AMD to even take some marketshare is to price their GPUs to the point they likely don't even make money from them. Both GPUs after this gen are also meant to see an architectural change, so you are hoping for a lot. Nvidia isn't like Intel, Nvidia is the one that has been pushing forward with change and AMD is the one that has been playing catchup.
@@jonathanryan9946I know right amd's 8800 XT is going to be around a 4080 but a little weaker but stronger than a 4070 TI meaning they're not trying to surpass their 7900 XTX and they're 8800 XT will be priced around 500 mid-range so this leads up to what the leaks were saying how they're not going to participate in the high-end but targeting the mid-range
@@jonathanryan9946 We don't even know what the next gen AMD stuff is yet. I can understand if you absolutely need to have a 5090. Everyone else should consider AMD or Intel. We are all fucked if the Nvidia monopoly continues unabated.
@@jonshaffer5793 "We don't even know what the next gen AMD stuff is yet" Except we more or less do know? AMD themselves have stated their will be no high end cards this gen i.e. they aren't competing with the 5080 let alone the 5090. The leaks from this very channel have practically said expect last gen 7900xtx / 4080 performance from these next AMD gpus, fyi the 4080 levels of performance is in regards to raytracing performance.... AMD will have to price their GPU extremely low to, if they do that they can likely do well.
@@aflyingcowboy31 So if you you absolutely have to spend 2k on a card for a mediocre performance gain over an 800 dollar card I say go ahead and throw your money away. The rest of us with a brain will stick with the "mid range cards". In that space I say buy anything but Nvidia because they are getting very anti consumer.
I'm all for cards above 5070ti being only for professionals. But also tell that to game developers so they don't optimize their games for 90 cards. September B580 launch would had been fire at right price. Hopefully higher sku is alive and well.
No, we used to get flagship performance for much cheaper and cut down 80 class cards that are really 70 class cards with low VRAM don't belong anywhere near "professional." They are manipulating the consumers that made this company in order to cater to AI and keep their fat margins.
@@razpaqhvh7501 Milk has to be somewhat affordable for regular consumers. NVIDIA video cards are pushing regular gamers out of the market and it doesn't even matter because they make like 90% of their money with professional stuff and the margins they get are way higher. Furthermore nobody can compete with them at the high end. There is no incentive for them to care.
That's what I'm talking about! "RTX 5070 ti 16Gb will be the best for gamers, they won't need anything more powerful", that could be true, but tell game companies to optimise their games and do not rely on shenanigans like upscaling and DLSS
I'd like to see a class action lawsuit again Jensen Huang or Nvidia. I would liike to see them answer for makig GPU with a short lifespan due to not enough vRAM. E-waste in half the time GPUs used to last for.
@@chrisblack6258 don’t buy the 80 series, it’s going to be a export safe model that they can ship to China, AKA a 4090 light and they will only sell it to gamers willing to compete with that demand and cost.
I'm good staying with my 4080 for another 4 years. Even if the new consoles release in 2027, we won't see the fruits of their power for another two years after.
I dunno... If Intel can put out a card with 4070 levels of performance + 16GB oh VRAM & sell it for $400, who cares if it's not able to directly compete with the 5070? We all know Nvidia is going to raise prices AGAIN, and I'm not sure people are going to be super excited to spend $600 - $700 on a 12GB 5070.
The problem is it'll be launching against RDNA4, which looks set to offer that kind of performance around the lower to middle end of the stack, and everything so far indicates it's going to be aggressively priced. So then you have to ask yourself: do I choose a $400 Sapphire Nitro card, or do I choose the $400 card from an unknown brand, using the GPU from the company that made a totally broken lineup last generation. It's going to take great performance and absurd pricing to win players over on Intel this generation, because AMD is already the brand budget builders trust.
The point is Intel really can not do it! Intel use expensive highend node… They should sell it at higher price than even Nvidia! Intel can ofcourse lose money… again… But that means that Battlr mage may be last Intel disgrete GPU for a while. They need profit, and alcemy, did not make that.
My initial plan was to buy an RTX 5000 card and Ryzen 9800X3D. However, I think it's better to skip another 3-5 years. I'll just play older games while I find a second job and build wealth. My only concern is probably whether or not my 8-year old GTX 1070 can survive another 5 years.
Why do you only consider Nvidia? RX 8800XT or whatever it will be suits you perfectly for not wealthy update, and you can just jump onto AM5 with like 7600 or 7500f and already have really good perfoming PC.
Just buy something used. Hell a brand new 7800xt sold for around 460 Euros during Black Friday. You dont need to spend 1000$ dollars on a GPU to have good perfomance
Good on you for showing restraint. That said, there are plenty of good options which would provide a massive upgrade over a 1070, for very reasonable prices and available right now. Working hard is great, but don't force yourself to suffer with an ancient card for 5 years 😅
Let me guess, Nvidia will release the 5070 after trump brings in tariffs but after secretly stockpiling them in the states. Then marking up the price...
Professional gaming… newer heard? 😂😂😂 Just go to the shop and say secret password; ”The more you buy, the more you save” and they will sell you professional GPU! 😅
If a 5080 is a professional card it cannot have 16gb vram. Professionals need more, period. Also the 80 clas should always be 15-30% faster than last gen flagship. If the 5080 is not that with at-least 20gbvram I will pass. I will only buy when I am getting good value.
maybe in the EU but the Orange Gorilla will just put tariffs on it and expect the price to go down. if he puts a 60% tariff on nvidia and amd (likely if he labels taiwan as part of china) then intel will be all any US citizen buys because 60-% price increase would make it not economically feasible to buy imported chips
Antitrust? lol. Even if enforcing the law was something that is going to occur for the next Four More Years, which it isn't, Trump isn't going to let his DOJ puppet wreck the value of a $3t market cap corporation and tank his beautiful stock market numbers.
Basically, Blackwell and Battlemage are a skip. Also, the RX 7600 XT undervolts extremely well. Tech Yes City basically got it perform the same as stock while getting the power consumption very close to an RTX 4060. The RTX 4060 only pulled 30 watts less at that point. If Intel consume more power than the 4060, then they lost to AMD with a node advantage. RDNA 4 doesn't seem to have any negativity coming from anywhere. In fact, alot of people seem to be optimistic about RDNA 4 with the new RT cores and FSR 4 using AI to make upscaling and frame-gen better. Personally, I am not worried about FSR 4 getting delayed becuase the minute it goes open-source, modders will get to work and we will be able to put it in every game that DLSS is currently in. Just like Anti-lag 2 can now be enabled in every game with Reflex.
A VRAM-cripple ends-up beeing a FPS-cripple sooner or later. Remember this you 6, 8, 10 and 12GB NVidia customers. RTX 2060 12GB -> RTX 3060 12GB -> RTX 4060 8GB -> RTX 5060 8GB? this is what happens when one corporation has a monopoly on the market At a point, when a behemoth company no longer cares if a particular segment purchase their products or not, basically grants them the license to do what ever they want. Nvidia’s strategy became known with the 40 series. Make the top product extremely powerful and jack the price up. They "could" have used 3GB chips, they "could" have just made each tiers memeory bus just 32 bits wider, with the extra bandwidth they "could" have used 4GB chips and cut the bus down 'up to' 50% and still been ahead. They didnt, because they know you will buy this crap regardless, they arent wrong are they? They love to screw over those who love to be screwed over
No, this is what happens when people buy into the hype of RT. Radeon cards are competitive with much higher priced nVidia cards in raster and the software since RDNA2 has been solid. The best antidote to nVidia prices is people start buying AMD. The current nVidia situation is 100% the fault of the nVidia fans themselves.
@@Osaka2407 And FSR not being quite as good is not a good enough reason to support nVidia - especially since DLSS is gimped to work only on newer hardware, whereas FSR is generic and supports GTX cards, too.
@@MetallicBlade if you mean 24GB for the 5080, I completely agree although they would have to wait until Q2 for the 3gb modules to do that. I was just saying for anyone with a 1080 Ti, 2080 or 3080, the 5070 ti might be a good upgrade if it's $799.
Apparently only "professionals" need enough fucking vram for playing kiddie video games? The truly frustrating part is as Nvidia gets greedier and greedier, and games are getting more and more demanding, AMD bailed on the high end and Intel is half-assing it like it's someone's weekend project. At this point I would spend a grand on a true Arc competitor just to throw the middle finger up at Ngreedia and Another Massive Disappointment. But apparently Intel only cares to compete in the budget class, while not actually _competing_ in the budget class. It's enough to make you say screw gaming and find another hobby.
When will all the Intel Fanboys realize their horse is FOUR YEARS BEHIND NVIDIA. It's been that way ever since the A770 was launched to challenge the 3070 TWO YEARS LATE, and it achieve 3060 performance with HIGHER TRANSISTORS AND DIE AREA THAN 3070. Intel can't design shit in VLSI any more ...
It would seem Nvidia will milk the shit out of 50 Series. Hopefully the Intel/AMD counterpart will both be priced at will perform at reasonable level so we can have options
This is what happens ladies and gentlemen when one company has a monopoly on the high end market. They can price however they damn well please because they know people will buy it.
Hmm, if B770 is roughly 4070 performance for $400, and top RDNA 4 is roughly 4080 performance for >$500, it might possibly compete well with an RDNA 4 "8070" if it's 16 GB vs. 12 GB. But then there's drivers...
As GPU prices continue to rise, from NVIDIA, where the top end is for “prefessional” use - how much of a deleterious of effect is this going to have on gaming development at large? Seems to me that there will be an untenable inflection point where hardware needed to push the industry forward will become so out of reach of the normal consumer that game development will stagnate.
You must be trolling the main thing happening with gaming is the "1 billions polygons screw" otherwise it's still the same bland healthbar and character with ridiculous hitbox fetish. Game with needlessly high polygon count on characters model and 4k texture everywhere , like are they going to make smut with them or something?The worses are the hyperfast third person game \ real time tactics where the screen is covered in particle vomit and you can't see any finer details anyway.
In my opinion this downgrades the 5080 to 1440p graphics card. Because it will age poorly for 4k. The value is simply not there to pay this amount of money.
@@ElFlipo1337 No, but people bought the gtx 600 series like crazy, they bought the 1000 series like crazy which were just as big of a scam as the rtx 4000 series, people just didn't realize it because of their raw performance. We are here, because for almost 13 years now people bought rip-off GPUs and they bought Nvidia over AMD when AMD had a superior architecture and software from 2007-2015.
Don´t worry! These will sell like hotcakes even if they would cost $3000 and $1500! Nvidia does not have to worry about income or not even market share! They did go from 86% to 88% with 4000 series while everyone did whine that it was too expensive! They most likely go up again to 90%! Nvidia has to increase prices more, so that they would not become monopoly! 😂
Hopefully, INTEL's battlemage is near NVIDIA 4070 in performance with 16 GB VRAM and priced under $400. Yea, I know. That's a lot of hopium. But if they can manage that the card will sell like crazy unless it is a power hog.
@@TheDoomerBlox how would it sell if pretty much everyone who buys these GPUs already know leaks/rumors about cancellation of ARC? I mean, this is probably a dead end for ARC, who knows if Intel will continue support it with drivers, and if you really think that people that buy Intel GPUs are not aware of these things - regular consumers just gonna pay for green card for fancy stuff like DLSS 4 and not even raw performance (which Alchemist had, at least in terms of RT, they blow out of water AMD's RT) Edit: fixed typos
5080 for professionals... vram amount 16gb 😂 rebraned 4080 super with slighter better power draw. I'm calling it now. Professionals are buying used 3090s, anything less than 24gb vram is trash, for profesionals and for 4k gaming.
someone from Nvidia already said that we should expect a shortage you gotta love a self made shortage of supply so prices and profit margins stays high all while I am sure their fanboys would sell a kidney or two just to line up on launch day.
5080 with half of CUDA Cores a Professional GPU? Lol 😂 Even the 5090 will not get the full 192SM and 32Gbps of GDDR7 ! Let's hope they won't hugely cut the L2 Cache either...
@@rattlehead999 Definitely yeah! And the 4090 is definitely bandwidth starved! The L2 Cache cut (72MB vs the original 96MB of AD102) definitely hurts the 4090's performance... and it hurts even more when you know that the L2 Cache transfer speed is about 5TB/s whereas GDDR6X is only about 1TB/s! Also the 4080 & SUPER have 64MB of L2 Cache when they have a lot less CUDA Cores! Nvidia really played it cheap with all its lineup...
@@jongamez5163 Yup the 4090 is only 30-35% faster than the 4080 while having 60% more SMs and it's precisely because of the lack of bandwidth, it should have been 512 bit.
@@rattlehead999 That's what Nvidia seem to be doing with the 5090... 512-bit @ 28Gbps GDDR7 aka 1,792GB/s. and we could probably get 2TB/s with an Overclock! Let's hope they will give the 5090 the full 128MB of L2 Cache! If it's $2000 I'm selling my 4090 to buy a 5090 😅
@@jongamez5163 I think that with 1792GB/s bandwidth it doesn't need as much cache per SM as the 4000 series did. I'm sticking to 1080p as long as games allow for it, so I have no use for a 5090 and 2000$ for a GPU is just dumb at this point. Especially when it's going to be a 600W brick.
the thing I am really hyped for is the cooler design of the 5000 series. Can't wait to see it. Also AMD could use a little innovation with their release version coolers.
I'm mostly hoping that FSR will finally make some gains on it's image quality. Would love to keep my 7900 XTX but at 1440p it's upscaling is just not good enough.
@rattlehead999 Yeah, I know I figure if the 5080 is considered professional now, it'll probably cost almost as much as the 4090, which means the 5070 Ti will cost as much as the 4080S so the 5070 will cost around $700-800. If I'm getting 5070 level performance for the price of a 4070 Ti S, I might as well go AMD for better value.
Ah well, at least 95% of video games that release every year now suck and my motivation to keep playing em lowers. Between that and the price of hardware, it looks like this will finally be the wall I hit where the only reason I upgrade is to keep playing the games I do enjoy with mods which won't need higher tiers of performance to run, I'll just upgrade after the hardware has been out a while and is heavily discounted and it'll only become cheaper as the gaming and hardware market become more stagnant. I hate to be so black pilled, but what a truly terrible time for gaming and PC to boot.
I wouldn't mind the higher classes being pushed towards professionals, WHEN there is enough performance on the lower classes for 4k 120FPS gaming with RT. But even the 4090 is not there yet...
@@haukikannel I use upscaling from 1080p to 4k and still don't have enough FPS with a 4090. A 5090 will also be not enough to hit 120 MIN-FPS with DLSS performance in UE5 titles... I don't use framegen.
Even though we know better, you can't help but think, "Well maybe, maybe the GPU market will stop blowing chunks right?" I mean good lord, it's been trash since Turing.
For the love of all that is good, NVIDIA you have a professional line called the Quadro that was thousands of dollars more expensive than the gaming cards. Please keep them separate.
Dude the ones buying up all the 4090’s and 5090’s are the Ai and Crypto bros. They can’t control who buys them. Made for Ai and Crypto bros only! They can’t even make them fast enough! The Crypto and AI bros have deep pockets. The ones mostly buying the 5090 or 4090 are the rich crypto, AI or people with too much money. The 5090 or 4090 still has tons of demand and was never meant to be affordable. The Ai and Crypto bros are buying 10 to 20 at a time so there’s still a huge market trying to buy them. Some can’t afford a A100 at $30k and they are hard to resell so they will raid the 4090’s and 5090’s
WHERE is Intels OpenVino Hardware? There IS a niche - more RAM for AI models and media encoding.The rest doesn't matter as a model which doesn't fit into RAM doesn't compute at all...
People need to not buy at these prices. Its simple really. People don't need these new cards, its the clueless muppets that insist on having new tech that are to blame for the rip off pricing.
@@Ginp- hes butthurt cuz his supposed intel “leaker” fed him BS and he thinks intel never follows thru with their plan. Most of his intel leaks are absolute laughable. Like the Arc A980 or something. The alderlake leak. Sapphire rapids. You dont even wanna open the granite rapids can of worm he fed his viewers 😂
Nvidia is now the biggest company in the world, only Nvidia can kill Nvidia, there is less than 10% chance that AMD can catch up to them completely now and less than 1% chance they can surpass them.
@@rattlehead999 It's because they have a monopoly though. If AMD could at least match them on ray tracing and the price their cards competitively nvidia would have to drop prices. There's a reason why nvidia's cards used to be cheaper and it was because the competition was closer to them.
Lisa Su is Jensen Huang's cousin (literally look it up). AMD will play the underdog for as long as they can so they can reap the benefits of NVIDIAs price gouging.
[SPON: Download Saily and use code “moore” to save 15% on Saily eSIMS: saily.com/moore ]
May I ask why you added a watermark to the text you created in Photoshop and displayed in your video?
Not OP, but it's so when people or creators use it without citing source info that the true source is obvious
"really for professionals", nice way to say that we wont be able to afford it at a reasonable price
AKA business write off. That’s why they can jack up the price.
And probably won't even have enough VRAM for the professionals anyway. Especially the 5080.
Exactly why I'm buying a 5090. It's a business expense.
"GeForce RTX 5090"- they have to drop "Geforce" as that is their gaming branding branch!
@@N0N0111 I can buy it just fine, but nvidia will get no money from me, ever again. What a waste of a company. TNT2, GF3 ultra, Ti4600, 6800GT, 7900GTs, 8800GTs, then they went greedy and I was out. Then I got 4890s and they were amazing along with 5950s, 6950s, 7950s, and 290X. I did get the 980 Ti, b/c Fury couldn't overclock. it was just too poor to justify when 980 Ti OCed like an AMD 7950. But, now, I don't care what they make. Absolute trash company in every way. 5700XT was fine for midrange, now I have 6800XT, which is 100% rock solid for 3 yrs, now.
I can wait till RDNA 5. Games suck, now, anyway. I'm playing 10 yr old stuff lol.
5070 is for gamers? Why not put some actual fucking vram thats needed in them then.
why bother when it will still sell?
Planned obsolescence again. The more you buy, the more you save. Thanks Ngreedia.
Human eyes cannot see past 12gb of vram!
The 5070, you say? The card that will likely see excellent competition from AMD? I see an opening.
At this point is it surprising anymore? Nvidia can do that and people still continue to buy their cards instead of AMD cards with more VRAM. It's people who are just enabling Nvidia to get away with this.
Nvidia has a professional line, its called Quadro, and it is usually 200% more expensive. A2000 for $600 = RTX 3050 for $150-200
ada 6000 is ~$8000, ~5x than a 4090
I bet the GPUs in the A series are binned... A2000 SFF and A4000 SFF are wonders of energy efficency.
That's the problem for more than a decade now. Many people did not want to buy those true pro card especially the semi pro crowd. Some companies even use geforce instead of quadro/tesla for their workstation solution. Now gamer have to face the consequence when nvidia start pricing their geforce with semi pro in mind. Cheap than true pro card but still more expensive than the "toy price" gamer used to pay for gpu.
The only real difference is that they have a driver "certification" and extra memory chips (which may or may not use ECC). Other than that they're the same garbage boards as the consumer cards.
The only true difference is that Nvidia pays devs to cut features if non scum"pro" gpu is detected. Like ambient occlusion in solidworks just do not exist with 4080, but working fine with a1000(or whatever) which is like cut down 3050
Interesting I didn't know I was a professional
Remember how many Vega owners were "Blockchain Creators"? :P
funny enough I bought a Vega 56 to flash as a 64 and mine eth. Probably paid for itself 10 times over. Those were the days...
I still don't know why I bought a Vega FE all those years ago when the $250 56 would have done just fine when what I really needed was a 1080 Ti.
I’m a professional Jack ass
I know you joke, but if you can afford a 5090 you are very likely a professional in some capacity in your job to be able to afford this kind of expense. And even if you personally isn't most of the 5090 buyers will be.
I hate the newer 80 series strategy, they’ve got to knock it off
they wont, people will still buy it
5080 8GB will start at $1999.
If AMD have anything that can compete woth 5090 things will be different. But the issue is not about AMD competing. But how to keep AMD alive and at the same time give the illusion that AMD still competing
@@arenzricodexd4409 Nvidia stopped competing.
@@arenzricodexd4409 This channel's RDNA 4 leaks show AMD should be quite competitive in the upper midrange. I say let the cards come out and see where they stand.
Might not get 80-series performance, but 70-series performance should be easily there, maybe more.
I love how we universally equate 'professional' to 'clueless cash cow' these days.
I saved thousands buying 'not pro' hardware in a professional setting over the years. Like how some Dell 'pro' server ram kits are literally Crucial kits with the same exact part number, and 'pro' Dell SSDs that are Micron basic models with another sticker slapped on top of it.
Professionals have cards as a biz expense, the true cash cows have been the vanity gamers buying top tier flagship models with their own money.
@@RobBCactive I don't believe this at all. Enterprise spending us way more than consumer spending
5090's are going to be rapidly scooped over H100's with use of open source ai training software
Not compared to the Crypto and AI bros with deep pockets. The ones mostly buying the 5090 or 4090 are the rich crypto, AI or people with too much money. The 5090 or 4090 still has tons of demand and was never meant to be affordable.
The Ai and Crypto bros are buying 10 to 20 at a time so there’s still a huge market trying to buy them. Some can’t afford a A100 at $30k and they are hard to resell so they will raid the 4090’s and 5090’s
Wait till you discover that the engineers that design cars go to tech extravaganza’s and get the specs of the tech and plug and play. Your bmw is a Toyota, Chevy, Dodge, HP etc. lol first generation Mini Cooper was a Dodge Neon practically.. The only thing that is unique on a car really today is the emblem!
I remember when people used to argue that the '90-class' pricing didn't matter, because we'll still have the 80 and 70-class models to fall back on...
How that working out? lol
Nvidia was never going to raise the ceiling without also raising the floor. Keep spending 2 grand on their GPUs, and they'll keep treating you like a mug.
I buy high end cards because I do 4k gaming. Expecting people to boycott what's in their best interest when nobody is there to compete is such a losing message.
If you don't buy their cards they'll just raise the prices and call them professional anyways. They don't care about the gaming market now that they can make insane profit with AI.
dude they had done that for years..what do you think titan cards were..and hten the 80 and 80tis?
@@JustADudeGamer You can buy AMD, 7900XTX is just slightly slower.
Right! Except it won't be 2K for the high end GPU, it'll be pushed up to 3K, double the 3090 more money than sense price. The more money than sense crowd will just rationalise their purchases claiming they need native 4K @120+ HZ with ray tracing, rather than it's a vanity purchase.
@@RobBCactive yeah i could buy the cards out right and still have enough to vacation or buy a car or what ever..but 1.2k hell anything over 900 for a graphics card? let alone 1.5 and up! its crazy..gaming alonethat cant be justified
"are really for professionals" == "not for the poors"
Or for people who don’t know how to handle a credit card they can’t afford.
@@legiox3719 "let the multi billion dollar company alone it's your fault if they sell overpriced products" aaaahh mentality
Classic brainwashed cons00mer mindset
They can’t even make them fast enough! The professionals and the Crypto and AI bros with deep pockets. The ones mostly buying the 5090 or 4090 are the rich crypto, AI or people with too much money. The 5090 or 4090 still has tons of demand and was never meant to be affordable.
The Ai and Crypto bros are buying 10 to 20 at a time so there’s still a huge market trying to buy them. Some can’t afford a A100 at $30k and they are hard to resell so they will raid the 4090’s and 5090’s
5090 is "for professionals", 5080 is "for professionals 16GB" (and half the cuda cores), 5070 is only 12 GB in 2025. Consumers just can't win...
*nVidia fanboys just can't win
5090 and 4090 are made for crypto and Ai bros.
They can’t even make them fast enough! The Crypto and AI bros have deep pockets. The ones mostly buying the 5090 or 4090 are the rich crypto, AI or people with too much money. The 5090 or 4090 still has tons of demand and was never meant to be affordable.
The Ai and Crypto bros are buying 10 to 20 at a time so there’s still a huge market trying to buy them. Some can’t afford a A100 at $30k and they are hard to resell so they will raid the 4090’s and 5090’s
A duopoly is almost as bad as a monopoly.
@@AlphaConde-qy7vi This has nothing to do with nvidia fanboyism. the simple fact is that amd is unable/unwilling to compete at the high end for the next generation, and all consumers suffer as a result.
@@mingyi456 stop complaining and just don't buy. if you keep buying, they keep making. this is the same as amd, you dont buy because its bad which is why they didn't bother to give you 90class card. you dont buy 5000series, then nvidia will be a bit generous with super or 6000series. stop blaming amd if you just wanting to buy nvidia.
So the 5080 is basically the 4090 with a small price-cut, but also with less RAM. Great...
That fucking sucks so bad.. If it was just a 4090 re skined with less power and cheaper like the same price as the 4080 super it would be a good release.... I have a 4080 right now and upgrading to the 5080 doesn't make sense.
@@apricreed9580 People upgrading gen-on-gen no matter what Nvidia does is why Nvidia is the way Nvidia is.
@@andersjjensen and then they blame AMD no matter what. People are incapable of taking responsibility for their actions.
Not have a chance against 4090 fs
Will be noticeably slower than 4090
"GPUs above the 5070 Ti are really for professionals."
So why is it called "GeForce" then? Hmm...
god i hope the rdna4 january card is well priced really dont want amd to screw up as usual
5070ti performance at like 600 would be great
depending on performance I will get it only if its 600 or lower
RDNA4 is the only hope left for next gen cards.
4080 Super at $500, if it's $600 it's dead on arrival against the $650 5070 with whatever marketing dlss Nvidia will come out with.
@@elpato3190pipedream. The best deal from AMD for the entire RX8000 series probably something that have 7900GRE performance at $450 instead of $550 that we have now.
This is all thanks to covid shortages + scalpers and morons buying 3080 and 3090s at stupidly high prices ...Nvidia saw this and have taken full advantage of this (started with the 3080ti and 3090ti pricing).... Lovelace was an absolute dumpster fire (quite literally) with the pricing and now this shit.... But I guess we are all "professionals" now 😒
No, you can thank the Ai and Crypto bros that buy 20 at a time
As observer168 said I would blame crypto more, but yeah some still paid that inflated price for gaming. Me I got my 3080 on release at the original RRP, thank god. I hope this time these prices bite NVidia in the butt as crypto mining not GPU bound so much.
@@Observer168 businesses in various industries have done this forever.
Stop spamming nonsense
Dont blame a virus for the mistake of consumers. People bought those cards for those prices and then started complaining when next gen went up with price. Like wtf? Start voting with your wallet if you want cheaper products. Simple supply and demand equation...
@@MaddJakd GPU cloud companies like tensordock jarvislabs lamdalabs buying up hundreds or thousands of 4090 cards. You can see them rented per hour.
nvidia is so full of shit with this "for professionals"
The more you buy the more you safe!
>look inside "professional" graphics card
>GEFORCE rtx
hmmmmmmmmm
The professionals and the Crypto and AI bros with deep pockets. The ones mostly buying the 5090 or 4090 are the rich crypto, AI or people with too much money. The 5090 or 4090 still has tons of demand and was never meant to be affordable.
The Ai and Crypto bros are buying 10 to 20 at a time so there’s still a huge market trying to buy them. Some can’t afford a A100 at $30k and they are hard to resell so they will raid the 4090’s and 5090’s
Nvidia have enoff fulls who thinks they always need the new letest s*** and don't care about price!
"professionals" = people who have $80k+ per year jobs - LOL
Intel with a delay doesnt seem to suprise me anymore at all. Its like Tesla and roadmaps.
Tesla are the worst for broken promises
i want to know about the price of the battlemage gpus, instead of false propoganda about its canccelled because this guy hs been crying for few years about it being cancelled but it releases in the end
@@bmqww223He is just leaking what is going on in the company at the time he does intel wasn't sure about releasing it so he guessed he thought they probably won't that's it.
@@bmqww223 You have a poor memory or selective hearing. He said BM was cancelled for desktop outside of a small die (well, nothing intel makes is small, but still hella cutdown lol). No information currently disputes that.
@@TheGuruStud why are you editing our own comments ? pretty sure you have problem of selective hearing that youre not sure what you say lol..... don't be a fanboy we all know here what he said ... he has been crying cancelled so much that nobody believes him anymore he just said in prvious video that its not even fast enough to beat 4060 but now he says that its almost close to 7500xt or 7600xt....
So nVidia isn't charging an arm ana leg for 5080 and 5090 because they have no competition, but only because they're for professionals? That makes everything so clear.
Are red flag was probably when AMD said they won't compete in the high end this time around for gpus the 8800 XT looks to be marketing around a 4080 a little weaker but stronger than a 4070 TI and it's going to be marketing around the 500.00 mark midrange. And that's it for 2025 power Wise from AMD supposedly unless they pull a pop-up launch that was hidden for the high end.
@@Ziyoblader with these pricings, I sadly expect AMD to up the top 8800 XT to $599 with 5070ti being 10% faster and $799. I don't expect anything shaking up in the market, they're fitting together like a puzzle as always in the duopoly. AMD's complicit
Made for Ai and Crypto bros only!
They can’t even make them fast enough! The Crypto and AI bros have deep pockets. The ones mostly buying the 5090 or 4090 are the rich crypto, AI or people with too much money. The 5090 or 4090 still has tons of demand and was never meant to be affordable.
The Ai and Crypto bros are buying 10 to 20 at a time so there’s still a huge market trying to buy them. Some can’t afford a A100 at $30k and they are hard to resell so they will raid the 4090’s and 5090’s
Nvidia = Blackwell even more expensive than Lovelace - no surprise there
AMD = RDNA 4 launching in Q1'25, but FSR4 won't be ready for launch - no surprise there
Intel = Battlemage is an absolute sh*t fest - absolutely no surprise there
😫
amd ages into fine wine, fsr 4 brings added performace to older gtx and RTX cards allowing everyone to delay an upgrade for another generation
And we don't have the prices yet for rdna4, fairly sure they'll miss the mark on pricing just like they did in the previous releases.
@@liberteus Just follow standard procedure, which is to wait a month or 2 and get it at a 'discount', otherwise also known as the 'price it should have launched at to begin with'.
@@AmiableChief it takes way longer than that now for AMD to get back to the price it should have been. And I'll buy a 5090 day 0 with a complete new PC, once every 4 years, at company's expense anyways.
Keep consuming sucker @@liberteus
The 5080 is really a 5070 sold at a x90 price 😡 its waaay slower than the x90 with 1/2 the shaders to try to get you put down $3000 for the 5090
Um skipping all the 50 series
@@thetechrealistnot all people can afford to buy a used card. I'm not talking about the price, some people sell broken used cards in the market and if you do not have a system to test a GPU right away you can't get a refund. I got to know this the hard way...
I"d go so far as to say its actually a 5070 now. it was pushed to an 80 class spec in 4000 series and this is ANOTHER drop in class performance. The 5090 is not the full die, so its even dropped a class.
5080 won't be way slower than 4090, but it will probably be slower, but cheaper.
@@thetechrealist that's the answer huh? nice strategy
I mean I am a professional but I would like a card that doesn’t cost a kidney and an arm…. Also when you go north of 1000$ I and many other “professionals” expect the best. If the 80 class and 90 class have a massive gap in performance it’s going to feel like I’m getting ripped off… this just brings me back to NVIDIA needing to properly market cards for the professional market with their Quadro line.. at least there the cards were not masquerading as a good gaming gpu..
You could probably get at least 4 5090’s for one of your kidneys
@@jessiethedudelol only 4 😂
I understand what they're going for about the 5090 and it's price but the 5080 is giving me shit vibes, given I have the 4080 and I feel ripped off but the new 5080 is a no buy what's so ever because of the VRAM, even if it's better than the 4090 the VRAM still a huge deal breaker, if they just made it the same as the 4090's vram with a slightly lower power draw it wouldn't suck that bad and made it's price at $1k max I would have considered buying it.
@@apricreed9580 💯💯💯
Good luck. Half the professional software i use won't give you tech support unless you're on a pro card. Love paying 8k for a graphics card.
All these products are a hard skip. No VRAM increase, almost no Raster performance increase. The only selling point for these dogshit cards is better Ray Tracing which no one cares about. Skip the regular 50 launch and wait for the 50 SUPER launch at the end of 2025. Thats when the new VRAM modules will roll out and you will see 50% more VRAM capacity in every SKU. That combined with full uncut dies (10-15% more shaders) and faster GDDR7 (15-20% more bandwidth) that will be the series off cards where the price to performance is what you want.
Heard something similar before about the fact there will be 2x capacity gddr7 modules but it sounds like a pipe dream to imagine them releasing their super cards with double the vram, where you got that info from ?
"really for professionals"
Does that mean latest AAA games are made only for... professionals?
professional game & hardware reviewers. I guess being a tax writeoff makes a $2000 GPU easier to afford
To be fair.. the 5070 TI 16gb is plenty for all AAA games. It will release at $799 though lmao
@@christophermullins7163 The 5070TI is gonna be $900 and will be slower than the 4090, so it will be hot trash basically
Made for Ai and Crypto bros only!
They can’t even make them fast enough! The Crypto and AI bros have deep pockets. The ones mostly buying the 5090 or 4090 are the rich crypto, AI or people with too much money. The 5090 or 4090 still has tons of demand and was never meant to be affordable.
The Ai and Crypto bros are buying 10 to 20 at a time so there’s still a huge market trying to buy them. Some can’t afford a A100 at $30k and they are hard to resell so they will raid the 4090’s and 5090’s
if you look at latest amount of failures in sales and big hitters shutting down, then it seems so yes. Only the (pro´s) @ gaming journalists that gets free copy´s and the gamedevs them self likes the games they make it seems. Maybe they should cater to the((normal)) people, and start ignoring the Pro´s. Both for nvidia and Gamedevs.
we are 3 generations into nvidias quad core era
I guess AMD appreciates if Nvidia uses their superior marketing to convince people that anything above the 5070ti isn't for gamers. One less reason for them to worry about RDNA4 not competing in high end being an issue.
So AMD didn't abandon the high end after all!
@@nipa5961 indeed! Who knows, maybe the 8800XT even beats Nvidia's highest end gaming product this time?
The problem is that the "RTX 5080" (in name only) is actually an RTX 5070 if you look at the leaked specifications. So the "RTX 5070" (in name only) won't actually be a true RTX 5070-that will be the graphics card labeled as "RTX 5080." Typically, a 70-class card has half the shaders and VRAM of the 90-class, and the "RTX 5080" matches this perfectly based on the leaked specs. Nvidia calling the "RTX 5080" a professional-grade graphics card is just a way to make you compare the "RTX 5070" (which will effectively be a 60-class card) with the "RTX 5080" (an RTX 5070 disguised as an RTX 5080). Always check the technical specifications, not just the names, to avoid being misled.
Nice quick outlook of what to expect in the next few months. Let's hope that the FSR 4 rollout is better than 3's. Also, hopefully it works or has a fallback for RDNA 3 or even 2 cards.
I doubt FSR4 will work on RDNA2. If it will, probably it won't be worth it, just like with Intel Xess (in fallback mode).
Is 16GB VRAM for professionals?
Not enough for me. I need at least 24GB to render large scenes in Blender.
I want to scoff in anyone's face that says a 5080 is for professionals.
@@switchthechannel6317Nvidia will release the 24G version later in 2025 with a $200 premium over the 5080.
@@JustADudeGamer if it didnt have gimped ass ram and bus, i'd believe them but as it sits its a rip
@@hotdogsarepropagandaThat's what makes it more laughable but based on rumors the die size is still a 70 class card disguised as an 80 class card which never belonged in the professional tier to begin with. I can't put up with any reviewers that give into such blatant anti-consumer manipulation.
Hopefully all this is just rumor mill nonsense but I'm getting mad at just the possibility they are doing this.
LOL Nvidia has been raising prices since the GTX 980Ti.
the gtx 680
Blows my mind how Nvidia went from being a fairly decent (no worse than average) company, to rivaling Apple with their anti consumer policies in such a short time. Sincerely hope they go under, would rather have intel vs amd to choose from for video cards than a company like nvidia fucking up the market.
Nvidia is going nowhere. They are the best of the best for a reason and that’s why they charge what they charge because no one effectively competes with them.
@@iLegionaire3755 hahah the A.I bubble is starting to burst already, & their earnings expectations have dropped.
@@kevinerbs2778 Except Generative A.I is not a bubble, nor a fad. Generative A.I has virtually limitless applications across every line of business, and those who think otherwise are going to eventually be left behind.
@@kevinerbs2778I hope that after the AI bubble will have bursted completely they will crawl back to us gamers
You've had your eyes shut since 2015 if you thought they were ever pro-consumer.
gtx 970 had a damn class action lawsuit for lying about VRAM then...
Nice cut, Tom. Have a happy Thanksgiving.
Thanks, you too!
I so deeply hope for a 4080+ performance from anyone other then Nvidia. I want to support competition from the bottom of my heart
The cheapest XTXes on pcpartspicker right now is $820. Granted, it's a stupid time to buy, but it was always there, and I'm having a blast with mine.
It is likely to late. Intel is all but defeated and AMD is winding down and giving up as well. We are Doomed.
@jonshaffer5793 AMD is not giving up, they are playing the long game. Look at steam usage charts for rtx 4090,it's far less than the mid to entry GPUs. AMD is trying to gain market share for Radeon while maintaining and gaining marketshare with Ryzen.
7900xtx. I got mine for 765$ from Amazon yesterday.
Leaks suggest the 8800 xt will be comparable to the 4080, at around half the price last I heard. And the 7900 xtx is already ahead of the 4080s in fps outside of heavy RT, at -25% the price.
The only reason you'd NEED a 4080S is if you're playing in 4k with RT on ultra with DLSS. In 1440p, or 4k W/O RT, you have options.
Once AMD taught Intel a lesson, it's time for Nvidia
Intel managed to fuck themselves over good and proper right as AMD was gathering all their remaining strength for the "do or die" battle. Nvidia is, unfortunately, an entirely different beast. I'd LOVE to see Radeon get to 50% market share... but remember that Ryzen has just surpassed 40% despite being overall better for several generations in a row.
@ agree. But ryzen was not able to win in games until they introduced x3d. Im just hoping they have something this cool for gpus under their sleeves instead of unsuccessfully copying nvidia
@@pashabiceps95 "Im just hoping they have something this cool for gpus under their sleeves instead of unsuccessfully copying nvidia"
They don't, FYI they can't even successfully copy Nvidia. According to leaks the next gen of AMD GPUs will mainly see a RT performance improvement which will only be comparable to the 4000 series.....
AMD has FSR4 coming but that is apparently being delayed, meanwhile Nvidia also has DLSS 4 in the works, AMD doesn't even have anything that will compete with the 5080.... The only way for AMD to even take some marketshare is to price their GPUs to the point they likely don't even make money from them. Both GPUs after this gen are also meant to see an architectural change, so you are hoping for a lot. Nvidia isn't like Intel, Nvidia is the one that has been pushing forward with change and AMD is the one that has been playing catchup.
@@aflyingcowboy31all and need is to release 4090 performance for 700 or 800
@@roshawn1111Maybe in two or three gens that might happen. Definitely ain't happening with AMD this gen though.
Stop buying Nvidia cards people. This madness has to stop.
Who is their competition though? AMD who left the high-end market, or Intel who is struggling to make a mid-tier card and has 0% of the market.
@@jonathanryan9946I know right amd's 8800 XT is going to be around a 4080 but a little weaker but stronger than a 4070 TI meaning they're not trying to surpass their 7900 XTX and they're 8800 XT will be priced around 500 mid-range so this leads up to what the leaks were saying how they're not going to participate in the high-end but targeting the mid-range
@@jonathanryan9946 We don't even know what the next gen AMD stuff is yet. I can understand if you absolutely need to have a 5090. Everyone else should consider AMD or Intel. We are all fucked if the Nvidia monopoly continues unabated.
@@jonshaffer5793 "We don't even know what the next gen AMD stuff is yet"
Except we more or less do know? AMD themselves have stated their will be no high end cards this gen i.e. they aren't competing with the 5080 let alone the 5090.
The leaks from this very channel have practically said expect last gen 7900xtx / 4080 performance from these next AMD gpus, fyi the 4080 levels of performance is in regards to raytracing performance.... AMD will have to price their GPU extremely low to, if they do that they can likely do well.
@@aflyingcowboy31 So if you you absolutely have to spend 2k on a card for a mediocre performance gain over an 800 dollar card I say go ahead and throw your money away. The rest of us with a brain will stick with the "mid range cards". In that space I say buy anything but Nvidia because they are getting very anti consumer.
I'm all for cards above 5070ti being only for professionals. But also tell that to game developers so they don't optimize their games for 90 cards.
September B580 launch would had been fire at right price. Hopefully higher sku is alive and well.
No, we used to get flagship performance for much cheaper and cut down 80 class cards that are really 70 class cards with low VRAM don't belong anywhere near "professional." They are manipulating the consumers that made this company in order to cater to AI and keep their fat margins.
@@JustADudeGamer we used to get milk for a lot cheaper too
@@razpaqhvh7501 Milk has to be somewhat affordable for regular consumers. NVIDIA video cards are pushing regular gamers out of the market and it doesn't even matter because they make like 90% of their money with professional stuff and the margins they get are way higher. Furthermore nobody can compete with them at the high end. There is no incentive for them to care.
That's what I'm talking about! "RTX 5070 ti 16Gb will be the best for gamers, they won't need anything more powerful", that could be true, but tell game companies to optimise their games and do not rely on shenanigans like upscaling and DLSS
I'd like to see a class action lawsuit again Jensen Huang or Nvidia. I would liike to see them answer for makig GPU with a short lifespan due to not enough vRAM. E-waste in half the time GPUs used to last for.
I guess they are waiting to create a 24 GB version for the 5080 ti?
As much as I hate the 5090 pricing, I remember the $3,000 Titan days and I feel like Nvidia does too.
But this ignores the 80 series
@@chrisblack6258 don’t buy the 80 series, it’s going to be a export safe model that they can ship to China, AKA a 4090 light and they will only sell it to gamers willing to compete with that demand and cost.
You'll need to live next to a power plant, with a direct line to your home - but it runs like a beast!
Intel is the king of half-assery
I'm good staying with my 4080 for another 4 years. Even if the new consoles release in 2027, we won't see the fruits of their power for another two years after.
I dunno... If Intel can put out a card with 4070 levels of performance + 16GB oh VRAM & sell it for $400, who cares if it's not able to directly compete with the 5070?
We all know Nvidia is going to raise prices AGAIN, and I'm not sure people are going to be super excited to spend $600 - $700 on a 12GB 5070.
The 7800XT is a 4070 Super with 16GB already. Intel couldn't ask more than $300 for such a card.
Intel can barely compete with 5 yr old midrage GPU.
The problem is it'll be launching against RDNA4, which looks set to offer that kind of performance around the lower to middle end of the stack, and everything so far indicates it's going to be aggressively priced. So then you have to ask yourself: do I choose a $400 Sapphire Nitro card, or do I choose the $400 card from an unknown brand, using the GPU from the company that made a totally broken lineup last generation.
It's going to take great performance and absurd pricing to win players over on Intel this generation, because AMD is already the brand budget builders trust.
The point is Intel really can not do it! Intel use expensive highend node… They should sell it at higher price than even Nvidia!
Intel can ofcourse lose money… again… But that means that Battlr mage may be last Intel disgrete GPU for a while. They need profit, and alcemy, did not make that.
Intel most likely would have tp price it close $600 to $700 to make sense… And that will not sell too well.
And once again the 80 class will be the bastard child. The 4080S will likely stay, even used at $1000
My initial plan was to buy an RTX 5000 card and Ryzen 9800X3D. However, I think it's better to skip another 3-5 years. I'll just play older games while I find a second job and build wealth.
My only concern is probably whether or not my 8-year old GTX 1070 can survive another 5 years.
Why do you only consider Nvidia? RX 8800XT or whatever it will be suits you perfectly for not wealthy update, and you can just jump onto AM5 with like 7600 or 7500f and already have really good perfoming PC.
Maybe try look for a secondhand option after new GPU launches?
If you need more FPS for your Games, go ahead and buy a new gen GPU/CPU
Just buy something used. Hell a brand new 7800xt sold for around 460 Euros during Black Friday. You dont need to spend 1000$ dollars on a GPU to have good perfomance
Good on you for showing restraint. That said, there are plenty of good options which would provide a massive upgrade over a 1070, for very reasonable prices and available right now.
Working hard is great, but don't force yourself to suffer with an ancient card for 5 years 😅
How come we don't have newer rdna 4 leaks considering it will be announced in about a month?
Either it's pretty good or pretty bad
Let me guess, Nvidia will release the 5070 after trump brings in tariffs but after secretly stockpiling them in the states. Then marking up the price...
If the 5090 and 5080 are for professionals and the 5070 is for gamers, then that means it can deliver 4090 pathtracing performance, right?
Nope
@ oh sorry, I forgot that pathtracing is only for “professionals”.
Professional gaming… newer heard?
😂😂😂
Just go to the shop and say secret password; ”The more you buy, the more you save” and they will sell you professional GPU!
😅
By that logic, since the H100 is for professionals, then the 5070 should have as many tflops as an A100.
If a 5080 is a professional card it cannot have 16gb vram. Professionals need more, period. Also the 80 clas should always be 15-30% faster than last gen flagship. If the 5080 is not that with at-least 20gbvram I will pass. I will only buy when I am getting good value.
Then you'll never own Nvidia again lol
Professionals use 4060 level GPUs… so 8gb is enough!
😂
Naah. 5090 will be 2500usd. People will pay it. NVidia should start getting worried about antitrust though.
maybe in the EU but the Orange Gorilla will just put tariffs on it and expect the price to go down. if he puts a 60% tariff on nvidia and amd (likely if he labels taiwan as part of china) then intel will be all any US citizen buys because 60-% price increase would make it not economically feasible to buy imported chips
Antitrust? lol. Even if enforcing the law was something that is going to occur for the next Four More Years, which it isn't, Trump isn't going to let his DOJ puppet wreck the value of a $3t market cap corporation and tank his beautiful stock market numbers.
At what point does the cost of antitrust lawsuits just become the cost of doing business?
With who enforcing it? The oncoming administration is very business friendly. 0 threat of Antitrust enforcement there.
@@amack308 0 threat? I don't think so. It'll happen if it serves their purposes.
Oooh, does that mean that they will shove 12GB down the customer's throat once again?
Yes the 5080 will really be a 5070 with limited ram and 1/2 the cuda cores of the 5090 for over $1000
Basically, Blackwell and Battlemage are a skip. Also, the RX 7600 XT undervolts extremely well. Tech Yes City basically got it perform the same as stock while getting the power consumption very close to an RTX 4060. The RTX 4060 only pulled 30 watts less at that point. If Intel consume more power than the 4060, then they lost to AMD with a node advantage. RDNA 4 doesn't seem to have any negativity coming from anywhere. In fact, alot of people seem to be optimistic about RDNA 4 with the new RT cores and FSR 4 using AI to make upscaling and frame-gen better. Personally, I am not worried about FSR 4 getting delayed becuase the minute it goes open-source, modders will get to work and we will be able to put it in every game that DLSS is currently in. Just like Anti-lag 2 can now be enabled in every game with Reflex.
I'm not a professional so Nvidia shouldn't mind that I'm not gonna buy their cards
Why would they care?
@@aerithgrowsflowers monies, or there lack off.
@@GreyDeathVaccine Their desktop + laptop GPUs now make 10% of their entire revenue.
Really for professionals = really for anyone with a huge wallet
A VRAM-cripple ends-up beeing a FPS-cripple sooner or later. Remember this you 6, 8, 10 and 12GB NVidia customers.
RTX 2060 12GB -> RTX 3060 12GB -> RTX 4060 8GB -> RTX 5060 8GB?
this is what happens when one corporation has a monopoly on the market
At a point, when a behemoth company no longer cares if a particular segment purchase their products or not, basically grants them the license to do what ever they want.
Nvidia’s strategy became known with the 40 series. Make the top product extremely powerful and jack the price up.
They "could" have used 3GB chips, they "could" have just made each tiers memeory bus just 32 bits wider, with the extra bandwidth they "could" have used 4GB chips and cut the bus down 'up to' 50% and still been ahead. They didnt, because they know you will buy this crap regardless, they arent wrong are they? They love to screw over those who love to be screwed over
At current trend when 10060 series release nvidia will only give us 2GB ram 😂😂
No, this is what happens when people buy into the hype of RT. Radeon cards are competitive with much higher priced nVidia cards in raster and the software since RDNA2 has been solid.
The best antidote to nVidia prices is people start buying AMD. The current nVidia situation is 100% the fault of the nVidia fans themselves.
@@animaze8043 they will keep saying they need DLSS but won't ever acknowledge it's precisely the reason we can't get good games and cheap VGAs
@@Osaka2407 And FSR not being quite as good is not a good enough reason to support nVidia - especially since DLSS is gimped to work only on newer hardware, whereas FSR is generic and supports GTX cards, too.
@@animaze8043 upscalers are dog shyt.
Just add 32 or 48 GB VRAM to B770. It doesn't need GDDR7. Plenty people will buy it for running local LLM.
I was just thinking today. Why not just erase the 4090 tag slap 5080 tag on it and then call it a day.
5090 is gonna be 1999.99 but sold for 2600 if u can find one
Bargain
the cost of these GPUs has become just ridiculous :(
Most buyers of 4090 and 5090 are AI and Crypto bros. They buy 10 - 20 at a time for GPU farms. You really think the gamers are fueling this madness?
@@Observer168 they aren't helping but i see your point
More overpriced, under-specced junk from nvidia.
Get ready for more 10 and 12 GB VRAM. 16 GB for 5070 or 5070 ti is just a dream. Nvidia will be dead expensive.
The 5070 ti is looking to be 16GB this time so it might the card to get for 80 series users.
@@frowningboat8039 I beg to differ.
80 series should get the 24GB stack from previous gen if asking MSRP will be above $1K.
@@MetallicBlade if you mean 24GB for the 5080, I completely agree although they would have to wait until Q2 for the 3gb modules to do that. I was just saying for anyone with a 1080 Ti, 2080 or 3080, the 5070 ti might be a good upgrade if it's $799.
Apparently only "professionals" need enough fucking vram for playing kiddie video games?
The truly frustrating part is as Nvidia gets greedier and greedier, and games are getting more and more demanding, AMD bailed on the high end and Intel is half-assing it like it's someone's weekend project. At this point I would spend a grand on a true Arc competitor just to throw the middle finger up at Ngreedia and Another Massive Disappointment. But apparently Intel only cares to compete in the budget class, while not actually _competing_ in the budget class.
It's enough to make you say screw gaming and find another hobby.
womp womp
When will all the Intel Fanboys realize their horse is FOUR YEARS BEHIND NVIDIA. It's been that way ever since the A770 was launched to challenge the 3070 TWO YEARS LATE, and it achieve 3060 performance with HIGHER TRANSISTORS AND DIE AREA THAN 3070. Intel can't design shit in VLSI any more ...
It would seem Nvidia will milk the shit out of 50 Series. Hopefully the Intel/AMD counterpart will both be priced at will perform at reasonable level so we can have options
This is what happens ladies and gentlemen when one company has a monopoly on the high end market. They can price however they damn well please because they know people will buy it.
The monopoly is not the problem. The idiots who buy this, are the problem.
I wonder if Jensen will bring the 5090 out of a furnace, since his oven might not have room to hold it this time, lol.
Will have to pick up a weekend job if you want a 5090
Hopefully, selling the 2 year old 4090 lighen that blow.
Time to switch to AMD
i would love to get a 5080, but if it's 16-18gig, then it's just a pure scam.
I really want Intel GPU's to be a thing. Desperately need more competition in this space.
Hmm, if B770 is roughly 4070 performance for $400, and top RDNA 4 is roughly 4080 performance for >$500, it might possibly compete well with an RDNA 4 "8070" if it's 16 GB vs. 12 GB. But then there's drivers...
The B770 needs to be no more than $350 if they wanted it to compete with the 5060/ti or 8700xt
Intel GPU launches
A is for Asaninine
B is for Bad
C is for Crappy
D is for Doomed
5090 $3300 after tax in Canada
canadian dollars are cheaper than US dollars.
9:10 - you're not guessing, you forgot to delete the part where it says your source literally confirms the efficiency
As GPU prices continue to rise, from NVIDIA, where the top end is for “prefessional” use - how much of a deleterious of effect is this going to have on gaming development at large? Seems to me that there will be an untenable inflection point where hardware needed to push the industry forward will become so out of reach of the normal consumer that game development will stagnate.
You must be trolling the main thing happening with gaming is the "1 billions polygons screw" otherwise it's still the same bland healthbar and character with ridiculous hitbox fetish. Game with needlessly high polygon count on characters model and 4k texture everywhere , like are they going to make smut with them or something?The worses are the hyperfast third person game \ real time tactics where the screen is covered in particle vomit and you can't see any finer details anyway.
1400 Bucks for 16GB VRAM ☠
GDDR7 is 6x times more expensive than GDDR6 per 1GB.
@@rattlehead999 So margins for nvidia are so bad that they have to let you eat the cost 6x?
In my opinion this downgrades the 5080 to 1440p graphics card. Because it will age poorly for 4k. The value is simply not there to pay this amount of money.
@@ElFlipo1337 No, but people bought the gtx 600 series like crazy, they bought the 1000 series like crazy which were just as big of a scam as the rtx 4000 series, people just didn't realize it because of their raw performance.
We are here, because for almost 13 years now people bought rip-off GPUs and they bought Nvidia over AMD when AMD had a superior architecture and software from 2007-2015.
Ahhhhhhh bummed about B770's delay 😫
Looks like I'll bet getting an AMD GPU in early 2025 then 🤷♂
What would he hilarious is if sales are terrible on the 5080 and 5090 cuz everybody thinks they're for professionals lol!!
They can’t even make them fast enough. The Ai and Crypto bros are buying 10 to 20 at a time. Ever heard of GPU farms?
Don´t worry! These will sell like hotcakes even if they would cost $3000 and $1500!
Nvidia does not have to worry about income or not even market share!
They did go from 86% to 88% with 4000 series while everyone did whine that it was too expensive! They most likely go up again to 90%!
Nvidia has to increase prices more, so that they would not become monopoly!
😂
Really disappointed with Intel news. Was hoping for a good 3rd competitor but Intel is tanking in all aspects right now.
Intel has officially become the new Boeing Titanic. Quite literally. I hope they can make a comeback, but it is not looking good for Intel.
Fingers crossed AMD solves their issues with true MCM design. We've surpassed the practical ceiling of monolithic.
I mean the confusion from saying it is effectively canceled, I mean what does that even mean?
Just say that the top end and laptop are canceled
Hopefully, INTEL's battlemage is near NVIDIA 4070 in performance with 16 GB VRAM and priced under $400.
Yea, I know. That's a lot of hopium. But if they can manage that the card will sell like crazy unless it is a power hog.
that's a lot of hopium, but fortunately Another Massive Disappointment will not miss another opportunity to miss an opportunity
Considering that B770 is a big die on TSMC 4nm reaching 4070 performance, $400 means Intel’s selling at a loss.
@@TheDoomerBlox how would it sell if pretty much everyone who buys these GPUs already know leaks/rumors about cancellation of ARC? I mean, this is probably a dead end for ARC, who knows if Intel will continue support it with drivers, and if you really think that people that buy Intel GPUs are not aware of these things - regular consumers just gonna pay for green card for fancy stuff like DLSS 4 and not even raw performance (which Alchemist had, at least in terms of RT, they blow out of water AMD's RT)
Edit: fixed typos
Sorry, didn't mean to reply to @TheDoomerBlox, I meant to reply to parent comment
Power hoggery is the lowest on the tier of concerns, if it's a 16Gb
The 50 series isn’t bumping the vram except for 5090 smh 😒
5080 for professionals... vram amount 16gb 😂 rebraned 4080 super with slighter better power draw. I'm calling it now.
Professionals are buying used 3090s, anything less than 24gb vram is trash, for profesionals and for 4k gaming.
Professional and 16GB don't go well together.
someone from Nvidia already said that we should expect a shortage you gotta love a self made shortage of supply so prices and profit margins stays high all while I am sure their fanboys would sell a kidney or two just to line up on launch day.
5080 with half of CUDA Cores a Professional GPU? Lol 😂 Even the 5090 will not get the full 192SM and 32Gbps of GDDR7 ! Let's hope they won't hugely cut the L2 Cache either...
Big cache is needed to compensate for lack of bandwidth, if there is enough bandwidth then there is no need for a lot of cache.
@@rattlehead999 Definitely yeah! And the 4090 is definitely bandwidth starved! The L2 Cache cut (72MB vs the original 96MB of AD102) definitely hurts the 4090's performance... and it hurts even more when you know that the L2 Cache transfer speed is about 5TB/s whereas GDDR6X is only about 1TB/s! Also the 4080 & SUPER have 64MB of L2 Cache when they have a lot less CUDA Cores! Nvidia really played it cheap with all its lineup...
@@jongamez5163 Yup the 4090 is only 30-35% faster than the 4080 while having 60% more SMs and it's precisely because of the lack of bandwidth, it should have been 512 bit.
@@rattlehead999 That's what Nvidia seem to be doing with the 5090... 512-bit @ 28Gbps GDDR7 aka 1,792GB/s. and we could probably get 2TB/s with an Overclock! Let's hope they will give the 5090 the full 128MB of L2 Cache! If it's $2000 I'm selling my 4090 to buy a 5090 😅
@@jongamez5163 I think that with 1792GB/s bandwidth it doesn't need as much cache per SM as the 4000 series did.
I'm sticking to 1080p as long as games allow for it, so I have no use for a 5090 and 2000$ for a GPU is just dumb at this point. Especially when it's going to be a 600W brick.
the thing I am really hyped for is the cooler design of the 5000 series. Can't wait to see it.
Also AMD could use a little innovation with their release version coolers.
I'm mostly hoping that FSR will finally make some gains on it's image quality. Would love to keep my 7900 XTX but at 1440p it's upscaling is just not good enough.
I love how you make stuff. keep up the good work.
NVIDIA:5090 is for profesional
Me: ok i will get AMD then.
let's be real if you wanted a 5090 you wouldn't choose AMD instead. On the other hand that is my exact logic for getting an 8800xt instead of a 5080.
They don’t care, they can’t even make them fast enough
@@chisel4164 The 8800xt will be on par with the 5070 or 5070Ti.
@rattlehead999 Yeah, I know I figure if the 5080 is considered professional now, it'll probably cost almost as much as the 4090, which means the 5070 Ti will cost as much as the 4080S so the 5070 will cost around $700-800.
If I'm getting 5070 level performance for the price of a 4070 Ti S, I might as well go AMD for better value.
@@rattlehead999no it won't. FSR is unusable and required for UE5. 8800xt will be weaker than the 7900xtx and even the 7900xtx is unusable for UE5
Ah well, at least 95% of video games that release every year now suck and my motivation to keep playing em lowers. Between that and the price of hardware, it looks like this will finally be the wall I hit where the only reason I upgrade is to keep playing the games I do enjoy with mods which won't need higher tiers of performance to run, I'll just upgrade after the hardware has been out a while and is heavily discounted and it'll only become cheaper as the gaming and hardware market become more stagnant. I hate to be so black pilled, but what a truly terrible time for gaming and PC to boot.
In the same boat sir, tuff times we live in
I wouldn't mind the higher classes being pushed towards professionals, WHEN there is enough performance on the lower classes for 4k 120FPS gaming with RT.
But even the 4090 is not there yet...
4k is for professionals!
Real games buy 1080p mpnitor and 5060 at reasonable $500 price point!
😂😂😂
.. or just use upscale from 720p to 4K if you really want that…
😂😂😂
@@haukikannel I use upscaling from 1080p to 4k and still don't have enough FPS with a 4090.
A 5090 will also be not enough to hit 120 MIN-FPS with DLSS performance in UE5 titles...
I don't use framegen.
@@gucky4717 just games play games at 1440p or 1080p then
@@utopic1312 You don't get it. DLSS performance in 4k is already 1080p.
Even though we know better, you can't help but think, "Well maybe, maybe the GPU market will stop blowing chunks right?"
I mean good lord, it's been trash since Turing.
“ReAlLy FoR pRoFfEsIoNaLs” is the biggest facepalm in hardware history. I am not buying NVidia
For the love of all that is good, NVIDIA you have a professional line called the Quadro that was thousands of dollars more expensive than the gaming cards. Please keep them separate.
Dude the ones buying up all the 4090’s and 5090’s are the Ai and Crypto bros. They can’t control who buys them.
Made for Ai and Crypto bros only!
They can’t even make them fast enough! The Crypto and AI bros have deep pockets. The ones mostly buying the 5090 or 4090 are the rich crypto, AI or people with too much money. The 5090 or 4090 still has tons of demand and was never meant to be affordable.
The Ai and Crypto bros are buying 10 to 20 at a time so there’s still a huge market trying to buy them. Some can’t afford a A100 at $30k and they are hard to resell so they will raid the 4090’s and 5090’s
stalker 2 is a psyop to make us buy amd x3d and 5090 I swear 🤣
Not if you don't feel like playing the game. XD
WHERE is Intels OpenVino Hardware? There IS a niche - more RAM for AI models and media encoding.The rest doesn't matter as a model which doesn't fit into RAM doesn't compute at all...
People need to not buy at these prices. Its simple really. People don't need these new cards, its the clueless muppets that insist on having new tech that are to blame for the rip off pricing.
Battlemage uncompetitive ??? - But you had already informed us that the project had been canceled... wth ? Battlemage exists ?
RiP Bintel, we all wanted Arc to succeed
it did succeed lol. but according to MLID it failed. this guy is always negative about intel. ARC offers great performance now at tht price
@@talha_siddiquy But still some games are not playable, e.g. Starfield
@@talha_siddiquy I actually never understood all this hatred towards ARC from him lol
@@GreyDeathVaccine imagine playing starfield
@@Ginp- hes butthurt cuz his supposed intel “leaker” fed him BS and he thinks intel never follows thru with their plan. Most of his intel leaks are absolute laughable. Like the Arc A980 or something. The alderlake leak. Sapphire rapids. You dont even wanna open the granite rapids can of worm he fed his viewers 😂
Bummer about BattleMage, was gonna build an all Intel system using Arrow Lake and a B770, I'm such a Intel shill, lol
Until AMD sort their stuff out Nvidia is gonna keep gouging the prices
Nvidia is now the biggest company in the world, only Nvidia can kill Nvidia, there is less than 10% chance that AMD can catch up to them completely now and less than 1% chance they can surpass them.
Nvidia has become too big to be beat by others, only Nvidia can kill Nvidia now.
@@rattlehead999 It's because they have a monopoly though. If AMD could at least match them on ray tracing and the price their cards competitively nvidia would have to drop prices. There's a reason why nvidia's cards used to be cheaper and it was because the competition was closer to them.
@@watchmejumpstart24 AMD was superior to Nvidia from 2007 to 2015 and people still bought Nvidia.
Lisa Su is Jensen Huang's cousin (literally look it up). AMD will play the underdog for as long as they can so they can reap the benefits of NVIDIAs price gouging.