Why I Won't Support NVIDIA Anymore...
ฝัง
- เผยแพร่เมื่อ 10 ก.พ. 2025
- Hello guys and gals, it's me Mutahar again! In the future of my new PC rig it's time to examine some serious options. I've been a Linux user for quite some years and NVIDIA hasn't been the nicest player for me unlike it's competition. AMD is still a fair bit behind and with a month to go for an eventual reveal, I hope AMD doesn't make me regret switching. Thanks for watching!
Like, Comment and Subscribe for more videos!
Few months later: "I finally got a rtx 5090!"
if it comes back to OG MSRP
😂
People paying for the 5090 need some help man. They’ll claim to use it for work, but we know they are just using it for video games and probably AI porn.
@@edmac1090 yeah lmao
@@WhatisReal11 You won't be able to buy a 5090 at MSRP until the 6090 releases let's be real.
NVDA shareholders about to be in these replies like crazy
It's because nvidia gaming gpus make such a tiny portion of thier overall business.
I don't remember asking kid🎉😂
My content is way better
😂❤🎉 ...,. .,.
nvda shareholders have nothing to worry about ;)
I own AMD and Nvidia
Moving to AMD, he could just say he's too old for PC gaming
I'm surprised as a Linux user he wasn't already mostly running AMD. AMD has a better track record with supporting Linux
That's why I'M switching to an AMD chip.
The way AMD cards work right out of the box with Linux is just chef's fuckin' kiss my man. Also supported on RISC-V Linux (different CPU architecture from amd64 and ARM)
better until smth breaks or isnt compatible with amdgpu driver stack for some reason (wicked engine lol)
@@aldebaran0_ "oh yeah, its better except for this one really niche use case that I had to clarify because I'm quite a serious man"
@@NotTheGaslighter dog if youve ran amd on linux you would know its shortcomings. Ive had the kernel module crash just from using displayport mst
Many years ago there were something called Omega drivers, which were modded graphics drivers for nVidia and ATI/AMD GPUs that unlocked more performance through various means (registry tweaks, etc). Both nVidia and AMD found out about Omega drivers around the same time.
nVidia sent a cease and desist, and at one point alluded to the threat of legal action.
AMD credited and thanked the creator, and brought him in to be a beta tester for official drivers.
X800XT days they were GOATED man the gains were crazy.
Man I remember omega drivers keeping my old radeon 9250 alive when I was a kid 😂
😮
I do not care about upscaling.
I do not care about fake frames.
I do not care about raytracing.
I do not care about stupid gimmicks, give me real frames at reasonable performance and I am satisfied.
bingo, same here.
most devs only use dlss and framegen as excuse for sloppy optimization, it's one step forward and two steps back.
and although i have a 4080 super, personally i couldn't care less about ray tracing and never use it, the performance hit is just too big and a game with well placed cube maps and lightsources looks absolutely good enough. for example: just look at 2015 sw battlefront or battlefield 1, or for a newer examples: horizon forbidden west or the ps5 demons souls remake, these games still slap despite having zero ray tracing.
Glad for you. And you know what I personally care? Good optimisation and actual effort in case of coding! Absence of that it's a real problem. Not a damn rays and upscalers.
this is so real, got a 3070, for me thats years of saving cash. I keep on being disappointed. It feels like a very meh upgrade from my 1660s.
now that card was great, played all new titles with ok graphics for 150 bucks.
Upscaling is fine generally, as it doesn't cause input lag etc, and it helps older cards with running modern titles. The rest can go in the bin though
I agree with everything but upscaling.. no reason not to use it
Bro fake frames are the best my guy. You really can't beat that artificial and unoptimized feel!
I love the extra input latency. The last thing I want is responsivness.
Don’t Translate 🔴
intliziyo yakho iya kuyeka ukubetha ebuthongweni bakho ngobu busuku. Ukuthintela oku kufuneka ubukele imizuzwana engama-45 yomxholo wam musa ukuthatha umngcipheko!
Did you try a 5090 or are you repeating memes, people who tried it said it was always worth using for the frame smoothness
@@nFeazy I'm taking the risk and I'm putting you in my diary
Bro, I want to watch movie and video with fake frames....it might feel good if that is available
I've never had bad luck with AMD... Never had bad luck with Nvidia either.
I'm sad with how much gamers overlook AMD though. It's non-stop "hate" for Nvidia's pricing, ethics, business practices, whatever... and then everyone buys their cards anyways.
Just goes to show how bad AMDs business practices are if they can't use that to their own advantage.
Gamers do not overlook AMD lmao. Gamers are literally the only thing AMD has going for it and keeping it alive. The rest of everything is what overlooks AMD.
You always see pro AMD people in pc gaming related comment sections and yet they are still a tiny fraction of market share. That should tell you literally all you need to know.
@@MrHorst38yeah i like amd but they seem to miss every damn opportunity that comes their way
Stockholm Syndrome
@@KawKab_DevI’m fully expecting them to completely fumble this opportunity and do something stupid like price the 9070 XT at $700 or something
RX 7900 XTX is undoubtedly a very underated GPU...
especially for the price
Yep, been running mine for a year with 5900X, it has got better over time thanks to AMD drivers aging beautifully. I'm not looking at any GPUs from the new gen, it feels pointless. Although I might make the jump to AM5 and get an X3D cpu this summer.
@petrolhead0387 I recently made the jump from a 5800x to a 9800x3d and it made the 7900xtx and even more enjoyable experience. More stable frames in demanding games and have been getting well over 600-700 fps in CS2 and many other esports titles at 1440p, the x3d chips really shine in cpu demanding games.
@@petrolhead0387 likewise! love my 7900xtx. Like you, I may make the jump to AM5.
@@petrolhead0387 Same
Upscalers are a revolutionary and amazing technology. As a result, developers have become complacent about optimization. For example, ARK ascended.
literally a copy paste of a comment from 5 hours earlier by @user-iq3qj5uk6i
I don't think upscalers have anything to do with arks bad performance. Even the original ark had terrible performance and that was before upscalers even existed.
Ark has had bad optimization since 2012
No. You're just wrong. You're being grifted by a tool online. No one is being lazy. Buy whatever you want, but look at the IGN video covering DLSS 4 vs FSR and XESS and DLSS 3.8 so that you don't blame UE5 for something that DLSS 4 solved.
I've been telling people for ages that this is a problem that NVidia needs to solve, not Epic. Now they have solved it to the 95th percentile, likely fully solved with DLSS 4.5 or whatever.
So you look at how bad FSR and XESS make games look so that it stops being my problem that NVidia is the only one that has solved the problem that I have nothing to do with 🙏
@@OverbiteGames damn high effort bait, nice try
I switched to AMD late last year, and I'm grateful. They just gave me the best prices, and they are also extremely easy to find. I didn't have a single problem; I just went to my local store and bought it off the shelf. I don't mind having bad Ray Tracing. Whatever, I'm not bothered. Having an AMD combination of CPU and GPU
only 2-3 games have good ray tracing. look Hogwarts Legacy. the ground and other things are like a mirror. ray tracing is meaningless when it's not good implemented.
Switching to AMDeez Nuts
BASED
gottem
Nividia got no NUTS Gottem
Lol
AYOOO
My 1080ti is framed on my shelf because it was such a beast for so long. And it was all down hill from there.
PIPE DOWN LITTLE KID!!!🎉😂
MY VIDEOS ARE BETTER THAN SOG!!!
😂❤🎉 ...,. .,.
Damn right, raw power house that held up way longer then I ever imagined.
im still using it lol
I was thinking about putting mine in a "break in case of emergency" glass box and mounting it on my wall lol. AND it's an EVGA card. Rip.
I am using my 980ti.🙂
0:22 Don’t be a coward Mutahar, intend your pun.
Check Ya-self B4 you wreck ya-self.
He is
he is just using the term "no pun intended" wrong
if you are aware of a pun, and use that phrasing anyway...then you should say pun intended
PIPE DOWN LITTLE KID!!!🎉😂
MY VIDEOS ARE BETTER THAN SOG!!!
😂❤🎉 ...,. .,.
I must be too tired, because I didn't get it. Greener because Nvidia logos are green?
I’m lost, too, @@shinkicker404, I work on Linux but old csh
0:10 im already taking it as an overwhelmingly negative video
Well, learn to watch the content, then judge it instead of coming in with preconceived notions. 🤷♂️ L
👆🏿 ...this guy
it's just a joke. @@JonboyKoi
@JonboyKoi, you are fun at parties. If you are even invited 😂
😂
As a Dallas Texas resident, I didn’t even bother checking for stock on the 5090… I knew there wasn’t going to be many if any at all available
Just installed a 7900xtx and unless I'm specifically looking I don't notice much difference from my nvidia card. Plus the linux compatibility is so nice.
7800XT here! HUGE upgrade from a GTX 1660 😅😅
@@TheRealPotsI went from 1050 to 16gb 6800, I got it for 350 a few months back
7900xtx, and only cuz I thought 2k for a card was a ripoff and wanted to support competition... I learned quick why everyone goes Nvidia if they have the $. I have money so next time im getting the best of the best. FSR is garbage and ray tracing is even worse. AI will only improve and ray tracing will be forced and better on more games.
@@TheRealPotsholy shit
I noticed a massive upgrade between my RTX 4070 ti. I got a RX 7900 XTX as well and doubled if not quadrupled frames in some games just running native or FSR 3.5. It is not as pretty but damn does it run games good. RDR2 at 190 FPS, Satisfactory end game at 100 FPS with some lows at 80 (old one was at around 50-20 FPS). Make sure you do a couple of changes to your BIOS. ReBar and make sure youre running on PCI-E Gen 4. I believe there was another option but it was related to Rebar.
I switched to AMD later last year and i'm thankful. They just gave me the best prices and dude... they are so damn easy to find too. I didn't got a single issue i went to my closest store and buy it off the shelf and that was it. I don't care having worse Ray Tracing. Idgf about it either way. Having a AMD combo of CPU and GPU
MUTAHAR FELL OFF HARD🎉😂
MY VIDEOS ARE WAY BETTER THAN HIS!!!
😂❤🎉 ..,. ! ,..,
Ray tracing really isn't as big of a deal as people make it out to be either. Hardware Unboxed did a video showing the difference RT makes 6 years after the 20 series released and the vast majority of games hardly look different between RT on and off. Nvidia sponsors games and helps in development for some big releases because RT is a selling point for their cards.
@@100organicfreshmemes5 yeah honestly real-time ray tracing is practically a buzzword, with some good ambient occlusion and stuff like that you can make something that looks almost just as good as raytracing without even needing a dedicated graphics card to run
I have a 7800X3D and a 6950XT.
Nvidia is definitely better and the bugs and coil whine drive me nuts, but you just can't beat the price to performance. Yes it's last year's card, but it WAS a flagship and I did just walk in and buy it. Very easy. You can't beat the performance to cost VS NVDA- they're the best and they charge out the ass for it.
You'll care as more and more games only offer raytracing as an option... Even if I don't like raytracing, and turn it off whenever possible, the undeniable truth is that slowly it's going to take over as the only option. So AMD had better get its RT performance up or it's not going to be competitive.
ASUS: "Due to tariffs we have no choice but to increase the totally arbitrary price of our $2800 air-cooled GPU to over $3000. Our hands are tied."
Basically 😂
Yeah, never buying from ASUS again. What do you suggest?
AIB say, MSRP is to cheap, they cant make money out of it. and the cooler of a 575w gpu must be good. some say the cooler of the rtx 5090 costs as much as the rx 9070xt. ofc its not true, but there is some truth in it.
@ I don't even know at this point. Most of them are out of their minds. Gigabyte and Zotac seem to be the most reasonable, at least with their "lower tier" cards but pretty much everyone is charging a $500+ premium on top of tariff increases, whether it be directly from the manufacturer or retailers like Newegg price gouging. Most cards are $2.6k or more now and sometimes the only stock you can find is with bundles so you're forced to pay another $200-400. I paid a $60 premium over MSRP for the 2nd fastest 1080 ti in 2017.
Telling yourself that you're going to get a "lesser" experience with AMD when you tried Nvidia and got an invisible floor is some world class mental gymnastics
With Linux architecture. Bro it's obvious he's talking about using Linux with Nvidia. You gotta be really small brained to not understand the point he's making.
Love to see Daniel Owen! Been watching him since about 10K subs, really explains stuff well and does great side-to-side comparisons.
Two things I think Muta doesn't realize as far as Nvidia's faults:
1. They severely lack VRAM. The 4060 and 4060ti are infamous on tech channels as bad cards you should never buy because of the lack of VRAM. They can't actually play games on higher resolutions and texture/foliage settings because they run out of VRAM and then you get stutters, lower framerate and in some cased textures failing to load. This is likely on purpose for planned obselescence. Despite only being roughly 10% faster according to sites like TechPowerup and Tom's Hardware, my 6750XT beats the 3060ti 8GB in my brothers' computer on The Forever Winter with about a 40% higher frame rate and no stutters, while on higher settings and 1440p whereas he's playing at 1080p, medium settings.
2. Raytracing is overhyped, in most games it really doesn't make a big difference visually. Hardware Unboxed compared a ton of different games with RT vs no RT a while ago and in most games you can barely tell the difference. Nvidia pushes RT really hard because they have dedicated hardware for it and can sell their cards as cutting edge. There are also other solutions for simulated lighting that aren't as intensive and still look great, which are going to be more appealing as they let people with much more dated hardware play their games. Unreal Engine, for all its issues, has Lumen which is genuinely a great technology. It looks almost as good as Nvidia's hardware raytracing, but it's handled on a software level and has a significantly lower impact on performance, both on Nvidia cards and AMD cards. Nvidia will keep pushing raytracing even if it's inefficient because it pressures people into buying their cards.
Underrated comment. I have bought GPUs from both sides since the late 90s, and the life-ending limit to all of them wad VRAM,(except for the RX480 8GB). ALWAYS go more VRAM. Cards with more VRAM also sell for much more on the used market for the same reason that they're more viable longer.
Exactly that's why 4060ti 8gb version is called 1080p card despite many saying it as 1440p card because of it's low Vram however the card itself is still capable of 1440p
Lumen is notoriously known to have bad optimization.
@ Unreal 5 as a whole is known for that. Lumen still doesn't reduce your FPS as much as raytracing though.
Great comment, brother 👍🏼
Upscalers are a revolutionary technology that is amazing. Unfortunately developers have become lazy with optimization due to this. For example ARK ascended
I don't remember asking kid🎉😂
My content is way better
😂❤🎉 ...,. .,.
that s the problem, they became so revolutionary that even developpers leave some of the work to them...
honestly ark is a bad example because even when they made survival evolved, it somehow looks like garbage and still uses like 32 gigabytes of ram, clearly the devs have never been interested in optimization
@ fair
upscaling makes things look more artificial, adds artifacts, and adds input delay. at least it makes devs stop optimizing their games tho 😌
Hot take (not really): Gaming peaked in 2015. Arkham Knight, a game that released 10 years ago still looks better than most of the games we get right now.
Accurate
That’s true.
lol,What about Red dead 2 😂,People play the wrong games….There’s so many great games that are new
Lukewarm take at this point, but you're not wrong.
That golden era started at 2015-2018 and then a couple good ones at 2019-2020 mostly bad but after that almost nothing looked as good and ran as good till Kingdom Come 2.
Apparently Nivida means "envy" in Latin. Its kinda a baller and well fitting name I can't lie.
Invidia, yes. Specifically one of the seven deadly sins.
Upscalers should have been something that extends life of older GPUs. Unfortunately, they are now sold as a new gen GPU exclusive and devs have gotten lazier in optimizing games.
The stock needs to plummet again
permanently this time
@oneplay5570 Wait let it go back up so i can sell mine then it can crash
@@oneplay5570 why do U hate America? We need Nvidia to go to the moon so then they can give everyone FREE GPU'S
@ no we dont lol
who are we? telling people to buy so you can get your gpu free? crypto energy@@zenon3021
I literally have a fully functional PC I built last week, but have yet to get a GPU for it because NVIDIA dropped the ball so hard this generation. PLEASE don't screw this up, AMD.
I don't remember asking kid🎉😂
My content is way better
😂❤🎉 ...,. .,.
I have a intel arc a580 for my brothers pc and a 7600xt for my pc im considering getting battlemage for my brother if they can figure out a fix for overhead issues
Why not buy something last gen then? Genuinely asking not hate or anything. If the current one didn't meet your expectations why not go with the closest that did?
@@papandbuddygirlAccording to Linux repositories, intel has introduced new code suggesting that they might release a higher end model and a lower end discrete model. Intel also appears to be ramping up development on Intel arc drivers for Linux, at a much faster pace then Nvidia had been for the last few years.
@@josephlh1690 ya I'm very excited for Intel's future in the gpu market they are like the only one im really taking very seriously at the moment
The problem with Nvidia's latest tech is tbere are more games than just Cyberpunk out there.
sssssh. Do not disturb Nvidias marketing 😅
Yeah and they tend to require this once-elementary concept of "VRAM" in gradually and predictably increasing capacities year-over-year....
man, im super thankful my 1050ti in my msi laptop still works, i wish i could do a simple upgrade, but being poor makes it impossible.
12:17 That little spiel you just said about Cyberpunk was kind of WRONG. CDPR did not bother to work with AMD and until just recently added FSR 3.0. Not 3.1...3.0 and it is garbage...a modder did some work on it and was way better; meanwhile DLSS keeps getting support from them EVERY iteration...if you think they are not getting "help" in everyway possible think again. So comparing it to DLSS was BS on your part; CDPR did an entire DLSS update for them. They just tossed amd out and said here...go get a dlss card or "F" off; AMD ain't paying us. Oh lest not forget that Witcher 3 and Cyberpunk 2 are both NVIDIA sponsored titles.
Nvidia uses them for promo = pays them to add features. Same reason why War Thunder, which never has any major changes, suddenly got 'ray tracing' added and some maps re-done for it lmao. It doesn't even work properly and the mirror reflections give users an advantage in some cases in the increasingly city maps. Due to absolute basement level laziness money extracting devs, 137m in EU and you are lucky to get a new map per year and no new features/modes or actual changes bar very minor ones. Play the same 5 maps due to premium map bans and no ability to select maps. It's absurdly poor managed this game and does not drive player retention lol.
I love extra input latency caused by genorated frames. It really makes motion blur and depth of field look/feel even much better.
Im a masochist.
I don't remember asking kid🎉😂
My content is way better
😂❤🎉 ...,. .,.
Best part is cranking up the sharpness filter. Deepfried fake frames are the BEST
@@KozmoPoly oh yeah, sharpness 10000%, saturation 10000%, and 3% brightness, truly the peak of seeing
Frame gen is fine for singleplayer shit imo, but it shouldn't be carrying the entirety of a game's performance on it's back. The input latency in my experience hasn't been abhorrent, but for competitive or multiplayer any extra is bad.
Who TF uses motion blur?
We as a society need to be happy with "good enough". You don't need the best. Stop consuming
We shouldn't be happy with our programs becoming more and more demanding with less and less to show for it. The problem is that things don't stay good enough for long enough.
Comment of the day👍
We are. 4090 users were less than 1% of all people playing any steam related games and 5090 will be even less than that lol its just big news with little stock so you hear about it
@@xDbrad Yea, people gotta stop scrolling reddit because it seems like everyone is getting a 4080 / 4090 or 5090s whatever. Steam survey results tell everything. Ya'll fine running a 3060 with 8 gigs of ram if you're not anal about running brand new Unreal unoptimized slop at 30k resolution and turbo high details.
Not just that, but people also tend to focus on the wrong thing. The monitor is a lot more important to your enjoyment than the GPU. Especially if you're not on OLED yet and have never seen full HDR. It's like discovering the *concept* of gaming anew and will make you feel like a kid again. When I see someone spend $1500+ on a GPU to play on a 4K VA panel I shiver with pity.
We need more things like Intel, especially for linux gaming.
Would bring much needed diversity that is healthy for the industry.
I hope intel’s next generation can improve the driver overhead issues, then it would probably be the best mid range option
We need a company to offer good quality stuff for less that isn't one of the big 3, won't happen because of the barrier to entry. Actually barrier doesn't cut it its more like a 5 feet thick solid concrete and steel wall.
I don't remember asking kid🎉😂
My content is way better
😂❤🎉 ...,. .,.
The rtx 5000 series and rx9000 killed Intel arc who going to support it when 5060 comes out
Intel and linux rn from my experience is almost flawless just only issues being drivers and a few other quirks but overall its pretty good
If Nvidia spent their AI silicon budget on raw raster maybe we would not need DLSS and Framegen.
Remember that news story from five years ago? Mr. Jacket assured that Moore's Law was no longer a problem, well, it wasn't about transistors, it was about AI.
Bottom line, Moore's Law has already knocked on the door, and it is now too expensive to produce something better (we need to take manufacturing technology to the next level)
11:09 hot take but when your gaming the most important thing is consistency on Framerates and this things like clarity and graphic fidelity is just an added bonus
I agree, I preffer a smooth experience for me and my pc rather than the graphics sliders being set to ultra with fake frames
Feel free to do some Radeon setup content! I'd love to know more tips for AI and AMD hardware.
PIPE DOWN LITTLE KID!!!🎉😂
MY VIDEOS ARE BETTER THAN SOG!!!
😂❤🎉 ...,. .,.
Yeah, I'd love to see this too. I setup Flux with comfyui and zluda several months ago on my 7800xt and it was pretty rough trying to figure out. I got it done, but took some trial and error. I'd like to see if I can get a deepseek model or something running, but not looking forward to it. Hopefully it'll be easier, but I haven't really looked into it at all yet.
Setting up flux was the one time I started to regret not going Nvidia a little.
Idk about ai but when you're on Linux like muta, you literally just install it, drivers are included in the kernel
it still rough, better if you have 7000 series.
if you just want to run, easy
training seems impossible even with cheaper 24gb vram amd offers
@@-Boone That's what they want - to monopolize AI accessibility and gatekeep it with their hardware. Deepseek was refreshing to see.
I literally just watched a video where you said you were team green. 😂
Edit: The video is called How I Built The "Poor-Shamed" Computer...
team red 💪
He’s a constant hypocrite when talking about windows “I’m never using this dog shit again” then his next weeks video will be like “why I run windows virtual machines on my Linux”
Edit: @therealastralhaze I see your point
@@Evancade I’ve noticed the same thing, cheers
Don’t Translate 🔴
intliziyo yakho iya kuyeka ukubetha ebuthongweni bakho ngobu busuku. Ukuthintela oku kufuneka ubukele imizuzwana engama-45 yomxholo wam musa ukuthatha umngcipheko!
@@EvancadeI haven't seen what you're referring to, but he might've meant he'd never actually daily drive Windows on his main drive. Therefore, I feel running a Windows VM is the next best thing
The reason NVIDIA is so expensive is because its leading in AI. The number of GPUs they are selling for these AI projects like Super Micro is insane. They don't need gamers anymore. You will be paying for the name, not the game. AMD is 100% the way to go. I typed this on a 7950X3D & 6900XT. Love it.
I'm not tryna defend Nvidia. But I don't think the GPU prices are high because they are leading in AI. Because the RTX 5070 and 5070 Ti got a lower price than last gen 4070 and 4070 ti while the RTX 5090 got a $400 increase over last generation RTX 4090. So I think the GPU prices are priced based on the demand of the GPU tiers. Because remember when the RTX 30 series was scalped like crazy and there was crazy demand for all of them? I think that's why they increased the price of the RTX 4070 from the previous generation 3070 from $500 to $600 and RTX 3070 ti to RTX 4070 ti went from $600 to $800.
I also want to mention that even if Nvidia doesn't have high GPU prices. The scalpers will buy them all up and resell them for a much higher price anyways. So I don't 100% pin the blame on Nvidia for high GPU prices. The only ones I blame are the people who still buy their GPUs, especially the people who buy from scalpers.
You can't type on a graphics card, you have to type on a keyboard (or touch keyboard), unless your using voice to text, in which case you would be "typing" through a microphone.
Nvidia is literally leading in all high end performance though.
@@Felale Agreed,Don’t let AMD fanboys get too comfortable saying AMD is better,It’s not.
"Nvidia is leading in AI guys!!!"
DeepSeek: "I'm about to do what's called a pro gamer move."
Its nice to see a major youtuber promoting and using linux as their daily driver OS. I also use arch btw
Yeah, I've been thinking about going the same route. I do NOT want to be forced to use Win11 with the questionable AI screenshot snooping around my data, pics etc. So I am trying to determine a distro to use (if/when) SteamOS has a proper desktop environment. So I too will be eyeing the AMD cards within the next year. I haven't purchased a non-Nvidia card since I had an ATI Rage series board around 2000/01. Been using AMD CPU's since the Phenom 2 days (skipped bulldozer and went with Intel 4770k).
As person who use steam deck as desktop for some time I just want to say... It's just pain in the ass if you try to do anything beside common stuff. I spend 5 days by trying to run moded Skyrim for example. It's usable buts it's like choice between piss and crap. I miss win 7 times.
garantee most of the 5090s and 5080s went to youtubers who will just keep them forever
massive respect for not taking the handout
They did. North America received less than 1,000 units, per Gamers Nexus.
A $2000 GPU is one of the cheapest marketing bribes any trillion dollar company could ask for
@@JimBobmA paper launch. This was intentional. Nvidia is limiting supply of the 5090 because they are more focused on manufacturing the data center versions of the gpu. The 5090 is actually nothing more then a graphics card deemed too defective to be sold as a server card. Translation, they are basically just hand me down cards off the assembly line.
10:00 Muta trying to convince me there's a meaningful difference between cards is like Skeletor selling me the PS5 Pro.
"mUh gWaFiX" ray tracing is just where frames go to die.
Normalize good art direction over slapping raytracing on everything
THIS. SO MUCH THIS.
pin this comment
Finally someone with a brain
If good art direction means cell shaded borderlands graphics then no TY
Been PC gaming since the early 90s.
I've used Windows since DOS, always hated Linux.
I've always had Intel processors and Nvidia graphics cards, because of the "problems" of their competition. I'd always read reviews and it'd be "doesn't run correctly on ______", "has graphical glitches on _______"... so, truth be told, I've always been biased and never just "gave them a go".
Fast forward to now, and, once my PC loses support on Windows 10, my next build is literally gonna be NONE of the above.
The funniest part are the new rtx 5000 laptops. They perform EXACTLY THE SAME as similar laptops with rtx 4000 series. With power and temperature constraints, this gen in exactly the same as the older gen, recycled.
I love my 7900 XTX, its just been getting better over time.
Love the comments criticizing the video 1 minute after it was published. Great job guys.
I don't remember asking kid🎉😂
My content is way better
😂❤🎉 ...,. .,.
That’s how you know he’s doing something right
He just say that he like rays and dlss. It's enough to ignite butthurt in some specific demographics.
Been using AMD since the early 2010s. They're pretty much just as good as their NVIDIA counterpart in every generation and typically cost a considerable amount less for the performance.
AMD's driver software is atrocious though. The one recent AMD card I owned was factory-set to overheat and shut down, only kicking fans to max when >90% of the way to the temp shutoff. Greenscreened at least once a week, and every time it did I had to remember to go fix the fan settings again. Nearly a year later that got fixed in the driver defaults, but was still looking for a version that didn't crash so often and the newest wasn't it.
I ain't paying $2000 for an Nvidia card either so I guess I'm sitting on my 3060 for many years to come. That's the real solution, refusing to engage with the crappy market.
@@jbutler8585 not my 7800xt tho
Unfortunately, that's just not true at the high-end. AMD has admitted that by announcing their not evan going to try and complete in the high-end right now. Which sucks. We need competition. AMD hardware is no comparison to Nvidia RT cores or dlss. Nvidia has hardware on their cards that allow their software to outperform AMD. I want to see AMD do what they did in the cpu category versus Intel to Nvida in the gpu category but that will take years to happen.
@stang10189 "pretty much just as good" implies that Nvidia is better. That's all I'm saying. It's a performance to cost analysis. And for me, AMD wins that pretty much every generation.
My first ever system was with an RX 570, and I rocked that little $150 gpu for years. It's still kind of impressive how well it performs today
It's interesting to see such a take. In Europe AMD is pretty much a no-brainer right now as well as Intel due to pricing.
Nvidia tends to often just be more expensive which doesn't seem to be the case in the Americas or other regions..
12:30 always love seeing one of the math teachers at my high school getting mentioned in stuff like this, he really does put in the work!
Switched to amd when I upgraded from a 20606g to a 6800xt and DAMN that shit good! It supports literally everything and has more community support. Not to mention it works great on fedora my preferred operating system :D team reds got my full support after how nvidia has been acting the last few years.
I'm done with NVIDIA as well. Just bought my 2nd AMD card and I don't regret!
Nothing better than blurred graphics to make the FPSs go zoom.
Nothing like buying high fps monitor with gsync or fresync to get rid of ghosting and then smear it and blur it all with fake frames. Peak PC stupidity.
I feel so conflicted about NVIDIA. Love them or hate them, they pioneered so much of what we now take for granted within PC gaming, and even things like DLSS and AI Upscaling were novel at first. But now, they charge super-inflated prices and their GPUs set a bad precedent for game devs who will forgo optimisation in aid of just thinking: "They can upscale and use framegen anyways". Look at the MH Wilds, Doom Dark Ages and Indiana Jones requirements. Yes, we want next-gen games in a gen still getting XBOX ONE ports, but if RGG Studio can put out Pirate Yakuza which looks almost photorealistic and can run well on a 2000 series, it proves we don't need hardware that beefy yet if devs just do their jobs.
I started doing this with 40 series/RX 7000 series. I bought an RX 7900 XTX on launch instead of a 4080 or 4090, and I'll never look back! I've had such a good experience with the 7900 XTX in my personal system in my office that I went ahead and bought a second one for my TV gaming PC too, and it handles gaming on my 4k 120Hz OLED phenomenally well. I'm not sure how RX 9000 will stack up against last gen's flagship 7900 XTX, but it's not like I need to upgrade anyways
Same setup and the Nitro+ XTX, it's a beast and I'm glad I went this way. RIP 80 series running out of VRAM in 4k titles already.
*MUTA a few months ago* "I am DONE with PC gaming altogether"
Nvidia sounds like some type of antipsychotic medication
LOL WHAT
Yea, it does.
They're grimy but really good at making money. But mostly grimy and deceptive.
Welcome to team red, muta! I've been a member since 2023
Ive been one since amd started
Team red for 5 years now. On 9800x3d & 7900 XTX now.
Team red since 2016 with my beloved rx480. Today I'm running 5600xt and planning to buy a mid tier 9000 series
I don't remember asking kid🎉😂
My content is way better
😂❤🎉 ...,. .,.
I just switched last year. 7900xt with a 5800x3d. I'm set for at least another 3 years. love it.
The one argument about nvidia, that everyone is praising is "day one patches" for games.
Dude, that just means their code is so broken that they need to change it every time a new game comes out and patch it with fixes, that are specific for each game.
That is not good.
The DLSS vs FSR comparison in Cyberpunk felt weird because I liked the FSR image more. Sure, the DLSS is more crisp but the palm tree looks hairy, like it's bristles instead of leaves, in the DLSS image :D
FINALLY!!!! I’ve been solely team red and blue for years now. It’s cozy over here
Why not switch to red all the way
@ I am mostly team red, and for a long time I was only team red. I haven’t bought an Intel cpu in close to 10 years now, I bought an A380 card on sale to support more competition in the market. I think Intel has done its fair share of slimy things in their CPU department but their GPU department seemed like fresh start. I use it for my living room emulator rig which it runs everything beautifully at 1440p
The spreadsheet at 4:04 is only for Microcenter.
4:00 Worth mentioning this is ONLY for GPU's distributed to Micro Center stores, not representative of the total units shipped to each state.
I hate to admit it, but even if I did upgrade to a 50 series I would still spend 99% of my gaming time playing the same games I played bavk when I had a 10 series card anyway.
I think the GPU shortage is intentional, because if there's a flaw like the 40 series being burnt, then mass producing it will be dangerous for the company. They're just watching to see whether something happens or someone reports anything bad.
39 seconds ago is preposterous
I don't remember asking kid🎉😂
My content is way better
😂❤🎉 ...,. .,.
Ray tracing can be transformative for games like Cyberpunk, Silent Hill 2, and Control but honestly I see a lot of people not even using it because of how much of a resource hog it is. Stuff like frame gen also runs the risk of introducing input lag.
If you just want high frame rates and play at or around 1440p without Ray tracing, I think AMD is more than up to the task of satisfying that huge chunk of gamers especially if the price is right.
I don't care for ray tracing as much as solid performance and a crispy IQ.
I think ray tracing is still at its infancy. It's pointless to have shiner graphics at the cost of a lower frame rate. Depending on the type of game you're playing, it'll affect the experience.
@@imulanRay tracing is already irrelevant. There are solutions that render light more realistically, that also use less resources to do it, but Nvidia drives studios to still rely on Ray tracing because their gpus went all in on it.
NVIDIA pisses me off because the fuckin industry has a hardon for DLSS and they are abusing that shit and being lazy af. FSR isn't nearly as powerful and Frame Gen definitely feels like it leans way more into NVIDIAs software than AMD. Which sucks because AMD cards have more power per price but NVIDIA still has a software lead that allows them to get away with stupid shit.
So... You want to tell me that developers that can't do anything beside UE5 graphic interface it's Nvidia fault? Really?
FSR Frame gen is very solid btw. It actually forced Nvidia to make a performance update for DLSS FG because FSR3 FG is faster and still looks good.
SInce 2018 i'm using the I5 4570. at the end of 2020 i got my hands on a RX 570 and a year later finaly i even had 16 gigs of ddr3 ram. Monster hunter Wilds runs like shit on my system and the game looks like a game from 5 years ago (Doom Eternal runs with Ultra on my system with unsatble 60 FPS), Monster hunter rise on the other hand looks like a game from 10 years ago and it looks very good to this day, it even runs on my system wit everything turned up high, with 60 FPS. why is art direction not the main thing? it was a good idea to make Rise run with 30 FPS on a Nintendo Switch. Better optimized game = More Money.
I just made the switch from GTX 1660Ti to RX 7700 and am extremely pleased. Sparing Monster Hunter Wilds, I’ve gotten a lot of great use out of my new AMD build running at 4K with decent settings-surprising for a new build that cost under $1K USD. I’m pretty sold on AMD in the future as long as they keep making higher-end stuff like the 7800 and 7900 models. Right now my only concern is the rumours that they’re only doing two mid-range cards this gen and that the 7000 series was their last gen for high-end. But, rumours and leaks-I wanna see what they’re actually doing going forward.
Raster is king, don't let anyone tell you otherwise. Nvidia's intention long term is to sell you the same card over and over again with a software update. The 5080 is almost there in that regard.
It's like the "30 fps is the cinematic experience" all over again. But this time it's fake gen fps.
The creator of the footnote with greatest exposure deems themselves as a homosexual fellow.
Exquisite footnote.
Forty keks for you, good sir.
@@salineaddict9850
I can't make sense of this
what are they saying?
@@memegazer just a fancy way of saying top comment is gay
@@memegazerHello fellow anglophonic commentor, let me translate English to English for you. Top comment is gay.
It's team orange, and as an AMD purist since socket 939 it's great to hear you coming to our side fully.
Man, I feel bad for you going through the Bulldozer era. I was an AMD fan as well, but even I switched to Intel's Sandy Bridge.
I don't remember asking kid🎉😂
My content is way better
😂❤🎉 ...,. .,.
Socket 754 with an Athlon 64 3400+ for me. Only recently (2 years ago) did I build my first Intel + nVidia PC. It's been great so far. As much as I love AMD, they are pretty behind nVidia to the point where I feel I have no choice but to stick with nVidia for now. I've been a graphics snob ever since the Radeon 9700 Pro days, and right now nVidia is the only real player in ray tracing/path tracing.
The pain behind those soulless eyes the Sonichu medallion really is putting in work behind the scenes
You were really kind in your FSR vs DLSS comparison... movement is really where FSR shits the bed. Hopefully FSR4 will be a lot better.
AMD will absolutely blow this chance by pricing their cards at $500 or more rather than something that would get them a shit ton of market share.
500$ is a reasonable price. They don't need ton of market share, they just need a bigger market then thy have now. Unlike Nvidia, AMD's real revenue and strength is much more diverse. If AMD exits GPU market, the company will continue to do just fine. They are not under that much pressure.
Here intel comes to save the day /j if they honestly can fix there issues it would be amazing
@@papandbuddygirlIntel isn't coming to save anything
$500 is pretty good, theres nothing last gen or current from the competition that comes close for decent perf
0:25 where my team blue at 😭
BlueCrew
MUTAHAR FELL OFF HARD🎉😂
MY VIDEOS ARE WAY BETTER THAN HIS!!!
😂❤🎉 ..,. ! ,..,
4000HD gang! Lmfaooo I will never let my surface pro 2.5 die. It’s holds up great for a 2013 x64 Tablet
@@OfficerChaoticX9this bot's creator fell off a balcony
THANK YOU!!!
I don't remember asking kid🎉😂
My content is way better
😂❤🎉 ...,. .,.
Your quite welcome.
I really hope that AMD this time won't self-sabotage themselves.
Same.
4:29 and remember, there’s only 2 Microcenters in all of Texas. While a few other retailers also had them, you’re looking at maybe 30 total units available in the entire state of Texas…
My location only had five 5090s. I arrived 9 hours early and there were already 80 people in line.
The remaining 70 had to get 5080s?
@ yup. Our location had 84 5080. By the time I was up, only Zotac Solid OCs were left. From what I heard from people asking around, it seemed like at least half the crowd wanted 5090s, but most decided to settle for a 5080.
@ i mean it is what it is i guess lol.i my self am waiting for the 9070xt xD i am hoping, hoping beyond hope, that it is around 600 usd with close to 4080s performance : p even if it is 10%-15% less, that would be a major win for the price lol. fingers crossed :D
I'm probably gonna swap back to team green since my last NVIDIA gpu, the 1060 3gb. Currently on a 7800XT and the headache of the drivers bugging out and terrible streaming quality on twitch is enough to make me want to switch.
idk what to say man, I have 7800xt since launch, had no driver issues, can't say about streaming as i don't do that, i'm just a gamer. but hey, you should always get what's best for you.
@@tiberiumirica8445 Fair, my 7800XT has done me well- but in the last 6 months or so i've had to DDU and reinstall drivers multiple times. For gaming it's fine, but when it comes to hardware acceleration (browser tasks/videos) and streaming (mainly twitch's fault) it's either bad or broken for me. AMD is good value, just not entirely right for me it seems.
Muta, when are you gonna release the just stop oil movement video you promise to release 7 months ago?
MUTAHAR FELL OFF HARD🎉😂
MY VIDEOS ARE WAY BETTER THAN HIS!!!
😂❤🎉 ..,. ! ,..,
get a life
Wish I could do this too as a Linux user, every time I update my NVIDA drivers I literally set off a Sunday afternoon to troubleshoot if necessary, though it's been better recently tbf... but ah I need that CUDA!!! 😭
Fakes and fraud becomes a major concern, with supplies that low.
Ngreedia really needs to be investigated....
MUTAHAR FELL OFF HARD🎉😂
MY VIDEOS ARE WAY BETTER THAN HIS!!!
😂❤🎉 ..,. ! ,..,
I don't remember asking kid🎉😂
My content is way better
😂❤🎉 ...,. .,.
NVIDIA drivers are horrible. They simply don't work on Linux. I can't even use the 470 drivers on the latest kernel (tesla 470 doesn't work either). In fact, Nouveau doesn't even work for me either.
which distro
But latest drivers for Linux is 550 (tho for Win it's 572). Why do you want to use 470?
@@GAMERRAZUMNO 470 are the drivers for older cards, and the latest Nvidia drivers on Linux are 570. (still classed as beta, but very usable).
NVIDIA has proven that gamers are no longer viable. In turn, NVIDIA is no longer viable to me.
Mutahar You're prolly too young to remember, but ATI/AMD were miles ahead at one time... just look at the performance of the Radeon 9700Pro (R300 chips) vs the then best thing nvidia had to offer... those radeons were almost 2 years undefeated by the competition, and those weren't the only cards that took the lead... the problem is that AMD always wanted to support gamers, making things open for all other companies to be usable, while nvidia always made new tech proprietary. This ended up with nvidia supporting the AMD tech (because they could, since it's open), but AMD lacking behind because the Nvidia techs couldn't be implemented. NVIDIA always was the bad actor, in the end it'll bite it's own tail, you'll see ...
Owner of a RX 6800, and my previous GPU was a RX 570 4GB. I just like GPUs that have good value.
The shortage is bringing back vivid memories of the situation when Bitcoin first blew up. I'm becoming really disenchanted with the whole industry tbh. AAA games have been lacking real creativity because nobody wants to take chances because game development is so expensive. GPUs are getting more and more expensive. Yes every once in awhile there's a AAA game that I really enjoy but that happens what, once a year? The rest of my gaming is esports-like titles and retro, both of which don't require high dollar GPUs or 2 year upgrades.
Looking forward to buying the RX 9070 XT Sapphire Nitro with a 9950X3D when they release!
God bless you Mr. Mutahar, for allowing me, a dusty old millennial to be able to understand what's even going on in the gaming industry.
Welcome to team Red, Muta! been a fan since... a decade and half
I bought an XTX for my wifes build. I have a 3090. I have been EXTREMELY impressed by the XTX. Drivers are crazy stable and the GPU is fast when it matters. Its the 4th fastest RT card as well and is the 4th fastest raster card.
As someone who is currently using an AMD Ryzen PC that's small enough to be a Minecraft Blackstone Button?
*_YOU'RE ONE OF US, NOW!_* OuO
If you ignore gimmick retrd sht like frame gen and path tracing and etc, my RX6900XT is still an insane beast. Lots of VRAM and I can push every new triple A to the max on my 21:9 3440x1440 monitor. If we ignore all these new gimmicks that nvidia has been putting out, suddenly cards don't really evolve since some years ago... you older high end card is still very good today.
I started to experiment with Linux gaming in ‘21 and switched to AMD with the 6900XT, and have since upgraded to the 7900XTX. I was disappointed to hear about the lack of a high-end 9000 card from them this generation, but after seeing how the 7900XTX stacks up against the 5080 and 5090, I think I’m okay to sit this cycle out.
Hopefully the RX 1190XTX will be worth the wait, and have a better name than that.
I made the switch last year and won’t be going back. Very happy with products and support.
Whoo flippers! To see price tags for the 5080/5090 sell for the same price as a whole rig should be going for?! Yeah, insane times we're in here.