I did notice that since around 2022 we started to see a few spicier budget cards in the top spots. (3070, and 3060ti) Before that the only on notable cards were the 1070, 1070ti, 1080, and 1080ti. (I wonder if the 'other' Radeon cards were the Vega 56, and Vega 64?)
This is a steam issue not a GPU manufacturer issue. VulkanInfo and DXdiag have no issues in identifying the cards which means that the information is there and it is accessible. You cannot program for either Direct3d or Vulkan without Identifying the GPU first.
Gtx 1060 getting the lead in jun 2017 and only being surpassed in aug 2022 (5 whole years and 2 generations later) just proves how stagnant the "budget" market has become.
@@tobeymaguire647 7900 GRE is probably what I'd pick if I had the choice now. I bought the 7800 XT when it launched and regret not waiting a little while longer, but nobody can see the future.
The mistake for them was charging “too little” for the generation uplift of the 10 series in terms of power and efficiency. Nvidia learned from the pandemic, that people are stupid enough to pay exorbitant prices for even their lower end cards.
It was back then when AMD was a serious competition in the GPU market share. After that moment AMD started going down especially when the hype over ray tracing became a thing and the crypto boom also kicked in so Nvidia had no reason to price their GPUs as good as the 10 series.
Haha, that was me for the longest time, didn't need much to run TTD and OpenRCT, wasn't much point upgrading until civ 5 and cities skylines came along.
The stats include installs, not necessarily people playing. Every laptop or work PC or garbage used to just open Steam and look at Friends or something counts to these values. In the current day probably only 2/3 of the data points are even representative, maybe less.
*I still have my 4790K 4 core/8 thread. It has been running for 10 years at 4.6GHz on a customized air cooler. 78 CFM and 3.8mm H²O 120mm fan.* *The 4600 HD iGPU in that CPU was always junk. It is a 720p Low settings graphics processor. You can't play any AAA title at 1080p whatsoever. Unless you lock the game to 10 or 15 fps.*
Maybe AMD not competing with Nvidia for the super GPU in 2025 is a good thing.... I mean this survey clearly shows a massive market for budget and entry level GPUs. Nvidia setting the price of new GPUs at $600+ has left one big gap for AMD to fill. I really hope it works out
@@Anubis1993KZintel has been in quite the shithole for the last couple months because of the instability issues (not sure if theyve been resolved yet) and amd has more of a good reputation for gaming, especially now that a ton of tech youtubers are starting to talk about the price to performance on amd gpus, im sure amd can dominate this market relatively easily
When all TH-camrs were talking about 4090 all gamers were playing on xx60 cards... especially 3060.... and in the end fastest GPU in top 20 was 4070 Ti with 1.25%... Anyway... I love the time when 8800 was on top... it was one on the best era of gaming (at least in my opinion)
it's because nobody needs those cards. those cards are total overkill and not worth the money. Plus they swallow more power / watts than the rest of the house combined
@@svingarm9283 ugb really...even 4070 struggles to run ue5, now every other game is made in ue5. You tell me if i need something that can actually acheve 1440p high on most ue5 titles. And anything 3060 is not unless you play 720p low
Bro. I still have, use and love my 6 year old 1060 6GB. 💀 She may be old (late 2018) but once I finally upgrade I'm going to put my baby in a display case close to me so I never abandon her like how she never abandoned me.
Thats a nice idea, i might steal that idea :D Still running a Gainward 1060 Golden Phoenix, not only did it serve me for so many years, the Gainward also looks sick.
i went from a gtx470 , gtx980, rtx2080. i'm wondering if it might be worth it to get some one to mod my 2080 from 8gigs to 16 gigs memory if the price is right, since there are more people out there on TH-cam that can do that sort of thing
same my 1060 is 8 years old never bothered to upgrade but imma go all out and get the 5090 next year which will hopefully get me through the next decade
Pretty sure this chart includes the laptop and desktop gpu's as one, if so it makes sense why the 4060 is exploding in popularity as the 4060 laptop is actually amazing for it's price unlike the desktop model, I would know since I have 4060 laptop lol.
@@VoornePuttenBenchmarks The mobile 4060 is good compared to other laptop GPUs like the 4070 mobile which offers very little performance gains for a lot more money, but price to performance wise it is on par with the desktop 4060 (it is just that there are better desktop options).
I got it in new PC and it was 4gen jump for me. I actually know 0 people who change their xx60 card to nextgen 60 card at all, and i think it all dudes building new PCs after 6-8 years of using 1050/1060
@@Snow.2040It can't be on par with desktop 4060 just because desktop 4060 uses 175W, and no laptop could provide such power to a chip at all. Maybe ~65% of desktop version but not on parr
@@Volkze But the crypto mining period made the GPu prices skyrocket. This cause people to hold on their old GPUs for longer and buy used ones, hence why the GTX 1060 retained its dominance for far longer than it would have been expected.
Dominance in SALES. But in nothing else. RX580 was faster back then by 5% and the AMD card supported Vulkan and DX12, which the 1060 did not. The AMD card also cost less. So the fact that 800% more people bought the 1060 just shows Nvidia how STUPID PC gamers are, which is why they continually increase their prices and lower their generational gains. Doesn't matter, people will buy their media centre 4060 for $400 regardless. (WHICH THEY DID!). Incidentally, the RX580 beats the 1060 today by over 100% in many titles. Like ... DOUBLE the framerate. Insane.
Still rocking with it even 10 years later! 🤣 and I knew for many ppl like myself it was perfect match with price and performance, which is no surpise so many ppl upgraded their GPUs to 970 back then
@@Balnazzardi the 970 was a steal in 2014, cheaper and much more powerful than a PS4 which was only a year old. I finally upgraded to a 4070 last year.
@@ROCKSTAR3291 ye I was really on edge of upgrading to 4070S this year, already was thinking of 4060Ti last yeae but then I thought perhaps I just wait till next year until Nvidia has revealed price and performance of 5000 series. I mean most likely RTX 4070 Super would last in my use as long as 970 did but should RTX 5070 be 16GB Vram GPU, it would be even more "future proof", especially if I would go from 1080p monitor to 1440p I dont play newest/most system heavy games, so I told myself "hold your horses, you can wait till 2025 and then make the decision" 😂
You also get the 1060 3GB and 1060 6GB, RTX 3050 8GB and RTX 3050 6GB, RTX 4070 TI 12GB and RTX 4070 TI super 16GB, GT 1030 GDDR5 and GT 1030 GDDR3, RTX 4070 GDDR6X and RTX 4070 GDDR6, etc.
@@BladeCrewso the reason i made this comment, is the GPU in the 1060s are different, performance is massively different between the 3GB and 6GB, because again different GPU and not the same card with only different VRAM. It was designed to make people spend more to think they are getting the performance of the 1060 when buying the 3GB model.
It's cool how it's generally centers around the 60-class cards with both higher and lower end cards tapering off around it. The 970 is an exception because I remember those selling for less than a 1060 for similar performance at the start of the 10-series.
970 and 750ti cards dominated that era. I still remember the 750ti being the best budget GPU coupled with a fx 6300 you'd have a $600 build that would play almost anything. Sad that the minimum to pay now is over a $1000 and you still don't get "good" performance on any new game.
@@sampic_ Indeed, modern AAA games are horribly optimized and they now use things like DLSS as a crutch to make them playable (but only on the latest cards of course).
Most people consider mid range, they want to invest not too much or too little, not to think too much on what to choose What is an average reasonable stuff they can get So entry and high end have their own customers
That will never happen again. They make sure there are no cheap last gen GPUs left when they launch a new series. Meanwhile, AMD kept the RX6600 in production for 3 years. I think this is why they put 16GB on the 7600XT - otherwise there would be no reason to buy a current gen budget AMD GPU.
@@Lurch-Bot Yeah, and for many people who thought extra VRAM without enough power is useless, the 16GB's unlock room for AMD Frame Generation, 8GB cards, can't do that without VRAM Spillover or crash, 10 is also risky/1080p 12GB is fine for 1080p+FG Scale that up
I got one off a buddy that "bricked" it and just reflashed the stock bios myself. Basically free, $100. That's about how much i paid for my RX570 (8gb). I have a 6700xt in my rig now but it genuinely isn't that big of a difference in raw raster performance. When you start introducing RT or just lots of shader effects, etc. You can definitely tell the difference though.
I will add to this thread that dual Gou laptops were a trend. I myself had one that switched between nVidia GPU and Intel hd graphics depending on settings (like games using the nVidia one but it being turned off for normal work)
There has been a pre-Ryzen period during which Intel has bribed laptop and prebuilt manufacturers to not use AMD CPUs, hence the reason Intel has dominated AMD in terms of integrated graphics for quite a while.
@@Josh-cw8by YT won't let me pose links in comments. Just google for "intel loses us bill on manufacturers bribe" or for "intel bribes manufacturers". You'll get a lot of results.
Im still rocking with it to this very day 😂 but granted I dont play the latest games aside from few exceptions anymore and when I do they arent the most demanding of games, like Jagged Alliance 3 for example. But ye I remember how many ppl ridiculed it for being "just 3.5GB GPU and not true 4GB"....well as we all know it offered excellent value for your money and the 3.5GB wasnt really that big of issue. Now the question for me is whether I upgrade to 12GB VRAM GPU or 16GB VRAM GPU next year. I guess I wait and see what the specs and prices are for low and mid tier RTX 5000 GPUs will be. I know its either 4070S or 5070 for me
Yes, people should stop going for "their team" and just buy the card that is best for them, no matter if it is green, red or blue. You will also still find people that think you can not buy a AMD card, because the drivers are bad and it runs to hot. They still live in 2014.
It still kinda pisses me off Nvidia has such a dominance in the market. Like sure their cards are good. This is an undeniable fact. But as a business, Nvidia is just as rancid if not more so than Apple. At least AMD tries to not fuck over it's own users every generation.
yep, im trying out the AMD cards (it comes tomorrow) for a bit with the 7700xt before the 5k series comes out. Who knows i might like it enough to stay with AMD (replacing a 2070 Super)
@@NoxNtella I hope you have a good time! I personally love my 7900xtx despite a few hiccups with Helldiver 2 and Space Marine 2, both of which had work arounds and now currently work just fine. The only major issue I can say I had was with the temperatures. I'm not sure if this is an xtx exclusive issue but I found that the factory applied paste was pushed out to the edges of the die. Personally I chose to use PTM7950 and I haven't had an issue since. Aside from that I can say that the performance is fantastic and the Adrenalin software is tough to beat.
@@NoxNtellaI much prefer AMD's adrenalin software than geforce . It's so much more user friendly. Can literally just type in the search the particular setting and bam. Also just better features in general imo. Anti-lag, enhanced sync, VSR(super resolution) which is better than Nvidia's (better performance gains) If you ever want to use scaling in a game. Don't use dynamic scaling or any of that nonsense. Use the in-house VSR from AMD. Example. 1440p monitor but just barely off 60fps (or whatever desired fps) in a demanding game? Activate VSR, target 1440p, set in game to 1080p. You will get better performance than using any other upscaling or rendering bs.
Also you barely lose any visual fidelity. I could barely tell the difference between native and VSR. There is a slight one but you only notice if your standing still and REALLY looking.
@@adlibconstitution1609 that is absolutly true and thats why they will continue to do same shit to customers like they did this generation and make overpriced flagship card and castrate the rest and sell them for some funny $/fps ratio while paying game publishers to focus on DLS while doing basicly no optimalization to their games :)
@@adlibconstitution1609 That is not true, there is not a single "laptop" named gpu in charts, because nvidia's mobile gpus is reported as same as desktop. For example, RTX 4050 and RTX 4050 mobile is same AD107, only with different frequencies. P.S. Not fanboy of anything, i have mainly intel igpu's and my last used dedicated GPU was GeForce4 Ti 4200 in pre-steam times :)
It feels like yesterday and I'm baffled by the fact that 770 was used so little. I swear to god my ASUS 770 lasted an eternity with satisfactory performance before making way to mighty 1080ti. Best buy ever.
My experience was the same! Great cards. I got the 4GB 770 in June 2013 brand new. I used it until December 2018, when I got the 1080Ti. In July, I just gave 1080Ti to my wife and upgraded to a new RTX 4080 Super by trading some GPUs to a guy on FB marketplace. Other than concerns around 16GB being good long term, I don't regret any of these transactions.
@@bjyoungblood We bought same cards in almost same time frames. I too bought gtx770 in 2023 and 1080ti in 2018. Except I went for a RX 7900XT lol. No regrets here either.
Got the gtx 770 2gb gifted to me in february 2014. 2nd best gpu at the time after gtx 780. Then witcher 3 released next year and i could barely play with low-medium settings. Never mind high or ultra, or hairworks. The gpu was basically worthless after just 1 year. Dark souls 3 released, same shit. Playing with mix of low and medium settings, yikes. Idk how much it costed cuz it was a gift, but prob 300-400.
I just love how the nVidia 10x0 series just wiped the floor with everything and managed to stay relevant for so many years. Hell, the 1080Ti still is a great card for most of todays games.
So it was with GTX 970 already and it actually managed to (rightfully) claim the number 1 spot and hold it for some time. Im still rocking with it to this day 😂. Now the matter of question is to which to upgrade finally next year...RTX 4070S or to the upcoming 5070. I guess in January Ill make my decision when Nvidia reveals the details for 5000 series
@@sanji663 Never caused any issues for me, atleast the kind of that would have made me curse about that "issue"...in my honest opinion the whole "its only 3.5GB GPU" was lot of noice for nothing. Then again I never really played really system heavy games with it to begin with, so it suited my needs all these years just fine. Anyhow I hope Nnvidia has enough sense to give 5070 better price-performance value than that, but if not then 4070S it is for me
I was riveted by the highs and lows of the Radeon (Other). Struggling to stay afloat over the years, the mysterious collective of GPUs working together to blemish the sea of green kept me rooting for them, despite knowing the outcome. Bless their souls! Don't drown yet!
Admit it It's a dead card. I have a 1050 ti since 2018 too. It can't play Starfield, Bodycam, Hogwarts Legacy. I will buy new system probably Rtx 4060 or 4070 super or 7800 xt 7900 gre I'm not sure for now but I'm sure I will buy new gaming computer
3:19, my beloved GTX 970 appears! Still rocking with it even after a decade 🤣 Next year gonna finally have to upgrade though, whether its to RTX 4070S or upcoming 5070, I dont yet know. But GTX 970 has served me so well these past 10 years, the last more recent game I played with it was Jagged Alliance 3, released last year. Ofc its starting to show its age, but still managed to play the less demanding games even 10 years later 😄
I’ve spotted one thing… current gen mid range GPU is using between 200 and 300 wats when 10-15 years ago that amount of wats was for top range GPUs… Another thing, old mid range GPUs like 760-1060 were much more powerful in compare to top range than todays GPUs who’s more like old low end… stagnation and power consumption nowadays are insane.
200 and 300? There is no midrange GPU that consumes 300 watts of power, unless you count rtx4070ti as mid range and even then that one only consumes 285watt, if anything RTX40 midrange like 4060 and 4060ti are very good with power effeciency
@@LampSteak 4060 series (Ti included) looks like it should be called 4050 / 4050 Ti. About 200-300W for current mid range, I was talking around that, not specific like for example 299W is midrange, 300W is flagship, 301W is titan class... My RX 7800 XT should consume 263W but it's overclocked version and it's around 280-290W. RTX 4070 series (in my opinion should be called 4060) is very efficient but it's from 200W and above and overclocked 4070 Ti Super can easily go above 285W co still they are in Wat range (200-300). Those are current mid range cards. They are 2-3 slots constructions, 10-15 years ago they easily could be in size and power consumption of flagship constructions. ps. Some 7900 GRE and 4070 Ti Super, overclocked versions, can consume even 300-330W in some benchmarks or for short bursts in games when needed.
@@isadora-6th no one should be shamed because of what he have in PC however companies like Nvidia should be shamed for what they are trying to do. In this gen distance in performance between top 4090 and xx60s class cards is the biggest. Even hardware unbox did video about evolution performance in different range GPUs through different generations. In their video we can saw why 1060 was so much success and 4060 is often called 4050 class cards. Even Nvidia tried selling current 4070Ti (12GB) as 4080… they are trying to make gap between different GPUs class wider so they are renaming slower cards and trying to sell them as more powerful ones.
@@RyanBGSTLI can't believe anyone would buy an SiS or an S3 graphics card or use their integrated graphics after the 1990s. Wow!!! A VooDoo Banshee or higher was a great card in those days. I still remember my first 2D graphics card. Then I bought the 3D accelerator that I had to patch in via cable. Now you just plug your card in and it's done. There's no fun to it. LoL!!
@@Twitch_Moderator these Chips were built into SiS and Via chipsets in the early to mid 2000's. They probably fell off when Steam dropped support for Windows XP.
the GTX 1060 6GB was genuinely good, it's not just because of mining. i don't use mine anymore but i gave it away and its still able to play badly optimized games like Hogwarts Legacy on a 1050p monitor (and 1080p for any other game), it was just that good. the Pascal entire generation was the biggest jump of price/performances that ever existed (i'm talking about the entire gen, not just one GPU like the 4090 is). since then the market is stagnant, NVIDIA doesn't give enough VRAM, their "budget" GPU's just recently went back to the normal price (accounting for inflation) but everything from a 70 to a 80 is severely overinflated, and the 90 is now an AI GPU that's not even meant for gamers anymore. oh and AMD still doesn't exist because instead of selling cheaper GPU's, they went the greedy way like NVIDIA by following their prices very closely but unlike them AMD doesn't have good tech behind their GPU's and now they announced they're stopping the high budget GPU's.
And what did we all learn? No 90 series card from NVIDIA in sight and the only 80 series on the board was the 3080 at a whopping 1.84% market share (7:27). Judging by the amount of hype the new generations get on TH-cam you'd think everyone would be rushing to buy them, just like with aaaaaaall the leaks going around with the 5090!
People get hyped. But that doesn't mean that everyone prioritizes $1,200 or more to just a video card. Maybe most people like to travel, smoke, drink, party, finance new vehicles, etc.
Only guy I knew using 3090 only does flight sim, most people gaming setup are middle-low specs, people aren't willy-nilly buy new GPU just because its new, most of us are broke AF.
Most of us have families to support and can not be buying 4090's even if I could afford an rtx4090 I would still not buy it. I am sticking to AMD cause I prefer using 1 brand for my pc when it comes to gpus and cpus 🥰 no fanboy here sorry hehe...am looking forward to the rx5000 series of cards then upgrade from my Sapphire nitro + special edition to an rx8700xt or whatever the cards will be called then @@Setupthemabomb
I still use it to play most game at 30-60 fps with low to medium graphic right now. Just OC it to 1823MHZ and undervolt a lil bit and see the magical happen
I alway bought AMD because its was cheaper most of the time ATI Radeon 9600 Pro 65€ ATI Radeon X1600 AGP 125€ ATI Radeon X1650 PCIe 100€ ATI Radeon HD 2900 350€ ATI Radeon HD 4870X2 400€ ATI Radeon 5850 mobile ATI Radeon 5870 320€ AMD Radeon 7970 440€ AMD RX 580 200€ AMD Vega 64 250€ AMD RX 6900XT 910€ I expect the next GTX 5090 to be more than 3k$, which is more than all my card I ever owned in my entire life, let this sink for a minute
@@arthurbesnard1536 you keep track of your computer purchases.. oooof 9600 Pro 65€ these were the days. seems surreal nowadays to pay this little... reminds me of the day when i upgraded the gpu on my parents PC for a geforce 4 card and the pc didnt start. spent 50 € or so for it back then. O.o
I upgraded the 2080 to a 7800XT for half the price of a 4070, and I'm not looking back. It uses as much power as the 2080, it's cool, it gave me 50% extra performance and it doesn't require it's own nuclear reactor to run. Although I'd have gone for the 7900GRE if 7800XT wasn't such a steal.
I am so lucky to have hd5850. I knew recently about it from tech power up. Hd4870 1g was nice and hd5850 cypress must be a goat. Double cores double performance.
Wow this just showed me so much and put so much into prospective. I'm shocked the 1080ti never made it to the top of the list or that none of the 90 series has ever made it to this list. It's amazing Intel once held the leading position!!! But what looked like Intel GPUs being popular was solely because this was divided by model types, Nvidia still at that time had most of the overall market despite Intel having the most popular model. The Radeon other you see at the end is likely all the new handhelds. Just some really interesting stuff can be learned from studying this.
note that prior to 2010 ... the top card for the longest time was nivida's high end card... around 2010 -2012 this started change when nivida really started jacking their prices regularly every generation.
It was 599 USD for the 8800 GTX (2006). It was 499 USD for the GTX 285 (2009) and a GTX 480 (2010) and a GTX 680 (2012). A GTX 980 (2014) went for 549 USD and a GTX 1080 (2016) went for 599 USD. So, an Artx 3080 went for 699 USD in 2020. That is a great price. So, the prices stayed super low and constant for MANY ages. The only time prices sky-rocketed was the 4000 series. When performance jumps were finally above 25%-36% rasterization.
@@Twitch_Moderator the only thing consistant with their prices was that their top end got 50 to 100 bucks more expensive every other generation , inflation didn't move that fast. i lived it dude i saw the prices creep sure nivida had a spike witht eh 4000's series but they were consistantly raising prices every other generation of card the top tier card of 2000 (the geforce 2 ultra) went for 270-320 bucks (about 550-650 today) got to keep in mind the 1080 came out before the inflation boom of bidenomics . and even at 599 it was well above what inflation dicated it should have been in 2017... and the titan .... that thing launched at $999 meaning in 2017 the most top tier card was already 3x more expensive than 2000's top tier card. the 1080 had nivida actually followed inflation should have been around 350-400 bucks while the titan should have been around 600-650 . so stop the bull shit . i'm not bashing nvidia hell i got a 4080 super myself . but you can't defend their pricing or make up stories about some kind of consistant pricing . the only thing they have been consistant with is steadily raising prices above what inflation would call for. the 4000 series pricing was a whole nother level of "f*ck you" to the consumer though that i xo agree with you on.
Yeah, as we can see, the vast majority of the GPUs are low to mid range, AMD strategy of stepping off the high end GPUs race is a good decision. Also the 3060Ti is probably the best GPU on the current market
Man the 1060 was my first GPU in 2018… it was a legend, sold it to my friend upgrade to a 3070, then to a 3080, and lastly a 4090. Times have changed definitely… but that 1060 got me through some really hard times and it worked super well.
@@specialk9762 linux is horrible for gaming. I spoke to a graphics developer and he told me linux is 15 years behind windows on gaming. While valve and their proton and gamescope is investing lots of money to change that, for the time being linux is very behind. For example, when u play games on linux, whether with vulkan or openGL, there's no flip model presentation on the swap chain, because that's a feature only directX supports (which is proprietary to microsoft and windows), vulkan and opengl dont have an equivalent to the flip model, which is a very big deal.
I had Radeon HD 5770 as well, the only AMD GPU I ever had actually. Before that I had PC with Geforce 3 Ti 200 and afterwards GTX 970...as you can tell, I dont upgrade my PC that often 🤣 next one is propably gonna be RTX 4070 Super or 5070.
@@potato-km6iwBetter rasterization, better price to performance, better company that's trying to put out products that meets consumers where they're at instead of making demands for more money they don't need. Unless you're getting a 4090 or 4080s, there's no reason to get an nvidia gpu
1. Nvidia absolutely rules the GPU market 2. The Halo soundtrack is so instantly recognizable and so good I could’ve kept watching this for an hour 3. Please give me more of the Steam Survey done like this!
4:00 sad how many brought a GTX 970 4GB with gimped VRAM(3.5GB) when they could have got a R9 390 with better DX12/vulkan support and 8GB VRAM for the same price. I was using a 980ti at the time with 6GB of VRAM, that card lasted me a while but even that was struggling with only 6GB of VRAM, i remember thinking surely 6GB VRAM was enough, only when i upgraded to Titan XP i saw games was using over 6GB of VRAM and then it made sense why i'd suddenly get lower FPS in certain areas or games, VRAM is very important and windows doesn't always report its usage correctly.
@@pranayarana1958 you'd of been satisfied no matter what you brought because you wouldnt have known better, i was using a 980ti back then with 6GB of VRAM and that was good until it started to hit the VRAM cap so i can only imagine how much worse it would have been with gimped vram, but thats the point with a card in mid range and especially so with that card, you'll want to upgrade faster, anything in the 1000 series would have seemed like a big upgrade.
@@tobeymaguire647 yeah it can be + I remember that radeon 6000 series were uber expensive when they were released because of covid and chip shortage. I remember I paid 1650 EUR for brand new 6900XT. If prices were normal, I believe this graph could have changed
@@tobeymaguire647a big reason for why nvidia is so dominant is also the fact that nvidia has pretty much zero competition in the laptop gpu market, I couldn't find a single laptop in my country that has an amd gpu.
@@tobeymaguire647Amd has less factories all over the world resulting in higher tax in more country/market, making it lose its strongest point, price. And honestly people won't care if they lose 5fps when they upgrade from a 2060 or 1060 to 4060 or even 4070 instead of an rx7800xt. It is just so ideal with their lower power consumption, while being of the same price if not cheaper than amd outside of the US.
Getting an amd cpu is better right now, if you have a goal in mind for your specific build, amd gpu can fill that gap. But the majority of pc are built for games and light work, how many people do you think need 24gb of vram 😅. So sure, they are better in some cases, always better in raw power, but just more hassle for what an average person needs that they can find in nvidia.
Love this VIdeo, watched from beginning to the End. SIck music choice makes it so EPIC ! My GPUS: ...4850HD -> GTX670 Windforce -> GTX 1060 Gainward Phoenix Edition -> (upgrade soon). 👍
They yap bcz you get much better bang for buck with AMD but you cant un-stupid people buying gpus so thats that... As for "other" its probably integrated amd gpus, but not sure
@@rogoznicafc9672 I got the rx 6800 and it is incredible now that they released fsr 3 into cp 2077 i play ultra rt 1440p high settings and get 60-80 fps and the drivers are flawless people are living in the past forsure, the card only cost 370 new currently.
@@tobeymaguire647 I might be completely wrong here, but I think AMD makes up a much larger proportion of NEW sales than the steam charts would lead you to believe. The nvidia stats are just boosted by older gpus being used in internet cafes in China.
All those entry level people at the top are going to be real disappointed as games continue to be increasingly more demanding the way they have been lately.
Things like cell phones, TVs, computers, etc, you are always upgrading or replacing. So, why complain? Cell phones cost $1,000+ now. You either pay off 75% of thar price over a 2-year contract, or you pay it all up front. So, why complain about paying $1,200 every 2-3 years for a graphics card? (which will being you MANY MORE hours of fun).
@@OmnianMIU Yes. "Radeon (other)" was used to represent the sum of all the AMD Radeon GPUs, including the RDNA 1-3 ones such as RX 5700XT, RX 6800XT, RX 7700XT and so on.
There just aren't many people that would rather miss out on mainstream features that Nvidia cards offer just to save a couple of bucks. It almost makes NO sense to buy an AMD/ATi card unless the hate in your heart consumes you and you despise Nvidia. Ie: gaming performance of a 7900XTX vs a 4080 SUPER. It is $50 cheaper to get a 4080S, and it is 17% more powerful in gaming than the 7900XTX. Yet, some people still hang on for dear life to the AMD card just in spite of Nvidia.
@@Twitch_Moderator Honestly what you have written here is absolute lie. Atleast in my country 7900XTX is at least 30% cheaper than 4080 super - 1000€ vs 1300€ and it's actually a little bit faster than 4080 super, I really don't know what leads people like you to lie like that, in the long run lies like this hurts ALL customers, since as we know, many people will straight up believe everything they read. Oh and in Raytracing it IS slower than 4080 super, for sure, however it's still at a 4070 Ti level, which isn't too bad. Also what mainstream feautures are you talking about? FSR and XEss is very good, FG2 is also very good. Only one thing thats missing is CUDA, but thats hardly a mainstream technology for gamers.
This is actually really insane. My lens into what cards were so popular did not reflect AT ALL what was actually being used. The 1080ti was hardly on the board. I thought that card would STILL be somewhere on the chart even today, but it was practically irrelevant compared to the renown it's gotten over the years. This chart was dominated by budget cards. Guess I can't be surprised
I’m actually surprised that Nvidia has that much more of a market share. Amd is really good. Hopefully with this next launch, they get a bit more attention. Their CPUs are doing really well though.
I bought the GTX970 in 2015. Still works fine in the spare bedroom in the system I leave in there for when friends stay 9 years later. I used it actively playing just about everything until the RTX 3060 12GB came out in 2022 which is my current board. So I guess I made the right choices :)
No, other isn't RDNA based gpus, because if you go on steam gpu charts, every single RDNA gpu has right name like rx 6700xt. I don't know what is "radeon (other)" 😅
Yes, they did. The reason they are represented as "Radeon (other)" is because their individual GPUs wouldn't even be on the chart due to low market share, so they were all summed up in one category. Just look at the market share.
@@definitelyfunatparties you must be sick in the head if you do that. last time i bought new phone was 7 years ago, and new ones arent almost any better
i'm not understand these nVidias hype... they drivers is the worst in functionality ... :( After somebody use an AMD card, will notice how mutch better the AMD cards control panel. The drivel stability is "just fine" (not worse than nvidia in these point... around a same)
It's because most people don't care about actual facts and just believe whatever their uninformed friends tell them, so they end up with a much more expensive card with features they most likely are not gonna use. It's just ignorant people echoing misinformation that they read on reddit and they have for some reason made themselves experts in Amd drivers even tho they never owned a Radeon GPU. You can easily tell by their replies, cause they don't know what they are talking about.
@@NulJernWhat you said is correct but another big reason for why nvidia is so dominant in the gpu market is the fact that they have no competition in the gaming laptop market.
Nope. Nvidia's gpus are better all around. You talking about the layout of the driver settings of nvidia driver, there's the new app now. Please the stability of the driver AMD is garbage, and not mentioning the update suppprt for the drivers! 2 or 3 drivers during the year for radeon. For nvidia gpus there's 2 driver versions per month 🤡
@@OmnianMIU 2 or 3 driver per year for a radeon? wtf ... stable version it is come in almost every month (very rare, when not come a new driver in a month, beta come even more often, but i thing "not worth it"), lol ... but.. mostly i replace driver twice a year, because not neccessary to replace.. just works. will be verry bad, if need to replace to often a driver. You definetly not used a radeon before... this "driver often driver release for nvidia excuse was around a 20-25 years ago to... it didn't make much sense then either". Of course, S3, Intel driver release schedule was bad.. but Radeon (in ATI time) it wasn't a problem back then to. And .. those different app.. just why use a different settings program in nvidia for game profiling? Just "to junk" solution.
man seeing the 580x making it up there was gr8 ive had this card for over 5 years running games in 1440p and it still kicking hard amazing gpu from amd
Video Card Manufacturers: "Millions for R&D, but not one cent for making sure the video card driver reports an accurate product name."
I did notice that since around 2022 we started to see a few spicier budget cards in the top spots. (3070, and 3060ti)
Before that the only on notable cards were the 1070, 1070ti, 1080, and 1080ti. (I wonder if the 'other' Radeon cards were the Vega 56, and Vega 64?)
2014 explains alot in gaming and stale hardware.
@@greenman8 AFAIK "other Radeon" means integrated graphics.
This is a steam issue not a GPU manufacturer issue.
VulkanInfo and DXdiag have no issues in identifying the cards which means that the information is there and it is accessible.
You cannot program for either Direct3d or Vulkan without Identifying the GPU first.
The fact you dont know the difference between an ATI card and an AMD card is kinda dumb.
Gtx 1060 getting the lead in jun 2017 and only being surpassed in aug 2022 (5 whole years and 2 generations later) just proves how stagnant the "budget" market has become.
Me who's been waiting for 7 years for a new killer budget GPU like the 1060 to pop up so, I can finally upgrade 🤡
@@tobeymaguire647 rx 6600 exists
@@tobeymaguire647 7900 GRE is probably what I'd pick if I had the choice now. I bought the 7800 XT when it launched and regret not waiting a little while longer, but nobody can see the future.
@tobeymaguire647 your choices are intel arc, last gen amd gpus, or used gpus 💀
also 10 series was just the goat. Every tier had great value gpu.
10xx was just too good price/perf and Nvidia will never make the same mistake again.
+
Pascal was so damn fast everyone overlooked the fact that NVIDIA started to gimp the VRAM that generation.
They got profit and they got tons of sales. i would not call that a mistake
The mistake for them was charging “too little” for the generation uplift of the 10 series in terms of power and efficiency.
Nvidia learned from the pandemic, that people are stupid enough to pay exorbitant prices for even their lower end cards.
It was back then when AMD was a serious competition in the GPU market share. After that moment AMD started going down especially when the hype over ray tracing became a thing and the crypto boom also kicked in so Nvidia had no reason to price their GPUs as good as the 10 series.
"Why are you so hyped?"
"I watched a statistic video about GPU uses!"
halo music thats why
+1
*with halo music
TBF, it was a pretty dynamic set of bars.
bro gtx 1060 just destroyed every thing
My first card on a $1000 pc prebuilt
@@TheChcam im still using it with r7 7800x3d XD
Wtf why@@dawiddrzewosz2792
i was still using gtx1060 3 months ago, and only reason i had to change was because it broke.
@@dawiddrzewosz2792 building my pc tomorrow and i got my beloved 1070 with a r7 7800x3d LMAO its prob the time to upgrade it aswell
gotta love the consistent mass of people playing on integrated graphics keeping 4000 intel on the chart for *decades*
Can't argue with free.
Haha, that was me for the longest time, didn't need much to run TTD and OpenRCT, wasn't much point upgrading until civ 5 and cities skylines came along.
Wouldn`t call it playing. It`s more like using.
The stats include installs, not necessarily people playing. Every laptop or work PC or garbage used to just open Steam and look at Friends or something counts to these values. In the current day probably only 2/3 of the data points are even representative, maybe less.
*I still have my 4790K 4 core/8 thread. It has been running for 10 years at 4.6GHz on a customized air cooler. 78 CFM and 3.8mm H²O 120mm fan.*
*The 4600 HD iGPU in that CPU was always junk. It is a 720p Low settings graphics processor. You can't play any AAA title at 1080p whatsoever. Unless you lock the game to 10 or 15 fps.*
Maybe AMD not competing with Nvidia for the super GPU in 2025 is a good thing.... I mean this survey clearly shows a massive market for budget and entry level GPUs. Nvidia setting the price of new GPUs at $600+ has left one big gap for AMD to fill. I really hope it works out
The only problem is that Intel is aiming for the same price category, so it won't be easy either.
@@Anubis1993KZintel has been in quite the shithole for the last couple months because of the instability issues (not sure if theyve been resolved yet) and amd has more of a good reputation for gaming, especially now that a ton of tech youtubers are starting to talk about the price to performance on amd gpus, im sure amd can dominate this market relatively easily
@@seplol If not, then surely the more competition the better?
AMD respond with 4 gpus of Radeon 8000 series priced competitively as follows: 599, 549, 499 and 399USD.
oh come on, AMD proved many times they don't care about screwing the consumer and will be just as bad as NVIDIA with prices.
When all TH-camrs were talking about 4090 all gamers were playing on xx60 cards... especially 3060.... and in the end fastest GPU in top 20 was 4070 Ti with 1.25%...
Anyway... I love the time when 8800 was on top... it was one on the best era of gaming (at least in my opinion)
0.80% PC gamers play games on RTX4090 I'm happy that I'm in this percent!
Its like all car reviews its all about high end, never what prople can actually offord or is popular
@@ttwinsturboeveryone wants to be in that present but it so expensive and thats why the slow/old ones have better sales
it's because nobody needs those cards. those cards are total overkill and not worth the money.
Plus they swallow more power / watts than the rest of the house combined
@@svingarm9283 ugb really...even 4070 struggles to run ue5, now every other game is made in ue5. You tell me if i need something that can actually acheve 1440p high on most ue5 titles. And anything 3060 is not unless you play 720p low
0:47 ATI: “I used to roll the dice”
Sweep the streets I used to own...
Bro. I still have, use and love my 6 year old 1060 6GB. 💀
She may be old (late 2018) but once I finally upgrade I'm going to put my baby in a display case close to me so I never abandon her like how she never abandoned me.
Thats a nice idea, i might steal that idea :D
Still running a Gainward 1060 Golden Phoenix, not only did it serve me for so many years, the Gainward also looks sick.
i went from a gtx470 , gtx980, rtx2080. i'm wondering if it might be worth it to get some one to mod my 2080 from 8gigs to 16 gigs memory if the price is right, since there are more people out there on TH-cam that can do that sort of thing
@@daxconnell7661If you have a 2080, just leave it as it is, that card will still be fine for a few more years.
it*
same my 1060 is 8 years old
never bothered to upgrade but imma go all out and get the 5090 next year which will hopefully get me through the next decade
3060 was a decent deal but why the 4060 is exploding in usage is mindboggling, this card is so overpriced for basically no performance uplift
it has 120 watt power consumption.. and power a lil better than a 250watt 1080ti bro
Pretty sure this chart includes the laptop and desktop gpu's as one, if so it makes sense why the 4060 is exploding in popularity as the 4060 laptop is actually amazing for it's price unlike the desktop model, I would know since I have 4060 laptop lol.
@@VoornePuttenBenchmarks The mobile 4060 is good compared to other laptop GPUs like the 4070 mobile which offers very little performance gains for a lot more money, but price to performance wise it is on par with the desktop 4060 (it is just that there are better desktop options).
I got it in new PC and it was 4gen jump for me. I actually know 0 people who change their xx60 card to nextgen 60 card at all, and i think it all dudes building new PCs after 6-8 years of using 1050/1060
@@Snow.2040It can't be on par with desktop 4060 just because desktop 4060 uses 175W, and no laptop could provide such power to a chip at all.
Maybe ~65% of desktop version but not on parr
The sheer dominance the gtx 1060 held for years was amazing
cuz of mining
@@KeizashiAcidRain This is from the steam hardware survey so its not because of mining.
@@Volkze But the crypto mining period made the GPu prices skyrocket. This cause people to hold on their old GPUs for longer and buy used ones, hence why the GTX 1060 retained its dominance for far longer than it would have been expected.
Dominance in SALES. But in nothing else. RX580 was faster back then by 5% and the AMD card supported Vulkan and DX12, which the 1060 did not. The AMD card also cost less.
So the fact that 800% more people bought the 1060 just shows Nvidia how STUPID PC gamers are, which is why they continually increase their prices and lower their generational gains. Doesn't matter, people will buy their media centre 4060 for $400 regardless. (WHICH THEY DID!).
Incidentally, the RX580 beats the 1060 today by over 100% in many titles. Like ... DOUBLE the framerate. Insane.
@@TheVanillatech The RX 580 is a 185W card. The 1060 is a 120W card.
3:19 GTX 970: Out of the damn way!
Still rocking with it even 10 years later! 🤣 and I knew for many ppl like myself it was perfect match with price and performance, which is no surpise so many ppl upgraded their GPUs to 970 back then
@@Balnazzardi the 970 was a steal in 2014, cheaper and much more powerful than a PS4 which was only a year old. I finally upgraded to a 4070 last year.
@@ROCKSTAR3291 ye I was really on edge of upgrading to 4070S this year, already was thinking of 4060Ti last yeae but then I thought perhaps I just wait till next year until Nvidia has revealed price and performance of 5000 series.
I mean most likely RTX 4070 Super would last in my use as long as 970 did but should RTX 5070 be 16GB Vram GPU, it would be even more "future proof", especially if I would go from 1080p monitor to 1440p
I dont play newest/most system heavy games, so I told myself "hold your horses, you can wait till 2025 and then make the decision" 😂
Remember gamers complaining about 3.5gb even though it was an amazing card? Funny how gamers are just as stupid today as they were back then.
Memories man, was a great GPU
Comments: wow 3060 was huge.
Someone who knows GPUs: Ah that old scam where 2 cards are labeled as the same card.
The 970 was also a huge scam with the 3.5 GB of fast memory and 0.5 GB of slow memory.
@@Maboidatboi yep, the r9 390 8GB at the time which was the same price was a much better choice.
You also get the 1060 3GB and 1060 6GB, RTX 3050 8GB and RTX 3050 6GB, RTX 4070 TI 12GB and RTX 4070 TI super 16GB, GT 1030 GDDR5 and GT 1030 GDDR3, RTX 4070 GDDR6X and RTX 4070 GDDR6, etc.
@@BladeCrewso the reason i made this comment, is the GPU in the 1060s are different, performance is massively different between the 3GB and 6GB, because again different GPU and not the same card with only different VRAM. It was designed to make people spend more to think they are getting the performance of the 1060 when buying the 3GB model.
Yeah the 3060 was basically a 2060.
It's cool how it's generally centers around the 60-class cards with both higher and lower end cards tapering off around it. The 970 is an exception because I remember those selling for less than a 1060 for similar performance at the start of the 10-series.
970 and 750ti cards dominated that era. I still remember the 750ti being the best budget GPU coupled with a fx 6300 you'd have a $600 build that would play almost anything. Sad that the minimum to pay now is over a $1000 and you still don't get "good" performance on any new game.
@@sampic_ Indeed, modern AAA games are horribly optimized and they now use things like DLSS as a crutch to make them playable (but only on the latest cards of course).
Most people consider mid range, they want to invest not too much or too little, not to think too much on what to choose
What is an average reasonable stuff they can get
So entry and high end have their own customers
That will never happen again. They make sure there are no cheap last gen GPUs left when they launch a new series. Meanwhile, AMD kept the RX6600 in production for 3 years. I think this is why they put 16GB on the 7600XT - otherwise there would be no reason to buy a current gen budget AMD GPU.
@@Lurch-Bot Yeah, and for many people who thought extra VRAM without enough power is useless, the 16GB's unlock room for AMD Frame Generation, 8GB cards, can't do that without VRAM Spillover or crash, 10 is also risky/1080p
12GB is fine for 1080p+FG
Scale that up
Almost half of the top 20 cards are a xx60 model. They also hold over a third of the total market in pure numbers.
Had a 1050Ti from 2018-2023. Never realized it was one of the most used GPUs at the time
same! it was truly a beast
It was a good balance between price and performance - especially for 1080p
It was my crush gpu from 2017 to 2019. Got a 1660 ti laptop in 2020, and built my current 3090 pc in 2021
if my 1080 didn't break on me it would've stayed till 2030
Mine still works just needs an undervolt
Finally retired my 1080ti since it kept crashing on me even with underclock.
I got one off a buddy that "bricked" it and just reflashed the stock bios myself. Basically free, $100. That's about how much i paid for my RX570 (8gb).
I have a 6700xt in my rig now but it genuinely isn't that big of a difference in raw raster performance.
When you start introducing RT or just lots of shader effects, etc. You can definitely tell the difference though.
Still nice to see the gpu i use is still up there which is the rx 580
I rocked the 480 for many years until I was old enough to work
this is a great card!!
Defenitely one of my favorites of AMD and still relevant if u use FSR in modern games
Unfortunately it is on its last breadth.
@@DragonOfTheMortalKombat I mean not really... It can still play the newest games just fine.. even call of duty can run high fps
Just upgraded my 580 to a 7700xt
Soo many unfortunate people in 2014 with intel hd 4000
did something happen at that time? like gpu price skyrocketed in 2014
@@reinn-df2ti intel hd 4000 is a notebook integrated graphic, in early 2010s a lot of people switched to laptops as their main PC
Hardware survey checks if the computer has the graphics installed, not if it's being used, so the number is probably less
In 2014 I have r9 270x lol
I will add to this thread that dual Gou laptops were a trend. I myself had one that switched between nVidia GPU and Intel hd graphics depending on settings (like games using the nVidia one but it being turned off for normal work)
Intel HD graphics really surprised me, it even dominate longer than AMD.
There has been a pre-Ryzen period during which Intel has bribed laptop and prebuilt manufacturers to not use AMD CPUs, hence the reason Intel has dominated AMD in terms of integrated graphics for quite a while.
Source? @@Hardcore_Remixer
@@Josh-cw8by YT won't let me pose links in comments.
Just google for "intel loses us bill on manufacturers bribe" or for "intel bribes manufacturers". You'll get a lot of results.
im super surprised that integrated graphics hold the crown for so long
@@Hardcore_Remixer
Here lies a shitty AMD fanboi
Gtx 970 lasted me 8 years for just 350€ it was crazy
Im still rocking with it to this very day 😂 but granted I dont play the latest games aside from few exceptions anymore and when I do they arent the most demanding of games, like Jagged Alliance 3 for example. But ye I remember how many ppl ridiculed it for being "just 3.5GB GPU and not true 4GB"....well as we all know it offered excellent value for your money and the 3.5GB wasnt really that big of issue. Now the question for me is whether I upgrade to 12GB VRAM GPU or 16GB VRAM GPU next year. I guess I wait and see what the specs and prices are for low and mid tier RTX 5000 GPUs will be. I know its either 4070S or 5070 for me
steam hardware survey is a great view into what OEM's are selling the most of when you look at the top spot.
I'm afraid the lack of competition is the reason why the market is so crappy from the customer perspective right now.
Yes, people should stop going for "their team" and just buy the card that is best for them, no matter if it is green, red or blue. You will also still find people that think you can not buy a AMD card, because the drivers are bad and it runs to hot. They still live in 2014.
Gtx 750 ti: “ look look im winning!”
Gtx 1060: “hold my beer”
It still kinda pisses me off Nvidia has such a dominance in the market. Like sure their cards are good. This is an undeniable fact. But as a business, Nvidia is just as rancid if not more so than Apple. At least AMD tries to not fuck over it's own users every generation.
yep, im trying out the AMD cards (it comes tomorrow) for a bit with the 7700xt before the 5k series comes out. Who knows i might like it enough to stay with AMD (replacing a 2070 Super)
@@NoxNtella I hope you have a good time! I personally love my 7900xtx despite a few hiccups with Helldiver 2 and Space Marine 2, both of which had work arounds and now currently work just fine. The only major issue I can say I had was with the temperatures. I'm not sure if this is an xtx exclusive issue but I found that the factory applied paste was pushed out to the edges of the die. Personally I chose to use PTM7950 and I haven't had an issue since.
Aside from that I can say that the performance is fantastic and the Adrenalin software is tough to beat.
@@notsogrand2837 the software looks really good and can't wait to try it
@@NoxNtellaI much prefer AMD's adrenalin software than geforce . It's so much more user friendly. Can literally just type in the search the particular setting and bam. Also just better features in general imo. Anti-lag, enhanced sync, VSR(super resolution) which is better than Nvidia's (better performance gains)
If you ever want to use scaling in a game. Don't use dynamic scaling or any of that nonsense. Use the in-house VSR from AMD.
Example. 1440p monitor but just barely off 60fps (or whatever desired fps) in a demanding game?
Activate VSR, target 1440p, set in game to 1080p.
You will get better performance than using any other upscaling or rendering bs.
Also you barely lose any visual fidelity. I could barely tell the difference between native and VSR. There is a slight one but you only notice if your standing still and REALLY looking.
If you are confused, just remember Nvidia GPUs own the laptop market.
But in the survey they will put rtx 3060 laptop. But if desktop they will only put rtx 3060.
Just accept it that Nvidia outsells AMD big time.
@@adlibconstitution1609 that is absolutly true and thats why they will continue to do same shit to customers like they did this generation and make overpriced flagship card and castrate the rest and sell them for some funny $/fps ratio while paying game publishers to focus on DLS while doing basicly no optimalization to their games :)
Very typical AMD fanboy-cope.
I see no point in coping... Nvidia is just the leader in GPU marketshare. That's not because of the laptop market.
@@adlibconstitution1609 That is not true, there is not a single "laptop" named gpu in charts, because nvidia's mobile gpus is reported as same as desktop. For example, RTX 4050 and RTX 4050 mobile is same AD107, only with different frequencies.
P.S. Not fanboy of anything, i have mainly intel igpu's and my last used dedicated GPU was GeForce4 Ti 4200 in pre-steam times :)
Time traveler kicks a rock :
Ryzen 4070 most popular gpu in 2024
not even close
Bruh, as a ZTT subscriber I can 4070% confirm this 🤣
It feels like yesterday and I'm baffled by the fact that 770 was used so little. I swear to god my ASUS 770 lasted an eternity with satisfactory performance before making way to mighty 1080ti. Best buy ever.
My experience was the same! Great cards. I got the 4GB 770 in June 2013 brand new. I used it until December 2018, when I got the 1080Ti. In July, I just gave 1080Ti to my wife and upgraded to a new RTX 4080 Super by trading some GPUs to a guy on FB marketplace. Other than concerns around 16GB being good long term, I don't regret any of these transactions.
@@bjyoungblood We bought same cards in almost same time frames. I too bought gtx770 in 2023 and 1080ti in 2018. Except I went for a RX 7900XT lol. No regrets here either.
@@bjyoungblood lmao concerns with 16 gb... 😆
Got the gtx 770 2gb gifted to me in february 2014. 2nd best gpu at the time after gtx 780. Then witcher 3 released next year and i could barely play with low-medium settings. Never mind high or ultra, or hairworks. The gpu was basically worthless after just 1 year. Dark souls 3 released, same shit. Playing with mix of low and medium settings, yikes. Idk how much it costed cuz it was a gift, but prob 300-400.
The GTX 970 was my 1st PC build ever. It will always be a great nostalgic memory. What great times🥲
Gtx 1060 walked so 3060 could run
the suspense was killing me with how the music was building up
The halo music was perfect for this vid 👌
I just love how the nVidia 10x0 series just wiped the floor with everything and managed to stay relevant for so many years. Hell, the 1080Ti still is a great card for most of todays games.
i'm amazed how low the 1070 in comparison is
the value of power to price was insane
So it was with GTX 970 already and it actually managed to (rightfully) claim the number 1 spot and hold it for some time. Im still rocking with it to this day 😂. Now the matter of question is to which to upgrade finally next year...RTX 4070S or to the upcoming 5070. I guess in January Ill make my decision when Nvidia reveals the details for 5000 series
@@Balnazzardi how do you manage the last 0.5 GB being a perma slow?
btw, my bet is that the 5070 will be 700$ and be as fast as the 4070ti
@@sanji663 Never caused any issues for me, atleast the kind of that would have made me curse about that "issue"...in my honest opinion the whole "its only 3.5GB GPU" was lot of noice for nothing. Then again I never really played really system heavy games with it to begin with, so it suited my needs all these years just fine. Anyhow I hope Nnvidia has enough sense to give 5070 better price-performance value than that, but if not then 4070S it is for me
I was riveted by the highs and lows of the Radeon (Other). Struggling to stay afloat over the years, the mysterious collective of GPUs working together to blemish the sea of green kept me rooting for them, despite knowing the outcome. Bless their souls! Don't drown yet!
it's me and my 1050ti since 2018, this baby is still holding on.
Admit it It's a dead card. I have a 1050 ti since 2018 too. It can't play Starfield, Bodycam, Hogwarts Legacy. I will buy new system probably Rtx 4060 or 4070 super or 7800 xt 7900 gre I'm not sure for now but I'm sure I will buy new gaming computer
@@ene4602why would you want to play those games anyway.
@@ene4602You sound like a quitter.
@@drumyogi9281 couldn't be me you toilet sitter
@@theancientone1616 You can even have gt 710 or r7 240 with this brain what a logic
3:19, my beloved GTX 970 appears! Still rocking with it even after a decade 🤣 Next year gonna finally have to upgrade though, whether its to RTX 4070S or upcoming 5070, I dont yet know. But GTX 970 has served me so well these past 10 years, the last more recent game I played with it was Jagged Alliance 3, released last year. Ofc its starting to show its age, but still managed to play the less demanding games even 10 years later 😄
I’ve spotted one thing… current gen mid range GPU is using between 200 and 300 wats when 10-15 years ago that amount of wats was for top range GPUs…
Another thing, old mid range GPUs like 760-1060 were much more powerful in compare to top range than todays GPUs who’s more like old low end… stagnation and power consumption nowadays are insane.
200 and 300? There is no midrange GPU that consumes 300 watts of power, unless you count rtx4070ti as mid range and even then that one only consumes 285watt, if anything RTX40 midrange like 4060 and 4060ti are very good with power effeciency
@@LampSteak 4060 series (Ti included) looks like it should be called 4050 / 4050 Ti. About 200-300W for current mid range, I was talking around that, not specific like for example 299W is midrange, 300W is flagship, 301W is titan class... My RX 7800 XT should consume 263W but it's overclocked version and it's around 280-290W. RTX 4070 series (in my opinion should be called 4060) is very efficient but it's from 200W and above and overclocked 4070 Ti Super can easily go above 285W co still they are in Wat range (200-300).
Those are current mid range cards. They are 2-3 slots constructions, 10-15 years ago they easily could be in size and power consumption of flagship constructions.
ps. Some 7900 GRE and 4070 Ti Super, overclocked versions, can consume even 300-330W in some benchmarks or for short bursts in games when needed.
@@WujoFeferanyway 4060 is at least "fine" and ok, and noone should be shamed for getting one of those. Still "midrange", and xx50 cards are just cut
@@isadora-6th no one should be shamed because of what he have in PC however companies like Nvidia should be shamed for what they are trying to do. In this gen distance in performance between top 4090 and xx60s class cards is the biggest. Even hardware unbox did video about evolution performance in different range GPUs through different generations. In their video we can saw why 1060 was so much success and 4060 is often called 4050 class cards. Even Nvidia tried selling current 4070Ti (12GB) as 4080… they are trying to make gap between different GPUs class wider so they are renaming slower cards and trying to sell them as more powerful ones.
@@WujoFefer my 7900xtx hit 425 watts
Me with 750ti 😎
I still have my 1060 collecting dust you did a great job bud served me well
Nice video dude. The music was on point
Makes me wonder who these "other" are. Matrox? S3 Graphics? 3DFX?
SiS Mirage and a couple S3 products had 0.2% ~ 0.5% shares that died off by 2020.
@@RyanBGSTL who
that one mf running dwarf fortress on a voodoo card
@@RyanBGSTLI can't believe anyone would buy an SiS or an S3 graphics card or use their integrated graphics after the 1990s. Wow!!!
A VooDoo Banshee or higher was a great card in those days.
I still remember my first 2D graphics card. Then I bought the 3D accelerator that I had to patch in via cable. Now you just plug your card in and it's done. There's no fun to it. LoL!!
@@Twitch_Moderator these Chips were built into SiS and Via chipsets in the early to mid 2000's. They probably fell off when Steam dropped support for Windows XP.
the GTX 1060 6GB was genuinely good, it's not just because of mining.
i don't use mine anymore but i gave it away and its still able to play badly optimized games like Hogwarts Legacy on a 1050p monitor (and 1080p for any other game), it was just that good.
the Pascal entire generation was the biggest jump of price/performances that ever existed (i'm talking about the entire gen, not just one GPU like the 4090 is).
since then the market is stagnant, NVIDIA doesn't give enough VRAM, their "budget" GPU's just recently went back to the normal price (accounting for inflation) but everything from a 70 to a 80 is severely overinflated, and the 90 is now an AI GPU that's not even meant for gamers anymore.
oh and AMD still doesn't exist because instead of selling cheaper GPU's, they went the greedy way like NVIDIA by following their prices very closely but unlike them AMD doesn't have good tech behind their GPU's and now they announced they're stopping the high budget GPU's.
Kind of puts Nvidia's dominance into perspective...
Radeon HD 4xxx oh boi, the inferno 100 degrees and Ati drivers era.
i built my first pc with a sapphire hd 6870 in 2012, so nostalgic to see it in the list
How was the life span of the Sapphire GPU ?
@@ramgopalraikwar1420 I stopped using it in 2017 when I bought a new pc, but its still working actually
And what did we all learn? No 90 series card from NVIDIA in sight and the only 80 series on the board was the 3080 at a whopping 1.84% market share (7:27).
Judging by the amount of hype the new generations get on TH-cam you'd think everyone would be rushing to buy them, just like with aaaaaaall the leaks going around with the 5090!
People get hyped. But that doesn't mean that everyone prioritizes $1,200 or more to just a video card.
Maybe most people like to travel, smoke, drink, party, finance new vehicles, etc.
5090 will be a beast 🎉
1080 got up there
What's sad is how even the rx4/5xx were never even close. I thought Polaris sold well. Lol!
Only guy I knew using 3090 only does flight sim, most people gaming setup are middle-low specs, people aren't willy-nilly buy new GPU just because its new, most of us are broke AF.
Most of us have families to support and can not be buying 4090's even if I could afford an rtx4090 I would still not buy it. I am sticking to AMD cause I prefer using 1 brand for my pc when it comes to gpus and cpus 🥰 no fanboy here sorry hehe...am looking forward to the rx5000 series of cards then upgrade from my Sapphire nitro + special edition to an rx8700xt or whatever the cards will be called then @@Setupthemabomb
good music choice
1050 ti enjoyer here, had no clue back in the days but what a solid product!
I still use it to play most game at 30-60 fps with low to medium graphic right now. Just OC it to 1823MHZ and undervolt a lil bit and see the magical happen
My GTX 1060 6GB is still running perfectly fine after 7.58 years. I paid 287 euros for it back in 2017
I rocked mine from 2016 to late 2023.
God-like card, honestly a legendary piece of hardware.
the RX 580 just refuses to die. i'm still rocking it in my rig. thinking of upgrading into rx 6600 or a 3060
I upgraded to a 6600xt
6750xt is amazing price to performance
Still rocking my 970, newer rigs have 3080ti and 4090, super cool charting.
Nice to see my 1660 Super still hanging in there. 😊 But it's probably a time upgrade soon. If I played any new games that is. 😅
I had two 7600 GS in SLI with my Core 2 Duo. Then I got the 8800 GTS LeadTek 320 MB. I paid around $320 for it back in 2007 I think.
2013 : radeon HD79xx
2023 : radeon 7900xtx
The one AMD enthusiast in the comment section that I had searched for.😅 Cheers! Let Radeon be alive in 2033! 🙏🙏
I alway bought AMD because its was cheaper most of the time
ATI Radeon 9600 Pro 65€
ATI Radeon X1600 AGP 125€
ATI Radeon X1650 PCIe 100€
ATI Radeon HD 2900 350€
ATI Radeon HD 4870X2 400€
ATI Radeon 5850 mobile
ATI Radeon 5870 320€
AMD Radeon 7970 440€
AMD RX 580 200€
AMD Vega 64 250€
AMD RX 6900XT 910€
I expect the next GTX 5090 to be more than 3k$, which is more than all my card I ever owned in my entire life, let this sink for a minute
@@arthurbesnard1536 you keep track of your computer purchases.. oooof 9600 Pro 65€ these were the days. seems surreal nowadays to pay this little... reminds me of the day when i upgraded the gpu on my parents PC for a geforce 4 card and the pc didnt start. spent 50 € or so for it back then. O.o
The difference was my HD 7950 was a _lot_ less expensive...
I upgraded the 2080 to a 7800XT for half the price of a 4070, and I'm not looking back. It uses as much power as the 2080, it's cool, it gave me 50% extra performance and it doesn't require it's own nuclear reactor to run. Although I'd have gone for the 7900GRE if 7800XT wasn't such a steal.
Enjoying games in native 6k with 7900 xtx. Love the fact you can have high end performance for under $1000 today.
april fools was a few months ago dude
Everyone had the gtx 8800 and it was the flagship card. Imagine everyone being able to buy a 4080 now
That's because the 8800 had like 10 variants at different prices
And they didn't use "GTX" they used GT, GTS after the number
I'm proud of my 3060 and 1050 ti
Radeon HD 5850 and 5870 the GOATs
I am so lucky to have hd5850. I knew recently about it from tech power up. Hd4870 1g was nice and hd5850 cypress must be a goat. Double cores double performance.
@@airmicrobe Had a Sapphire Radeon HD 5850 Toxic myself. Great overclocker and probably the most unique looking card I've ever owned.
@@beirch i had a r9 290x which is basiclay the same card it was a beast
Wow this just showed me so much and put so much into prospective. I'm shocked the 1080ti never made it to the top of the list or that none of the 90 series has ever made it to this list. It's amazing Intel once held the leading position!!! But what looked like Intel GPUs being popular was solely because this was divided by model types, Nvidia still at that time had most of the overall market despite Intel having the most popular model. The Radeon other you see at the end is likely all the new handhelds. Just some really interesting stuff can be learned from studying this.
Total leather jacket dominance.
note that prior to 2010 ... the top card for the longest time was nivida's high end card... around 2010 -2012 this started change when nivida really started jacking their prices regularly every generation.
It was 599 USD for the 8800 GTX (2006). It was 499 USD for the GTX 285 (2009) and a GTX 480 (2010) and a GTX 680 (2012).
A GTX 980 (2014) went for 549 USD and a GTX 1080 (2016) went for 599 USD.
So, an Artx 3080 went for 699 USD in 2020. That is a great price.
So, the prices stayed super low and constant for MANY ages. The only time prices sky-rocketed was the 4000 series. When performance jumps were finally above 25%-36% rasterization.
@@Twitch_Moderator the only thing consistant with their prices was that their top end got 50 to 100 bucks more expensive every other generation , inflation didn't move that fast. i lived it dude i saw the prices creep sure nivida had a spike witht eh 4000's series but they were consistantly raising prices every other generation of card
the top tier card of 2000 (the geforce 2 ultra) went for 270-320 bucks (about 550-650 today) got to keep in mind the 1080 came out before the inflation boom of bidenomics . and even at 599 it was well above what inflation dicated it should have been in 2017... and the titan .... that thing launched at $999 meaning in 2017 the most top tier card was already 3x more expensive than 2000's top tier card.
the 1080 had nivida actually followed inflation should have been around 350-400 bucks while the titan should have been around 600-650 .
so stop the bull shit . i'm not bashing nvidia hell i got a 4080 super myself . but you can't defend their pricing or make up stories about some kind of consistant pricing . the only thing they have been consistant with is steadily raising prices above what inflation would call for. the 4000 series pricing was a whole nother level of "f*ck you" to the consumer though that i xo agree with you on.
Yeah, as we can see, the vast majority of the GPUs are low to mid range, AMD strategy of stepping off the high end GPUs race is a good decision.
Also the 3060Ti is probably the best GPU on the current market
Man the 1060 was my first GPU in 2018… it was a legend, sold it to my friend upgrade to a 3070, then to a 3080, and lastly a 4090. Times have changed definitely… but that 1060 got me through some really hard times and it worked super well.
just upgraded my 1060 after 8 years to a 7900xtx. the 1060 still runs perfectly just wanted to play newer games with reasonable fps on max settings
Then u should have gotten a 4080 super, big mistake.
@@kerkertrandov459 nope. only idiots think this
@@kerkertrandov459 nope. runs exactly how i want it to. And i also use linux on my pc which nvidia is horrible for
@@specialk9762 linux is horrible for gaming. I spoke to a graphics developer and he told me linux is 15 years behind windows on gaming. While valve and their proton and gamescope is investing lots of money to change that, for the time being linux is very behind. For example, when u play games on linux, whether with vulkan or openGL, there's no flip model presentation on the swap chain, because that's a feature only directX supports (which is proprietary to microsoft and windows), vulkan and opengl dont have an equivalent to the flip model, which is a very big deal.
@@specialk9762 well youtube deleted my comment, whatever. But u should know linux has no flip model.
Radeon HD 48**, 58** and 57** vere definitely cool for their time. I had a Radeon 5770, 5830, 5850 and 5870 on different PCs at home at the time.
I had Radeon HD 5770 as well, the only AMD GPU I ever had actually. Before that I had PC with Geforce 3 Ti 200 and afterwards GTX 970...as you can tell, I dont upgrade my PC that often 🤣 next one is propably gonna be RTX 4070 Super or 5070.
I still have a 1060. Upgraded everythign else in my system, just saving up for a rx7700XT
why would you buy an amd card💀💀
@@potato-km6iwBetter rasterization, better price to performance, better company that's trying to put out products that meets consumers where they're at instead of making demands for more money they don't need. Unless you're getting a 4090 or 4080s, there's no reason to get an nvidia gpu
@@tumultoustortellini April fools was a few months ago
@@potato-km6iw Why are you dodging and taking the piss? Got anything of value to say, or nah
@@tumultoustortelliniEasy own
i still have my gtx 560ti :D when my 3090 died during the gpu crisis i had to pop it back in for like 6 months because i couldnt find anything else.
1080ti i7 3770 and 32ram here! Resistance!
Nice. You'll be able to take that GPU with you if you ever get a new system. 3770 is MAJORLY holding the 1080 ti back 👍
Long live the 3770!
I'm currently abusing a 3770K @ 4.5GHz all-core, and a 3090 Hybrid.
Only 16G ram, but it is RipJaws V @ 8-8-9-24
I7-3770k and GTX 690 here 😂
1. Nvidia absolutely rules the GPU market
2. The Halo soundtrack is so instantly recognizable and so good I could’ve kept watching this for an hour
3. Please give me more of the Steam Survey done like this!
GTX 1060 be like: 🗿
As 6700 xt i dont care to green costume just editing and playable on every software is great to me
2016 until last month i was using a 1060
4:00 sad how many brought a GTX 970 4GB with gimped VRAM(3.5GB) when they could have got a R9 390 with better DX12/vulkan support and 8GB VRAM for the same price.
I was using a 980ti at the time with 6GB of VRAM, that card lasted me a while but even that was struggling with only 6GB of VRAM, i remember thinking surely 6GB VRAM was enough, only when i upgraded to Titan XP i saw games was using over 6GB of VRAM and then it made sense why i'd suddenly get lower FPS in certain areas or games, VRAM is very important and windows doesn't always report its usage correctly.
Idk man. Many of us 970 havers were satisfied owners. Card performed admirably for the average 1080p gamer.
@@pranayarana1958 you'd of been satisfied no matter what you brought because you wouldnt have known better, i was using a 980ti back then with 6GB of VRAM and that was good until it started to hit the VRAM cap so i can only imagine how much worse it would have been with gimped vram, but thats the point with a card in mid range and especially so with that card, you'll want to upgrade faster, anything in the 1000 series would have seemed like a big upgrade.
i am wondering where is 6800 xt. one of the best cards with price/performance ratio
It's because the average person buys Nvidia by default because they don't do any research
@@tobeymaguire647 yeah it can be + I remember that radeon 6000 series were uber expensive when they were released because of covid and chip shortage. I remember I paid 1650 EUR for brand new 6900XT. If prices were normal, I believe this graph could have changed
@@tobeymaguire647a big reason for why nvidia is so dominant is also the fact that nvidia has pretty much zero competition in the laptop gpu market, I couldn't find a single laptop in my country that has an amd gpu.
@@tobeymaguire647Amd has less factories all over the world resulting in higher tax in more country/market, making it lose its strongest point, price. And honestly people won't care if they lose 5fps when they upgrade from a 2060 or 1060 to 4060 or even 4070 instead of an rx7800xt. It is just so ideal with their lower power consumption, while being of the same price if not cheaper than amd outside of the US.
Getting an amd cpu is better right now, if you have a goal in mind for your specific build, amd gpu can fill that gap. But the majority of pc are built for games and light work, how many people do you think need 24gb of vram 😅.
So sure, they are better in some cases, always better in raw power, but just more hassle for what an average person needs that they can find in nvidia.
Love this VIdeo, watched from beginning to the End.
SIck music choice makes it so EPIC !
My GPUS: ...4850HD -> GTX670 Windforce -> GTX 1060 Gainward Phoenix Edition -> (upgrade soon). 👍
Most of the 4060 gpus are likely from laptops since they have great deals
The laptop version of the 4060 is quite different from the real 4060. Grouping them messed up the statistics quite a bit
currently 4060 desktop has 4.44% 4060 laptop has 4.24%
@@tobeymaguire647Isn't the desktop 4060 like 2% faster than a laptop 4070??
@@falcon_224according to techpowerup, 4060 is almost 20% faster acually
The desktop version of the 4060 is the same as the laptop version since they both share the exact same chip.
If only people understood then gems found in used gpu market.. instead they pay more for worse performance.
With how much tech youtubers yap about AMD, i expected more, and certainly not the RDNAs 1 through 3 being grouped into "other"
They yap bcz you get much better bang for buck with AMD but you cant un-stupid people buying gpus so thats that... As for "other" its probably integrated amd gpus, but not sure
@@rogoznicafc9672 I got the rx 6800 and it is incredible now that they released fsr 3 into cp 2077 i play ultra rt 1440p high settings and get 60-80 fps and the drivers are flawless people are living in the past forsure, the card only cost 370 new currently.
AMD is for people who know what they're doing. Nvidia is for sheep 🐑
According to your stupid ass
The 3060 and 4060 is better than the 4090 because its above it in the list
@@tobeymaguire647 I might be completely wrong here, but I think AMD makes up a much larger proportion of NEW sales than the steam charts would lead you to believe. The nvidia stats are just boosted by older gpus being used in internet cafes in China.
All those entry level people at the top are going to be real disappointed as games continue to be increasingly more demanding the way they have been lately.
Halo 3 warthog run 🗿🗿
ATI with 48XX went out with a bang. Amazing GPU's.
In the end AMD is nowhere to be found. NV is absolutely dominating.
Get ready to pay monopolistic premium prices for subpar performance. 4060 style.
@@moravianlion3108 Oh, that`s why the 4060 is so popular.
@@brugj03I think it's because it's included with prebuilts and Nvidia is a buzzword that non techsavvy people know
Things like cell phones, TVs, computers, etc, you are always upgrading or replacing. So, why complain?
Cell phones cost $1,000+ now. You either pay off 75% of thar price over a 2-year contract, or you pay it all up front.
So, why complain about paying $1,200 every 2-3 years for a graphics card? (which will being you MANY MORE hours of fun).
@@Twitch_Moderator Don`t bother they are complaining because they are seeking attention.
In the end they just upgrade.
people really be feeling like winning coz their card has the biggest graph here :D
WELL DONE!
Amd really got reduced to other
Do you know what is this "other"???
@@OmnianMIU Yes. "Radeon (other)" was used to represent the sum of all the AMD Radeon GPUs, including the RDNA 1-3 ones such as RX 5700XT, RX 6800XT, RX 7700XT and so on.
There just aren't many people that would rather miss out on mainstream features that Nvidia cards offer just to save a couple of bucks. It almost makes NO sense to buy an AMD/ATi card unless the hate in your heart consumes you and you despise Nvidia.
Ie: gaming performance of a 7900XTX vs a 4080 SUPER. It is $50 cheaper to get a 4080S, and it is 17% more powerful in gaming than the 7900XTX. Yet, some people still hang on for dear life to the AMD card just in spite of Nvidia.
nope, only older ones. those gpus together have a few more percentage, but dont show up because its too little @Hardcore_Remixer
@@Twitch_Moderator Honestly what you have written here is absolute lie. Atleast in my country 7900XTX is at least 30% cheaper than 4080 super - 1000€ vs 1300€ and it's actually a little bit faster than 4080 super, I really don't know what leads people like you to lie like that, in the long run lies like this hurts ALL customers, since as we know, many people will straight up believe everything they read. Oh and in Raytracing it IS slower than 4080 super, for sure, however it's still at a 4070 Ti level, which isn't too bad.
Also what mainstream feautures are you talking about? FSR and XEss is very good, FG2 is also very good. Only one thing thats missing is CUDA, but thats hardly a mainstream technology for gamers.
This is actually really insane. My lens into what cards were so popular did not reflect AT ALL what was actually being used. The 1080ti was hardly on the board. I thought that card would STILL be somewhere on the chart even today, but it was practically irrelevant compared to the renown it's gotten over the years. This chart was dominated by budget cards. Guess I can't be surprised
I’m actually surprised that Nvidia has that much more of a market share. Amd is really good. Hopefully with this next launch, they get a bit more attention. Their CPUs are doing really well though.
Radeon gpus was, is and will be garbage as always
@@OmnianMIU proof? Price to performance is way better. I got a 6950xt for $500. Rubs as good as a 3080 ti.
@@deanguy all trashdeon are obsolete, see benchmark of new games, end
@@OmnianMIU bros just mad he spent 20% more for an Nvidia card for the same performance
@@deanguy 20% more? Same performance?🤣🤣 ignorant
I bought the GTX970 in 2015. Still works fine in the spare bedroom in the system I leave in there for when friends stay 9 years later. I used it actively playing just about everything until the RTX 3060 12GB came out in 2022 which is my current board. So I guess I made the right choices :)
im still rocking a 980ti rip
It is usually a great oc card. Try a 100 on the core and you are nearly up in a 1080 performance in some titlea
Last gpu capable or running xp, you're gonna have fun with a retro machine
video starts and shows us that in 2008 most gamers rocked the 80s cards more than the 60s . What a great decade to be a gamer that was.
AMD didn't spend billions of dollars developing RDNA just to be sorted into "other"
Its all just the infamous steam survey - probably not even an accurate representation of reality ... did you expect something more?
@@WaspMedia3D conspiracy theorist AMDemential fangirl crying detected 🚨
No, other isn't RDNA based gpus, because if you go on steam gpu charts, every single RDNA gpu has right name like rx 6700xt. I don't know what is "radeon (other)" 😅
@@OmnianMIU TF you on about? Lol ... ok then. BTW what is a therist?
Yes, they did.
The reason they are represented as "Radeon (other)" is because their individual GPUs wouldn't even be on the chart due to low market share, so they were all summed up in one category.
Just look at the market share.
Insane to see Intel HD graphics hold the top 1st and 2nd spots for all of 2014.
But then again that was also literally me during that time. XD
7:02 the music intensifying as the 3060 takes off
Glad to see both of my GPU in the race. Older GTX 1660 Ti in the race with RTX 3060 12GB winning 😎👍🏼
I was using hd graphics 4000 for 5 years!
hi, what happened in 2014? why many used intel hd at that time.....
@@reinn-df2tiit's an iGPU that came in Intel desktops and laptops, most people unknowingly used it and didn't even know what it was
The RX 580 was just hanging around quietly within the crowd. "Just happy to be here."
Guess I need to get off of Steam with my 4090
bro really spent over a grand on a graphics card?
@@M-Rayan why not, I’m not poor
Inb4 someone calls you a Nvidia cash cow and berates you for "wasting" your own money
Asking if people really spent over a grand on a gpu is the dumbest question ever… don’t we all spend over a grand on a phone every year? Lmao
@@definitelyfunatparties you must be sick in the head if you do that. last time i bought new phone was 7 years ago, and new ones arent almost any better
1050ti, rx580, 2070super, now a 3060ti and one day a 6090 :-)
and all the cards play well to this day, its been a good ride
i'm not understand these nVidias hype... they drivers is the worst in functionality ... :(
After somebody use an AMD card, will notice how mutch better the AMD cards control panel. The drivel stability is "just fine" (not worse than nvidia in these point... around a same)
Nope
It's because most people don't care about actual facts and just believe whatever their uninformed friends tell them, so they end up with a much more expensive card with features they most likely are not gonna use. It's just ignorant people echoing misinformation that they read on reddit and they have for some reason made themselves experts in Amd drivers even tho they never owned a Radeon GPU. You can easily tell by their replies, cause they don't know what they are talking about.
@@NulJernWhat you said is correct but another big reason for why nvidia is so dominant in the gpu market is the fact that they have no competition in the gaming laptop market.
Nope. Nvidia's gpus are better all around. You talking about the layout of the driver settings of nvidia driver, there's the new app now. Please the stability of the driver AMD is garbage, and not mentioning the update suppprt for the drivers! 2 or 3 drivers during the year for radeon. For nvidia gpus there's 2 driver versions per month 🤡
@@OmnianMIU 2 or 3 driver per year for a radeon? wtf ... stable version it is come in almost every month (very rare, when not come a new driver in a month, beta come even more often, but i thing "not worth it"), lol ... but.. mostly i replace driver twice a year, because not neccessary to replace.. just works. will be verry bad, if need to replace to often a driver.
You definetly not used a radeon before... this "driver often driver release for nvidia excuse was around a 20-25 years ago to... it didn't make much sense then either". Of course, S3, Intel driver release schedule was bad.. but Radeon (in ATI time) it wasn't a problem back then to.
And .. those different app.. just why use a different settings program in nvidia for game profiling? Just "to junk" solution.
man seeing the 580x making it up there was gr8 ive had this card for over 5 years running games in 1440p and it still kicking hard amazing gpu from amd