@@selohcin It really does go strong. As long as you arent trying to run any 'modern' game at the highest settings it can run most stuff in my experience above 120fps
I bought mine on release at a retailer in my country (back in august 2017 for a whopping 899€) got the "MSI GTX 1080 Ti Gaming X 11G" still using it to this day. I've started to realized over the years (especially during the start and end of all the crypto shenanigans) how lucky I was to pick one up when I did (was my first flagship GPU too).
0:15 it still perplexes me how the second slowest model of the 30 series (3060) has more vram than the 3070 and 3070ti, they should have all had 12 gigs.
My guess is that Nvidia did that intentionally: They knew most 3060 buyers are new people in the PC community or just don't care about PCs and the 3060 looked good enough, they wanted the people to grow and love their 3060 (assuming that it's the first time they got a proper gpu and we're previously stuck on integrated graphics) Nvidia knew the 3070 buyers were the real enthusiast, cuz no ordinary person is willing to pour more than 300$ for a pc component. They knew that the 8gb will have them upgrade and give up hundreds to something more expensive so they don't run into the same issue soon
A reminder that GTX 1060, 1650, and 2060 are still to this date in the top 5 of most used video cards on Steam 😅. Maybe in part 2 we'll see more RX cards!
@billted3323 2060S is bare minimum I would say. Edged out my 1080 at least, not to mention everything else like dlss and FSR. Wait for the new super series, amd might even lower prices! 7700-7800 XT are great performers. 3060 12GB edition on the cheaper side. I'm personally avoiding RX 6000 series since they don't handle AI as good as their new family, which might come in handy in 2024 and onwards.
I got a 1080ti for $800 when bitcoin miners were eating them up in 2018. It was an absolute monster that could run most games on ultra settings 1440p at the time. I just got a 4090 a month ago, I thought it was time for a new build and I'm happy so far.
It's one of the reasons why I got that card. An 4080 would still be 30% more expensive. And for the price of a 4090 I could build a complete gaming pc with rx 7900xtx lol
Has i Say to another. Is a Bad GPU. In his time the SO it was Focus on CPU not the GPU. When the SO start requiere more GPU 1080ti it was pretty behind. A actual 4060 is around 300 dollars, use so much less energy, and have around 7/15% more of power. The modern lauch price of a 1080ti is around 950 dólares, without shipping and taxes. So yeah, it was a Bad GPU. Just people love the nostalgia. The only good GPU in that gen was the 1060. I gist my old 1080ti to a friend in COVID. I don't hesitate Even s little. It was a terrible GPU.
@@rio-dq9sc it was an amazing gpu. Before the 1080ti the only way to get this type of performance was through multi-gpu like sli or crossfire. It was a miracle when I went from trifire 7970 to crossfire. Then the 1080ti came and evryone called it was expensive but it was only 120 euros more than the 980ti apon release. And the 1080ti was blowing away 2 gpu configurations. Nobody liked having to use multiple gpus it was costly and innefficient but for some things it was the only option at the time. 1080ti changed all that. It's silly to compare it to a 4060 Wich came out many years later. And still "7/15% more power" is a joke comparing pc parts that are so far apart in age.
Went from a 3070 ti to a 7900 xtx and never have I felt so scammed in my life by nvidia. This thing cost less than my 3070ti did at the time, and has so much more performance it's not even funny. Yes yes I know, it's been 3 years. But still, what the fuck was nvidia thinking with 8gb vram?
Absolutely spot on about the RTX 3070. It could have been an anomaly from NVIDIA like the 970 or 1080 Ti if it had 12 GB VRAM. NVIDIA ensured that did not happen, so that they could sell a 4070 2 years later.
True, but that being said I love my little 3070. More vram would have made her a God tier gpu. But I'd like to quote the most devistating friendzone line one can be told because I feel it's very relevant here. "don't let what you want ruin what we have"
I think people often forget that if a driver update gives a game 120% more FPS, that probably means the game was functionally broken before the update. They didn't pull out that performance increase out of thin air.
older games ? hmmm ? i play recent games and the GTX 1050 Ti works, well, if you put graphics on max, ohhh yeah, only "old" games, but if you a guy play games only for the gameplay and not for the graphics is a lot cool games works well in the GTX 1050 Ti... the new generations of gamers are more focused on graphics, the majority of gamers who started playing games in the 80s don't care about graphics, when I buy a game I buy it for the quality of the gameplay and not for the quality of the graphics.
@@TTHIAGO666 Yeah, I agree entirely with you. I've been having a blast lately playing Detroit: Become Human with my ultrawide 3440x1440, and it runs well at 30fps. The most important thing is the fact I'm enjoying the game.
@@TTHIAGO666 I feel the same way with a 3050, albeit it's the laptop version and it can run a ton of games at high settings, I don't really care about that tho I only want a good story and good gameplay. Currently going through a racing sim arc and it runs better than my xbox one so that's a huge improvement imho. I only ever care about graphics if it seriously improves the experience, and non of the games I play do that so it doesn't really matter
As a 1080 ti owner, I find this a very accurate representation of what it feels like to own the card. (honestly thought I got it for 250$ and its amazing 10/10 graphics card)
As a proud 1080ti owner with 10 year warranty(4 years to go) I totally agree. I never had a gpu this long. The combination of the most awesome performance to price combined with upscaling technologies like fsr make this gpu hold much longer than it was supposed to. Not in my widlest dreams I would have guessed this gpu would kick ass for so long... And the warranty certainly helped, I already had one rma.
As a 3070 user, it's a tragedy that a card this good only has 8gb of VRAM. FFS, they put 12gb of VRAM on the 3060. Why couldn't they do the same with the 3070?
Some crap to do with bus widths, also excluding 4 gigs of ram saves NVidia like 10 bucks per unit, I didn't know they were that petty but the cards were selling fine so I don't think they mind all that much people got screwed over.
Wouldn’t have done much. There’s a 16GB 3070, and the games it was tested with didn’t even try to use more than the 8GB the stock card had anyway. Only one or two games actually tried to use more, and they had to be cranked and, in one case, use custom shaders, if I remember right.
Nvidia wants people fall for this bait. They made a good product, but always make something a little off, so you can buy a newer one as soon as possible.
For an educated answer: band width. It has a 256 bit bus. Each memory chip provides 32 bits. If u divide 256 by 32 u get 8. So u can put 1gb chips for a total of 8gb or 2gb chips for 16gb. Nvidia chose the cheaper (even if its pennies worth amounts) option. 3060 has a 192 bit bus. Divide by 32 and u get 6. So same trick, either 1gb or 2gb for a total of 6gb or 12gb.
I told my dad that I needed a pc so he got me one with freaking integrated graphics even though we had a budget of $1700 and worst part was I had already spend my time after school for a whole 2 weeks researching on all the pc parts and building tips 😭😭
I just gave my 1080 ti aorus xtreme to a friend for free. I couldn't sell this piece of art. It's priceless. The last perfect and honest gpu Nvidia made. I could even play at 2K without rt on with decent fps after almost 7years. Upgraded to a used 3090 ti from MSI for just 600 euros. I am happy with my choice. Hope this goes well!
Long story short: I had GTX 980 4G founders... and it died :( My Pc is quite small and only accepts blower-style GPUs and has around 180W TPD limit for the GPU. I ended up with RTX 3060 12 GB. At that time I was like.... oh, G, why will I ever need 12 GB of V-ram ? ...And now I know that it was a good choice lol.
@@akonako364 Oh, I have a 13" laptop with that thing, I got it new for 200€ (Chuwi laptop 13" 1440p, backlit keyboard, 12GB of ram, 256GB ssd, nice trackpad, good battery), actually quite an amazing laptop, one of the best things I've bought considering price to performance. Of course, it doesn't come close to my gaming PC, but I was actually able to get minecraft java 1.20.4 running with 80 fps on 6 chunks, all minimum settings, default texture pack with the help of sodium (1440x900 resolution though, that thing barely runs google chrome in 1440p so of course I wouldn't use that), but the laptop isn't even that bad for the price... oh, actually at 2560x1440 I got 45 fps in minecraft bedrock on 10 chunks so that's a win
Nah fr tho even after owning the 4090 for a few months now it still doesn't feel like it's worth its price for the performance you get at all, planning on selling mine soon
Ngl the A770 that I bought has actually been pretty bug free. Only issue I had was that the first two times using it, I needed to switch my hdmi port. Never had to do it since then though so might not even have been driver related :/
Give them til the end of this year when the Battlemage cards come out, the drivers should just about be perfected and if the Alchemist cards further drop in price, they will unquestionably be the kings on the budget end. With an A750 you already get better performance than a used 1080 Ti at a similar price and it comes new with 3 year warranty. Plus ray tracing which may even be usable depending on game. For tasks like video editing there's just no competition in this price range, for years Intel iGPUs have been better than any cheaper Nvidia and AMD GPUs, Arc takes it to the next level.
I hope my card will age that well, but these new releases have me doubting it will be much good in 5 years. Fm2023 is sub 60 fps, cityskylines 2 Is barely over 30, Alan wake 2 is barely over 60. And I'm probably already going to stick to 1080p to because of it. And it sucks cuz I had bought it for 4k, but back to 1080p or 1440p for me.
The AMD one is actually accurate because on linux due to reasons people are hating on NVIDIA(even more than it should at times..) due to rather poor driver support and rather AMD's will just works and is the best for desktop and gaming, xp.
Yes, but you can only run games on it because you're a second class citizen. You wanna do something more productive? Maybe some AI stuff, photogrammetry, programming? AMD is useless for that, your only choice is Nvidia, even on Linux. Also, the drivers are not that bad. They are just closed source, but they work well enough.
@@ThePortuguesePlayer Yeah i know that, both are pretty much capable of remotely doing the same things, NVIDIA will be a lot better with CUDA than AMD with ROCm, but if only for desktop and gaming AMD will be a better option, still i am hopeful for NVIDIA, hope the work for NVK gives fruition and so on, its not bad but they could do more and not be all that much..
AMD does just fine with most stuff you've mentioned (unsure about AI), because as far as I'm aware, AMD GPUs have no problems with photo editing, and GPU doesn't even affect programming, so I don't know where you got that one from. Also, when I was still using an NVIDIA GPU (RTX 2070), the drivers really are that bad, I nearly lost it with the amount of issues I had getting those god forsaken drivers to work properly, with uninstalls, reinstalls, band-aid fixes I found on fourms, display server compatibility issues, etc. I can't guarantee the opposite for my new AMD GPU though, as I haven't tested it on Ubuntu yet, just Windows, due to the fact I haven't really needed to compile any Linux only applications recently, but I have heard that it is much, much better for AMD on Linux vs NVIDIA on Linux. Also, I haven't actually heard any problems about AMD on Linux other than OpenGL being subpar, and that's not a Linux-exclusive thing.
@@sonicboy678 You spend more money to game on a small screen and overheat more because it's a small plastic box and you want to get mad about it. Keep crying. 😂
I remember I was free for a week last year and decided to try out Yakuza Kiwami 1 on my potato laptop with Intel hd 520. I just wanted to see what the hype was about. But the game was so good, i just couldn't put it down. I played and finished the entire game with 15-17fps. That shit was the most fun I had with a video game ever since Shadow Fight 2. Now I am waiting and hoping to get a decent gaming laptop to play the second gaming.
Maybe a bit old, but I played Mass Effect 1 in 2009 with a potato laptop, and at low settings, 320p and 15fps that thing looked like a ps1 game. But when you’re young and don’t have any money you’re happy with 15fps potato
@@alvaronavarro4890 Speaking of Mass Effect, I wanted to play it after finishing Dragon Age 1 (i absolutely loved) and 2. But i kinda didn't want to ruin such a good game because first play is always special. Gaming laptops are freaking expensive where I live, easily costing over 700 bucks with 8 gb ram, about 250 gb space, already used with scratches and some shitty low level gpu. This might not seem much but where i live the minimum wage is 150 bucks, i found good deals on Amazon but the delivery is freaking over 500 bucks. So yeah I suppose getting my hands on Steam Deck might be my best option.
I remember waiting, did extensive looking around and saving money for 1080ti. 6 years later, still running strong. Worth every penny and delayed gratification paid off.
Not sure if RTX 3060 is popular for gaming, but it has crazy VRAM/price ratio for AI. You can generate two 736x512 images at once or run a 13b text generation model with 5 bit weights.
RX 570 4GB owner here. the card's still kicking, but I guess I'll have to put it down once I upgrade. 7 years of driver support for Polaris-based GPUs and 5 or 6 for Vega GPUs made me realize how good of the cards they were (and still are for some). sure, I've encountered issues with drivers crashing sometimes, and some drivers just give me a blue-tinted video, but hey, it performed quite well at stock speeds. the fact that it has (for today's standards) a low TDP of 150W, and is meant for ITX builds made me realize Sapphire makes good GPUs. might consider getting modded drivers to prolong the lifespan of it, but who knows.
Still rocking pretty much the same model as well. So far I only really see issues with unoptimized UE4 games, but the other ones I play work pretty fine on 1080 and 1440 med-high settings.
I did some undervolt in mine and I use at 90w with no performance drops, I love this card... I'll cry when I have to change. It was my first decent gpu.
Had one and sold it on and got a 4070 gtx, Its the perfect time to invest lol and also alot quieter card lol 😉 , Still a solid card, The 2080ti, Its carried my older pc at the time for years ! 😅
@@onlypvpcaterina-6669 the RTX 2080ti is not a good card right now even used it is a bad card, you get way better cards right now that eat less power and has way better Feautures and Performances for way less money. At the release it was a great card yes no doubt about it. To get a RTX 4070 was a bad idea because if you had a RTX 2080ti so you can easy run a RTX 4070ti and it will be the better product. But the Performance between a RTX 2080ti and a Rx6800xt was so HUGE that it was kinda strange that the RTX 2080ti even had any rights to exist after the Rx 6000 release
I have an RX590. To me it means I was able to afford the best Polaris GPU (not by much, but still) when I was building my first PC (which I still use to play Cyberpunk 2077 with FSR 3 FG at 3440x1440) and couldn't afford a 1070. I wonder what having a Polaris GPU means to NikTek.
RX580 over here, really amazing gpu that tanks everything I sthrow at it, although, newer games do cap by gpu speed, not my Vram that i have plenty with 8Gb.
I used to have a 3070 myself, and VRAM was never actually an issue for me somehow. I never played any AAA games of 2023 though, so that probably explains it
I played cyberpunk and forza no problem so idk it might be that I have another 8gb of vram from integrated and it just goes over there highest possible settings (manually)
Only time my 8GB wasn't enough was in Cyberpunk with Ultra raytracing or path tracing. Ultra raytracing I get like 20fps and path tracing is less than 1fps. If I set everything to ultra 1440p without raytracing or path tracing I get 80+fps. Halo Infinite I have to set textures to high instead of ultra but everything else is max 1440p at 80+fps as well. Starfield runs like shit even at medium everything but that game isn't using the VRAM it's just even more unoptimized than that lol. Not sure about Hogwarts Legacy, The Last of Us, RE4 remake, Avatar Pandoras Frontier, Alan Wake 2, or other modern unoptimized games. Probably need to set some of those to medium to get by but there's still way more games that work with 8GB at max settings than there are that don't.
@@DeltaSix_YT I played through Cyberpunk with most settings on high I'm pretty sure textures included on my RX 5700 XT, and it also only has 8 gigs, so idk. No RT though, obviously, and I'm on 1080p.
As a Linux RTX 3060 Mobile user I can confirm having to unfuck Jensen's bs multiple times in my laptop's lifespan had severe impact on my mental health
on 480p ? this card is as fast as a Rx580 and has just 6 gb vram. this card cant even play games from 2019 at 60 fps on mid. settings. the last two years was extreme ? no the last 4 years was extreme and i want to see how you get with FOV maxed out and Render Distance maxed out (what you need in competetive games). i bet you dont get even 120 fps in apex legends with this card at lowest settings 0.o wtf is wrong with you even Red Dead Redumption 2 is not running good on this card. Cyberpunk 2077 ? how about Horizon Zero Dawn ? How about The witcher 3 from the release Year of the GTX 980ti ... no 120 FPS.
@allxtend4005 980ti is faster than a 580, a 980ti is around the performance of a 1070 with a 580 being closer to a 1060 and a 570 being closer to a 1650. Not that a 580 is bad, but you are a bit wrong on some assumptions.
Intel HD 4000: There is no afforable GPU, there is no powerful GPU and there is no "Queen of Gamer" ! Sis Mirage Graphic 3: You dare challenge Megaming?!
I have a 710 for a display out on a NAS server, and two 3090s for my "gaming" HTPCs ;) Also, my 1080ti is retired, it sits in its OG box collecting occasional stares of accomplishment from on lookers. The good life.
As a 3070 owner yes that’s 1000% accurate. Luckily I play mostly older games which were optimized for normal hardware. Honestly some minimal specs are just insane these days.
Owners of low-spec GPUs have something that everyone else doesn't: Modding, 400% sharpness in the Amd/Nvidia panel and editing .ini files to run Cemu at 25fps.
I think a longer video would have been appropriate. What is with all the AMD APU users? Or the RX 6600, 6700, 7600 and 7800 enjoyers? The 1650? What about the CHINESE 4090?
1080Ti was an all time 8800 GTX level chad GPU. I went from a 1080Ti to a 6900 XT after 5 years of service. Both were priced relatively well in their era (at least when cryptomining was dead). Hopefully the 6900 XT finewines also.
I actually do have a gt 710. I have problems running any game older than 2011. So I just stick to source games and indie products and its surprisingly ok. I think I'll get something better when I can finally afford it though.
Ha! Our family computer came with an OEM gtx 635 which is roughly twice the power. And even still it performs worse than the integrated graphics on the i7 4770, the definition of manufactured ewaste. Always remember: your videocard defines you
Have had both a 3070 and 7900xt. 3070 was great but the vram greatly limited its capabilities. 7900xt was an expensive upgrade, but I’m loving it so far
@@OnlyGrafting It's bad you reach Vram limits when using higher textures and when you do they dont load properly so yeah it's pretty bad. Look at your Vram Usage when playing it's pegging itself.
@OnlyGrafting Nah, he probably plays on 4k ultra settings lol, people greatly overestimate the importance of VRAM just to hate on 3070. My 3070ti has not once even used all 8 gigs on 2k monitor highest settings
@@ThunderTheBlackShadowKitty 1080p. High textures would usually set off vram warnings. When I played through Far Cry 6 and RDR2 the games would just outright refuse to load higher resolution textures as the memory capped. I’m not saying it’s a bad card though, in most games I could easily run the highest settings without worry, but it still had its limits
As a former Intel iGPU and GT 710 user: "Never Before Have I Been So Offended By Something I One Hundred Percent Agree With" As former 1660 Super and current 4060 user: *cries in part 2(?)*
I assume the intel iGPU is a bit of a joke but the new iGPUS' in the new Core Ultra laptops such as the Core ultra 7 155H are actually quite strong, beating the Radeon 780M in quite a few tests gaming and non-gaming
It's not lol it needs drastically more ram frequency to beat it which removes it from an affordable price point to compete you would need to spend +400 dollar just to beat AMD. That's not good especially when considering you can just get better frequency with AMD and still pull ahead.
This summer my 750ti will be 10 years old (with my whole gaming pc) and still got me to play at stable 60 fps a lot of games I didn't think it could even handle, even if I had to make """optimizations""" for some of them
I felt that first part with my 3070 ti. Just 1 tier away from 10-12GB but for $200+ more which is why I settled and got the 3070 Ti on sale for like $450. Thankfully the newer games that use more than 8GB all suck anyways and are unoptimized but still f*** NVIDIA regardless for purposely obscuring their products on launch. I had a 3GB GTX 1060 which was great until around 2020 then it became bottlenecked by the VRAM despite still being moderately powerful. Meanwhile Intel Arc GPUs with 16GB can be found for less than $300. Yeah, I'm sure NVIDIA couldn't have afforded to make the 3070/ti 10-12GB+ lol.
i dont know if you dont get it but Nvidia put less ram on ther cards since forever... it is not happening since RTX 2000 series. GTX 1000 series was just a misstake because Intel Slowed the Game Progression in Visuals and the release of Ryzen + DX12 did wounders.
@@allxtend4005 I think 8GB is plenty for a GTX 1070. Even 6GB is OK for a 1060 for what it can handle and it's price. The 1080Ti has 1GB more than the 3080 did at launch which is crazy. The real issue is we are still using that same amount of VRAM in much more powerful cards nearly 7 years later. I do agree it is cheap enough for them to have included more in the 10 series, but it's far more excusable than 6GB in a 2060, 8GB in a 3070 Ti, 10GB 3080, etc. I can't say much about the 4000 series though since their VRAM bus width is significantly reduced, which is why the 16GB 4060 Ti VS the 8GB 4060 Ti is silly to me in some circumstances.
@@-James-A it was a great card even my 3GB was holding strong for so long. Some newer games had weird texture problems but would still run even at 80+ fps on medium+settings 1440p. I was impressed.
Did I miss your GPU? Let me know and make sure to Subscribe for Part 2
Oh of course, let me know who I really am by checking out...1650😏(YKWIM).
GTX 1070 TI
rtx 3060 12gb vram
Nvidia Quadro P1000/ other lp sff cards
Rtx 3060 12gb, it can't play ultra rt but it can run great in every single game at 45/60 fps
As an Integrated Intel(R) HD Graphics Card 520, Dell laptop user, I see this as an absolute win
considering his mindset....i don't know if that's the win you think it is.
It's an honor to have you guys here :)
Completely agree, this video just confirmed that selling my 6950xt was indeed a good idea
I tip my hat to you, fellow Intel integrated graphics user.
@@mastroitek dont do it we still get dem updates 😎🥺
As an AMD gpu user, I fully agree with Linus Torvalds.
The main reason I use AMD, Linux support.
@@raypol1 Same lmao
Same even if I don't use Linux. 6700XT here
As an Nvidia gpu user, I too...
Ryzen 7800x3d and 7900xtx user here - F*** NVIDIA !
The GTX 1080 Ti is one of the best components I have ever purchased. Still strong even after 5 years.
"Strong" is a very questionable word here, but I'll agree it was one of the best ever made.
how it's questionable it competes with 3060 literally @@selohcin and with 208p0 EASILY and it's a 7 years old gpu
@@selohcin It really does go strong. As long as you arent trying to run any 'modern' game at the highest settings it can run most stuff in my experience above 120fps
It is not one of the best, It IS THE BEST.
I bought mine on release at a retailer in my country (back in august 2017 for a whopping 899€) got the "MSI GTX 1080 Ti Gaming X 11G" still using it to this day.
I've started to realized over the years (especially during the start and end of all the crypto shenanigans) how lucky I was to pick one up when I did (was my first flagship GPU too).
0:15 it still perplexes me how the second slowest model of the 30 series (3060) has more vram than the 3070 and 3070ti, they should have all had 12 gigs.
didnt even know that... wtf
My guess is that Nvidia did that intentionally:
They knew most 3060 buyers are new people in the PC community or just don't care about PCs and the 3060 looked good enough, they wanted the people to grow and love their 3060 (assuming that it's the first time they got a proper gpu and we're previously stuck on integrated graphics)
Nvidia knew the 3070 buyers were the real enthusiast, cuz no ordinary person is willing to pour more than 300$ for a pc component. They knew that the 8gb will have them upgrade and give up hundreds to something more expensive so they don't run into the same issue soon
@@tristenhood3167 The first 3060 had 12gb then the 8 gb came one or two years later
@@KATCracK I must've misremembered then. Double checked. You're right, my bad.
@@tristenhood3167 np
As an Arc A770 Card holder, this video is accurate and that BOD screen is personal!
as an arc a750 user, i agree with you
my a750 crashes in game 😮💨😑
should i go with rx 7600 or a770 ? (theyre the same price in my country thats why i compare them)
@@faileeer I use an A750 LE, and I'd say go for the RX 7600 if you don't want problems
Blame windows, this card is made to work with a good OS
A reminder that GTX 1060, 1650, and 2060 are still to this date in the top 5 of most used video cards on Steam 😅. Maybe in part 2 we'll see more RX cards!
Just upgraded from my 1060 6gb that I had for 7 years lol
@@BaldKiwi117I did so a year ago. I loved my 1060. Served me very well.
@billted3323 2060S is bare minimum I would say. Edged out my 1080 at least, not to mention everything else like dlss and FSR. Wait for the new super series, amd might even lower prices! 7700-7800 XT are great performers. 3060 12GB edition on the cheaper side.
I'm personally avoiding RX 6000 series since they don't handle AI as good as their new family, which might come in handy in 2024 and onwards.
@billted3323 I went to a 7800xt and it's been great. 1440p with high/ultra settings without any issues
@@_TrueDesire_ the 1080 ti is better than a 2060S
For part 2 you should add some of the RX 6000 cards
2nd!
@@mukkahstill survive more than 1 year 6700 xt
@hyperixdamn lmao that's me
@hyperix me too xd
@hyperix
Do you ever laugh at people with a retardation?
I am Smarter. I am Better. I AM BETTER
same
I HAVE RTX3050 😅
If you got 1080ti, you got lucky that you were able to afford a top GPU back in the day and got lucky that it was the last good GPU.
The last good GPU where price to performance wasnt completely whack.
Glad to own mine :) 11gb of vram is coming in real handy!
@@Gnarfendorfyup nvidia learnt never to give customers good gpu at an affordable price ever again.
it cost me 650 dollars years ago and ended up being worth more than that used a few years later.
I got a 1080ti for $800 when bitcoin miners were eating them up in 2018. It was an absolute monster that could run most games on ultra settings 1440p at the time. I just got a 4090 a month ago, I thought it was time for a new build and I'm happy so far.
As a 7900xt user, this video was very accurate.
It's one of the reasons why I got that card. An 4080 would still be 30% more expensive. And for the price of a 4090 I could build a complete gaming pc with rx 7900xtx lol
**Laughs in 7900XTX**
@@vaemar Just built my PC with the 7900xtx merc 310 + 7800x3d. runs AMAZINGGG
I'll be rocking my 6950xt for a while 👍
amd rocks
Remember buying the 1080ti back in the day.
"man that's more than youll ever need" and "such a waste of money" etc
Bought it thinking how overkill it was for the time. 6 years later still using it lol.
Lol same. 1080ti can't even run Cyberpunk. I tried and it looked awful.
@@WarmwaterBlissBought mine before the pandemic. Around 2019. My PC was amazing back then.
Has i Say to another.
Is a Bad GPU.
In his time the SO it was Focus on CPU not the GPU.
When the SO start requiere more GPU 1080ti it was pretty behind.
A actual 4060 is around 300 dollars, use so much less energy, and have around 7/15% more of power.
The modern lauch price of a 1080ti is around 950 dólares, without shipping and taxes.
So yeah, it was a Bad GPU.
Just people love the nostalgia.
The only good GPU in that gen was the 1060.
I gist my old 1080ti to a friend in COVID. I don't hesitate Even s little.
It was a terrible GPU.
@@rio-dq9sc it was an amazing gpu.
Before the 1080ti the only way to get this type of performance was through multi-gpu like sli or crossfire.
It was a miracle when I went from trifire 7970 to crossfire.
Then the 1080ti came and evryone called it was expensive but it was only 120 euros more than the 980ti apon release.
And the 1080ti was blowing away 2 gpu configurations.
Nobody liked having to use multiple gpus it was costly and innefficient but for some things it was the only option at the time.
1080ti changed all that.
It's silly to compare it to a 4060 Wich came out many years later.
And still "7/15% more power" is a joke comparing pc parts that are so far apart in age.
My RTX 4090 when BeamNG: *Confused Screaming*
i think its your cpu, beamng is cpu intensive
beamng runs fine at 120-180 fps on 1080p ultra settings w 3060ti and i5-13400f so idk buddy
@@Sol4rOnYt yeah thats a 13th gen, ofc its gonna run good
im stuck here with an i5-9300H
my i7-7700 and GTX 1060 with 16gb of ram, half of which being used by firefox for some reason watching me spawn 6 traffic cars in west coast usa:
Feeling attacked at the 3070 :(
Same bruh, same =w=
No worries bro, I attacked myself too :(
same😭
I am going to buy RTX 4070 Ti Super to solve vram issues.🙂
Not more than 3090ti🌚
As a 7900XT user, I love this video.
I genuinely started laughing seeing the "f you Nvidia" clip in this context, and I 100% agree.
I loved it because it shows just how retarded and childish the AMD fanboi shills really are!
6950xt here. Absolute same thing
I went from a 3060 ti to a 7900xt, and it sums up my thought process when switching.
Went from a 3070 ti to a 7900 xtx and never have I felt so scammed in my life by nvidia.
This thing cost less than my 3070ti did at the time, and has so much more performance it's not even funny.
Yes yes I know, it's been 3 years.
But still, what the fuck was nvidia thinking with 8gb vram?
Went from an used R9 270, to GT 730, R9 270 was good enough just to hold GTA 5 on 1080p 60 fps high graphics
"you have a RTX 4090"
*smells the burning plastic from its power connectors*
thats 0.01% of them
This comment smells poor
@@AndrewB23 yet you posted it anyway.
Pathetical
If you are dumb and can’t connect correct cable….😢
@@AndrewB23 you smell black
1:54:Zeb89 intro
Che ci fa un italiano qui
Ahahah
Ci avrebbe picchiati sapendo che ci sono delle NVIDIA in questo video
Aonna ragazzi
What's the song called though?
Absolutely spot on about the RTX 3070. It could have been an anomaly from NVIDIA like the 970 or 1080 Ti if it had 12 GB VRAM. NVIDIA ensured that did not happen, so that they could sell a 4070 2 years later.
That ain't a 4070. Its at best a 4060ti. The real 4070 will be the 4070ti.
@@siyzerixyeah we can't forget. But atleast the new 4080 super is kind of like a real 4070.
Aaand i got got
True, but that being said I love my little 3070. More vram would have made her a God tier gpu. But I'd like to quote the most devistating friendzone line one can be told because I feel it's very relevant here.
"don't let what you want ruin what we have"
i never had issues with the 8gb ram
I think people often forget that if a driver update gives a game 120% more FPS, that probably means the game was functionally broken before the update. They didn't pull out that performance increase out of thin air.
I'd assume that most people are aware of that, but might not hurt to remind of it, especially given the context of this video.^^
What do you mean, clearly the patch included some more vram
As a 1050ti user which plays only older games, I see this as an absolute win.
I have 50 hours in bg3 and then i released it truly is time to let go and upgrade
older games ? hmmm ? i play recent games and the GTX 1050 Ti works, well, if you put graphics on max, ohhh yeah, only "old" games, but if you a guy play games only for the gameplay and not for the graphics is a lot cool games works well in the GTX 1050 Ti... the new generations of gamers are more focused on graphics, the majority of gamers who started playing games in the 80s don't care about graphics, when I buy a game I buy it for the quality of the gameplay and not for the quality of the graphics.
@@TTHIAGO666 Yeah, I agree entirely with you. I've been having a blast lately playing Detroit: Become Human with my ultrawide 3440x1440, and it runs well at 30fps. The most important thing is the fact I'm enjoying the game.
@@mrlightwriter true
@@TTHIAGO666 I feel the same way with a 3050, albeit it's the laptop version and it can run a ton of games at high settings, I don't really care about that tho I only want a good story and good gameplay. Currently going through a racing sim arc and it runs better than my xbox one so that's a huge improvement imho. I only ever care about graphics if it seriously improves the experience, and non of the games I play do that so it doesn't really matter
1080ti was the pinnacle of graphics card technology.
They made it so good that it took 5 years to make something remotely as good
I'm a proud high tier 3080 owner with $1k less and now 4 years later effectively with a mid range GPU. Worth every penny.
well yeah that's pretty typical, every card ages after a few years
No babe, you DON'T have an effectively mid range card, what you have is a fucking beast
3080 is mid-range?? 😲😲
3080 being mid tier? realistically, it's low high at worst.
@@jayceneal5273 Not for One bloody Grand
As a core 2 duo user with no gpu, I see this as an absolute loss
feel u bro
Just find a job bro
my E8400 had 5870, it was overkill
If you swap in a Core 2 Quad your system can maybe handle a 750TI.
I changed out my Core 2 Duo laptop after 12 years of service just a couple of years ago. Stay strong, brother!
As a 1080 ti owner, I find this a very accurate representation of what it feels like to own the card.
(honestly thought I got it for 250$ and its amazing 10/10 graphics card)
0:50 This is me with my integrated Vega 8 graphics, YES I HAVE A F*KING LAPTOP!
AMD integrated GPUs go crazy
The G series apus are insane
@@azideiaman day 478 of wondering why I didnt just buy the 5800X3D instead of 5800X
both do not have an integrated GPU afaik@@amiesports
A10-7870K gang rise up
athlon silver 3050u
As a proud 1080ti owner with 10 year warranty(4 years to go) I totally agree. I never had a gpu this long. The combination of the most awesome performance to price combined with upscaling technologies like fsr make this gpu hold much longer than it was supposed to. Not in my widlest dreams I would have guessed this gpu would kick ass for so long... And the warranty certainly helped, I already had one rma.
Pascal really spoiled us
@@SirSethery yes he did
Still have mine no issues EVGA SC2
@@Ziyoblader same here, the 1080 ti is dare I say the greatest graphics card ever produced
2080ti here but my boss still has his 1080ti. That card stands the test of time. Mine does too.
As a 3070 user, it's a tragedy that a card this good only has 8gb of VRAM. FFS, they put 12gb of VRAM on the 3060. Why couldn't they do the same with the 3070?
Some crap to do with bus widths, also excluding 4 gigs of ram saves NVidia like 10 bucks per unit, I didn't know they were that petty but the cards were selling fine so I don't think they mind all that much people got screwed over.
To make you buy a new gpu 2 years later :(
Wouldn’t have done much. There’s a 16GB 3070, and the games it was tested with didn’t even try to use more than the 8GB the stock card had anyway. Only one or two games actually tried to use more, and they had to be cranked and, in one case, use custom shaders, if I remember right.
Nvidia wants people fall for this bait. They made a good product, but always make something a little off, so you can buy a newer one as soon as possible.
For an educated answer: band width. It has a 256 bit bus. Each memory chip provides 32 bits. If u divide 256 by 32 u get 8. So u can put 1gb chips for a total of 8gb or 2gb chips for 16gb. Nvidia chose the cheaper (even if its pennies worth amounts) option. 3060 has a 192 bit bus. Divide by 32 and u get 6. So same trick, either 1gb or 2gb for a total of 6gb or 12gb.
I told my dad that I needed a pc so he got me one with freaking integrated graphics even though we had a budget of $1700 and worst part was I had already spend my time after school for a whole 2 weeks researching on all the pc parts and building tips 😭😭
The fact you started off with my graphics card had me wheezing- Never would I have guessed the 3070 is getting up there in age already...
I just gave my 1080 ti aorus xtreme to a friend for free. I couldn't sell this piece of art. It's priceless. The last perfect and honest gpu Nvidia made. I could even play at 2K without rt on with decent fps after almost 7years. Upgraded to a used 3090 ti from MSI for just 600 euros. I am happy with my choice. Hope this goes well!
4090 owner here, 100% true. This is how i see the World.
Same
Gt 710 will always be the graphics card I peaked on with fortnite
Long story short: I had GTX 980 4G founders... and it died :( My Pc is quite small and only accepts blower-style GPUs and has around 180W TPD limit for the GPU. I ended up with RTX 3060 12 GB. At that time I was like.... oh, G, why will I ever need 12 GB of V-ram ? ...And now I know that it was a good choice lol.
gotta love the GTA 4 TBOGT theme for the outro, one of my favourite games and dlcs ever
As an Intel HD Graphics 520 user, I can confirm that this is exactly how I feel :')
Same as Intel UHD 600 user here (in Celeron N4020 lol)
i have the same gpu
As a uhd 620 128mb user i can feel your pain
I have intel graphics hd 520
@@akonako364 Oh, I have a 13" laptop with that thing, I got it new for 200€ (Chuwi laptop 13" 1440p, backlit keyboard, 12GB of ram, 256GB ssd, nice trackpad, good battery), actually quite an amazing laptop, one of the best things I've bought considering price to performance. Of course, it doesn't come close to my gaming PC, but I was actually able to get minecraft java 1.20.4 running with 80 fps on 6 chunks, all minimum settings, default texture pack with the help of sodium (1440x900 resolution though, that thing barely runs google chrome in 1440p so of course I wouldn't use that), but the laptop isn't even that bad for the price... oh, actually at 2560x1440 I got 45 fps in minecraft bedrock on 10 chunks so that's a win
As a gamer with a "MIGHTY" GT 710, I see this extremely relatable...
As a 4090 owner, I agree regarding 1080 Ti. It's still the best GPU
Lmfao, high level comment my brada.
Nah fr tho even after owning the 4090 for a few months now it still doesn't feel like it's worth its price for the performance you get at all, planning on selling mine soon
i bet u bought a 4090 for 4k 60 fps gaming lmapp yikes@@felixfam0481
@@felixfam0481 I had the same feelings, but I only kept it just to make some videos for my channel. Otherwise, it's not a great investment
@felixfam0481 4090 would have been good at 800 euros, but not 1800.
Ngl the A770 that I bought has actually been pretty bug free. Only issue I had was that the first two times using it, I needed to switch my hdmi port. Never had to do it since then though so might not even have been driver related :/
Give them til the end of this year when the Battlemage cards come out, the drivers should just about be perfected and if the Alchemist cards further drop in price, they will unquestionably be the kings on the budget end.
With an A750 you already get better performance than a used 1080 Ti at a similar price and it comes new with 3 year warranty. Plus ray tracing which may even be usable depending on game.
For tasks like video editing there's just no competition in this price range, for years Intel iGPUs have been better than any cheaper Nvidia and AMD GPUs, Arc takes it to the next level.
My 980ti is still serving me wonderfully after 8-ish years.
I just repasted mine last week, still works like a treat
I hope my card will age that well, but these new releases have me doubting it will be much good in 5 years. Fm2023 is sub 60 fps, cityskylines 2 Is barely over 30, Alan wake 2 is barely over 60. And I'm probably already going to stick to 1080p to because of it. And it sucks cuz I had bought it for 4k, but back to 1080p or 1440p for me.
Ah screw you that last part made me wanna get addicted to listening Rebel Path again after getting over it lmao
7900 XT reporting. I game on Garuda Linux, using Proton. I also generate NSFW "art" using AI. The level of freedom I am experiencing is exquisite.
AI on amd?
TEACH ME YOUR WAYS!!
@@Nycoorias
Unfortunately I can't send you a link to the github, as Goolag will do the censorinos.
@@Nycoorias
I can't. The trillion dollar business that runs this site sennzors the link.
@@Nycoorias
It even nukes my attempts to tell anyone what it's doing. Hence why I have to type so oddly.
👻@@HaveYouTriedGuillotines
6700xt used is going for like $250-275 is also a crazy good deal
The AMD one is actually accurate because on linux due to reasons people are hating on NVIDIA(even more than it should at times..) due to rather poor driver support and rather AMD's will just works and is the best for desktop and gaming, xp.
That's what I love about AMD on linux. Install the OS and it's literally working out of the box.
Yes, but you can only run games on it because you're a second class citizen. You wanna do something more productive? Maybe some AI stuff, photogrammetry, programming? AMD is useless for that, your only choice is Nvidia, even on Linux.
Also, the drivers are not that bad. They are just closed source, but they work well enough.
@@ThePortuguesePlayer Yeah i know that, both are pretty much capable of remotely doing the same things, NVIDIA will be a lot better with CUDA than AMD with ROCm, but if only for desktop and gaming AMD will be a better option, still i am hopeful for NVIDIA, hope the work for NVK gives fruition and so on, its not bad but they could do more and not be all that much..
AMD does just fine with most stuff you've mentioned (unsure about AI), because as far as I'm aware, AMD GPUs have no problems with photo editing, and GPU doesn't even affect programming, so I don't know where you got that one from.
Also, when I was still using an NVIDIA GPU (RTX 2070), the drivers really are that bad, I nearly lost it with the amount of issues I had getting those god forsaken drivers to work properly, with uninstalls, reinstalls, band-aid fixes I found on fourms, display server compatibility issues, etc.
I can't guarantee the opposite for my new AMD GPU though, as I haven't tested it on Ubuntu yet, just Windows, due to the fact I haven't really needed to compile any Linux only applications recently, but I have heard that it is much, much better for AMD on Linux vs NVIDIA on Linux. Also, I haven't actually heard any problems about AMD on Linux other than OpenGL being subpar, and that's not a Linux-exclusive thing.
@@TraceEntertains CUDA.
My 1080ti purchase 6 years back ...the best decision I ever made😊
As a gaming laptop user (4080 Mobile) I'm waiting for part 2.
Gotta give us laptop users some love!
Or some dirt 😅
Isn't the 4050 a laptop only gpu right now?
Laptop gaming... lol good oxymoron. Funny joke.
@@RektemRectums So funny that I can do so easily...
@@sonicboy678 You spend more money to game on a small screen and overheat more because it's a small plastic box and you want to get mad about it. Keep crying. 😂
Still thrashing my Strix 1080ti to death, and probably will continue to do so for a good couple of years yet...
Same card, same plan. It will be like the Highlander movies, the last one standing 😅
As a linux user planning to buy the 7900xt next month. I feel that Linus Torvald clip.
I have a friend who still uses a 1080 and holy moly is that card awesome. He is still rocking his 1080p games
I have a 980Ti, it has served me perfectly for the last nearly 8 years. Probably going to have to upgrade eventually, but today is not that day.
I remember I was free for a week last year and decided to try out Yakuza Kiwami 1 on my potato laptop with Intel hd 520. I just wanted to see what the hype was about. But the game was so good, i just couldn't put it down. I played and finished the entire game with 15-17fps. That shit was the most fun I had with a video game ever since Shadow Fight 2. Now I am waiting and hoping to get a decent gaming laptop to play the second gaming.
Maybe a bit old, but I played Mass Effect 1 in 2009 with a potato laptop, and at low settings, 320p and 15fps that thing looked like a ps1 game. But when you’re young and don’t have any money you’re happy with 15fps potato
@@alvaronavarro4890 Speaking of Mass Effect, I wanted to play it after finishing Dragon Age 1 (i absolutely loved) and 2. But i kinda didn't want to ruin such a good game because first play is always special. Gaming laptops are freaking expensive where I live, easily costing over 700 bucks with 8 gb ram, about 250 gb space, already used with scratches and some shitty low level gpu. This might not seem much but where i live the minimum wage is 150 bucks, i found good deals on Amazon but the delivery is freaking over 500 bucks. So yeah I suppose getting my hands on Steam Deck might be my best option.
My 1050 probably lives the best life, being properly cooled, taken care of, and being used only in games
My 1050 had the worst life cuz i didnt know anythint about taking care of a laptop
Probably half melted, screaming, barely running, always used
I remember waiting, did extensive looking around and saving money for 1080ti. 6 years later, still running strong. Worth every penny and delayed gratification paid off.
I used to be Chad Chadington with my 1080ti, but I evolved to Chad Thundercock with the rx 7900xtx
VRAM size matters
was waiting for the 4080 train and it never came..
POV : you have a gpu bought during the COVID19 period
(a moment of enormous gpu price inflation)
💸💸💸 good bye my money
Think that was the Tiger King reference aimin' at hehe
1:06 i use a rtx card but still this is lit🤣🤣🤣
Not sure if RTX 3060 is popular for gaming, but it has crazy VRAM/price ratio for AI. You can generate two 736x512 images at once or run a 13b text generation model with 5 bit weights.
It's worlds most popular GPU at the moment.
As a 1080ti owner, gave me a good chuckle
I got this vid right as i’m transitioning to a 4070 ti SUPER
RX 570 4GB owner here. the card's still kicking, but I guess I'll have to put it down once I upgrade. 7 years of driver support for Polaris-based GPUs and 5 or 6 for Vega GPUs made me realize how good of the cards they were (and still are for some). sure, I've encountered issues with drivers crashing sometimes, and some drivers just give me a blue-tinted video, but hey, it performed quite well at stock speeds. the fact that it has (for today's standards) a low TDP of 150W, and is meant for ITX builds made me realize Sapphire makes good GPUs. might consider getting modded drivers to prolong the lifespan of it, but who knows.
Still rocking pretty much the same model as well. So far I only really see issues with unoptimized UE4 games, but the other ones I play work pretty fine on 1080 and 1440 med-high settings.
I did some undervolt in mine and I use at 90w with no performance drops, I love this card... I'll cry when I have to change. It was my first decent gpu.
RTX 2080 TI still going strong.
Had one and sold it on and got a 4070 gtx, Its the perfect time to invest lol and also alot quieter card lol 😉 , Still a solid card, The 2080ti, Its carried my older pc at the time for years ! 😅
@@onlypvpcaterina-6669 the RTX 2080ti is not a good card right now even used it is a bad card, you get way better cards right now that eat less power and has way better Feautures and Performances for way less money.
At the release it was a great card yes no doubt about it.
To get a RTX 4070 was a bad idea because if you had a RTX 2080ti so you can easy run a RTX 4070ti and it will be the better product.
But the Performance between a RTX 2080ti and a Rx6800xt was so HUGE that it was kinda strange that the RTX 2080ti even had any rights to exist after the Rx 6000 release
@@onlypvpcaterina-6669 Im still using mine, its rock solid for 1440p still! Keeping it unless 5090 offers an insanely efficiënt alternative.
what the hell is a 4070 gtx do u not know about pc's at all?@@onlypvpcaterina-6669
I have an RX590. To me it means I was able to afford the best Polaris GPU (not by much, but still) when I was building my first PC (which I still use to play Cyberpunk 2077 with FSR 3 FG at 3440x1440) and couldn't afford a 1070.
I wonder what having a Polaris GPU means to NikTek.
RX580 over here, really amazing gpu that tanks everything I sthrow at it, although, newer games do cap by gpu speed, not my Vram that i have plenty with 8Gb.
As an Integragrated Intel(R) UHD 620, HP laptop, i can see this as an absolute win
As an "Intel® HD Graphics for 2nd Generation Intel® Processors" with "Intel® Pentium® Processor B950" user, I agree.
I used to have a 3070 myself, and VRAM was never actually an issue for me somehow. I never played any AAA games of 2023 though, so that probably explains it
I played cyberpunk and forza no problem so idk it might be that I have another 8gb of vram from integrated and it just goes over there
highest possible settings (manually)
@@SpaceFHon high settings too?
Only time my 8GB wasn't enough was in Cyberpunk with Ultra raytracing or path tracing. Ultra raytracing I get like 20fps and path tracing is less than 1fps. If I set everything to ultra 1440p without raytracing or path tracing I get 80+fps.
Halo Infinite I have to set textures to high instead of ultra but everything else is max 1440p at 80+fps as well.
Starfield runs like shit even at medium everything but that game isn't using the VRAM it's just even more unoptimized than that lol.
Not sure about Hogwarts Legacy, The Last of Us, RE4 remake, Avatar Pandoras Frontier, Alan Wake 2, or other modern unoptimized games. Probably need to set some of those to medium to get by but there's still way more games that work with 8GB at max settings than there are that don't.
@@DeltaSix_YT I played through Cyberpunk with most settings on high I'm pretty sure textures included on my RX 5700 XT, and it also only has 8 gigs, so idk. No RT though, obviously, and I'm on 1080p.
@@astra6640 thank you im definetly gonna buy an rtx 3070 on ebay those things go for like 250-300 it's incredible
As a Linux RTX 3060 Mobile user I can confirm having to unfuck Jensen's bs multiple times in my laptop's lifespan had severe impact on my mental health
1:40 i have a regular 3090, but i bought it when the prices were still crazy, so i felt that
3090 Ti Owners... ROFLOL!!! Ain't that the truth!!! LOL!!!🤣🤣🤣
The Joker clip is honestly a perfect fit for me. Was even wondering if my outdated card will be represented in the video lol.
I have a 980ti 😎 It could play pretty much every game at +120 fps for like 7 years, the last 2 years were a bit extreme haha
on 480p ? this card is as fast as a Rx580 and has just 6 gb vram. this card cant even play games from 2019 at 60 fps on mid. settings.
the last two years was extreme ? no the last 4 years was extreme and i want to see how you get with FOV maxed out and Render Distance maxed out (what you need in competetive games). i bet you dont get even 120 fps in apex legends with this card at lowest settings 0.o wtf is wrong with you even Red Dead Redumption 2 is not running good on this card.
Cyberpunk 2077 ? how about Horizon Zero Dawn ? How about The witcher 3 from the release Year of the GTX 980ti ... no 120 FPS.
@@allxtend4005 Damn somebody woke up on the wrong foot lol. Write your book somewhere else.
@@allxtend4005the RX580 was (and kind of still is at a low budget) also a very good card.
@allxtend4005 980ti is faster than a 580, a 980ti is around the performance of a 1070 with a 580 being closer to a 1060 and a 570 being closer to a 1650.
Not that a 580 is bad, but you are a bit wrong on some assumptions.
1050 ti still ran strong till i swapped for a 4060 got for christmas. had it for almost 7 years! holy cow!
What my 3060Ti says about me is I wanted a decent GPU before Halo Infinite came out and I wasn't willing to pay a scalper for a 3080.
Amen!
I paid 1300 dollars for a 3080 from a scalper because I'm an idiot, and also because I sold my 1080 ti for 600 dollars.
Intel HD 4000: There is no afforable GPU, there is no powerful GPU and there is no "Queen of Gamer" !
Sis Mirage Graphic 3: You dare challenge Megaming?!
And what about the GTX 1650 users 🙂
As a GTX 1050 mobile GPU with 4 GB VRAM, I am fine 🙂
GTX 1050 SLAPS🔥🔥🔥🔥
As a GTX 1050 user, that's exactly how I feel. Gonna upgrade to GTX 1660 ti
As an rtx 3080 turbo enthusiast I am devastated that our gpu didn’t make the cut, just like our water blocks never make the cut
At least we have a 3080
I got a 3080 turbo and watercooled mine with a block from alphacool! Works superb and was fun to set up!
I have a 710 for a display out on a NAS server, and two 3090s for my "gaming" HTPCs ;)
Also, my 1080ti is retired, it sits in its OG box collecting occasional stares of accomplishment from on lookers. The good life.
As a 3070 owner yes that’s 1000% accurate. Luckily I play mostly older games which were optimized for normal hardware.
Honestly some minimal specs are just insane these days.
Owners of low-spec GPUs have something that everyone else doesn't: Modding, 400% sharpness in the Amd/Nvidia panel and editing .ini files to run Cemu at 25fps.
As a 1060 owner, I see this as an absolute win
I think a longer video would have been appropriate. What is with all the AMD APU users? Or the RX 6600, 6700, 7600 and 7800 enjoyers? The 1650? What about the CHINESE 4090?
I will now proceed to jojo-pose as a proud 1080Ti owner.
Same here
I have a Ryzen 4070, win for my books
As an 7900 xtx user, we suffer from severe indifference towards VRAM requirements
Amd integrated GPU?
1080Ti was an all time 8800 GTX level chad GPU. I went from a 1080Ti to a 6900 XT after 5 years of service. Both were priced relatively well in their era (at least when cryptomining was dead). Hopefully the 6900 XT finewines also.
I NEVER, in my life (32), have seen that RDJ clip. Ty !
good video! do rx6400 and gt 1030
Definitely will do :)
@@NikTek6700xt too
@@NikTek Thanks! Earned a sub :D
I actually do have a gt 710.
I have problems running any game older than 2011.
So I just stick to source games and indie products and its surprisingly ok.
I think I'll get something better when I can finally afford it though.
I have a radeon hd 7450
i have problems running
so i just stick to linux with no desktop enviornment
i will never upgrade
Ha! Our family computer came with an OEM gtx 635 which is roughly twice the power. And even still it performs worse than the integrated graphics on the i7 4770, the definition of manufactured ewaste.
Always remember: your videocard defines you
As a 7900xt owner that's exactly how I feel 😂
Same bro
The reason i bought it also 😂
Cyberpunk soundtrack goes hard
Have had both a 3070 and 7900xt. 3070 was great but the vram greatly limited its capabilities. 7900xt was an expensive upgrade, but I’m loving it so far
Is the Vram for it honestly that bad? I rock a 3070Ti and don't notice much but obviously it's different
@@OnlyGrafting No, it's not that bad. This guy must be running 4k or RTX.
@@OnlyGrafting It's bad you reach Vram limits when using higher textures and when you do they dont load properly so yeah it's pretty bad. Look at your Vram Usage when playing it's pegging itself.
@OnlyGrafting Nah, he probably plays on 4k ultra settings lol, people greatly overestimate the importance of VRAM just to hate on 3070. My 3070ti has not once even used all 8 gigs on 2k monitor highest settings
@@ThunderTheBlackShadowKitty 1080p. High textures would usually set off vram warnings. When I played through Far Cry 6 and RDR2 the games would just outright refuse to load higher resolution textures as the memory capped. I’m not saying it’s a bad card though, in most games I could easily run the highest settings without worry, but it still had its limits
As a former Intel iGPU and GT 710 user: "Never Before Have I Been So Offended By Something I One Hundred Percent Agree With"
As former 1660 Super and current 4060 user: *cries in part 2(?)*
As someone with no gpu, I see this as an absolute win.
I feel so proud to be a Integrated Intel Graphic Card laptop user
I assume the intel iGPU is a bit of a joke but the new iGPUS' in the new Core Ultra laptops such as the Core ultra 7 155H are actually quite strong, beating the Radeon 780M in quite a few tests gaming and non-gaming
It's not lol it needs drastically more ram frequency to beat it which removes it from an affordable price point to compete you would need to spend +400 dollar just to beat AMD. That's not good especially when considering you can just get better frequency with AMD and still pull ahead.
This summer my 750ti will be 10 years old (with my whole gaming pc) and still got me to play at stable 60 fps a lot of games I didn't think it could even handle, even if I had to make """optimizations""" for some of them
learn to code and move to america you will be much richer. work here 3 years and pass an english and us history test and you are a citizen.
I felt that first part with my 3070 ti. Just 1 tier away from 10-12GB but for $200+ more which is why I settled and got the 3070 Ti on sale for like $450. Thankfully the newer games that use more than 8GB all suck anyways and are unoptimized but still f*** NVIDIA regardless for purposely obscuring their products on launch.
I had a 3GB GTX 1060 which was great until around 2020 then it became bottlenecked by the VRAM despite still being moderately powerful. Meanwhile Intel Arc GPUs with 16GB can be found for less than $300. Yeah, I'm sure NVIDIA couldn't have afforded to make the 3070/ti 10-12GB+ lol.
i dont know if you dont get it but Nvidia put less ram on ther cards since forever... it is not happening since RTX 2000 series. GTX 1000 series was just a misstake because Intel Slowed the Game Progression in Visuals and the release of Ryzen + DX12 did wounders.
@@allxtend4005 I think 8GB is plenty for a GTX 1070. Even 6GB is OK for a 1060 for what it can handle and it's price. The 1080Ti has 1GB more than the 3080 did at launch which is crazy. The real issue is we are still using that same amount of VRAM in much more powerful cards nearly 7 years later.
I do agree it is cheap enough for them to have included more in the 10 series, but it's far more excusable than 6GB in a 2060, 8GB in a 3070 Ti, 10GB 3080, etc. I can't say much about the 4000 series though since their VRAM bus width is significantly reduced, which is why the 16GB 4060 Ti VS the 8GB 4060 Ti is silly to me in some circumstances.
I don't want to brag but my first GPU was a 1060 6gb.
Sorry ladies, I'm taken....
@@-James-A it was a great card even my 3GB was holding strong for so long. Some newer games had weird texture problems but would still run even at 80+ fps on medium+settings 1440p. I was impressed.
Yeah it's a big issue on fh5, but like you said, it is a really bad game.
Proud GTX 960ti owner, this old ass gpu can run everything i throw at it, they were on crack or something when they made the 960 and 1080