ERRATA/CLARIFICATIONS 01:37 To clarify: Maxwell 1 cards (the GTX 745, 750, 750 Ti and their workstation/data centre equivalents) don’t support feature level DX12_0. This feature support is needed for several modern games, including Elden Ring and Forza Horizon 5. IMO any GPU arch which claims to be DX12 Compatible, but which only supports DX11_0 or 11_1 features, doesn’t truly support DX12. This also applies to Kepler (GTX 600 & 700 series) and AMD’s GCN1 (HD 7000, half of the 200 & some of the 300 series). 11:50 The Witcher 3 is clearly CPU limited at 1080P Ultra using the Ryzen 5 5600X, so - as mentioned in the voiceover - users of older/lower end CPUs will see lower FPS in DX12. 15:37 Disregard the TLOU comparison, the RX 5700 XT was tested with an earlier patch. I had dropped texture resolution and other VRAM intensive options, which A) should no longer be necessary post-patch 1.0.4, and B) may account for the higher performance. Sorry!
Have you considered using translation layers such as DXVK or VK3D to test these old cards? Sometimes, old cards can have decent Vulkan support while lacking in their DX12 support, including all the features. By translating DX11/12 to Vulkan and running the games that way, it can be quite helpful and may even enable a card to run a newer game that it otherwise couldn't. Additionally, on Linux, the open-source AMD driver (amdgpu) offers optional experimental support for GCN 1.0/1.1 cards, allowing them to utilize newer Vulkan features and optimizations implemented by the driver team. This topic is frequently discussed on Linux-oriented forums and sites, yet TH-camrs rarely create videos about it. It could make for an interesting video.
A plague tale requiem resolution optimizer set to ultra quality will render the game at full resolution with no up scaling as in the engine settings Primary Scaling is set to 1 and Secondary Scaling set to 1
I'm still using my 1060 3GB despite it not having enough VRAM to run modern games. Even Samurai Warriors 5 (It's devs Koei Tecmo / Omega Force aren't known for being a tech powerhouse) struggles due to VRAM. Hell, certain GZDoom wads run out of VRAM (GZDoom is a sourceport of classic '93 Doom).
ahahaha so true. And even then , they undercut it with VRam on purpose , because they had a free extra slot for 1 more gb on all of their cards, but they didn't cut it hard enough by their mistake lol. They weren't as bold back then of screwing as much their customers and undercutting them with vram as nowadays.
@@CyberneticArgumentCreator difference is, back then 4gb of vram was plenty, but most cards had 6-8gb or more. now games are starting to need 12gb+ and only a few cards from the last couple generations have that, even their newest barely meets that level for the 4070 ti
@@CyberneticArgumentCreator no they don't... The RTX 3060 had 12GBs, the RTX 4060 has been announced to have 8GBs only.. which is shit, 60 tier cards should have 12GBs, 70 and above should have 16GBs at the minimum...
The 1080 Ti honestly has legendary status. It was one of the highest performance leap cards ever made by nVidia and was maybe a high price but it was actually justified at the time unlike what were seeing now.
1080 Ti at 700$, while the 4090 is 1,600$. And who knows if they'll dare to stick a 2,000$ price on the upcoming 4090 Ti (actually the 4090 from infamous Asus is 2,000$ *shivers* ). Meanwhile most that 700$ can get you today is the 4070. We're definitely living in the dark ages of gaming.
@@Sergmanny46This is why just encourage people to buy consoles these days. I'll never switch, but if all you do is game, the value proposition in PC hardware is atrocious.
@@friendofp.24 Never thought I would see the day where recommending a console over a PC is actually the better choice. But then again, 500$ for a brick that can virtually play anything you throw at it is something that cannot be ignored.
Using a 1080ti right now still. Bought it 4 years ago as an upgrade from my 680. Probably the best card I've ever owned. It never let me down, and when the time comes to finally replace it. It will have a spot on my shelf in a display case. I'm happy this card is still getting love 6 years later.
Bought mine used as an upgrade from my 1050ti after the 20 series came out. The best purchase ever! Now using a used 3080ti. The king of 1080p gaming minus the rtx I would say.
It's amazing it still goes so hard in 2023, I'm still rolling with mine and looking at today's prices for modern cards is just depressing, but I can keep my 1080ti for another generation or two.
Got my Titan Xp (which is basically just a special edition 1080 Ti FE with slightly more VRAM and performance), and it's gonna stay with me long after I finally retire it from service on my main desktop.
@@Soggy-Soy-Toy for stuff like path tracing and how ray tracing is done nowadays they are pretty well necessary but in light applications like older RT games it isnt needed
It's not about how well it sells, it's about longevity. Yes it sold well, but it's longevity meant that a lot of people who bought the card probably spent 5-6 years without upgrading and once it enters the second hand market there's still a lot of people who'd buy it rather than spending money on a new GPU. So much money not flowing into the company because the product they released was too good.
@@snark567 Its difficult to kill a gpu, if a company want to last 3 years ,most of them may die sooner than 3 year and can cause bad reputation to company, or else it last longer .
I bought a 1060 6gb in 2017, sold it in 2022 and it was perfectly capable to run every game i threw at it at 1080p 60fps medium-high settings, never had a single problem with it. Amazing card honestly, I was a bit sad after giving it away.
They hit gold with Maxwell and they were able to shrink and double its clocks to create Pascal and release it in only a couple years after Maxwells release. Rumor was that pascal was already done when Maxwell was released.
The 1050 Ti was crap, for $25 more you could buy a RX 470 that was much more powerful than the 1050 Ti. Then in 2017 Etherium came and the only acceptable GPU for a decent price was the 1050 Ti, including the large numbers of owners from specific areas of the world.
@@takehirolol5962 That is if you only care about gaming . RX 470 has no hardware accelerated VP9 decoder for TH-cam. All Pascal cards even from the lowly GT 1030 has VP9 support.
It's irritating because you know they didn't just pull the newer cards and tech out of their ass, they roll this stuff out at just the right rate to turn their top end cards into middle of the pack cards within a couple years. Who wants to spend 700-1500 bucks every two years?
@@brahtrumpwonbigly7309 yeah, that's why i always think of buying cards at least 2 generations old if there isn't better contenders. What i've seen on ebay the last few days is that: RX6600 is around 200$, GTX1080ti is around 200-250$ dollars, RX6700XT is around 200-300$ dollars. Quite hard to decide i would go RX6700XT if it's at 200-250$ on the used market while it's a bit more powerful it has better DX11 and DX12 performance and still has years for that AMD fine wine drivers. RX580 i have seen it at 50 dollars and it seems it's only half the performance of the RX6600
@@Splarkszter Rx 6600 uses fuck all power which was my main drive But I struggled with that exact dilemma before I bought, the 1080ti is quite nostalgic same with the r9 295x2 legendary cards
I am one of those people :) These days, you need a 4070 to totally exceed the 1080ti's capabilities. Ideally, you'd want a 4070 TS at minimum for double the performance at a notable VRAM upgrade. Nvidia shouldn't have skimped on VRAM with 20 and 30 series. Otherwise, the card would've been obsolete faster.
We kinda had AMD to thank for the 1080Ti since they WAY overhyped their upcoming Vega range so Nvidia panicked and released a way more powerful card than they ever would have... and they will likely never make that mistake again
I think it's just happened again tbh, the performance gap between the RTX 4090 vs RX 7900 XTX is quite large, probably more than what Nvidia would have wanted!!
@@you2be839 No it didnt you dummy. If you know anything which you dont, you would know AMD has not yet released the 7950XT and XTX. They are already named on techpowerup if searched for it manually like always for not yet released cards. And the 7950XTX beats the 4090 by nearly 10% in raster performance. Shows how little you know about things you talk about. If only you werent so handicapped on computer knowledge people might take you serious.
@@williampaabreeves And you are also another commenter with 0 intelligence. Again as named in the other 2 comments, there are a few SKU's of gpu's that will release in future, but have not done so yet. They didnt price it because of monopoly either, since they dont have a monopoly anymore with AMD this close on them. Which is good for customers However Nvidea decided to make their chip on 4nm which is currently so expensive to make that the gpu cost is that high on the 4090, on top of that their memory modules are 5nm. AMD didnt see value in that so their chip is 5nm and the memory on 6nm. Since none of you in the comments have actual knowledge about whatever you bring up this is why people make stupid choices. Because people like this section talk about speculations they base on thin air. The cost for the 4090 is so high due to its being the first chip on 4nm. Its too new, production isnt high enough. And on top of that the netherlands have not yet made enough machines yet for all computer developing stations to be enough production to lower the cost. Nvidea choose a insane costly manifacture process which the netherlands made only 4 machines of. TSMC and Intel and Samsung and Nvidea all wait for ASML which creates ALL high density computer equipment so you can even use a computer. Without the netherlands developing that you would be in stone ages. Nothing in computer development goes forward unless the netherlands improves its processes. Basically you all depend on the Netherlands, And we have not made yet enough 4nm machines for you all to develop these things in bulk for less. So the real reason is Nvidea choose a new machine for wavers that is so costly and so few exist that the production price shot out of logic. AMD however decided to stay on 5nm for chip and 6nm for memory. If this comment section had a combined IQ of 100 then you should have been able to figure this info out yourself. So it shows me you are all kids who know nothing about how this world works or how economics or production works, none of you cares to learn and you all speak annecdotal. Now i hope this cleared things up for you all to even comprehend what is going on. Nevertheless, AMD and nvidea have both not released their flagship. 7950 XT and XTX and 4090ti both are still under development. And prices of the AMD will be far superior because of this. Performance could be just below it, but then again the 4090ti will cost 1.5 to 2 times the cost of a 7950XTX so given that the XTX of the 50 series already shows superior raster performance than the 4090, i wonder how much value the nvidea will be on that processing technique. Probably useless for its price performance.
I used a 1080Ti in my main rig from 2017 all the way to mid 2022. It allowed me to skip dealing with the GPU shortage entirely, whilst still being able to play games at their highest settings. Although I am now on a 3090 and have no use for the old Pascal beast I'm never going to sell it because it reminds me of a better time, when Nvidia actually cared about the majority of its customers and didn't charge 50% more for their cards between generations.
I'm planning on doing the same with my EVGA 1070. Heck of a card that served me well along with a 8700k. Especially when it's basically my first PC as well that saw me through college
@@Demopans5990 i switched from a 1070 to a 3090, and the performance difference is obviously noticeable in every way, but really shows how capable those cards still are. I think I was running borderlands 3 on basically max everything 1080p at 40-60 fps.
Yeahhh 1000 Series crew here. I started w/ a 1070 in early 2020 bc of a tight budget and 2 1/2 yrs later got a 3070. Like you my friend, I'm not going to part w/ my dear 1000 card. I didn't have to worry one bit about scalpers and shortages. I'm gonna use it in my retro gaming room build 😁
Perhaps the RT glitches you were seeing in Jedi Survivor was related to RT denoising. One of the biggest issues with real time RT was getting rid of the noise. RT cores and tensor cores can handle this, while Pascal has none.
I ran Quake II RTX on my friend's 1070, and the denoising worked fine. Either that's the denoising problem that is specific to this game (or UE4 in general), or the problem is something else, I think,
@@srnbrehman4695 I ended up getting a good deal on a 7900 XTX. The drivers are as annoying as I remember them being when I originally gave up on ATI/AMD but it's been a solid card so far. I tossed the 1080 into my HTPC to live out the rest of it's life on controller game duty on the TV. Probably be there for years.
i started in 03, so i cant even remember the names, a 256m card was a beast at that point. i remember heaps of talk about 8800 GT's or something like that, don't think i ever heard more noise about a card, or simply heard it refereed to so many times, must have been godly too.
Ah good old days, i was still a broke bloke college student back in that day, and me and my friends would be watering our mouths everytime we pass by a computer store with the 1080ti displayed on their shelves. saved up on my college days and got my first card, GTX 960 on 2018 saved up again after I started work and got a GTX 1070 in 2020 just right before pandemic prices and now I got meself a 3070 in 2023
What infuriates me the most is that they leave behind the design of the GTX 900 and 1000 series, I love that design, looks so futuristic and all the diamonds and triangle shapes are beautiful. GTX 980ti it's a beast of a card too, even today, I have it on my other computer.
980ti is great. YEah and also people don't seem to remember or figure out it's actually because of the triangles, it's a rendering and computer animation thing using lots of triangles for the way it renders things you actually can see it on older games using lots of transparency sometimes. It's also related to why the aesthetic was used, is keep in mind the Hyped game of 2016-2017 era included Deus Ex: Mankind Divided. That was a total hog too btw, uses postproc to try and hide it but the game is flawed visually and it still runs at 60fps on a 1080ti/5700xt using ultra detail settings 1440p but it still does look good...although, not as much when you remember Witcher 3 was same era and how it looks vs how it performs. So anyway, hence the triangles and angular design aesthetic of those cards. I think the all black Titan looks so good though, only thing to make it better and match the DE:MD aesthetic would be gold plating the heat fins on the FE model.
The 1080ti is an absolute beast. It's an example of what Nvidia does when it thinks it has competition. The upcoming AMD card at the time was hyped to have such high performance that Nvidia bought out the 1080ti, the AMD card didn't meet the hype leaving us with the legend that is the 1080ti. I've got its little brother the 1080 and the ti is roughly 30% faster. That gap for a ti is ludicrous. I am currently deciding between upgrading my monitor to 1080p high refresh rate or 1440 high refresh but needing a GPU upgrade. If I had a 1080ti I wouldn't need the GPU upgrade to play at 1440p high settings. Buying something like a 6700xt doesn't seem so appealing when the 1080ti exists.
@@tomdr93x It's one of the best cards at the moment. It's roughly 51% faster then my 1080, but it would still be a silly upgrade for anyone with a 1080ti being only a 19% improvement.
@@omegaPhix Anyone that paid $700 for a GPU would have upgraded years ago. I was more considering the 1080Ti as a current purchase for someone looking to upgrade to it. The poor state of the brand new budget market makes buying a second hand 1080Ti a decent option.
I purchased this mistake a couple of months after launch and it was a fantastic card. Maybe the best Nvidia ever released, right up there with the 8800GTX. Used it for 5 years until I replaced it with a 6900XT.
Man the 8800gtx. That was all the rage. Wasn't that around the time of Crysis coming out? Man that's making me feel weird to think about. I'm nearing 30 now and pretty sure I was like 12 or 13 at the time.
Nothing will ever beat the 8800GTX’s performance improvement. 130% faster and the introduction of CUDA and unified shaders. Absolute insanity. Then it was followed up with the 8800GT delivering most of the performance for $250! Those days were great indeed.
@@dex6316 Oh yeah, nothing will beat such a large leap for a long time. I just typically tend to look at GPUs based on relative value and longevity. I think the 1080Ti will end up being a rare Nvidia finewine scenario. I had an HD 7970 before my 1080Ti and I probably could've gotten another couple of years from that card had I not moved to 1440p.
I got a 1080ti when the 2000 series was announced for the "clearance discount". It was an absolutely amazing jump for its time and I remember the jumps in performance from following generations being miniscule compared to the jump b/w the 900 and 1000 series.
Cheapest 1080 Tis were 620 € new. That would have been great value especially to 1080p gamers. I play at 4K, but also skipped the 2080 Ti until Ampere came, when I got the new Ti for less than I got for my old Ti. Quite insane, but also expected, as I had gambled on the Turing launch disappointment happen again. It did, you could get 2080 Tis for 450-600 easily for about two weeks, then the prices skyrocketed again.
I went for a decent 3070 secondhand, but if I hadn't then the 1080 ti would have actually been next on my radar. It's insane how a 6 year old card is still actually this valuable. Everything after the 10's just seemed to up te price as much as the performance.
@@Boofski the thing is more that back when I was looking for a card, people were asking more for a 2080 then a 3070. If I had to step down it'd be 1080ti because of that.
@@Olav_Hansen I considered the 3070/3070ti, but ended up going with a used 2080ti, the 3070 was considered better, at the time, but the artificially low amount of vram really put me off.
same, my wife had it for years until Hogwarts Legacy came out and I wanted her to enjoy the best texture and upgraded it to a 3060-Ti, and at the time I thought it might've been a bad investiment but looking at 4060-Ti's performance I felt like it was a steal
Man, I remember pre-ordering the Founder's Edition as soon as it became available. Barely got my order in as it sold out in minutes. I knew it was going to be a good card, but no one could have guessed _how_ good this many years later. It's too bad Nvidia turned on its customers.
I still have mine, both of them. Nvidia managed to burn me on the SLI lie for gaming, but for Video Editing and 3D modeling applications that use it *MY GOD* it still blows any single card on the market even today.
@@ThatGuyNamedRick SLI 1080 Tis here as well, First ever 4k60 GPU config. 😊 We saw _Tomb_ _Raider_ achieve ~90% SLI scaling circa 2017... That was incredible! 1:1 SLI scaling was *within* *reach* , but Nvidia killed it, because they knew it would cannibalize sales/delay upgrades. Cowards... 😆 You're right, for production workloads, the rendering difference between a 4090 and 2x 1080 Tis is
lol the house buying analogy is so true. It annoys me to no end the old folks who keep telling us "Back in my day I worked 50 hours a day and was able to buy my house" Yes mama, we know, back in your day you also walked 30 miles everyday to school too right?
@@basilman121 given what percentage of your income housing is, you can see how nobody affords a house anymore. and alongside that in the last few years food doubled in price, it's insane!
The gtx1080 to me will always be the pinnacle of graphics cards. I never owned one (I still use a 970) but I was always fond of it. Even now when I think of 'the best' my mind immediately defaults to the 1080ti before I remember we're already on the 40XX series.
the fact the 1080ti is 6 years old already is crazy, i was thinking why is this a video but that’s insane to think it’s been that long since it released
I loved my 1080ti it lasted me years and it wasn't bottlenecked by 1700X at release either. Easily the best card since my 8800 GTX. It was also the first card that I experienced 144hz with. 1080ti + ASUS PG279Q was peak 2017 gaming.
1080TI truely has the spirit of the 8800 series in it, sadly i couldnt go GTX back then on child wages, but my 8800GTS was a beast for years. 640mb of vram yo!
@TheRealUnconnected the GTS was indeed a great card too. The GT was ok too as you could get it in a single slot solution and great for a physx processor for the handful of games that supported it. I kept my GTX going even longer by getting another secondhand one and going SLI. I recall all the hype of the GTX in its day. Only way to run CoD4 properly!
I swapped my 320MB 8800 GTS for a pair of 256 8800 GT cards back in the day. Always regret getting rid of the first card that I ever fell in love with. Not making the same mistake with my 1080 TI
@@srobeck77 i didnt say there is no improvement at all, i said that top of the line gpus (rtx 4090) are mostly priced over brand more than performance. Of course there is improvement, a 4060 is better than the 1080 ti drawing half the watts
@@morettillo5487 you also "didnt say" several other technologies like ray tracing and then you proceeded to compare a 1080 to the lowest end 4060. You aint too bright fella, not even bright at all....
I had a Titan X Pascal in my PC for a long time (the 1080 ti hadn't been announced yet) and it still ran really well until I replaced it recently with the 7900 XT. I gave my Titan X to my sister who still had a 1060 at the time and she's been happy with it. The Pascal generation really was peak nVidia.
6 years. Still running 1080ti SLI. Both cards cooled 480 360 and 120 rads. Just now experiencing temps above 50C(the one card that is used the most). Thats probably due to lack of maintenance loop wise. These cards are solid. I will box these cards up and keep them when its time to retire them.
One of the GPU fans broke on my ASUS Strix 1080 Ti card, but I kept on using it because it was honestly amazing. I just bought a Lenovo Legion Slim 7i laptop and it's only slightly better than my 5-6 year old gaming rig from 2018.
what drivers are you using? i never could figure the best combo to run most games- i always had to change the driver and support software depending on the game i was playing
I still am rocking a EVGA 1080 Ti and won't change it until the next gen. It is an absolute beast of a card that performs on par with RTX 4060 , and will be on par with RTX 5050 too.
lol I'm still using GTX 1080 ti because most of the AAA games are trash this year and last year. It's mostly competitive games and older games I put tons of hours on. Did you even see Redfall? Is it worth upgrading at this point?
1080 ti is much worse than any of the 40 series. The 3060 is on par with the 1080 ti, check your facts next time. Ps you know that gpu spec leaks are rarely ever what they are on release?
@@Ryan-wh2xf 380 bit bus vs 128 and 11 gb vs 8 gb. Trust me it is much better at least for the older titles (where you wouldn't find any benchmarks online) like Homm series, or Gothic series, or AC1 etc , this card is faster because it is optimized for older titles and 4060 ti is not , trust me mate. I am getting 800 to 1k fps in csgo, I mean can any of the newer cards get it? Especially 4060 ti it won't . I've tried a RTX 2080 and sold it because it was worse than a 1080 Ti. I wouldn't change it for anything. And even a +50% potential increase in fps in newer titles is not worth it. The only worthy upgrade when you don't feel like you are being ripped off is the 4080 and up and 5070 and up. No reason upgrading now though. 5000 series looks sexy and huge, hence why waiting for it seems like a better option. As it stands now I play all of my titles in 4k ultra (older ones) and some newer titles switch to 1080p ultra for more fps. I get 140 fps in normal gameplay in Atomic Heart at 1080p ultra and 40 fps at 4k ultra. I mean both choices are technically ok. And you can play both if you really wanted at whatever option. People really underestimate the cards performance. They think because it's older it must be worse... Get this , it is actually better. Also there are no problems with it AT ALL, you know when you change your card you could get problems with it like it not working, artifacting etc, COIL WHINE, crackling sounds, flimsy coolers or flimsy shroud , or overheating etc etc etc etc etc , there are so many intrinsic potential problems with it ,also on my RTX 2080 for some reason the drivers took several minutes to recognize and install!?!?!? I don't understand how this is even possible, etc etc etc . Not a problem on this card. The first day I plugged it in I forgot about it. There are literally nothing bad about it or nothing that could potentially be bad, it works like a solid card, and I like the way it looks, you don't hear it etc and there are no problems. I am only reminded of it that it even exists is when it runs a heavy title at 4k ultra, then the room gets a bit heated up. Other than that , I can't hear it even at higher loads. There are NO problems with it at all and it has been serving me with no issues for so long, like a charm. I would not trade it for anything, unless that anything is much better, as it has to be at least 2x performance in newer titles no less, and more than 12 Gb Vram, and this Gen I would prefer to skip, although if the prices on 4080 and up halves , I might consider it. Also Nvidia will be supporting a 1080 Ti for at least 4 more years, so there's really no rush. The only real reason switching from an older GPU is when you are not supported by the drivers. And if you get a driver support there's no real reason switching if you love this GPU. The best GPU of all time the GTX 1080 Ti . And I am a proud owner of it.
The only reason I replaced my 1080Ti recently was an issue that forced me to reduce the power cap to the point it annoyed me. 1440@165Hz is the setup, modern games were forcing me to make strategic choices on what to turn up. But what was frankly amazing was that the vast vast majority of modern games and all the older games played basically at 1440 Ultra, a beast of a card!
@@Atom224 4070Ti 😢. Ideally I'd have waited till at least 5000 series before seriously pricing up a replacement because I could still play the vast majority of my library at 60FPS and greater at High-Ultra with maybe shadows and anti-aliasing Med-Low A 40*80*Ti or 5080Ti would have been a pretty good replacement with me truly ideally stretching to a *60*80Ti for years service/price return. At the end of the day of I can't play new games I can just keep playing my existing library of games 😁, maybe switch to playing console emulators to tide over
4070Ti Is still richly doubling the capacity of a 1080Ti. I still think that since the 1080Ti, the 4070Ti is best bank for your buck! So, good choice anyway.
@@td3132 yeah for sure, it’s still viable. Good to see the mention of the 5700XT at the end as that was a card a lot of people slept on. Both would be good secondhand options. Just replaced mine due to a slightly unplanned 4k upgrade which was just too much for it.
@@spacechannelfiver had a geforce 2 bacl in the day. don't know where it is now. but i still got my 8800gtx and 1080 ti. will keep these for " a while" :D
Used to have a 1080ti before I got my 3070. That card is downright legendary, it managed to preform incredibly at 1080p ultra settings for a very VERY long time.
@@mikv8 A little lart of me regrets the descision today but all in all im happy with my 3070 and considering I sold my 1080ti for a good price and got my 3070 for an even better price im not too bothered by it. All in all the 3070 is still a beast card, yeah the vram sucks dick but for most titles it runs 4k happily.
What's funny is that, from what I heard, to some degree, Nvidia's good ideas have come from believing AMD's boasts, taking them seriously and making a more competitive product out of them. Two examples include the GTX 1080 TI (which around from performance boosts that AMD claimed Vega 64 would have... turns out not to be the case at all, but nowadays it can actually approach the 1080 TI supposedly!) And the RTX 3050 having 8GB of VRAM instead of the 4GB it was supposed to, because of the VRAM article that AMD published. It probably would've aged worse than the GTX 1660 TI which performed very closely to it at the time it was released, now the RTX 3050 pulls cleanly ahead. It's funnier since these times AMD has messed up their opportunity to get ahead in the GPU market. Still, competition is good for us.
Lol the 6600 and especially the 6700 from the most recent gen are both incredible cards, what are you on about? They'll provide a tremendous mileage I can guarantee it, even the 6800 with its 16 GB of VRAM is gonna prove to be monstrously enduring in the years to come.
What a legend of a card. Got mine a few months after launch and while it's been relegated to my Linux machine these days, it still pumps out solid framerates on anything i've thrown at it. Absolutely a card that deserves it's place alongside other epic cards like the Radeon 9800 Pro, 6800 GTX, and 8800GT.
@@korana6308 GTX is legendary as it was the first DX10 GPU, but GT is a true legend as it offered awesome price/performance yet not much slower than GTX/Ultra.
@@RuruFIN agree, the 8800GT was just next level, it had GTX performance but was single slot and cheap as hell. An amazing leap forward tech wise at the time.
Just upgraded from 1080Ti to 4090 last month lol. Lasted me almost 7 years. Back in the day it allowed me to touch on 4k gaming for the first time and 1440p through all these years with some minor compromises. Didn't have RT so no problems and performance strain there. 60fps perfectly achievable. 11GB VRAM! still holding up. It basically really aged 1-2 years ago with games like Hogwarts, Calisto, Cyberpunk, Witcher3 Next-gen, Jedi Survival. Truly a champion of a card. Sold it for 200$, RIP.
@@K11... That pretty optimistic. I'm willing to bet any amount that Igpu's will be better than the 4090 well before "ten years". Just next year (2nd half 24') we have Strix Halo Mega Apu coming out with 40 Cu of Rdna 3 and a 256 bit bus of lpddr5x, with projected performance equal to the desktop 4060ti, thus having usurped the prodigal 1080ti in just ~7 years since its release. By the time Rdna 4 based igpu's are out (no rumors on that yet, earliest we will see it would be late 2025) I wouldn't be stunned if even the 4080 is already threatened. The earliest Igpu with the potential to usurp the 4090 would probably be Rdna 5 based somewhere near ~2027, but even the most cautious prediction would have Igpu's beating the 4090 by 2030 (Rdna 6). By 2033 even a lowly Steam Deck 4 or wtvr with a much weaker igpu (relative to the Mega Apu's Amd is making which will wipe out "low-end" gpu's) should be beating a 4090.
@@slickrounder6045 Short of technological shift there is no evidence to show that a 4090 will suffer any sort of short term drop in performance. Realistically you can reasonable be assured of 5-7 years of reliable service with little to no compromises after which by the 10 year mark you are likely to either need/want a new GPU or we will see new games finally force you to start tweaking setting considerably to maintain a useable level of performance.
@@Hybris51129 There is constant "technological shift". 7 years later the prodigal 1080ti is now firmly a low end card (sub $200 used, beaten by new $200ish cards) relegated to primarily 1080p for the most demanding new releases. Zero reason to assume the 4090 will defy history and somehow maintain midrange status 7 years from now (forget about 10 years from now, when again even lowly steam deck handheld with igpu graphics should comfortably beat it. Bank on it).
I've sold recently my former 1080ti after 6 years of use in favor for an Rx 6800 .Still going strong that 1080ti.The dude who bought it was really happy.
Same, upgraded my PC itself in time to a R9-3900x, but still using my Gigabyte GTX1080ti and i do notice it lacks some features of newer cards, textures are in some games a bit bland even on High/Ultra settings, yet in FPS it still is still relevant. And like you,. mostly play Indy games like KSP, Stationeers, V-Rising etc, and with those games it more as perfect, and dont need a 4000 series card for them. The few AAA games i play, i hardly notice the lack of those features (TW-WH3) and older tripple A games like ME:Shadow of Mordor, as i often play older Tripple A games that are on sale, burned to many times on new releases beeing a Pile-o-junk where to card still was on its prime. Best Purchase i made in GPU's
I actually had a 1080ti that I got in 2017 for 300 bux... how in the world I hear you ask. Well it was completely dead, but some "basic" repair skills helped me find and replace a broken MOSFET that luckily got it back into working order. I sold it for 800 bux just before Christmas in 2021 because I had gotten a good deal on a 6800XT prior. Ended up selling that for profit too after realising I don't even play videogames anymore.... I now have a Vega 64 because it has impeccable Linux support and runs the few games I care about well enough at 4k
1080Ti still stands strong because 2080Ti was not a big step up in pure raster. 3080 was the first big step up from 1080Ti.. And of course, 4090 has been the next big step up, but it costs a bloody fortune, so. I'm sticking with my 3080ti.
As a 1080ti owner, I have the license to say, this is probably the best nvidia card still today, I9 9900K and a Gtx1080ti you can play any game in 2023, I use a 1080p monitor, and every game passes the 60fps margin, so tbh, Im not planning on changing my system rn, probably when my gpu dies or the intel 12th gen gets cheaper
Yea any game at 1080p i can imagine isnt a problem when i upgarded my monitor last year from 1440p 144hz to a 4K 120hz monitor my SLI 1080 Ti setup just wasnt cutting it no more so i wenr ahead and bought a 4090 recently 😅
@Travis Clark probably not but you cant treat 90series cards as normal flagships either since they bassicly replaced the Titians so you should still see the 80 series as the "normal" flagship
I bought a used Titan XP for $200 a few months back to replace my 1080 which i sold for $200 (crazy GPU market we are in) and the Titan XP has really been a decent step up and the 12Gb of vram over the 8Gb has been pretty noticeable.
I am so happy I bought my 1080ti when I did. Upgraded to a Ryzen with 32gb of ram. Hoping GPU prices come back down for the next gen. Still able to run all my games at this point
I have a 5700XT Taichi and I adore that card. I also found myself crapping all over nVidia's software as a result. AMD's literally got the superior GPU software from that era. Just remember if you do get it to set a custom fan profile, my only problem is I have a factory OC'd version and the OC vBIOS for some boneheaded reason doesn't have the fan curve to match, but you can't set a custom fan profile without tuning the card manually, so I'd had to set my custom profile with either OC or undervolt (or both lel) and set up the fan profile to hit full blast, otherwise it'd overheat. Well especially running shit like Cyberpunk. I just played Cyberpunk 2077 at ultra detail levels 1440p with this card, and that's the hardest game I ever threw at it, you may not mind but I don't like upscaling so I settled for 38-42fps and ultra native over 55fps with FSR quality. If you're at 1080p you can probably run Cyberpunk at closer to 60fps all ultra settings with that card. The only thing I even had a problem with was texture memory overflow into using system RAM, and so that well known cyberpunk bug with the muddy textures sometimes happens when I hit 10gb of VRAM usage but that's 1440p ultra (if you had a 1080ti that will not be a problem tho). Depend which is cheaper. Oh, and also about how much you want to deal with the used market, bear in mind 5700XTs probably going to be mined on, I mined several hundred dollars off mine and still used it to play Cyberpunk later. But if they cost the same, consider the 11gb of VRAM. If it's cheaper, the 5700XT may be the next RX 580 imo of rock solid budget entry. It's still a 1080p ultra settings card even on brand new games. It performs about the same as an RTX 3060 does.
@@pandemicneetbux2110 Thank you for such a detailed response! The 1080ti and 5700xt are around the same price used for me (the 1080ti being only £10 more), I'd opt for the 1080ti simply because of the vram difference but I worry a little since its quite a bit older than the latter card & I'm looking for longevity. Both these cards have downsides when buying used, as the 5700xt likely would've been used for mining & the 1080ti is pretty old so it would've seen more abuse from a previous owner. Though I am leaning closer towards the 5700xt and I probably wouldn't be playing *too* many brand new titles, though If I was I'd be playing in 1080p.
@@zeriiyah If both are similar prices, you can't really go wrong with either. The 11gb of vram is definitely a nice thing to have, but if the games you play will still run fine on 8gb then I would probably get a 5700 xt. They are usually cheaper than the 1080 tis, at least on ebay here in the US. Not to mention, the cheapest 1080 ti cards are going to be blower models, and they can be loud and hot
@@zeriiyah Gtx 1080 Ti is a better option cus it allows you to play current AAA at 1080p. I have used both, still own 5700xt as a backup and both are strong even at 2k. Doesnt matter if its used for mining as long as miner cared for the card and could be in better condition than a manually OC-ed gaming one. In the end what looks better to you and which software do you like better AMD's or Nvidia's also price, one has lower power draw and other has 3gb of Vram more. Non are RT gpu's so they dont optimize and make best drivers for them.
I just remember the status the 1080 and 1080ti had. I had no money so got a 1060 but in covid times upgraded to a 1080 finally! Kinda a dream card even thought it might be a bit older.
Grabbed a 1080 Ti FE early this year. It’s much more capable than I expected, especially once undervolted. Also, its bigger brother, TITAN Xp, can also be picked up for very reasonable prices now. Maybe you could check that out next time?
@@beherenow1668 GPUs are wattage limited, not voltage limited. Their power systems/bios is capped to X watts max power. This is what the power limit in the settings configures. If you lower the voltage then you can squeeze more amps through the card under the same wattage cap. This usually means you can feed more power to the vram, for instance and achieve higher memory overclocks. As others have said it also reduces temps which helps the GPU auto overclock to higher frequencies and stay there longer.
The 6800k is a horrible bottleneck. I had to buy a 7700k to somewhat fix that. Last year I went to a 5800X and most titles gained 50% FPS, with the same GPU.
I wouldn't be surprised in all honesty, my 3600 holds back my 6600XT in some games (literally the same performance as the 1080Ti) in 1440p UW Though most the time it'll be the game engine being limited to 4 cores even in modern games
I sold my 3080 for $500 and replaced it with a 1080 TI for $120. I've never used it before, hard to believe but for $380 less I still get good performance in 2K gaming. It also proved my thoughts about not buying Nvidia's new series for a small performance difference. In short, it's true that this card is a legend. And yes, after producing this card, Nvidia realized that it made a big mistake and never produced such powerful cards that would work for many years without problems in terms of price to performance.
Hint for CP77 on the 1080Ti: drop the shadow quality to low, it's worth about 5-15 frames on high settings 1440p but obviously highly dependent on the area you're in.
Can confirm the 1080ti is a beast! Bought mine second-hand on a budget in late 2021, and it's the only part I don't regret in one way or another. Love your work man :)
I use a MSI Gtx 1080 ti Lightning Z and i won't upgrade until Rtx 5080 releases. I'm happy with my card. I also have a 4K panel and i can play games at 4K. Not newer titles but for example i was able to play Call of Duty Advanced Warfare, Doom Eternal, Wolfenstein: New Colossus, Batman Arkham Knight, Battlefield 1 and Battlefield 5. All at ultra settings. This is an amazing card. I will upgrade when Rtx 5080 releases.
The entire GTX 10 series has been god tier. Honestly it's probably the greatest graphics card series that has ever launched. The 1060 and 1070 cards are still in widespread use today and rescued many a budget gamer during the pandemic and the years that followed.
you're absolutely right. my 1070 was holding up pretty well up until last year when i started trying to run some new single player games. upgraded to a 3080. it's nice but man that 1070 lasted for five years
Still using 1070 MSI Gaming X since release. 0 issues, still going strong. Its somehow funny that 8gb of vram is on pair with modern cards. Its slowly time to make an upgrade. Aiming for 4070ti Super due to 16gb of vram... only if pricing is somehow fair.
My 1080 ti is still holding up very well after all these years. Pared with a i7 8700k and 32GB ram. The whole sytem is still going strong running almost 24/7 since 2018.
I have a 9600k with a strix 1080ti. I recently bought a 12700k with a 3080 and I'm not impressed at all...my older pc seems to punch way above its weight class.
I got my hands on one of the EVGA 1080ti Hydro Copper Kingpin Edition back then. Now, while EVGA doesn't make any GPUs anymore, it's a collectors item. One of the fastest 1080tis you could get out there, premounted watercooler, made for LN2 cooling and overclocking competitions. I also still got the Box as well on a shelf. One of the best purchases in terms of Hardware i made.
I had two of them back when I was still under the delusion that SLI was going to be rescued, later sold one of them to my brother and the other to my father. They're both happily pushing frames to this day, very wholesome story. Dad decided to abandon PC gaming for Xbox though, the traitor.
I do my daily ritual sacrifices to keep my evga sc2 1080ti appeased. Bought mine in 2018 and it keeps on trucking! one of these days I'll upgrade, but today is not that!
This video popped up on my homepage front and center, and I am so happy you made this video! I do in fact run pretty much the rig you demo'd. It's a Ryzen 7 5700G and 32GB of DDR4 on an ASUS B550 board, with graphics provided by a 1080 Ti of the AORUS variety. I purchased this card back in 2017 for $789US and have used it basically every day since. Hands down one of the best purchases I've ever made, because a new GPU around the $800 mark that would be a significant upgrade just came out this year. I don't play a whole lot of fast-paced realistic looking games, which is definitely helping my experience, but overall it's been an exceptionally capable and reliable piece of hardware. Thank you again for reviewing it in 2023!
Had a feeling that extra VRAM would carry performance well into the future, big mistake trying to go without that, although many games do well allowing me to compensate.
If you're wondering how the 4090 will be holding up in 2029, you first have to take into account what percentage of them will have burned power connectors by then, and if that at some point can make them die. The Northridge Fix guy says he is getting a lot of them to fix.
Take into account a very small minority? like .01%? Yeah big number to take into account. Maybe take into account the vast majority that won't even deal with that issue.
@@Mcnooblet considering they still haven't definitively determined the exact reason for them burning up, we don't know of this will become more of an issue over time or not.
Watch the gamers nexus video from today. The northridge fix was from GPUs stockpiled over the course of months from cablemod. It appears they melted because of the user not plugging them in properly, and cablemod sent them in to see if they can be salvaged. Cablemod was being an absolute bro and replacing cards that failed while using their own power connector. The salvaged cards are then given away for free because Cablemod likes good PR.
@@coldheaven8007 It is! It runs my primary PC that's on basically 24/7. I will say that I did get lucky, I could tell out of the box that it had barely been used. Almost zero dust in any of those impossible to reach corners in the heatsync. Honestly mining cards would be safe purchases too, the real wear is from thermal cycling the components on the card.
I remember when the 10 series cards were launched. The performance jump was insane. After that the 20 series performance increase felt so miniscule. But of course the main focus on that release was to introduce RT.
@@HybOj It has to start somewhere. Of course you could say that iphone 1 is shit compared to iphone 15 pro but we wouldnt have the modern iphone without the first one. Same goes for RT and DLSS
@@HybOj I'm not debating anything. I'm just saying even though the first gen RT was weak it was necessary step to have the current and future RT. So there's no point to laugh at first generation of anything. That's how progress is made.
@@megapet777 ok so imagine I did not put ( :D ) there and we are in agreement. Ofc 1st step is always necessary to get anywhere. I agree. And I also say that the 1st implementation of DLSS was really bad, ppl tend to forget :) It changed to completely different system, which is no more trained on each game, but on a training set of general situations.
Even as someone involved in the industry, i hadnt fully appreciated how much this card struggles with modern games. To only just get 60fps at 1080p on something like The Last of Us is shocking to me. Other than ray tracing it hadnt particularly felt like games - other than outliers like cyberpunk - had meaningfully gotten so much more detailed looking and difficult to run.
Had one in my system since 2018 and i can still play anything i want. It even handled cyberpunk surprisingly well. I remember waiting for it to come out because new generation actually meant something. I had a feeling that me not buying a new one since then had a reason, thanks for showing it so clearly.
I remember this being my dream card in 2017 but I wasn't even close to affording it, so I settled for a second hand 1070 back then. Was still pretty great card I used till the end of last year when I got a 6800XT.
I have a 1080ti, it's a very good card, it's just starting to show its age. The majority of games pre-2023 will play fine at 1440p +60fps, but it is becoming a 1080p card with newer titles. I've only recently upgraded to a RTX 4070 to test out DLSS and FG. Also I think the 4090 is going last for a long time, till the newer game consoles come out around 2028. By then the 4090 will become the new 1440p/1080p old dinosaur like the 1080ti.
Pascal definitely struggling with some newest games. 2080ti can be 50% faster. I don't know if this is drivers, or much better async compute on Turing. When 2080ti came out it was only 25-30% faster, but most games in 2018 were DX11.
@@KrisDee1981 definitely drivers and architecture differences. The 1080ti has the raw computer power but none of these new dx12 games are optimized for this old shit. Still using my 1080ti I bought in 2017.
@@ericmullen7582 Yeah but I remember even in 2018 people predicted that Pascal will struggle in DX12 and Vulkan. Wolfenstein Youngblood was 60% faster on 2080ti.
4090 Will last so long because it has 24GB of GDDR6X mem. The only gpu i will be moving on is the 4090Ti after their 6000 series came out because non of the any of games made in 2023 using more than 13-14GB of VRAM in 1440p ultra/epic presets, non of them. 4090Ti will last untill 2030 or when they stop sending the newer updates for the drivers. Even 980Ti still holds up in 1080p today. I " upgraded " my old 1060 6Gb to an old 1070Ti 8GB because i saw it on the market so cheap that i bought it immediately without any hasitation. It cost me only 110€ or 115-120$. Sold 1060 for 60 bucks and now i can tweak the details even higher, feels like it gain around 25-30% more power, in CSGO i had around 180fps and now i can play at 220-250fps no problem. GTA 5 online 1080p ultra also more than 60fps. For the people is the 1080Ti the best budget gaming gpu you could ever buy but very next year it will became obsolete and 2000 series take the throne when it comes to budget gaming. 2080Ti will be lasting for another 3-4 years. But as i mentioned, after 6xxx series 4090Ti will be the best option to be safe for another 6-7 years.
At long last you’ve reviewed it! The legendary, the mythic, the incredible, the immortal, the infallible, the irreplaceable, the beloved, the revered, the timeless, GTX 1080 Ti! A card so powerful it can brute force ray tracing! To this day I still dream of owning a GTX 1080 Ti, especially a founders edition. The look of that cooler! Even though I already have a graphics card that comes close to its performance (kinda) and consumes much less power than a 1080 Ti there’s just something about the idea of owning one. It’s like dreaming of owning a classic 60s muscle car for the looks, performance, and reputation it had in its time. I’d be very tempted to get one as an upgrade for my friend if his power supply was up to the job, but realistically I know it’d be more practical and reasonable in the long run to find a newer and less power hungry card with equal or greater performance. _But still..._
Man, I remember getting my 1080 ti right before all the GPU prices crashed during the pandemic. Had to pay off 800$ for this 5 year old card but man, never regretted getting this thing. Plays everything I want to throw at it in current day even if I don't really play current day games.
The only high end card that outshined this was the RTX 3080 at MSRP IMO. Same price but technically a lower end model, ray tracing was now possible by using DLSS 2, which was good at this point and the rasterisation was very solid. Just a 10/10 card before the scalpocalypse.
The dudes saying this card was a mistake, I have one question for you. Are you really discontent, sad, feeling betrayed, and devastated that there's a company actually making a good value product for you? Would you be happy if the card cost $100k? What level of stupidity is this? Those research saying people are getting dumber is actually fucking true.
I actually bought a 1060 when it was just released for 200 EUR, played in 1440p on max almost all games until the last few years. It seems that they messed up with the 10 series as they delivered great quality for a very good price.
In my eyes this has got to be one of the GPU's of all time. Even like 20 years from now, looking into the past of pc development throughout the years it'll remain a point of interest to talk about.
This card still is a beast, big performance gain compared to 980TI. I want to get my hands on a GTX 1080TI, can someone recommend a model with good cooler that will last as long as possible? EVGA versions seem to be good quality but I am not sure.
my 1080 TI FE made me realise that modern Nvidia flagships are the choice if you wanna record what's on the screen, but never buy FE ones unless model is 3yo/younger or has 3 fans and up to 1x8pin
Never really looked back after I picked one up right before the 30-series launch. In the weeks before the launch, EVGA was discounting their 1080tis down the $375 and then $350, and I couldn't not jump on that. With the scalpers and everything else that happened after the launch, I honestly couldn't be happier.
ERRATA/CLARIFICATIONS
01:37 To clarify: Maxwell 1 cards (the GTX 745, 750, 750 Ti and their workstation/data centre equivalents) don’t support feature level DX12_0. This feature support is needed for several modern games, including Elden Ring and Forza Horizon 5. IMO any GPU arch which claims to be DX12 Compatible, but which only supports DX11_0 or 11_1 features, doesn’t truly support DX12. This also applies to Kepler (GTX 600 & 700 series) and AMD’s GCN1 (HD 7000, half of the 200 & some of the 300 series).
11:50 The Witcher 3 is clearly CPU limited at 1080P Ultra using the Ryzen 5 5600X, so - as mentioned in the voiceover - users of older/lower end CPUs will see lower FPS in DX12.
15:37 Disregard the TLOU comparison, the RX 5700 XT was tested with an earlier patch. I had dropped texture resolution and other VRAM intensive options, which A) should no longer be necessary post-patch 1.0.4, and B) may account for the higher performance. Sorry!
the GTX 760 only supports DX11_0 too
Have you considered using translation layers such as DXVK or VK3D to test these old cards? Sometimes, old cards can have decent Vulkan support while lacking in their DX12 support, including all the features. By translating DX11/12 to Vulkan and running the games that way, it can be quite helpful and may even enable a card to run a newer game that it otherwise couldn't. Additionally, on Linux, the open-source AMD driver (amdgpu) offers optional experimental support for GCN 1.0/1.1 cards, allowing them to utilize newer Vulkan features and optimizations implemented by the driver team. This topic is frequently discussed on Linux-oriented forums and sites, yet TH-camrs rarely create videos about it. It could make for an interesting video.
The upcoming games want a dx 12_1 this means the old rx 500 and older are not supported anymore.
A plague tale requiem resolution optimizer set to ultra quality will render the game at full resolution with no up scaling as in the engine settings Primary Scaling is set to 1 and Secondary Scaling set to 1
Titan V from 2017 would be interesting to test in few years... (since it's way too much $$$ at the moment)
Making a very good product that will be useable in the long term sure is a horrible error
It is. 1080ti has made every Nvidia GPU after it look bad in comparison…
Nvidia does not make the same mistake again!
Yeah they really fucked up this product
I'm still using my 1060 3GB despite it not having enough VRAM to run modern games. Even Samurai Warriors 5 (It's devs Koei Tecmo / Omega Force aren't known for being a tech powerhouse) struggles due to VRAM. Hell, certain GZDoom wads run out of VRAM (GZDoom is a sourceport of classic '93 Doom).
I am still pretty happy with mine
After all, planned obsolescence is their core philosophy
If nVidia made this card today, they would have found a way to give it only 5GBs of VRAM.
ahahaha so true.
And even then , they undercut it with VRam on purpose , because they had a free extra slot for 1 more gb on all of their cards, but they didn't cut it hard enough by their mistake lol. They weren't as bold back then of screwing as much their customers and undercutting them with vram as nowadays.
But their cards all have more VRAM at every tier now...
@@CyberneticArgumentCreator difference is, back then 4gb of vram was plenty, but most cards had 6-8gb or more. now games are starting to need 12gb+ and only a few cards from the last couple generations have that, even their newest barely meets that level for the 4070 ti
They'd also give it an 8-bit memory bus to ensure that it can't run any resolution higher than 144p
@@CyberneticArgumentCreator no they don't... The RTX 3060 had 12GBs, the RTX 4060 has been announced to have 8GBs only.. which is shit, 60 tier cards should have 12GBs, 70 and above should have 16GBs at the minimum...
The 1080 Ti honestly has legendary status. It was one of the highest performance leap cards ever made by nVidia and was maybe a high price but it was actually justified at the time unlike what were seeing now.
1080 Ti at 700$, while the 4090 is 1,600$. And who knows if they'll dare to stick a 2,000$ price on the upcoming 4090 Ti (actually the 4090 from infamous Asus is 2,000$ *shivers* ). Meanwhile most that 700$ can get you today is the 4070.
We're definitely living in the dark ages of gaming.
@@Sergmanny46 Not just graphic cards, look at the smartphone market. The situation is even worse, they often sell worse new models for more money.
@@Sergmanny46This is why just encourage people to buy consoles these days. I'll never switch, but if all you do is game, the value proposition in PC hardware is atrocious.
@@friendofp.24 Never thought I would see the day where recommending a console over a PC is actually the better choice. But then again, 500$ for a brick that can virtually play anything you throw at it is something that cannot be ignored.
The 1080ti is 48% faster than a 980ti while a 4090 is 76% faster than a 3090. 649 vs 700 launch and 1500 vs 1600 msrp.
Using a 1080ti right now still. Bought it 4 years ago as an upgrade from my 680. Probably the best card I've ever owned. It never let me down, and when the time comes to finally replace it. It will have a spot on my shelf in a display case. I'm happy this card is still getting love 6 years later.
Nvidia will never make the same mistake with the 10-series again, that's for sure
Bought mine used as an upgrade from my 1050ti after the 20 series came out. The best purchase ever! Now using a used 3080ti. The king of 1080p gaming minus the rtx I would say.
I have a 1080. (The ROG one which is super similar a 1080ti performance wise). It holds up fine.
It's amazing it still goes so hard in 2023, I'm still rolling with mine and looking at today's prices for modern cards is just depressing, but I can keep my 1080ti for another generation or two.
Got my Titan Xp (which is basically just a special edition 1080 Ti FE with slightly more VRAM and performance), and it's gonna stay with me long after I finally retire it from service on my main desktop.
The fact the 1080 ti can run RT at all without any rt cores is pretty impressive
😂I was thinking this
Na yall just eat up anything pr says, you don’t need rt cores for rt
clown@@Soggy-Soy-Toy
@@Soggy-Soy-Toy for stuff like path tracing and how ray tracing is done nowadays they are pretty well necessary but in light applications like older RT games it isnt needed
@@flufficornss Those Minecraft Shaders go so well with the 1080Ti at 1080p.
It's sad that an amazing value product that sells extremely well is considered a mistake by companies.
I think the wii was considered also
Coz 1080ti owners refused to upgrade.
It's not about how well it sells, it's about longevity. Yes it sold well, but it's longevity meant that a lot of people who bought the card probably spent 5-6 years without upgrading and once it enters the second hand market there's still a lot of people who'd buy it rather than spending money on a new GPU. So much money not flowing into the company because the product they released was too good.
@@snark567 Its difficult to kill a gpu, if a company want to last 3 years ,most of them may die sooner than 3 year and can cause bad reputation to company, or else it last longer .
Corporate greed
Pascal really was the best GPU generation from NVIDIA. Even its worst cards had some value to them.
I bought a 1060 6gb in 2017, sold it in 2022 and it was perfectly capable to run every game i threw at it at 1080p 60fps medium-high settings, never had a single problem with it. Amazing card honestly, I was a bit sad after giving it away.
GTX 1060 3 GB and GT 1030 DDR4 were pretty crap tho.
They hit gold with Maxwell and they were able to shrink and double its clocks to create Pascal and release it in only a couple years after Maxwells release. Rumor was that pascal was already done when Maxwell was released.
The 1050 Ti was crap, for $25 more you could buy a RX 470 that was much more powerful than the 1050 Ti.
Then in 2017 Etherium came and the only acceptable GPU for a decent price was the 1050 Ti, including the large numbers of owners from specific areas of the world.
@@takehirolol5962 That is if you only care about gaming . RX 470 has no hardware accelerated VP9 decoder for TH-cam. All Pascal cards even from the lowly GT 1030 has VP9 support.
Remembering when the 1080 Ti came out and now seeing it become somewhat of a budget card is extremely surreal.
It's irritating because you know they didn't just pull the newer cards and tech out of their ass, they roll this stuff out at just the right rate to turn their top end cards into middle of the pack cards within a couple years. Who wants to spend 700-1500 bucks every two years?
Yeah , I agree. the price has literally doubled for the same equivalent level of GPU's which is disgusting.
i would say the 5700xt exists which is quite close in performance but about 120/150
@@brahtrumpwonbigly7309 yeah, that's why i always think of buying cards at least 2 generations old if there isn't better contenders.
What i've seen on ebay the last few days is that:
RX6600 is around 200$, GTX1080ti is around 200-250$ dollars, RX6700XT is around 200-300$ dollars.
Quite hard to decide i would go RX6700XT if it's at 200-250$ on the used market while it's a bit more powerful it has better DX11 and DX12 performance and still has years for that AMD fine wine drivers.
RX580 i have seen it at 50 dollars and it seems it's only half the performance of the RX6600
@@Splarkszter Rx 6600 uses fuck all power which was my main drive
But I struggled with that exact dilemma before I bought, the 1080ti is quite nostalgic same with the r9 295x2 legendary cards
Whoever bought a 1080Ti back in the day has unknowingly made the best financial decision in the last decade.
After bitcoin of course
I am one of those people :)
These days, you need a 4070 to totally exceed the 1080ti's capabilities. Ideally, you'd want a 4070 TS at minimum for double the performance at a notable VRAM upgrade.
Nvidia shouldn't have skimped on VRAM with 20 and 30 series. Otherwise, the card would've been obsolete faster.
I'd still be happy using my 1070ti if it didn't die on me
The 1080 generation was the last generation where we saw massive performance improvements for little to no meaningful price hikes
3000 gen says hi
@@randomobserver9488 3000 gen was overpriced from the skalpers tho
@@randomobserver9488 there's a worm in your brain. it's making you say things you have no reason to say.
The move from 28nm to 14nm did take many years
@@randomobserver9488don't get me wrong, the 3000 series was generational with 3070, 3080, and 3090, but they were ruined by insane prices.
We kinda had AMD to thank for the 1080Ti since they WAY overhyped their upcoming Vega range so Nvidia panicked and released a way more powerful card than they ever would have... and they will likely never make that mistake again
I think it's just happened again tbh, the performance gap between the RTX 4090 vs RX 7900 XTX is quite large, probably more than what Nvidia would have wanted!!
@@you2be839 they planned for it this time, which is why the 4090 costs a kidney or two
@@you2be839 it could've been even faster, though. Or maybe they plan a 4090 Ti. BTW in the gap between 7900 XTX (or 4080) fits a possible 4080 Ti.
@@you2be839 No it didnt you dummy.
If you know anything which you dont, you would know AMD has not yet released the 7950XT and XTX. They are already named on techpowerup if searched for it manually like always for not yet released cards.
And the 7950XTX beats the 4090 by nearly 10% in raster performance.
Shows how little you know about things you talk about. If only you werent so handicapped on computer knowledge people might take you serious.
@@williampaabreeves And you are also another commenter with 0 intelligence. Again as named in the other 2 comments, there are a few SKU's of gpu's that will release in future, but have not done so yet. They didnt price it because of monopoly either, since they dont have a monopoly anymore with AMD this close on them. Which is good for customers However Nvidea decided to make their chip on 4nm which is currently so expensive to make that the gpu cost is that high on the 4090, on top of that their memory modules are 5nm. AMD didnt see value in that so their chip is 5nm and the memory on 6nm.
Since none of you in the comments have actual knowledge about whatever you bring up this is why people make stupid choices. Because people like this section talk about speculations they base on thin air. The cost for the 4090 is so high due to its being the first chip on 4nm. Its too new, production isnt high enough.
And on top of that the netherlands have not yet made enough machines yet for all computer developing stations to be enough production to lower the cost. Nvidea choose a insane costly manifacture process which the netherlands made only 4 machines of.
TSMC and Intel and Samsung and Nvidea all wait for ASML which creates ALL high density computer equipment so you can even use a computer. Without the netherlands developing that you would be in stone ages. Nothing in computer development goes forward unless the netherlands improves its processes. Basically you all depend on the Netherlands, And we have not made yet enough 4nm machines for you all to develop these things in bulk for less.
So the real reason is Nvidea choose a new machine for wavers that is so costly and so few exist that the production price shot out of logic.
AMD however decided to stay on 5nm for chip and 6nm for memory.
If this comment section had a combined IQ of 100 then you should have been able to figure this info out yourself. So it shows me you are all kids who know nothing about how this world works or how economics or production works, none of you cares to learn and you all speak annecdotal.
Now i hope this cleared things up for you all to even comprehend what is going on.
Nevertheless, AMD and nvidea have both not released their flagship. 7950 XT and XTX and 4090ti both are still under development. And prices of the AMD will be far superior because of this. Performance could be just below it, but then again the 4090ti will cost 1.5 to 2 times the cost of a 7950XTX so given that the XTX of the 50 series already shows superior raster performance than the 4090, i wonder how much value the nvidea will be on that processing technique. Probably useless for its price performance.
I used a 1080Ti in my main rig from 2017 all the way to mid 2022. It allowed me to skip dealing with the GPU shortage entirely, whilst still being able to play games at their highest settings. Although I am now on a 3090 and have no use for the old Pascal beast I'm never going to sell it because it reminds me of a better time, when Nvidia actually cared about the majority of its customers and didn't charge 50% more for their cards between generations.
I'm planning on doing the same with my EVGA 1070. Heck of a card that served me well along with a 8700k. Especially when it's basically my first PC as well that saw me through college
@@Demopans5990 i switched from a 1070 to a 3090, and the performance difference is obviously noticeable in every way, but really shows how capable those cards still are. I think I was running borderlands 3 on basically max everything 1080p at 40-60 fps.
Yeahhh 1000 Series crew here. I started w/ a 1070 in early 2020 bc of a tight budget and 2 1/2 yrs later got a 3070. Like you my friend, I'm not going to part w/ my dear 1000 card. I didn't have to worry one bit about scalpers and shortages. I'm gonna use it in my retro gaming room build 😁
also 3090 owner and 1080 Ti at my shelf
Still using mine
Perhaps the RT glitches you were seeing in Jedi Survivor was related to RT denoising. One of the biggest issues with real time RT was getting rid of the noise. RT cores and tensor cores can handle this, while Pascal has none.
Good point, thanks 😊
RT is still noisy.
absolutely
@@mikem9536 that's why there's denoising
I ran Quake II RTX on my friend's 1070, and the denoising worked fine. Either that's the denoising problem that is specific to this game (or UE4 in general), or the problem is something else, I think,
This didn't surprise me. I preordered a 1080ti and I'm still using it today. I've never had a card that was this good for this long.
Still sitting on mine too. I play most stuff in 1080 and haven't seen anything really worth the cost of the upgrade yet.
@@kaiserkreb have you upgraded yet? and if so what did you upgrade too? Im new to learning pc building and parts
@@srnbrehman4695 I ended up getting a good deal on a 7900 XTX. The drivers are as annoying as I remember them being when I originally gave up on ATI/AMD but it's been a solid card so far. I tossed the 1080 into my HTPC to live out the rest of it's life on controller game duty on the TV. Probably be there for years.
The GTX 1080 was the first graphics card I ever heard of 😂. There was crazy hype for the card back in the day
we call that the hay days my friend lol
My first graphics card that I heard of as well. Everybody was talking on about how this was THE monster GPU.
i started in 03, so i cant even remember the names, a 256m card was a beast at that point. i remember heaps of talk about 8800 GT's or something like that, don't think i ever heard more noise about a card, or simply heard it refereed to so many times, must have been godly too.
Ah good old days, i was still a broke bloke college student back in that day, and me and my friends would be watering our mouths everytime we pass by a computer store with the 1080ti displayed on their shelves.
saved up on my college days and got my first card, GTX 960 on 2018
saved up again after I started work and got a GTX 1070 in 2020 just right before pandemic prices
and now I got meself a 3070 in 2023
I still remember Voodoo
What infuriates me the most is that they leave behind the design of the GTX 900 and 1000 series, I love that design, looks so futuristic and all the diamonds and triangle shapes are beautiful.
GTX 980ti it's a beast of a card too, even today, I have it on my other computer.
I'm an AMD guy so normally I don't care but....I'm actually totally there with you.. The 900 and 1000 cards were beautiful. Plus amazing performance.
Gtx 700 FE had the same coolers as the Gtx 900 FE series
@@BeefLettuceAndPotato I think 2000 Fe cards are up there in terms of looks as well!
980ti is great. YEah and also people don't seem to remember or figure out it's actually because of the triangles, it's a rendering and computer animation thing using lots of triangles for the way it renders things you actually can see it on older games using lots of transparency sometimes. It's also related to why the aesthetic was used, is keep in mind the Hyped game of 2016-2017 era included Deus Ex: Mankind Divided. That was a total hog too btw, uses postproc to try and hide it but the game is flawed visually and it still runs at 60fps on a 1080ti/5700xt using ultra detail settings 1440p but it still does look good...although, not as much when you remember Witcher 3 was same era and how it looks vs how it performs. So anyway, hence the triangles and angular design aesthetic of those cards. I think the all black Titan looks so good though, only thing to make it better and match the DE:MD aesthetic would be gold plating the heat fins on the FE model.
The triangular design was introduced with Pascal, the 1000 series, not Maxwell the 900 series. It was also used with Volta under the Titan V.
The 1080ti is an absolute beast. It's an example of what Nvidia does when it thinks it has competition. The upcoming AMD card at the time was hyped to have such high performance that Nvidia bought out the 1080ti, the AMD card didn't meet the hype leaving us with the legend that is the 1080ti. I've got its little brother the 1080 and the ti is roughly 30% faster. That gap for a ti is ludicrous. I am currently deciding between upgrading my monitor to 1080p high refresh rate or 1440 high refresh but needing a GPU upgrade. If I had a 1080ti I wouldn't need the GPU upgrade to play at 1440p high settings.
Buying something like a 6700xt doesn't seem so appealing when the 1080ti exists.
6700 XT is worth it, no joke one of the best value for performance you can get rn
@@tomdr93x It's one of the best cards at the moment. It's roughly 51% faster then my 1080, but it would still be a silly upgrade for anyone with a 1080ti being only a 19% improvement.
@@rmgaminguk7079 If you paid 700 bucks for a GPU back in 2017 you'll buy a 6900XT or something like that nowadays
@@omegaPhix Anyone that paid $700 for a GPU would have upgraded years ago. I was more considering the 1080Ti as a current purchase for someone looking to upgrade to it. The poor state of the brand new budget market makes buying a second hand 1080Ti a decent option.
@@rmgaminguk7079 6600XT performs about the same and doesn't cost much more
I purchased this mistake a couple of months after launch and it was a fantastic card. Maybe the best Nvidia ever released, right up there with the 8800GTX. Used it for 5 years until I replaced it with a 6900XT.
Man the 8800gtx. That was all the rage. Wasn't that around the time of Crysis coming out? Man that's making me feel weird to think about. I'm nearing 30 now and pretty sure I was like 12 or 13 at the time.
Nothing will ever beat the 8800GTX’s performance improvement. 130% faster and the introduction of CUDA and unified shaders. Absolute insanity. Then it was followed up with the 8800GT delivering most of the performance for $250! Those days were great indeed.
@@dex6316 yeah the 512meg 8800gt was phenomenal value . I had 2 in Sli back in the day and Crysis was awesome.
@@dex6316 Oh yeah, nothing will beat such a large leap for a long time. I just typically tend to look at GPUs based on relative value and longevity. I think the 1080Ti will end up being a rare Nvidia finewine scenario. I had an HD 7970 before my 1080Ti and I probably could've gotten another couple of years from that card had I not moved to 1440p.
I just ordered 6950 XT to swap my 1080 Ti. How happy with the results are you?
I got a 1080ti when the 2000 series was announced for the "clearance discount".
It was an absolutely amazing jump for its time and I remember the jumps in performance from following generations being miniscule compared to the jump b/w the 900 and 1000 series.
lol coz 20 series was overrated
@@kenxchoi77520 series cards were horrible tbh 😂
Cheapest 1080 Tis were 620 € new. That would have been great value especially to 1080p gamers. I play at 4K, but also skipped the 2080 Ti until Ampere came, when I got the new Ti for less than I got for my old Ti. Quite insane, but also expected, as I had gambled on the Turing launch disappointment happen again. It did, you could get 2080 Tis for 450-600 easily for about two weeks, then the prices skyrocketed again.
Same bought mine around that time for £400 used. Died just the other week.
I went for a decent 3070 secondhand, but if I hadn't then the 1080 ti would have actually been next on my radar. It's insane how a 6 year old card is still actually this valuable. Everything after the 10's just seemed to up te price as much as the performance.
The 1080ti is like half the price of a 3070 lol, but no DLSS support is kind of annoying
@@Boofski the thing is more that back when I was looking for a card, people were asking more for a 2080 then a 3070.
If I had to step down it'd be 1080ti because of that.
my 1070 still keeps up for 1080p and my 3070 is great for 1440p gaming still. love em!
@@Olav_Hansen I considered the 3070/3070ti, but ended up going with a used 2080ti, the 3070 was considered better, at the time, but the artificially low amount of vram really put me off.
The 3070 is a terrible deal unfortunately
I was using a GTX 1060 in 2022. Pascal was amazing. Absolutely amazing.
I was using one till 2 weeks ago :') it was hard to say goodbye
Me too now i have 1080ti;)
same, my wife had it for years until Hogwarts Legacy came out and I wanted her to enjoy the best texture and upgraded it to a 3060-Ti, and at the time I thought it might've been a bad investiment but looking at 4060-Ti's performance I felt like it was a steal
@@cks2020693 I switched for a rx 6700 (non XT, 10gb) so far, so good
My wife and I are both using 1070s still. Keeping up with the market as I wait for any reason at all to upgrade
Man, I remember pre-ordering the Founder's Edition as soon as it became available. Barely got my order in as it sold out in minutes. I knew it was going to be a good card, but no one could have guessed _how_ good this many years later.
It's too bad Nvidia turned on its customers.
I still have mine, both of them. Nvidia managed to burn me on the SLI lie for gaming, but for Video Editing and 3D modeling applications that use it *MY GOD* it still blows any single card on the market even today.
They made the nokia 3310 of gpu and immediately regret it
It's just a horrible card. 4090 can do 8k 60fps. Even 1080ti could not every do 4k 60fps max, just 1440p only.
@@789uio6y my guy over here thinking technology doesnt evolve
@@ThatGuyNamedRick SLI 1080 Tis here as well, First ever 4k60 GPU config. 😊
We saw _Tomb_ _Raider_ achieve ~90% SLI scaling circa 2017... That was incredible! 1:1 SLI scaling was *within* *reach* , but Nvidia killed it, because they knew it would cannibalize sales/delay upgrades. Cowards... 😆
You're right, for production workloads, the rendering difference between a 4090 and 2x 1080 Tis is
lol the house buying analogy is so true. It annoys me to no end the old folks who keep telling us "Back in my day I worked 50 hours a day and was able to buy my house"
Yes mama, we know, back in your day you also walked 30 miles everyday to school too right?
If you look into this, real wages have increased by 25% vs housing cost increase of 120% within the same timeline.
@@basilman121 given what percentage of your income housing is, you can see how nobody affords a house anymore. and alongside that in the last few years food doubled in price, it's insane!
@@SoulTouchMusic93 It's insane how much people still put up with it.
The gtx1080 to me will always be the pinnacle of graphics cards. I never owned one (I still use a 970) but I was always fond of it. Even now when I think of 'the best' my mind immediately defaults to the 1080ti before I remember we're already on the 40XX series.
Nah GeForce 256 and 8800gtx/ultra were far more impactful
the fact the 1080ti is 6 years old already is crazy, i was thinking why is this a video but that’s insane to think it’s been that long since it released
makes you feel old dont it lol.
@@imafirenmehlazer1 fr, time is flying by
I thought hey the 1080 ti isn't that old! Then I remembered that I was still in high school when it launched... And I'm about to be 24 💀
I loved my 1080ti it lasted me years and it wasn't bottlenecked by 1700X at release either. Easily the best card since my 8800 GTX. It was also the first card that I experienced 144hz with. 1080ti + ASUS PG279Q was peak 2017 gaming.
1080TI truely has the spirit of the 8800 series in it, sadly i couldnt go GTX back then on child wages, but my 8800GTS was a beast for years. 640mb of vram yo!
Came here to look for the 8800gtx comment and was not disappointed.
@TheRealUnconnected the GTS was indeed a great card too. The GT was ok too as you could get it in a single slot solution and great for a physx processor for the handful of games that supported it. I kept my GTX going even longer by getting another secondhand one and going SLI. I recall all the hype of the GTX in its day. Only way to run CoD4 properly!
1700x most definitely bottlenecks the 1080ti. I picked up quite a bit of frames when I upgraded my 1800x to the 3800x a few years ago.
I swapped my 320MB 8800 GTS for a pair of 256 8800 GT cards back in the day. Always regret getting rid of the first card that I ever fell in love with.
Not making the same mistake with my 1080 TI
The GTX 1080 Ti is truly a masterpiece, a marvel of engineering. Truly the best GPU ever made!
The new ones are way better than that masterpiece, but your paying for it.
@@srobeck77 you aren't paying for the improvement but for the brand if you buy top of the line gpus nowadays
@@morettillo5487 so you think there no improvement at all? Thats pretty delusional
@@srobeck77 i didnt say there is no improvement at all, i said that top of the line gpus (rtx 4090) are mostly priced over brand more than performance. Of course there is improvement, a 4060 is better than the 1080 ti drawing half the watts
@@morettillo5487 you also "didnt say" several other technologies like ray tracing and then you proceeded to compare a 1080 to the lowest end 4060. You aint too bright fella, not even bright at all....
I had a Titan X Pascal in my PC for a long time (the 1080 ti hadn't been announced yet) and it still ran really well until I replaced it recently with the 7900 XT. I gave my Titan X to my sister who still had a 1060 at the time and she's been happy with it. The Pascal generation really was peak nVidia.
6 years. Still running 1080ti SLI. Both cards cooled 480 360 and 120 rads. Just now experiencing temps above 50C(the one card that is used the most). Thats probably due to lack of maintenance loop wise. These cards are solid. I will box these cards up and keep them when its time to retire them.
One of the GPU fans broke on my ASUS Strix 1080 Ti card, but I kept on using it because it was honestly amazing.
I just bought a Lenovo Legion Slim 7i laptop and it's only slightly better than my 5-6 year old gaming rig from 2018.
what drivers are you using? i never could figure the best combo to run most games- i always had to change the driver and support software depending on the game i was playing
I still am rocking a EVGA 1080 Ti and won't change it until the next gen. It is an absolute beast of a card that performs on par with RTX 4060 , and will be on par with RTX 5050 too.
lol I'm still using GTX 1080 ti because most of the AAA games are trash this year and last year. It's mostly competitive games and older games I put tons of hours on. Did you even see Redfall? Is it worth upgrading at this point?
Soo, you can see in the future?
Last I checked the 4060 isn't out for quite a while
1080 ti is much worse than any of the 40 series. The 3060 is on par with the 1080 ti, check your facts next time.
Ps you know that gpu spec leaks are rarely ever what they are on release?
@@Ryan-wh2xf 380 bit bus vs 128 and 11 gb vs 8 gb. Trust me it is much better at least for the older titles (where you wouldn't find any benchmarks online) like Homm series, or Gothic series, or AC1 etc , this card is faster because it is optimized for older titles and 4060 ti is not , trust me mate. I am getting 800 to 1k fps in csgo, I mean can any of the newer cards get it? Especially 4060 ti it won't . I've tried a RTX 2080 and sold it because it was worse than a 1080 Ti. I wouldn't change it for anything. And even a +50% potential increase in fps in newer titles is not worth it. The only worthy upgrade when you don't feel like you are being ripped off is the 4080 and up and 5070 and up. No reason upgrading now though. 5000 series looks sexy and huge, hence why waiting for it seems like a better option. As it stands now I play all of my titles in 4k ultra (older ones) and some newer titles switch to 1080p ultra for more fps. I get 140 fps in normal gameplay in Atomic Heart at 1080p ultra and 40 fps at 4k ultra. I mean both choices are technically ok. And you can play both if you really wanted at whatever option. People really underestimate the cards performance. They think because it's older it must be worse... Get this , it is actually better. Also there are no problems with it AT ALL, you know when you change your card you could get problems with it like it not working, artifacting etc, COIL WHINE, crackling sounds, flimsy coolers or flimsy shroud , or overheating etc etc etc etc etc , there are so many intrinsic potential problems with it ,also on my RTX 2080 for some reason the drivers took several minutes to recognize and install!?!?!? I don't understand how this is even possible, etc etc etc . Not a problem on this card. The first day I plugged it in I forgot about it. There are literally nothing bad about it or nothing that could potentially be bad, it works like a solid card, and I like the way it looks, you don't hear it etc and there are no problems. I am only reminded of it that it even exists is when it runs a heavy title at 4k ultra, then the room gets a bit heated up. Other than that , I can't hear it even at higher loads. There are NO problems with it at all and it has been serving me with no issues for so long, like a charm. I would not trade it for anything, unless that anything is much better, as it has to be at least 2x performance in newer titles no less, and more than 12 Gb Vram, and this Gen I would prefer to skip, although if the prices on 4080 and up halves , I might consider it. Also Nvidia will be supporting a 1080 Ti for at least 4 more years, so there's really no rush. The only real reason switching from an older GPU is when you are not supported by the drivers. And if you get a driver support there's no real reason switching if you love this GPU.
The best GPU of all time the GTX 1080 Ti . And I am a proud owner of it.
The only reason I replaced my 1080Ti recently was an issue that forced me to reduce the power cap to the point it annoyed me. 1440@165Hz is the setup, modern games were forcing me to make strategic choices on what to turn up.
But what was frankly amazing was that the vast vast majority of modern games and all the older games played basically at 1440 Ultra, a beast of a card!
And for 1080p players, there's almost no need to upgrade
What did you get instead?
@@Atom224 4070Ti 😢. Ideally I'd have waited till at least 5000 series before seriously pricing up a replacement because I could still play the vast majority of my library at 60FPS and greater at High-Ultra with maybe shadows and anti-aliasing Med-Low
A 40*80*Ti or 5080Ti would have been a pretty good replacement with me truly ideally stretching to a *60*80Ti for years service/price return.
At the end of the day of I can't play new games I can just keep playing my existing library of games 😁, maybe switch to playing console emulators to tide over
4070Ti Is still richly doubling the capacity of a 1080Ti. I still think that since the 1080Ti, the 4070Ti is best bank for your buck! So, good choice anyway.
@@LauWo the 6950xt is close to the same price or less with the same performance and 4gb more vram. The prices are way off in the 40 series
this card was my wet dream for so many years. it still kinda of is.
the best card nvidia has ever made
@Transistor Jump they didn't say best performing. just best. as far as legacy, the impact it made.
@@td3132 there's been a few bangers over the years; Geforce 2 and 8800GTX spring to mind
@@spacechannelfiver that's true. But in modern gaming. Must of us think of the gtx 1080ti
@@td3132 yeah for sure, it’s still viable. Good to see the mention of the 5700XT at the end as that was a card a lot of people slept on. Both would be good secondhand options. Just replaced mine due to a slightly unplanned 4k upgrade which was just too much for it.
@@spacechannelfiver had a geforce 2 bacl in the day. don't know where it is now.
but i still got my 8800gtx and 1080 ti. will keep these for " a while" :D
Used to have a 1080ti before I got my 3070. That card is downright legendary, it managed to preform incredibly at 1080p ultra settings for a very VERY long time.
oh god i thought i read 2070 for a moment, i was like, thats a fkn downgrade lol
I wouldn't trade a high-end 1080Ti for a mid-range 3070, even though the latter is a bit faster it's still a mid-tier card with nothing special to it.
@@mikv8 A little lart of me regrets the descision today but all in all im happy with my 3070 and considering I sold my 1080ti for a good price and got my 3070 for an even better price im not too bothered by it. All in all the 3070 is still a beast card, yeah the vram sucks dick but for most titles it runs 4k happily.
@CyPhaSaRin
That's a match, and with DLSS actually 2070 outperforms 1080ti.
imagine being able to say you had a gtx 1080 ti in 2017....damn
What's funny is that, from what I heard, to some degree, Nvidia's good ideas have come from believing AMD's boasts, taking them seriously and making a more competitive product out of them. Two examples include the GTX 1080 TI (which around from performance boosts that AMD claimed Vega 64 would have... turns out not to be the case at all, but nowadays it can actually approach the 1080 TI supposedly!) And the RTX 3050 having 8GB of VRAM instead of the 4GB it was supposed to, because of the VRAM article that AMD published. It probably would've aged worse than the GTX 1660 TI which performed very closely to it at the time it was released, now the RTX 3050 pulls cleanly ahead.
It's funnier since these times AMD has messed up their opportunity to get ahead in the GPU market. Still, competition is good for us.
Lol the 6600 and especially the 6700 from the most recent gen are both incredible cards, what are you on about? They'll provide a tremendous mileage I can guarantee it, even the 6800 with its 16 GB of VRAM is gonna prove to be monstrously enduring in the years to come.
What a legend of a card. Got mine a few months after launch and while it's been relegated to my Linux machine these days, it still pumps out solid framerates on anything i've thrown at it. Absolutely a card that deserves it's place alongside other epic cards like the Radeon 9800 Pro, 6800 GTX, and 8800GT.
not the 8800 gt, but the 8800 gtx and ultra.
@@korana6308 GTX is legendary as it was the first DX10 GPU, but GT is a true legend as it offered awesome price/performance yet not much slower than GTX/Ultra.
@@RuruFIN agree, the 8800GT was just next level, it had GTX performance but was single slot and cheap as hell. An amazing leap forward tech wise at the time.
Just upgraded from 1080Ti to 4090 last month lol. Lasted me almost 7 years. Back in the day it allowed me to touch on 4k gaming for the first time and 1440p through all these years with some minor compromises. Didn't have RT so no problems and performance strain there. 60fps perfectly achievable. 11GB VRAM! still holding up. It basically really aged 1-2 years ago with games like Hogwarts, Calisto, Cyberpunk, Witcher3 Next-gen, Jedi Survival. Truly a champion of a card. Sold it for 200$, RIP.
Me too, and I expect the 4090 will keep me happy for the next ten years.
@@K11... That pretty optimistic. I'm willing to bet any amount that Igpu's will be better than the 4090 well before "ten years". Just next year (2nd half 24') we have Strix Halo Mega Apu coming out with 40 Cu of Rdna 3 and a 256 bit bus of lpddr5x, with projected performance equal to the desktop 4060ti, thus having usurped the prodigal 1080ti in just ~7 years since its release. By the time Rdna 4 based igpu's are out (no rumors on that yet, earliest we will see it would be late 2025) I wouldn't be stunned if even the 4080 is already threatened. The earliest Igpu with the potential to usurp the 4090 would probably be Rdna 5 based somewhere near ~2027, but even the most cautious prediction would have Igpu's beating the 4090 by 2030 (Rdna 6). By 2033 even a lowly Steam Deck 4 or wtvr with a much weaker igpu (relative to the Mega Apu's Amd is making which will wipe out "low-end" gpu's) should be beating a 4090.
@@slickrounder6045 Short of technological shift there is no evidence to show that a 4090 will suffer any sort of short term drop in performance. Realistically you can reasonable be assured of 5-7 years of reliable service with little to no compromises after which by the 10 year mark you are likely to either need/want a new GPU or we will see new games finally force you to start tweaking setting considerably to maintain a useable level of performance.
@@Hybris51129 There is constant "technological shift". 7 years later the prodigal 1080ti is now firmly a low end card (sub $200 used, beaten by new $200ish cards) relegated to primarily 1080p for the most demanding new releases. Zero reason to assume the 4090 will defy history and somehow maintain midrange status 7 years from now (forget about 10 years from now, when again even lowly steam deck handheld with igpu graphics should comfortably beat it. Bank on it).
@@slickrounder6045 Sadly the number of people still gaming comfortably at 1440p@60+ fps on a 1080 ti shows otherwise.
Amazing video, i enjoyed throughly, thank you. I'm a GTX 1080 ti owner, and i will upgrade to a RTX 5080 when it comes out.
I've sold recently my former 1080ti after 6 years of use in favor for an Rx 6800 .Still going strong that 1080ti.The dude who bought it was really happy.
I still use my 1080Ti to this day. Given that I mostly play indie games, it remains kind of overkill for my use case. Wonderful card.
Ya playing cid meiyers alpha centuri in my 3080ti seems a BIT of an overkill😅😅😅😅
Me as well, and also all older free games
Same, upgraded my PC itself in time to a R9-3900x, but still using my Gigabyte GTX1080ti and i do notice it lacks some features of newer cards, textures are in some games a bit bland even on High/Ultra settings, yet in FPS it still is still relevant.
And like you,. mostly play Indy games like KSP, Stationeers, V-Rising etc, and with those games it more as perfect, and dont need a 4000 series card for them.
The few AAA games i play, i hardly notice the lack of those features (TW-WH3) and older tripple A games like ME:Shadow of Mordor, as i often play older Tripple A games that are on sale, burned to many times on new releases beeing a Pile-o-junk where to card still was on its prime.
Best Purchase i made in GPU's
I actually had a 1080ti that I got in 2017 for 300 bux... how in the world I hear you ask. Well it was completely dead, but some "basic" repair skills helped me find and replace a broken MOSFET that luckily got it back into working order. I sold it for 800 bux just before Christmas in 2021 because I had gotten a good deal on a 6800XT prior. Ended up selling that for profit too after realising I don't even play videogames anymore.... I now have a Vega 64 because it has impeccable Linux support and runs the few games I care about well enough at 4k
See, that's the trick. Most modern games are shit anyway so there's no need to upgrade. The most modern game I frequently play is from 2016.
1080Ti still stands strong because 2080Ti was not a big step up in pure raster. 3080 was the first big step up from 1080Ti..
And of course, 4090 has been the next big step up, but it costs a bloody fortune, so. I'm sticking with my 3080ti.
As a 1080ti owner, I have the license to say, this is probably the best nvidia card still today, I9 9900K and a Gtx1080ti you can play any game in 2023, I use a 1080p monitor, and every game passes the 60fps margin, so tbh, Im not planning on changing my system rn, probably when my gpu dies or the intel 12th gen gets cheaper
Yea any game at 1080p i can imagine isnt a problem when i upgarded my monitor last year from 1440p 144hz to a 4K 120hz monitor my SLI 1080 Ti setup just wasnt cutting it no more so i wenr ahead and bought a 4090 recently 😅
@Travis Clark probably not but you cant treat 90series cards as normal flagships either since they bassicly replaced the Titians so you should still see the 80 series as the "normal" flagship
I bought a used Titan XP for $200 a few months back to replace my 1080 which i sold for $200 (crazy GPU market we are in) and the Titan XP has really been a decent step up and the 12Gb of vram over the 8Gb has been pretty noticeable.
haha nice! Wonder how much better it is compared to 1080ti
@@HybOjit’s slower
I am so happy I bought my 1080ti when I did. Upgraded to a Ryzen with 32gb of ram. Hoping GPU prices come back down for the next gen. Still able to run all my games at this point
the gtx 1080 ti was the best gpu yet
Thank you for making the comparison with the 5700xt at the end! I've been looking into both cards recently, trying to decide which to get
I have a 5700XT Taichi and I adore that card. I also found myself crapping all over nVidia's software as a result. AMD's literally got the superior GPU software from that era. Just remember if you do get it to set a custom fan profile, my only problem is I have a factory OC'd version and the OC vBIOS for some boneheaded reason doesn't have the fan curve to match, but you can't set a custom fan profile without tuning the card manually, so I'd had to set my custom profile with either OC or undervolt (or both lel) and set up the fan profile to hit full blast, otherwise it'd overheat. Well especially running shit like Cyberpunk. I just played Cyberpunk 2077 at ultra detail levels 1440p with this card, and that's the hardest game I ever threw at it, you may not mind but I don't like upscaling so I settled for 38-42fps and ultra native over 55fps with FSR quality. If you're at 1080p you can probably run Cyberpunk at closer to 60fps all ultra settings with that card. The only thing I even had a problem with was texture memory overflow into using system RAM, and so that well known cyberpunk bug with the muddy textures sometimes happens when I hit 10gb of VRAM usage but that's 1440p ultra (if you had a 1080ti that will not be a problem tho). Depend which is cheaper. Oh, and also about how much you want to deal with the used market, bear in mind 5700XTs probably going to be mined on, I mined several hundred dollars off mine and still used it to play Cyberpunk later. But if they cost the same, consider the 11gb of VRAM. If it's cheaper, the 5700XT may be the next RX 580 imo of rock solid budget entry. It's still a 1080p ultra settings card even on brand new games. It performs about the same as an RTX 3060 does.
@@pandemicneetbux2110 Thank you for such a detailed response! The 1080ti and 5700xt are around the same price used for me (the 1080ti being only £10 more), I'd opt for the 1080ti simply because of the vram difference but I worry a little since its quite a bit older than the latter card & I'm looking for longevity. Both these cards have downsides when buying used, as the 5700xt likely would've been used for mining & the 1080ti is pretty old so it would've seen more abuse from a previous owner. Though I am leaning closer towards the 5700xt and I probably wouldn't be playing *too* many brand new titles, though If I was I'd be playing in 1080p.
Ive got a evga gtx 1080 sc looking at 6700xt as an upgrade for 1440p gaming there cheap new
@@zeriiyah If both are similar prices, you can't really go wrong with either. The 11gb of vram is definitely a nice thing to have, but if the games you play will still run fine on 8gb then I would probably get a 5700 xt. They are usually cheaper than the 1080 tis, at least on ebay here in the US. Not to mention, the cheapest 1080 ti cards are going to be blower models, and they can be loud and hot
@@zeriiyah Gtx 1080 Ti is a better option cus it allows you to play current AAA at 1080p. I have used both, still own 5700xt as a backup and both are strong even at 2k. Doesnt matter if its used for mining as long as miner cared for the card and could be in better condition than a manually OC-ed gaming one. In the end what looks better to you and which software do you like better AMD's or Nvidia's also price, one has lower power draw and other has 3gb of Vram more. Non are RT gpu's so they dont optimize and make best drivers for them.
Bought 1080 Ti for 400eur weeks before RTX 2080 was released, what a deal it was since 1080 Ti was onpar with 2080 while 2080 cost 800eur on retail.
if you're on 1080p you don't need to update till at least the next gen lol
@@zaks7 upgraded to 3080 since launch and last month upgraded to 4090 . my 1080 Ti is long gone :)
@@thaddeus2447 ahh... I got myself a 4070 from a 2060, decent upgrade i think :D
Nvidia won't make this mistake twice.
I just remember the status the 1080 and 1080ti had. I had no money so got a 1060 but in covid times upgraded to a 1080 finally! Kinda a dream card even thought it might be a bit older.
Grabbed a 1080 Ti FE early this year. It’s much more capable than I expected, especially once undervolted. Also, its bigger brother, TITAN Xp, can also be picked up for very reasonable prices now. Maybe you could check that out next time?
Does undervolting it give you more frames in games? cheers
@@beherenow1668 nah it lowers the power usage and slightly changes frequencys helping the card perform more evenly
@@pvmchrisy thank you! I bought a second hand 1080ti yesterday, the Rog strix model for 180 euros
@@beherenow1668 Yes. It will allow the core clock to boost higher without being restricted by temperature or power limits.
@@beherenow1668 GPUs are wattage limited, not voltage limited. Their power systems/bios is capped to X watts max power. This is what the power limit in the settings configures.
If you lower the voltage then you can squeeze more amps through the card under the same wattage cap. This usually means you can feed more power to the vram, for instance and achieve higher memory overclocks.
As others have said it also reduces temps which helps the GPU auto overclock to higher frequencies and stay there longer.
Still running this card every day at 1440p, such an incredible GPU for its time even with my i7-6800k
The 6800k is a horrible bottleneck. I had to buy a 7700k to somewhat fix that. Last year I went to a 5800X and most titles gained 50% FPS, with the same GPU.
@@Mp57navy that's wild, got me considering an upgrade a bit more seriously
@@Mp57navy ironically the i7-5960x Overclocked from the previous gen could perform about the same as the 5800x.
@@Mp57navy Same I had i7 7700 and frames doubles in some games when i upgraded to i9 10850k
I run mine with a i7-4790k lol
The 5600 is actually holding it back looking at the Afterburner overlay. That is incredible.
in which game? only Last of Us has >70°C CPU and >70% CPU usage
where? all games show 98%-99% gpu usage what?
I wouldn't be surprised in all honesty, my 3600 holds back my 6600XT in some games (literally the same performance as the 1080Ti) in 1440p UW
Though most the time it'll be the game engine being limited to 4 cores even in modern games
I sold my 3080 for $500 and replaced it with a 1080 TI for $120. I've never used it before, hard to believe but for $380 less I still get good performance in 2K gaming.
It also proved my thoughts about not buying Nvidia's new series for a small performance difference.
In short, it's true that this card is a legend. And yes, after producing this card, Nvidia realized that it made a big mistake and never produced such powerful cards that would work for many years without problems in terms of price to performance.
if you never need it, sell it, big lost ?
if GTX still does the job, why not ?
Average 4 fps? Nobody is playing games in 2k on a 1080ti 😂
Hint for CP77 on the 1080Ti: drop the shadow quality to low, it's worth about 5-15 frames on high settings 1440p but obviously highly dependent on the area you're in.
i see your a owner yourself XD wouldn't know unless you had or have one these giga chad cards lol
@@imafirenmehlazer1 ^^
Can confirm the 1080ti is a beast! Bought mine second-hand on a budget in late 2021, and it's the only part I don't regret in one way or another.
Love your work man :)
Man gtx 1080ti is still a beast
This card was part of my X99 build. Which I still use today. Still plays most games.
I am still on x99 as well. 5820k. Still a beast imo for what it cost to put together.
@@thecrackrabbit9531 Same! with my 6850k
@@ridingtherr0502 Yes, I am running the 5930K.
@@d00mch1ld gosh modern game eats my cpu, so tempted to get a cpu upgrade.
I use a MSI Gtx 1080 ti Lightning Z and i won't upgrade until Rtx 5080 releases. I'm happy with my card. I also have a 4K panel and i can play games at 4K. Not newer titles but for example i was able to play Call of Duty Advanced Warfare, Doom Eternal, Wolfenstein: New Colossus, Batman Arkham Knight, Battlefield 1 and Battlefield 5. All at ultra settings. This is an amazing card. I will upgrade when Rtx 5080 releases.
Got a 1080ti this year, upgraded from a 1070 which ive had for years and its been phenomenal so far, crazy how an old card can still hold up today
If u play same games.. u dont need better
The entire GTX 10 series has been god tier. Honestly it's probably the greatest graphics card series that has ever launched. The 1060 and 1070 cards are still in widespread use today and rescued many a budget gamer during the pandemic and the years that followed.
you're absolutely right. my 1070 was holding up pretty well up until last year when i started trying to run some new single player games. upgraded to a 3080. it's nice but man that 1070 lasted for five years
1070 here. Still going strong
My 1070ti been like 7 year old and still being like “I’m not that old yet buddy”
@tregedy6075 just breaking it in really. Let's get back to this in 10 years.
Btw. Anyone link 2 together? Worth it? Prices are cheap now
Still using 1070 MSI Gaming X since release. 0 issues, still going strong.
Its somehow funny that 8gb of vram is on pair with modern cards.
Its slowly time to make an upgrade. Aiming for 4070ti Super due to 16gb of vram... only if pricing is somehow fair.
My 1080 ti is still holding up very well after all these years. Pared with a i7 8700k and 32GB ram. The whole sytem is still going strong running almost 24/7 since 2018.
I have a 9600k with a strix 1080ti. I recently bought a 12700k with a 3080 and I'm not impressed at all...my older pc seems to punch way above its weight class.
Similar here. 8700 non-K with 16GB of RAM. The CPU and GPU are very well matched in terms of performance.
Same herer. 8700k and 32Gb
I got my hands on one of the EVGA 1080ti Hydro Copper Kingpin Edition back then. Now, while EVGA doesn't make any GPUs anymore, it's a collectors item. One of the fastest 1080tis you could get out there, premounted watercooler, made for LN2 cooling and overclocking competitions. I also still got the Box as well on a shelf. One of the best purchases in terms of Hardware i made.
I had two of them back when I was still under the delusion that SLI was going to be rescued, later sold one of them to my brother and the other to my father. They're both happily pushing frames to this day, very wholesome story. Dad decided to abandon PC gaming for Xbox though, the traitor.
bought a very slightly used asus rog 1080 ti in 2017 for 700 dollars and it was the best investment ive ever made. Im still using it right now
The zwormz face at the end killed me lmao
Can confirm the GTX 1080ti is an amazing card the most expensive card I've ever bought and I'm still happily using it with absolutely no issues.
I have a 1080ti but I need to upgrade the rest of the components in order to get the full benefits of it. What's your specs?
@@jodiepalmer2404 Intel 5960x 8C 16T@ 4.2 GHz with 32G of RAM.
My 1080ti still going strong. Dont wanna upgrade. :(
I do my daily ritual sacrifices to keep my evga sc2 1080ti appeased. Bought mine in 2018 and it keeps on trucking! one of these days I'll upgrade, but today is not that!
This video popped up on my homepage front and center, and I am so happy you made this video! I do in fact run pretty much the rig you demo'd. It's a Ryzen 7 5700G and 32GB of DDR4 on an ASUS B550 board, with graphics provided by a 1080 Ti of the AORUS variety. I purchased this card back in 2017 for $789US and have used it basically every day since. Hands down one of the best purchases I've ever made, because a new GPU around the $800 mark that would be a significant upgrade just came out this year. I don't play a whole lot of fast-paced realistic looking games, which is definitely helping my experience, but overall it's been an exceptionally capable and reliable piece of hardware. Thank you again for reviewing it in 2023!
Had a feeling that extra VRAM would carry performance well into the future, big mistake trying to go without that, although many games do well allowing me to compensate.
If you're wondering how the 4090 will be holding up in 2029, you first have to take into account what percentage of them will have burned power connectors by then, and if that at some point can make them die. The Northridge Fix guy says he is getting a lot of them to fix.
Take into account a very small minority? like .01%? Yeah big number to take into account. Maybe take into account the vast majority that won't even deal with that issue.
@@Mcnooblet I don't know what the percentage is. What I know is what I said. You sound offended.
@@Mcnooblet considering they still haven't definitively determined the exact reason for them burning up, we don't know of this will become more of an issue over time or not.
Watch the gamers nexus video from today. The northridge fix was from GPUs stockpiled over the course of months from cablemod. It appears they melted because of the user not plugging them in properly, and cablemod sent them in to see if they can be salvaged. Cablemod was being an absolute bro and replacing cards that failed while using their own power connector. The salvaged cards are then given away for free because Cablemod likes good PR.
@@dex6316 Yes, I saw that. I'm glad the problem isn't as bad as it initially appeared.
It's like accidentally making a battery that doesn't run out. People will not have to buy a battery ever again.
16:25 No doubt the 4090 will hold up well considering the 3090 is still holding up extremely well almost 3.5 years after it's release.
Still on a 1080ti here, playing at 1440p gonna skip another generation
Scored a lightly used EVGA SC model for $190 yesterday. Its incredible that a 6 year old card can maintain this much relevance.
Is it still working? Kinda risky buying such an old card used
@@coldheaven8007 It is! It runs my primary PC that's on basically 24/7. I will say that I did get lucky, I could tell out of the box that it had barely been used. Almost zero dust in any of those impossible to reach corners in the heatsync.
Honestly mining cards would be safe purchases too, the real wear is from thermal cycling the components on the card.
I remember when the 10 series cards were launched. The performance jump was insane. After that the 20 series performance increase felt so miniscule. But of course the main focus on that release was to introduce RT.
yes, RT in a unusable form :D and DLSS 1 which was just a shitshow
@@HybOj It has to start somewhere. Of course you could say that iphone 1 is shit compared to iphone 15 pro but we wouldnt have the modern iphone without the first one. Same goes for RT and DLSS
@@megapet777 Im just stating facts, not sure what u want to debate :D
@@HybOj I'm not debating anything. I'm just saying even though the first gen RT was weak it was necessary step to have the current and future RT. So there's no point to laugh at first generation of anything. That's how progress is made.
@@megapet777 ok so imagine I did not put ( :D ) there and we are in agreement. Ofc 1st step is always necessary to get anywhere. I agree. And I also say that the 1st implementation of DLSS was really bad, ppl tend to forget :) It changed to completely different system, which is no more trained on each game, but on a training set of general situations.
Even as someone involved in the industry, i hadnt fully appreciated how much this card struggles with modern games. To only just get 60fps at 1080p on something like The Last of Us is shocking to me. Other than ray tracing it hadnt particularly felt like games - other than outliers like cyberpunk - had meaningfully gotten so much more detailed looking and difficult to run.
Had one in my system since 2018 and i can still play anything i want. It even handled cyberpunk surprisingly well. I remember waiting for it to come out because new generation actually meant something.
I had a feeling that me not buying a new one since then had a reason, thanks for showing it so clearly.
I remember this being my dream card in 2017 but I wasn't even close to affording it, so I settled for a second hand 1070 back then. Was still pretty great card I used till the end of last year when I got a 6800XT.
1080 Ti was as cheap as the 4070 card, only selling 1080 Ti cards after the RTX lanche for cheap.
@@lucasrem Well I was piss poor back then. 😂
I have a 1080ti, it's a very good card, it's just starting to show its age. The majority of games pre-2023 will play fine at 1440p +60fps, but it is becoming a 1080p card with newer titles. I've only recently upgraded to a RTX 4070 to test out DLSS and FG. Also I think the 4090 is going last for a long time, till the newer game consoles come out around 2028. By then the 4090 will become the new 1440p/1080p old dinosaur like the 1080ti.
Pascal definitely struggling with some newest games.
2080ti can be 50% faster.
I don't know if this is drivers, or much better async compute on Turing.
When 2080ti came out it was only 25-30% faster, but most games in 2018 were DX11.
@@KrisDee1981 definitely drivers and architecture differences. The 1080ti has the raw computer power but none of these new dx12 games are optimized for this old shit.
Still using my 1080ti I bought in 2017.
@@ericmullen7582 Yeah but I remember even in 2018 people predicted that Pascal will struggle in DX12 and Vulkan. Wolfenstein Youngblood was 60% faster on 2080ti.
4090 Will last so long because it has 24GB of GDDR6X mem. The only gpu i will be moving on is the 4090Ti after their 6000 series came out because non of the any of games made in 2023 using more than 13-14GB of VRAM in 1440p ultra/epic presets, non of them. 4090Ti will last untill 2030 or when they stop sending the newer updates for the drivers. Even 980Ti still holds up in 1080p today. I " upgraded " my old 1060 6Gb to an old 1070Ti 8GB because i saw it on the market so cheap that i bought it immediately without any hasitation. It cost me only 110€ or 115-120$. Sold 1060 for 60 bucks and now i can tweak the details even higher, feels like it gain around 25-30% more power, in CSGO i had around 180fps and now i can play at 220-250fps no problem. GTA 5 online 1080p ultra also more than 60fps. For the people is the 1080Ti the best budget gaming gpu you could ever buy but very next year it will became obsolete and 2000 series take the throne when it comes to budget gaming. 2080Ti will be lasting for another 3-4 years. But as i mentioned, after 6xxx series 4090Ti will be the best option to be safe for another 6-7 years.
At long last you’ve reviewed it! The legendary, the mythic, the incredible, the immortal, the infallible, the irreplaceable, the beloved, the revered, the timeless, GTX 1080 Ti! A card so powerful it can brute force ray tracing!
To this day I still dream of owning a GTX 1080 Ti, especially a founders edition. The look of that cooler! Even though I already have a graphics card that comes close to its performance (kinda) and consumes much less power than a 1080 Ti there’s just something about the idea of owning one. It’s like dreaming of owning a classic 60s muscle car for the looks, performance, and reputation it had in its time. I’d be very tempted to get one as an upgrade for my friend if his power supply was up to the job, but realistically I know it’d be more practical and reasonable in the long run to find a newer and less power hungry card with equal or greater performance. _But still..._
Been using my 1080ti for years whilst upgrading the other components. She's my bae. A serious workhorse
I bought a EVGA 1080 FTW at launch, Then in the later years upgraded to a 1080 Ti hybrid from EVGA. which im still using.
I think the entire series 10 was amazing, my 1060 carried me from 2016 to 2020 before I upgraded to a 3070
Man, I remember getting my 1080 ti right before all the GPU prices crashed during the pandemic. Had to pay off 800$ for this 5 year old card but man, never regretted getting this thing. Plays everything I want to throw at it in current day even if I don't really play current day games.
The only high end card that outshined this was the RTX 3080 at MSRP IMO. Same price but technically a lower end model, ray tracing was now possible by using DLSS 2, which was good at this point and the rasterisation was very solid. Just a 10/10 card before the scalpocalypse.
no. 3080 was and still is trash
@@korana6308 thank you for the descriptive counter argument.
My 1080ti have saved me tons of money! Still works as a charm since i don't play AAA games
I sold my 1080ti to a nice kid in Ottawa 1.5 years ago, I hope he’s still enjoying it.
The 1080ti gets similar performance to the 3060 depending on the game
Actually, it's a bit ahead unless you're doing ray tracing, which, to be honest, neither card has any business doing.
I never got my hands on a 1080 Ti because I made the poor financial decision to get a gaming laptop with a 1050 Ti mobile version instead. Me sad :(
Actually I still run 1080TI.
The dudes saying this card was a mistake, I have one question for you.
Are you really discontent, sad, feeling betrayed, and devastated that there's a company actually making a good value product for you?
Would you be happy if the card cost $100k?
What level of stupidity is this? Those research saying people are getting dumber is actually fucking true.
I actually bought a 1060 when it was just released for 200 EUR, played in 1440p on max almost all games until the last few years. It seems that they messed up with the 10 series as they delivered great quality for a very good price.
pascal is truly an incredible mistake.
In my eyes this has got to be one of the GPU's of all time. Even like 20 years from now, looking into the past of pc development throughout the years it'll remain a point of interest to talk about.
This card still is a beast, big performance gain compared to 980TI. I want to get my hands on a GTX 1080TI, can someone recommend a model with good cooler that will last as long as possible? EVGA versions seem to be good quality but I am not sure.
Have a handme down evga 1080ti ftw, dad had it since 2018 and ive have it since 2020, not an issue
Gtx 1080ti was a really good card back in the days, but why would you buy one now when it has lots of drawbacks and better alternatives
@@thewinner4x873 fair enough
dude get a 3080 ti used
@@greendaykerplunk thank you for your answer! I will keep an eye out be for an EVGA model.
my 1080 TI FE made me realise that modern Nvidia flagships are the choice if you wanna record what's on the screen, but never buy FE ones unless model is 3yo/younger or has 3 fans and up to 1x8pin
Never really looked back after I picked one up right before the 30-series launch. In the weeks before the launch, EVGA was discounting their 1080tis down the $375 and then $350, and I couldn't not jump on that. With the scalpers and everything else that happened after the launch, I honestly couldn't be happier.
1080ti, 2070super. and 2080 are like the same exact performance