@@rehakmate I upgraded from an EVGA 1080 Classified (2025 Mhz boost clock at default) to an open box EVGA 2080 Super (Black Edition) for $619 in June of 2020.
@x33mmm Why did you own both cards to see the difference? For the retail price, sure, cause at the time 1080ti to 2080super you have to pay a pretty heavy price. But, if you can get the 2080super for a cheap price as i did, then i think it's worth it. You're looking at least 20fps lead, which was in my case. Here in Canada 1080ti USED are still going for $600cnd the cheapest, and as of right now in the USED market you can get the 2080 super for about $750, like I did, if not bit less. Totally worth the upgrade. It is really hard to get current GEN cards. The only reason why i upgraded at the time, is because of Red Dead 2, and for me, my Gigabyte 1080 ti OC was AVG about 55 to 65fps . Where as for my 2080super OC i was AVG about 95 to 115fps for RD2 @ 1440p. 1080ti is a great card for 1080p, i just know i struggled for 1440p gaming for the 1080ti; thus the reason for the upgrade. edit: Don't take my word, just look at this video, and you will see @ 1440p 2080super has 20fps lead vs 1080ti, and that's just at STOCK settings. So I wouldn't exactly call it a "Stupid Decision".
About time.. someone who does 1080p.. Most people use 1080p, why do people remove it from there bloody tests. People buy these cards to future proof ..
@@2quick614 Yeah same here and games like total war and RD2 - you need this high end specs to get over 60 FPS max settings so its not even just for the future, its for now with some games.
@@Anglo-Brit Dude, stop spending money on expensive video cards and spend money on a better monitor. 1080p to 1440p is night and day, believe me I was like you and now 2 good monitors later and I'm still enjoying max setting with my 1080ti + predator x34p and games way better than 1080p. Gaming past 1080p is life changing, trust me...
@@gastonjones3950 Mate you wouldn't get 20 FPS on the games I play on 4k. Even at 2k your gonna be under 60fps. The higher resolutions only give you smother images which you can achieve with MSAA X 8 on many games. Many demanding games will not do well with max settings. Try Total war Atilla and see how far you get ... It would be embarrassing. Go try get 4k on read dead 2 maxed out settings or ant Total war game. It's not gonna happen, I struggle to hold 60fps on total war as it's so demanding in the CPU .. Higher resolutions would only make that worse. It's great for light or casual gaming like GTA. But on demanding titles.. No
@@Anglo-Brit Did you read my comment? Do you understand how monitor resolution and fps work? Why would I run 4k? Why are you mentioning 4k when you were defending 1080p? I'm 38 and have been gaming since I was 12, I played the original DOOM in a 640x480 CRT monitor with a balled mouse. I'm not speaking out of my butt, I have games and play with the setup. I play FPS mainly. I'm not a competitive gamer, so I dont need the 144hz 2k monitor that I now use for my work in IT. I saw no benefit, so I upgraded to an ultrawide 3440x1440 100hz. So bringing up 4k is idiotic, because I dont plan to play at 4k. Now, I used to play just like you @ 1080p max settings till 2016 when I bought that 144hz 2k monitor and saw the night and day difference of the resolution change. So that comment about using MSAA is BS, if you havent played in a natively supported 1440p monitor you are speaking out of ignorance. I'm trying to help you understand that there is another world out there that you are ignoring because you think you know everything. I'm speaking out of experience. Do what ever you want, I don't really care. You can only show a person how to fish...
Honestly, it feels like 4K is this weirdly important thing that's just really hard to justify. I'm a PC gamer sitting less than a meter away from my screen, and there's a point where I can scale between eight resolutions on my 16:9 1080p panel and not notice any appreciative change in visual fidelity. If I can scale high enough to remove any need for antialiasing, then why would I need to go higher than 1080p? Consider, though, I'm asking this from the perspective of a gamer who mostly sticks to single-player stuff and who has really no need for frame-perfect input. I also say that having spent two weeks with a PS4 Pro and a 4K panel, while on vacation. The graphics were *slightly* better, sure, but most of it felt like it was on the color gamut and the overall warmth of the display. As long as my minimum is 60 FPS, I'm golden. So, am I a dinosaur or is 4K this twitch-gaming specific perk that's only appreciable if you're competitive?
@@cutty02 Eh, I still think it depends on your average distance from your screen. Are you a PC gamer or a console gamer? If you're mostly the second type, you probably sit a ways away from your screen and can probably justify a 4K display. I'm sitting at the same distance I would for clerical work, and 4K monitors for PC rigs just honestly fail to justify their price point, as far as I'm concerned. But hey - good for you if you can enjoy 4K!
the flagship of the 2000 generation and its barley pass the 60 fps on 4k on newer games, I remember when every gpu generation was night and day difference now we get over priced components that act as a midway point between generations of gpu's
The 1080ti was probably the most impressive card Nvidia has ever released. The 3070 and 3080 might surpass it though. 3070 being 2080ti tier for 500$ and the 3080 being 2X a 2080 which is basically a 1080ti... For the same price as the 1080ti was at 700$. If you want 4k gaming, a 3080 is a great choice and will last you a few years. If you plan to stay at 1440p, a 1080ti is still a monster. If you have a 1080 or below and want 1440p, upgrading to a 3070 is the wisest choice. The 3070 is a 4k capable card on it's own so it would demolish 1440p titles and will stay relevant for years to come. Only go for the 3090 if you REALLY want 4k 144fps or 8k... This is the 0.1% of PC users though, which is why the card is priced as such, but even so, it's FAR better than a Titan RTX and that costs 2500$. This is literally a 1000$ less. Or if you want to future proof yourself for like 6+ years with 4k gaming.
The 2080 ti cannot reach a stable 1440p 144fps in most games, and the 3070 is going to have equal performance to that of a 2080 ti, so if you want pure 1440p 144fps I recommend the 3080 or 3090, but the 3070 won't cut it.
Not even close. The 3070 will crush the 2080 super... Look at it's specs. It has over 30% more Cuda cores than a 2080ti... And also has more clock speed and in terms of ray tracing it also has higher specs (considerably higher actually). The 3070 will only lose out to a 2080ti in games where it requires more than 8gb of VRAM, which are very few games at 4k ultra settings.
If the 2080ti is still around $1200 so I say 20-30 fps better and round $1500-2k. Unless AMD can help us out with prices this time around. Let hope AMD Big Navi can deliver and force Nvidia to cut prices like they did to Intel. Edit: I don't want to have to sell a kidney to afford 3080Ti.
Got the RTX 2080 Super new for $260 and coming from a GTX 1080 Ti was a small upgrade. DLSS is amazing at 1440p DLLS Performance I usually get 40-60 more fps than my old GTX 1080 Ti in games with the image almost looking native.
8 is plenty. I game in 1440p, everything always on ultra, and have seen a max of 3.5gb used. Maybe 4k will use more, but I don't think itd go over 8. People chase the wrong things
@@RandomBenchmark ik what the cards are supposed to have. But look at the VRAM category on the performance overlay. U see the word VRAM: followed by 43633 MBs or something like that. Why does it say that?? Is it the current memory usage? I don't see why else it would say that.
It does but with one core only, max all core boost is 4.9 GHz if the temeprature is not exceeding 70c. www.techspot.com/amp/news/85045-intel-core-i9-10900k-official-boosting-up-53.html
RTX is laughable when the gen isn't far behind in fps when playing in 4k. The price for the rtx 2080ti is unwarranted with maybe 20fps over the 1080 ti? Nvidia while they make good cards they set asinine preices
what could be the issue? I was planning to get the zotac 2080 super amp edition. The performance of the 2080ti compare to the super one. The super gives more bang for the buck.
Does anyone have any recommended settings (monitor and GFX) with this card for a 144 Hz GSync monitor (LG) for online FPS and offline story gameplay etc. Had read someone somewhere recommending upscaling via monitor to get good FPS with decent RTX settings on etc. Can anyone with this MSI Ventus OC card advise please? (1440p monitor)
@@daddyalpha2648 are you new at this? DLSS (mainly at higher resolutions) is a game changer. When implemented correctly, it will render the image at a lower resolution and use AI to upscale it back up, but somehow in the process no degrade image quality at all or very much (only in games with a good implementation). The benefits you get from this can literally be a 150% fps boost for a non noticeable quality drop.
I want to go 2080Ti but those graphics cards are pretty damn expensive. But I know I'll be getting some hella performance out of that GPU. but for 1080p, I know a GTX 1080Ti will be up my alley. I'm building a gaming PC and I have everything I need minus the GPU
Amazing work ! I'm really impressed by the number of benchmarks ran in this video.
Thanks for working that much and thanks for all these statistics !
I'm here to estimate how powerful the 3070 gonna be
Same here lol
And I'm here to estimate how powerful the 3060 is
Edit: Probably about as fast as the 2080 super, for an estimated 349$ launch price - not bad
@@jonas6841 they say its faster then rtx 2080 ti
@@pimpyoda1579 no that's the 3070 not 3060
Keren Yu same
It’s insane how well the 1080ti is holding up today. 4 years ago, this card was soooo insane. Literally a 25-30% advantage over the non-TI 1080.
And most of the biggest losse sare just because pascal had bad support for modern APIs
I upgraded from a 1080 to a 1080Ti just before the Gpu crisis, its night and day. It holds up perfectly, I play assetto corsa and Msfs 2020 in VR.
@@rehakmate I upgraded from an EVGA 1080 Classified (2025 Mhz boost clock at default) to an open box EVGA 2080 Super (Black Edition) for $619 in June of 2020.
This is why i upgraded from 1080ti to 2080 super, cause when it came to 1440p gaming it made a pretty decent difference.
@x33mmm Why did you own both cards to see the difference? For the retail price, sure, cause at the time 1080ti to 2080super you have to pay a pretty heavy price. But, if you can get the 2080super for a cheap price as i did, then i think it's worth it. You're looking at least 20fps lead, which was in my case.
Here in Canada 1080ti USED are still going for $600cnd the cheapest, and as of right now in the USED market you can get the 2080 super for about $750, like I did, if not bit less. Totally worth the upgrade. It is really hard to get current GEN cards. The only reason why i upgraded at the time, is because of Red Dead 2, and for me, my Gigabyte 1080 ti OC was AVG about 55 to 65fps . Where as for my 2080super OC i was AVG about 95 to 115fps for RD2 @ 1440p. 1080ti is a great card for 1080p, i just know i struggled for 1440p gaming for the 1080ti; thus the reason for the upgrade.
edit: Don't take my word, just look at this video, and you will see @ 1440p 2080super has 20fps lead vs 1080ti, and that's just at STOCK settings. So I wouldn't exactly call it a "Stupid Decision".
Went from vega 56 to 2080. At first didn't notice much, but when I oc it oh boy. She runs games smooth as butter. Albeit with more heat and noise.
one thing about the 2080 super is the mem can clock high . Got mine +1350 MEM +30 CORE ,adds about another 10 fps after clocking
Yeah mine 2080 super ftw3 does 1100+ memory and 119+core clock that's a nice bump
@@zoddytech4204 +85 core and +1100 memory here aswell. Its almost performing like 2080ti stock...
About time.. someone who does 1080p.. Most people use 1080p, why do people remove it from there bloody tests. People buy these cards to future proof ..
Yeah 1080p tests are good as well. 2080S/3800X, I am still on 1080p just for the future proof reasons.
@@2quick614 Yeah same here and games like total war and RD2 - you need this high end specs to get over 60 FPS max settings so its not even just for the future, its for now with some games.
@@Anglo-Brit Dude, stop spending money on expensive video cards and spend money on a better monitor. 1080p to 1440p is night and day, believe me I was like you and now 2 good monitors later and I'm still enjoying max setting with my 1080ti + predator x34p and games way better than 1080p. Gaming past 1080p is life changing, trust me...
@@gastonjones3950 Mate you wouldn't get 20 FPS on the games I play on 4k. Even at 2k your gonna be under 60fps.
The higher resolutions only give you smother images which you can achieve with MSAA X 8 on many games.
Many demanding games will not do well with max settings.
Try Total war Atilla and see how far you get ... It would be embarrassing.
Go try get 4k on read dead 2 maxed out settings or ant Total war game.
It's not gonna happen, I struggle to hold 60fps on total war as it's so demanding in the CPU .. Higher resolutions would only make that worse.
It's great for light or casual gaming like GTA. But on demanding titles.. No
@@Anglo-Brit Did you read my comment? Do you understand how monitor resolution and fps work? Why would I run 4k? Why are you mentioning 4k when you were defending 1080p? I'm 38 and have been gaming since I was 12, I played the original DOOM in a 640x480 CRT monitor with a balled mouse. I'm not speaking out of my butt, I have games and play with the setup. I play FPS mainly. I'm not a competitive gamer, so I dont need the 144hz 2k monitor that I now use for my work in IT. I saw no benefit, so I upgraded to an ultrawide 3440x1440 100hz. So bringing up 4k is idiotic, because I dont plan to play at 4k. Now, I used to play just like you @ 1080p max settings till 2016 when I bought that 144hz 2k monitor and saw the night and day difference of the resolution change. So that comment about using MSAA is BS, if you havent played in a natively supported 1440p monitor you are speaking out of ignorance. I'm trying to help you understand that there is another world out there that you are ignoring because you think you know everything. I'm speaking out of experience. Do what ever you want, I don't really care. You can only show a person how to fish...
Honestly, it feels like 4K is this weirdly important thing that's just really hard to justify. I'm a PC gamer sitting less than a meter away from my screen, and there's a point where I can scale between eight resolutions on my 16:9 1080p panel and not notice any appreciative change in visual fidelity. If I can scale high enough to remove any need for antialiasing, then why would I need to go higher than 1080p?
Consider, though, I'm asking this from the perspective of a gamer who mostly sticks to single-player stuff and who has really no need for frame-perfect input. I also say that having spent two weeks with a PS4 Pro and a 4K panel, while on vacation. The graphics were *slightly* better, sure, but most of it felt like it was on the color gamut and the overall warmth of the display.
As long as my minimum is 60 FPS, I'm golden. So, am I a dinosaur or is 4K this twitch-gaming specific perk that's only appreciable if you're competitive?
aka youre blind....lol PS4 pro is fake 4k @30fps. I would say that 99% of people would be able to tell the difference between 1080p and 4k.
Cutler Cycles
Ye u have to be blind to not see the difference but honestly i prefer 2k 144hz over 60hz 4k
@@cutty02 Eh, I still think it depends on your average distance from your screen. Are you a PC gamer or a console gamer? If you're mostly the second type, you probably sit a ways away from your screen and can probably justify a 4K display.
I'm sitting at the same distance I would for clerical work, and 4K monitors for PC rigs just honestly fail to justify their price point, as far as I'm concerned.
But hey - good for you if you can enjoy 4K!
you can only really see a difference with 4k with screens 50 inches and bigger anything smaller its kind of a waste
the flagship of the 2000 generation and its barley pass the 60 fps on 4k on newer games, I remember when every gpu generation was night and day difference now we get over priced components that act as a midway point between generations of gpu's
Well how are you feeling about this new generation😬
@@JustMevix it's look good, but the 3000 series are what the 2000 should have been.
The 1080ti was probably the most impressive card Nvidia has ever released.
The 3070 and 3080 might surpass it though.
3070 being 2080ti tier for 500$ and the 3080 being 2X a 2080 which is basically a 1080ti... For the same price as the 1080ti was at 700$.
If you want 4k gaming, a 3080 is a great choice and will last you a few years.
If you plan to stay at 1440p, a 1080ti is still a monster. If you have a 1080 or below and want 1440p, upgrading to a 3070 is the wisest choice. The 3070 is a 4k capable card on it's own so it would demolish 1440p titles and will stay relevant for years to come.
Only go for the 3090 if you REALLY want 4k 144fps or 8k... This is the 0.1% of PC users though, which is why the card is priced as such, but even so, it's FAR better than a Titan RTX and that costs 2500$. This is literally a 1000$ less.
Or if you want to future proof yourself for like 6+ years with 4k gaming.
The 2080 ti cannot reach a stable 1440p 144fps in most games, and the 3070 is going to have equal performance to that of a 2080 ti, so if you want pure 1440p 144fps I recommend the 3080 or 3090, but the 3070 won't cut it.
@@ZAINILEXTHE1ANDONLY Yes... Most games? Certainly not many. But yes, 1440p at 144fps is probably even harder than 4k 60 fps.
@@grimm6jack ye
Watch the 3070 only match the 2080super. Remember companies will choose the 2 or 3 best games or benchmarks and leave out the rest.
Not even close. The 3070 will crush the 2080 super... Look at it's specs. It has over 30% more Cuda cores than a 2080ti... And also has more clock speed and in terms of ray tracing it also has higher specs (considerably higher actually).
The 3070 will only lose out to a 2080ti in games where it requires more than 8gb of VRAM, which are very few games at 4k ultra settings.
Im Curious what team people are on.
💚Team Green (Nvidia)
❤️Team Red. (Amd)
💙Team Blue. (Intel)
💰Team Empty wallet
Great benchmark again!!
@Joseph Maher gotta see what team red offers in 2 or 3 month
Team green gpu team red cpu
Do the same comparison when FSR Becomes Widely available and can be used on a 1080 TI
Imagine the difference between 2080ti and 3080ti accordingly....
If the 2080ti is still around $1200 so I say 20-30 fps better and round $1500-2k. Unless AMD can help us out with prices this time around. Let hope AMD Big Navi can deliver and force Nvidia to cut prices like they did to Intel.
Edit: I don't want to have to sell a kidney to afford 3080Ti.
well you have the 3090 wich is the new titan basically, around 50% faster than TITAN RTX.
What program is this for benchmarking?
how did you put the cpu clock ?? in the afterburner I don't have that option with the 10900k
You have to update MSI afterburner to the newest version or use HWiNFO plugin and add CPU clocks from it.
Dude, you should be OC'ing the CPU from the BIOS not from the OS... there are more things involved than just VCore and Multiplier...
The dude was asking how OP put cpu clock speed in the msi on screen display. He wasn’t asking how to OC using afterburner lol
Thank you this is awesome 👏🏻
Got the RTX 2080 Super new for $260 and coming from a GTX 1080 Ti was a small upgrade. DLSS is amazing at 1440p DLLS Performance I usually get 40-60 more fps than my old GTX 1080 Ti in games with the image almost looking native.
Upgrade from 1080ti to 3070? (1080p 240Hz)
yes
If it at 1080p not worth it
@@austinknetzke5736 It's always worthwhile.
You need very strong CPU for that.
@@semperis1552 lol i have i3 9100f and rtx 3070 and fps in r6s 400+ FPS
I have a 1080 ti sc2 hybrid I got used for 600usd . the 2080ti is 30% better for 3X the price , Dunno if thats a good trade off lol
And now today you can get a 2080ti for $600 used 😂
Ahh i want live in your country... and in here still 1000$ lol
I get 5weeks ago 2080super Xtrio
For 440 € year of use.
Not bad not bad.
What i learned from this test: Upgrading from my 1080ti to the 3080 will give me a 60fps boost on 4k ultra. Can't wait!
4k is nothing but waste of power. Go 1440p and upscale you won't need to upgrade.
you ar right 4k is shit firsst yu have to get 4k moniter and it i very epanive
@@ManofOneGod
The only problem with the 3070 has 8 gb
3070 refresh with 16gb was accidentally leaked by gigabyte. Just wait or buy AMD if they offer a better product for the same price.
@@edge21str 16GB canned per vidoecardz
Unless ur gaming in 4k it's no problem at this current time
8 is plenty. I game in 1440p, everything always on ultra, and have seen a max of 3.5gb used. Maybe 4k will use more, but I don't think itd go over 8. People chase the wrong things
What PSU are you using?
Corsair HX1000 - 1000W Platinum
Don't buy 3000 cards yet guys, big navi gonna be the leap we all been waiting for 7nm tsmc gonna be huge.
I'm honestly going to buy an rtx 3090 and also buying what ever big navy will bring. I t will be like christmas twice in a year 😆😆
when is it ?
But dlss
Regular Navi was already using 7nm TSMC, yet they barely competed against Nvidia cards using 14nm TSMC.
The thing that still bugs amd is their driver. And i bet it will plague navi or whatever gpu they produced upon their next release.
Bet everyone that got the 2080ti and didnt wait for 3080 is crying now
I don't think so, if you look the stock we got today (I am 2080 super user tho, 2080ti was too expensive and wortless 25% difference)
This comment didn’t age well
The point is if u play 1080p no need upgrade. Pointless upgrade.
What? Some of these games are 30-60 more average FPS from a 1080ti to a 2080ti? A 3080 could push some of them to like 50-100 more fps.
why does the v ram say 4000ish MB on each card
What are you talking about?
RTX 2080 Super got 8GB of VRAM and both GTX 1080 Ti and RTX 2080 Ti got 11GB.
@@RandomBenchmark ik what the cards are supposed to have. But look at the VRAM category on the performance overlay. U see the word VRAM: followed by 43633 MBs or something like that. Why does it say that?? Is it the current memory usage? I don't see why else it would say that.
It's the current usage and it's in MB, the speed it's above and it's in MHz.
doesnt the 10900 boost to 5.2?
It does but with one core only, max all core boost is 4.9 GHz if the temeprature is not exceeding 70c. www.techspot.com/amp/news/85045-intel-core-i9-10900k-official-boosting-up-53.html
@@RandomBenchmark is it the same with k processors as well? I think that would be all core 5.0 and 1 core 5.3
All frequencies are in the link I provided with full explanation how each boost works. Non K 10900 CPU max turbo all core is 4.6GHz
AMAZING TEST THANK YOU!
Thanks for your video!
RTX is laughable when the gen isn't far behind in fps when playing in 4k. The price for the rtx 2080ti is unwarranted with maybe 20fps over the 1080 ti? Nvidia while they make good cards they set asinine preices
20 fps in 4k is a lot man.
How to you get 160fps high on 1080 ti on max settings? I get around 90 on low settings. edit: I am on gtx 1660ti
Do I get 160 fps, in what game?
Check my benchmark system specifications, it's a high end setup with 10900k at 5GHz with 4000mhz CL 17 RAM.
Random Benchmark warzone 1080p Hugh settings
Something is wrong with your setup, I'm getting nearly a 100...
th-cam.com/video/FSIiJxsyrlc/w-d-xo.html
1660 ti is a low end card compared to the 1080ti
Sammy Livingston which was release first
Rtx 2080Ti Result Is Good.But In Metro Exodus And Assassin's Creed Odyssay,4k Isn't Good Option.because U Get Less Then 60
1080 still a beast
It's a 1080 Ti
u use 10900 NON K?
Look at the screen, it says 10900k
TLDR: All are good, if you can only buy like the gtx 1080 ti and or the rtx 2080 super, don’t feel sad because they are all the same! (in FPS)
put that 1080 ti to 2ghz and the gap narrows even more
Counterstatement, Oc the 2080s and 2080ti then the gap remans the same.
with the new rtx cards your cpu and monitor will effect the fraps too fyi
I have a ryzen 5 3600 set to 4.2 ghz 16 gb of 3000mghz ram and a 2080 super are cant even get a solid 120 fps
what could be the issue? I was planning to get the zotac 2080 super amp edition. The performance of the 2080ti compare to the super one. The super gives more bang for the buck.
May be the cpu bottleneck your gpu you need atleast 8 cores
@@vegasxparty that's dumb thing to say, 3600 is powerful CPU more cores doesn't mean better CPU for gaming 🤦♂️🤦♂️🤦♂️
Something seems wrong with the 1080 Ti results....And what happened to the 2080 Ti in Far Cry? Seems whack...
the super would beat it
Thats why I bought RTX 2080 Ti
Congratulations, you paid a 90% premium for a 30% increase.
@@Derek_The_Magnificent_Bastard beacuse i don't care about money. I have invested in stocks and get 1 RTX 2080Ti for free every year
@@alinvornicu7734 Must be nice.
congratulations now a 3070 for half the price is gonna perform better.
@@nicane-9966 still don't care about money
Dlss actually boost fps 🤔
Does anyone have any recommended settings (monitor and GFX) with this card for a 144 Hz GSync monitor (LG) for online FPS and offline story gameplay etc.
Had read someone somewhere recommending upscaling via monitor to get good FPS with decent RTX settings on etc.
Can anyone with this MSI Ventus OC card advise please? (1440p monitor)
LG 27GL850
@@xSETUMx Thanks, but I was talking about settings on the monitor, not the actual monitor itself.
Ah finally some benchmark
What is dlss?
www.nvidia.com/en-us/geforce/news/nvidia-dlss-2-0-a-big-leap-in-ai-rendering/
Random Benchmark thank you!
A feature (marketing gimmick) that blurs everything that you don't focus on to make what you are looking at look better when it really isnt, lol...
RTX 2080 super gang
Gang represent
@@2quick614 my card costed my entire two week pay check at Walmart 😂😂 good thing I bought it then cause prices have sky rocketed. (I'm 16)😂😂
3070 gang lol
Can anyone explain what dlss actually do?
Basically downscales the resolution and then upscales it using AI. Don't ask me how that works but it does.
@@mrc2176 isnt that stupid? You can just run the intended resolution 🙄😂
@@daddyalpha2648 are you new at this? DLSS (mainly at higher resolutions) is a game changer. When implemented correctly, it will render the image at a lower resolution and use AI to upscale it back up, but somehow in the process no degrade image quality at all or very much (only in games with a good implementation). The benefits you get from this can literally be a 150% fps boost for a non noticeable quality drop.
The 1080 ti still have at lease a year left.RDR2 showed that you can still get a smooth 30fps.
Witcher 3? 😭
If you got a 1080ti or 2080ti just stick with it
I wanna buy the 1080ti for my 4k TV... Is it a good buy??
Well my gtx 1080ti rog strix clocks at over 2 ghz and would gain an easy 10 15 frames on the gigabyte card 😊
This gigabyte can run with over 2GHz as well...
th-cam.com/video/kJq_XgWE2F4/w-d-xo.html
All cards can be overclocked to run faster. Just keeping it apples to apples.
@@Bugbite0656 FACTS
R.I.P 2080ti
Wdy mean rip 2080ti mate
I want to go 2080Ti but those graphics cards are pretty damn expensive. But I know I'll be getting some hella performance out of that GPU. but for 1080p, I know a GTX 1080Ti will be up my alley. I'm building a gaming PC and I have everything I need minus the GPU
1080 ti is overkill for 1920
@@ianthewhite7035 No, it's not overkill at all. I have the RTX 2080 super and I play at 1080p, I don't reach 144 frames in some games.
144 is a tall order, especially if you're talking about new games when you say "some games".
@@xXJLNINJAXx lower settings??
my gtx 1080ti easy run around @2000mhz, it's adds couple fps
Why even include 1080p anymore.
No one plays that blurry.
Nobody playing at 1080p? So why people are buying 240hz and even 360hz 1080p monitors?
don't know what you're talking about 😅 if you game on a high refresh rate monitor you would play that blurry
you may be living under a rock 1080p is STILL the most used resolutiojn in the entire world especially for competitive reasons at higher HZ.
Op knows he's wrong now no reply lul
Today you can get a 2080ti for $600 used 😅
YouGoLow IStayHigh and you can soon get the 3070 for $499
Not at the time of writing this you dont
and still not at a time like this ahahaha almost end of april... all i can see is price are going up up up up up
Ultranightmate !
Gonna upgrade from 5700 xt to 2080 Super.
I hope the nvidia drivers aren't as dogshit as amd's
The drivers are amazing, you won't regret it.
@@2quick614 good
Dumb decision with the 3080 around the corner.
@@spotifex3214 yeah I posted that before I had heard of the prices being rumored around 1400 for the the 3090 which I'm currently saving up for...
@@nocutlass3711 i will buy 3090 too.. lets play 8k lol
20 fps difference
My 1080ti after OC performs in between the 2080 super and 2080 TI
I bought it used for 350 lmfao.
Until you oc the 2080s and ti... You'll be in the exact same position
@@joshc7765 win
Nice video
it all looks the same to me or am i wrong
same shit lol
THE 3070 GONNA BE LIKE THE 2080 TI JUST BUY A 2070 SUPER TRUST ME.....
watching on iphone as if that matter
no charts