Used 3070 is under 300 eur, can be found between 220 to 280 eur in Europe. 3070 is faster than both 4060 and B580. And has zero compatibility issues so i would say if you are looking at used market, 3070 is no brainer if you can find under 250eur especially. Another great option is used 3060 ti as well. B580 costs over 400 euros here.
@@saiyan_fate1488 Only Multi Frame Generation is exclusive to 50 series, 20/30 series get Enhanced DLSS/DLAA, DLSS 4 is a suite of all the current technologies.
Thanks for the feedback, I could do that however this is how they would set things up in an actual game. Less visual clutter = faster kills. I'm mostly trying to show how different GPUs display driver overhead on this game. 👍
Theres a ton of mods available now to convert Dlss to FSR 3, not to mention XESS. Doesnt seem fair to compare every game with Dlss. Also throw in some real games, not just "competitive" games for "gam3rz".
Yeah definitely wait for 50 series reviews and see if there are any 40 series selloffs too. The B580 is in a pretty bad pricing right now, $350+ at Newegg.
i have 3070ti and im not gonna upgrade before 60 series no point. unless you absolutely want full raytracing and dlss framegen but i dont care about those.
@@OverseerPC-Hardware graphics cards back then were so expensive.. I bought 900$ laptop in that time, and right now it's worth maybe 30% of that price today, but it's serving me a still sustainable performance in games, but I plan to switch to AMD in next months
Yes, I'm using the 9800X3D, which pushes more frames out. What specific i7 11th gen is that? If that's a 11700K, that should be able to push more frames.
Good job for this video Iam currently have msi suprim rtx 3070 witch r7 5700x3d iam satisfied with ultra high fps full hd But like you said if i will upgrade to 2k the 8gb vram willbe not enough for high fps so i should wait for rtx 50 or rx 90 series ❤
please don't compare xess to dlss man.. it's not the same technology and fake rendered frames doesn't matter. If you compare gpus from different manufactuers just don't use upscaling and fg, just compare these gpus in raster. And when comparing gpus use higher settings to make the differences more reasonable, beacuse on some tests in the video rtx 3070 don't get full load and (i mean vram perspective comparison, because every card can just handle low settings lmao without much of vram), esports games tend to use more of a cpu resources so it's pointless
However, that wouldn't happen on a real-world scenario though. You would use the upscaling tech designed for your GPU (for optimal quality) - XeSS for Intel, DLSS for Nvidia, FSR3 for AMD. XeSS on Intel GPUs has almost the same level image quality as DLSS. FG is disabled on all our tests as it affects latency, and your ability to 'frag'. The lines below are a copy-paste from a previous comment I made earlier. This is a long read but hopefully should give you context. "...most of the games we test here are all live-service games and mostly competitive/PvP. In these scenarios, you don't really need the top tier GPU (unless if you're into Warzone at 300+ FPS), but you would still need to have a decent GPU to be paired with an amazing CPU. Settings are set to competitive for better response times and visibility (less clutter = faster kills). This is as real-world as you can get. A top-tier CPU is cheaper (and more effective), than going for a top-tier GPU for these type of application/games. On mainstream tech channels, they normally benchmark single player games set to ultra and then run A to B, and that's it.. done. Personally, if I play single player games, 60 fps is all I need, and I just set it to the highest fidelity that the GPU can give me. Mainstream channels usually don't benchmark the games we test here because it's hard for them to do (and analyse), plus they don't actually play these games. I know this is a long read but.. think of this as additional information to what you already know, and a different perspective. So if you want single player ultra settings benchmarks, you have the mainstream channels (which is mostly the same data). If you want to know live-service game performance, that's where you'll find us. More information is always better for the consumer." 👍
@@OverseerPC-Hardware oh I get it, but using this gpu on low settings is a bit of a shame and with the upscalers.. but yeah for esports titles this gpu will still hold just fine, but cpu must be a little better to keep those frames stable. But for AAA games only 8GB+ VRAM gpus are worth considering
Yup, Good CPU performance is essential in these applications, and you got to do what got to do.. to get those kills (even if it means sacrificing fidelity). And yes I do agree that having more VRAM is definitely good. However, my goal is also to help viewers make informed decisions - bigger number is not always better (especially in these scenarios).
Whats the point of testing a budget GPU like the B580 or now also 3070 with a high end CPU? No one is gonna pair these cards with a 9800X3D. Show us some real world tests, geez. Especially since a GPU may have CPU overhead which would not be noticable with a high end CPU. Also, only esports titles? Yeah, pointless video.
Hi there, I don't know if you've noticed but most of the games we test here are all live-service games and mostly competitive/PvP. In these scenarios, you don't really need the top tier GPU (unless if you're into Warzone at 300+ FPS), but you would still need to have a decent GPU to be paired with an amazing CPU. Settings are set to competitive for better response times and visibility (less clutter = faster kills). This is as real-world as you can get. A top-tier CPU is cheaper (and more effective), than going for a top-tier GPU for these type of application/games. On mainstream tech channels, they normally benchmark single player games set to ultra and then run A to B, and that's it.. done. Personally, if I play single player games, 60 fps is all I need, and I just set it to the highest fidelity that the GPU can give me. Mainstream channels usually don't benchmark the games we test here because it's hard for them to do (and analyse), plus they don't actually play these games. I know this is a long read but.. think of this as additional information to what you already know, and a different perspective. So if you want single player ultra settings benchmarks, you have the mainstream channels (which is mostly the same data). If you want to know live-service game performance, that's where you'll find us. More information is always better for the consumer. 👍
Thanks for the feedback, bro. Will take note of it next time - I think I forgot to tweak audio post production. And it's also possible my accent may also make it harder to understand.
🟢 Are you sticking with the RTX 3070? or are you upgrading to something with more than 8GB VRAM?
I am getting the 5070 ti, 8gb vram is not enough anymore. To bad the 5070 only has 12 gb vram.
Yeah, I feel like that Nvidia is trying to position the 5070 Ti as the most bang for the buck.
Not upgrading the 3070ti , I'll wait for like 5 years, there is no need at all.
Used 3070 is under 300 eur, can be found between 220 to 280 eur in Europe. 3070 is faster than both 4060 and B580. And has zero compatibility issues so i would say if you are looking at used market, 3070 is no brainer if you can find under 250eur especially. Another great option is used 3060 ti as well. B580 costs over 400 euros here.
you can get them around 180 USD in India and perform really good even lets you use RT and DLSS is huge W especially dlls 4 thats coming out
dlss 4 is exclusively for 5000 series only
@@saiyan_fate1488 Only Multi Frame Generation is exclusive to 50 series, 20/30 series get Enhanced DLSS/DLAA, DLSS 4 is a suite of all the current technologies.
RT is actually rough on this gpu in modern games lmao
i have this card but it cant handle path of exile 2 very well :( might need to get a 4series card soon.
For the price point the used RTX market is where the value is to be had.
Great testing, however i would reccomend testing the competitive games like valorant with higher settings since they will be cpu bound,
Thanks for the feedback, I could do that however this is how they would set things up in an actual game. Less visual clutter = faster kills. I'm mostly trying to show how different GPUs display driver overhead on this game. 👍
B580 is a good budget gpu if they cover the cpu overhead issues
Yeah, let's hope the Intel ARC team is working on a fix for this known issue ASAP. If they can't fix it in time, it'll be dead on the water.
b580 is being crushed by rtx 50 series lol
@@lua5.1 the 50 series cards are more than double the price so I hope they crush the B580.
@@gingobingo1567 it's called price to performance
@@lua5.1 ???? Then don't compare a budget card to mid/high-end cards. Stupid af.
Theres a ton of mods available now to convert Dlss to FSR 3, not to mention XESS. Doesnt seem fair to compare every game with Dlss.
Also throw in some real games, not just "competitive" games for "gam3rz".
I canot get over fps with my 30-70 in warzone. Hows this possible?
Plenty of factors bro.. what's the rest of your specs?
The 3070 is around $300 on eBay while the B580 is $250 new. I'd wait for a B700 series card before making any conclusions
Yeah definitely wait for 50 series reviews and see if there are any 40 series selloffs too. The B580 is in a pretty bad pricing right now, $350+ at Newegg.
i have 3070ti and im not gonna upgrade before 60 series no point.
unless you absolutely want full raytracing and dlss framegen but i dont care about those.
now in alot of games you can use dlss and fsr frame gen
That's a wise choice, bro 👍
Do you have the 16gb version?
This is not a joke at you, but Nvidia
The 16GB version died with RTX 3080 Ti rumor back then.. (from 16GB to 12GB when released) 🤣
16GB exists.. but only with memory chips replacement lol
@Blaczek297 Yeah, lol.. Nvidia realised at that time they didn't need more VRAM because ppl will still buy it with the mining boom.
@@OverseerPC-Hardware graphics cards back then were so expensive.. I bought 900$ laptop in that time, and right now it's worth maybe 30% of that price today, but it's serving me a still sustainable performance in games, but I plan to switch to AMD in next months
I had the very same card. Sadly it died on me
Why your pubg 90+% utilization on low setting? Im only get 50-60+% uti. My cpu is i7 11gen 😔
Rebar
My rebar is on@@antonhallergren588
Bro he's like using the best gaming cpu ever this is only normal i think
Yes, I'm using the 9800X3D, which pushes more frames out.
What specific i7 11th gen is that? If that's a 11700K, that should be able to push more frames.
@@breakuchihatv4595 check if you have antivirus (worse than a virus) on your pc too.
Good job for this video
Iam currently have msi suprim rtx 3070 witch r7 5700x3d iam satisfied with ultra high fps full hd
But like you said if i will upgrade to 2k the 8gb vram willbe not enough for high fps so i should wait for rtx 50 or rx 90 series ❤
please don't compare xess to dlss man.. it's not the same technology and fake rendered frames doesn't matter. If you compare gpus from different manufactuers just don't use upscaling and fg, just compare these gpus in raster. And when comparing gpus use higher settings to make the differences more reasonable, beacuse on some tests in the video rtx 3070 don't get full load and (i mean vram perspective comparison, because every card can just handle low settings lmao without much of vram), esports games tend to use more of a cpu resources so it's pointless
However, that wouldn't happen on a real-world scenario though. You would use the upscaling tech designed for your GPU (for optimal quality) - XeSS for Intel, DLSS for Nvidia, FSR3 for AMD. XeSS on Intel GPUs has almost the same level image quality as DLSS. FG is disabled on all our tests as it affects latency, and your ability to 'frag'.
The lines below are a copy-paste from a previous comment I made earlier. This is a long read but hopefully should give you context.
"...most of the games we test here are all live-service games and mostly competitive/PvP.
In these scenarios, you don't really need the top tier GPU (unless if you're into Warzone at 300+ FPS), but you would still need to have a decent GPU to be paired with an amazing CPU. Settings are set to competitive for better response times and visibility (less clutter = faster kills). This is as real-world as you can get. A top-tier CPU is cheaper (and more effective), than going for a top-tier GPU for these type of application/games.
On mainstream tech channels, they normally benchmark single player games set to ultra and then run A to B, and that's it.. done. Personally, if I play single player games, 60 fps is all I need, and I just set it to the highest fidelity that the GPU can give me. Mainstream channels usually don't benchmark the games we test here because it's hard for them to do (and analyse), plus they don't actually play these games.
I know this is a long read but.. think of this as additional information to what you already know, and a different perspective. So if you want single player ultra settings benchmarks, you have the mainstream channels (which is mostly the same data). If you want to know live-service game performance, that's where you'll find us. More information is always better for the consumer." 👍
@@OverseerPC-Hardware oh I get it, but using this gpu on low settings is a bit of a shame and with the upscalers.. but yeah for esports titles this gpu will still hold just fine, but cpu must be a little better to keep those frames stable. But for AAA games only 8GB+ VRAM gpus are worth considering
Yup, Good CPU performance is essential in these applications, and you got to do what got to do.. to get those kills (even if it means sacrificing fidelity). And yes I do agree that having more VRAM is definitely good. However, my goal is also to help viewers make informed decisions - bigger number is not always better (especially in these scenarios).
Whats the point of testing a budget GPU like the B580 or now also 3070 with a high end CPU? No one is gonna pair these cards with a 9800X3D. Show us some real world tests, geez. Especially since a GPU may have CPU overhead which would not be noticable with a high end CPU.
Also, only esports titles? Yeah, pointless video.
Hi there, I don't know if you've noticed but most of the games we test here are all live-service games and mostly competitive/PvP.
In these scenarios, you don't really need the top tier GPU (unless if you're into Warzone at 300+ FPS), but you would still need to have a decent GPU to be paired with an amazing CPU. Settings are set to competitive for better response times and visibility (less clutter = faster kills). This is as real-world as you can get. A top-tier CPU is cheaper (and more effective), than going for a top-tier GPU for these type of application/games.
On mainstream tech channels, they normally benchmark single player games set to ultra and then run A to B, and that's it.. done. Personally, if I play single player games, 60 fps is all I need, and I just set it to the highest fidelity that the GPU can give me. Mainstream channels usually don't benchmark the games we test here because it's hard for them to do (and analyse), plus they don't actually play these games.
I know this is a long read but.. think of this as additional information to what you already know, and a different perspective. So if you want single player ultra settings benchmarks, you have the mainstream channels (which is mostly the same data). If you want to know live-service game performance, that's where you'll find us. More information is always better for the consumer. 👍
could you use a better mic? your voice is too muddled and sometimes it's hard to understand what you're saying
Thanks for the feedback, bro. Will take note of it next time - I think I forgot to tweak audio post production. And it's also possible my accent may also make it harder to understand.
that card is still very expensive in phillipines
Wait ka nalang sa RTX 50 series release bro, at baka magdiscount yung old RTX 30 stocks 😉