How AMD Zen Almost Didn't Make It - an interview: th-cam.com/video/RTA3Ls-WAcw/w-d-xo.html Watch our best cases of 2023 (so far) round-up! th-cam.com/video/GYM9t4c_ev0/w-d-xo.html
totally agree the 4060 looks more like a 4050. but why didn't you compare it to the 3050 then? maybe even adding what should have been the 3050 Ti into the mix, meaning the 3060 8GB? great video other than that. I love it. keep leading the community into a better future, thx GN team!! can you make a video hypothesizing the 4060 is a 4050, 4060Ti a 4050ti, all the way up, except 4090? also giving them a fair price from your POV would be nice. I think the community needs smth like this for better discussions
@@random-zr5km I think since the 4050 wasn't released, and the 4060 got a 110W TDP design, it look like 4050 behind the scenes. The 4060 is in the 60ish power category and its 110W-TDP. I also think that the 3050 was a mistaken due we never really seen an 50ish power category card on 130Watt TDP, but the 3050 is kicking ass with its TDP. Lets go back to stone age to review stuff! 550TI- 116W 650- 65W 650TI- 110W 750- 55W 750TI- 60W 950- 90W 1050- 75W 1050TI- 75W 3050- whopping 130Watt TDP I Think the GPU die manufacturers will avoid the low TDP due to competition, back then we even had GT610, GT630, GT710, GT730 on Nvidia side and we had HD6350 ,HD6450 ,HD7350, HD7450 on AMD side. Now we dont even have 50's series. I still even have my old af HD5550 laying on my desk, but i think i will show up in my windows xp retro pc butt due its not bad for xp.
How is power savings not relevant for low end? People who buy low end cards don't care how much they pay for electricity? And ones who care - buy expensive cards? :D Nvidia slides also show 10-20 load hours per week, that's not that much. And by the end of card "life" (with first owner), it's power consumption matters much more - reselling price lower\power price probably higher. Not that many people interested in old-furnace hardware, 980ti outperforms 1060 6g but most people would prefer 1060 x580 or look for 1070+. Plus it's better for SFF things,game rooms\lan parties where you need populate lots of PCs without blowing up fuses,offices, for kids\family PCs. There is only one question - why did Nvidia compared it to 3060 on that chart, this all was obvious joke about Arc 750.
so the short version of this since I myself have a 3060 Ti is wait for 5060 Ti and hopefully nvidia has learned there lesson from this trash launch of the 4060 family.
@@Steellmor the question is still odd due the Arc750 is a big boy with 225W-TDP, it better compared to 70ish power, due its usage. I think, (i also thinking about that everybody thinks that) the 110W-TDP card cannot be put next to a 225W-TDP card, also from the Intel GPU's everything can come out, intel gpu drivers evolving as time pass.
It's hard to get excited about anything Nvidia does anymore when the greatest generational increases are the increased prices and model numbers. This is a 4050, not a 4060.
@@2SaltyRecipes Meh, Nvidia hate bandwagon aside, I would say it does what it's supposed to, it's an entry level, ultra-low power consumption card. And I don't think nvidia's chart for power savings is completely off either, especially for us in Europe. If you mostly game esports games like CS or (whatever is current FOTM) and want to save on energy bill, it does seem like a decent choice to me.
We would have gone crazy about getting this card at this price just 15 months ago. Low performance segment certainly should celebrate either this card or other ones that got further discounted because of this release. It's going in the right direction one way or another.
@@vensroofcat6415 The fact that is is more "affordable" than when we were all scammed during the plandemic is not a good thing imho. That means we should be happy to get "just the tip" because a few months ago we would have got the whole sausage.
Thank you for mentioning this is a PCIE X8 card. For low to mid range cards, these are attractive upgrades for older systems still on PCIE 3.0. It's not fair to expect PCIE 3.0 benchmarks, but 3.0 users will likely need all 16 lanes, so it's a good thing to mention. Thanks
This is irrelevant. A 4090 loses 2% performance on Gen3 x16, 6% on Gen2 x16. You can look up PCI-E scaling benchmarks (TechPowerUp is one site). The only scenario where this could affect the 4060 is when you run out of VRAM, but then you're screwed anyway.
Looking at the performance of a used RTX 3060 Ti/ 3070 or a RX 6600 XT/ 5700 XT in this charts I would suggest those way cheaper upgrade options with similar performance level.
@@xii_yuki1935 Clearly you don't understand this at all. The number lanes doesn't matter, the bandwidth does. The 4090 is 2.5 times faster and it's not bottlenecked by Gen3 x16, so why would this card be bottlenecked by Gen3 x8? New revisions of PCI-Express are developed for professional applications, that's where it matters. There's a reason why none of the new cards use Gen5, it's pointless for gaming. It's purely marketing based on the illusion that higher number means more performance.
@@THU31 I guess the performance drop for the 6500XT (another x8 card) going from PCIe 4.0 to 3.0 being anywhere between 10% to 20% was our lack of understanding. If you cut the number of lanes in half, you cut the potential bandwidth in half. PCIe 4.0 x16 won't bottleneck any of these cards, PCIe 3.0 x16 and PCIe 4.0 x8 will be the same and unnoticeably worse than the former, but PCIe 3.0 x8 will be a quarter of the bandwidth and will definitely impact performance enough to change the value proposition. And that's the issue here, a "budget" (formerly midrange...) card is more likely to be used in a system limited to PCIe 3.0, giving it x8 lanes handicaps it for the casual user who doesn't know better, since it won't show up in most benchmarks or be explained in any vendor datasheet. Also, as an aside, don't condescend to random strangers on the internet without doing the legwork. TLDR: PCIe 3.0 x8 bottlenecks even a 6500XT and the proof is within a mouseclick's reach.
Bc people don't always watch reviews so people still buy these cards regardless of what everyone may think. A lot of these go into pre-built pcs where way more of the people in the world get into pc gaming over building their own.
it's crazy how only the 4090 had a meaningful performance jump compared to its predecessor while the rest of the line up is a glorified refresh with DLSS 3
This is exactly how I feel about this. Though, I"ll admit to liking the 4070 as it's basically a 4080 with 1/3 less watts. I really like less watts. Never got a 3080, probably never will. I can't game in a room with hardware that's that hot.
Speaking of “massive waste” of hardware, I’d like to see an updated review of AMD’s 6600 (XT) and 6700 XT reviews (maybe Intel Arc with its driver fixes, not necessarily as separate videos). Both reviews were (rightfully) hyper negative due to their launch price, but they’re now considered some of the less awful GPU choices for their prices.
Updating my wife's rig from 2017 with a 1060, and it's honestly so nice to see that card on some of these charts, you've really considered your audience and where they might be coming from at this price point. Much appreciated as always.
My MIL's computer is also 2017 but had a GT-1030 2Gb, which I upgraded with Rx 470 4Gb and 450w psu. But the 470 cannot keep up and glad I got the 4060 so I do not have to upgrade anything else for a few years of 1080p/60.
Hate to break it you and other gamers, but nVidia doesn't care about the gaming market. They haven't cared for some time. They make their money in enterprise which means workstation and now ML use. The PC gaming market is just a side gig for nVidia right now.
Yeah sadly they make more than 10x more selling enterprise cards. They should split the company up, make gaming products a different branch under their umbrella.
@@kristoffer3000 Well for the most part, the people who'll forget this news in 2 weeks wouldn't sit through or even skim a video like this anyways. The unfortunate part is that we only represent a small percentage of total consumers. The good news is that, based on sales, the average consumer is clearly taking notice of the news surrounding these cards.
@@Hybris51129 Why? Not everyone can or want to upgrade to a higher resolution. Also, with micro-PCs like Steam Deck or Aya Neo 1080p is more than enough for 7-10' displays and it is not as hard to run as a higher resolution which is a big requirement on such small, constrainted systems.
@@LAndrewsChannel On things like the Steam deck you are perfectly right but we are seeing people outright spec full size desktops for 1080p and companies are still marketing to those people. It's a very strange thing like what it would have been like to still be building a new desktop in 2004 that is meant to run at 480p or 720p instead of 1080p.
@@Hybris51129 Well, if there is a buyer then there will also be a seller. I bet in 2004 720p was still a very popular resolution for both old and new systems as well. Change takes a long time, even in this sector.
no worries we got you covered with 6xxx series nvidia gpu. We got for you DLLS 5.0 with fake frames 600. we will just upscale from 240p and you will have +600 fps. Nvidia: what texture quality? Please this interview is over, we give you 600fps with low power consumption! You always want more and more! Interviewer: But why is it that 3xxx series have less fps but games look so much crisp and detail? Nvidia: this interview is over! JENSEN: THE MORE YOU BUY THE MORE YOU(WE) SAVE
@@lok6314 I never thought I would use AMD parts after I got my 1060 some years ago with a very nifty I5 6400. Now I'm on a 5800X3D(cause really it was killer price to performance for it's price) and a 6900XT (cause I accidentally got it really cheap, I was looking at a 3070). Who know maybe in a couple of years I'm gonna be on a Intel GPU with a nVidia CPU...
If it was 2016 you could get the top tier card, a 1080ti, for $699... Which still is better than this today (minus ray tracing ofc). I sure got a lot of use out of mine before I bought a 4090 when it launched.
yeah, anyway i prefer this type of questions more, technical actually will only ruin your 'buying experience' its like asking questions abt food, you know
@@puppy.shorts In some ways, I feel like I can't be mad at him. He told right at the beginning he was restricted in many ways. It's not like he acted like stupid and tried to sell this shit. He tried to be neutral and told many times he could do this or that. And if he can get some money why not.
@@fridaycaliforniaa236 Jayz business is built on information, entertainment and consumer trust. If you see another "unimpressive" (at best) product and chose to basically sell it to your viewers it's not a good look. And the problem with trust is: It's lost a lot easier than gained.
@@puppy.shorts Think it depends on the Size of the creator I know RandomgamingHD did it but hes quite small dosnt really do benchmarks so im not too annoyed at that but bigger channels should no better than to be marketing
The large cache with a reduced memory bus is primarily an advantage when working with small datasets. Which is either going to be low-res rendering. Or crypto-mining. They built a card to do something nobody is really interested in anymore.
@gardotd426lso even when they do support crypto mining, theres been bunch of scams so short of etherium, bitcoin or cryptos that have been around for while, Luna, tera, NFTs being pushed ect harmed crypto overall, Plus federal government is trying to launch digital dollar and is trialing it in june or july Probably gonna be regulated as Cryptos become more regulated as a asset like stocks or something, federal government already getting involved.
My 11 year old HD7950 has a 384bit memory interface (240 GB/s), which is the reason I haven’t changed GPU to date. If I had to get a card that would handle the next 11 years, it surely wouldn’t be a 128bit 4060. And the $450 I paid for that card back in the day was a lot, but justified after 11 years and still running strong. I wouldn’t pay $450 (street price) for a subpar 4060 that won’t even make it over three years.
@@GamersNexus Damn, I don't know why I only noticed now, went back to the first few vids I saw from about 4 years ago and they're there. Maybe the hella computex stuff made me forget about them. But these darker background ones are new for sure, when you compared the card specs with a white hightlight.
@@spacechannelfiver yep. and the 6600XT isnt far off of the 7600 too, so in conclusion the 2 generations old 5700XT which you can get used for ~200€ gets quite close to the brand new 4060 for 50% more money
The 3060 beats the 4060 in some tests lol. Almost every 40 series card is gimped in some way this generation. The 4080 isn't bad but $500 jump in price over the 3080 is nuts.
@@walter1824and 2070s. These days my 150 bucks ex mining 5700xt is faster than my titan xp and is knocking on 2080 territory in a bunch of games. The thing has aged insanely well once again
i bought a 4060 and i can say im pretty happy. its an amazing 1080p card. cyberpunk maxed out with all ray tracing options stable 60fps . looks so pretty :)
Do you have any problems with the shorter memory bus or the 8gb of vram? I'm on the fence about getting one and a lot of people seem to be talking about those two things. My 1060 6gb still works for what I use it for but I'd like to start playing some newer games. 1080p, ryzen 5 3600
@@loongcat6500 well, it's difficult to know because there's no way to measure or diagnose that. I can't compare it to an identical card without those. I can tell you why it's designed like that - it's designed to be good at 1080p gaming, but not at AI creativity because Nvidia don't want to canabalize that market with cheaper gaming GPU's. It can do that, for the most part, but it's performance isn't professional-level. With 16gb or 24gb of ram and a wider bus and that power efficiency it would be great at it. It's a fine 1080p gaming card with great power efficiency that can do AI art just fine with some care. You have to decide if that's what you want.
@@loongcat6500 for 1080p it runs every game maxed out 60+ fps, Hogwarts legacy, Cyberpunk2077, Immortlas of Aveum etc...your CPU may bottleneck it a bit but yes its very worth it. I reccomend it 100% over the 3060 as you can even play 1080p 120 fps with FG in any current title.Now, in a game like Ratchet and Clank Rift apart with ray tracing ON it will spike and stutter a bit with High settings but its a port from PS5, on Cyberpunk2077 i can play it 1080p RT Overdrive + FG at 100+ FPS...CS 2 i can play it locked 138 fps on my 144hz monitor with max settings so its really good.
@unholydonuts I went with 4060 due to A) it's newer and B) power efficiency. The cramped case I have doesn't have the best airflow and 4060 cards are typically smaller and much, much cooler at speed which is a benefit in my case. So they make most sense. However .. I've now got interested in AI art and the 8gb is sometimes an issue for fitting larger models. But if I'd gone with the 3060 I'd have had 50% more VRAM, but less processing speed and more issues around heat and airflow. Keep in mind that games tend to fluctuate load, whereas AI tends to just be consistent high load throughout the task ..
I feel like the 12gb RTX 3060 at $320 was a better value when it launched (if you got one at MSRP), and i like it's performance at 1440p in modern games.
I have a RTX 3060, and while it's not the most high performing card out there, I am happy with what I got. I can play most games at mid to high quality in 1440p with 60+ fps.
I also got a 3060 12GB, and I then found out 5 days later that the 4060 was coming out that day, at the time I knew VERY little about GPUs, so I was a bit upset at myself for missing the 4060. But now after having some more knowledge on GPUs I don't think I missed out on much, honestly. Sure the frame gen and DLSS 3 is very good and all, but it's not the end of the world having to put a medium preset over high, or go to 1080p from 1440p for performance. Plus I have higher VRAM and in some cases the 4060 caps out with VRAM and performs worse at 1440p, or so I've heard, which feels absurd to me. I'll probably wait on swapping until there's a 5060 out there, or maybe even 6060.
You would probably help a lot of newbies out by specifying what bus width actually means. 128bit bus means 4 physical memory packages (32bit X 4 = 128) and this helps to illustrate why these cards should be cheaper, because they have far reduced cost due to fewer components, not just the memory itself, but the memory VRM and the heatsink mass, number of thermal pads, it all adds up.
Hey they are the "best offender" regardless. NOBODY is looking to help noobies, and I firmly believe that is why we are where we are. On the other hand, the noobies are a generation of p*****s that feel emotional pain if their hardware lacks a feature, so you have to walk that line as a tech channel. FFS: We can reconstruct a 4k image with only 1080p resolution data. Its 2-3x as fast as raw rasterization. It looks just as good, sometimes better and sometimes worse. Its probably the biggest perfomant-oriented feature ever created, by far. Should be the biggest, most hyped feature in this hobby's history, and would have 10 years ago... Today, its all about those feelings!
You know what else has a 128b memory bus? The RTX 3050. I don’t see this card coming in at 250 when it’s using a cutdown variant of the AD107 die, and underclocked memory to reduce board costs.
They (GPU makers) keep leaning on the fact that 1080p is the most popular resolution and it’s like yeah, duh, because most gamers want 1440p minimum but can’t afford to go bigger and you all refuse to bring higher resolution capabilities to lower end cards even though you easily could and actually set a new standard for games but no. So they’re stuck there because of you and you keep leaning on that fact to excuse your lack of improvements. Hence, time vortex!
Most PC gamers have a whole-system budget of about $1000 US. That's why everyone is still running 1080p and why gaming laptops are so popular even though they're by no means an ideal gaming experience.
Yep. It's a problem, or at least situation, that they had a significant hand in creating, and now they will milk it for every $ it's worth. 1080p was the goal over a decade ago.
It seems like both AMD and Nvidia is trying to segment their cards into resolution tiers. They want to create budget cards for 1080p, 1440p for mid-tier and 4k for the highest tier. They are gimping the cards artificially by restricting vram and memory bus etc.
I'm one of the many still rocking a 1060, so I appreciate it being included in the benchmarks. I've actually specifically been waiting for a new $300 dollar card with a 100-125W tdp for years, and completely skipped the 2000 and 3000 series because of how much the power consumption at the $300 price point kept skyrocketing, so it's nice to see a return to that. But of course, they had to F* it up with the small memory bus and reduction in memory from the 3060.
I'm with you...working with a 1070 8GB in my laptop from 2017. It still does fantastic on everything but rendering Stable Diffusion images quickly. Looks like I'll either keep rocking it for a couple more years, or go with a 3060 12GB desktop setup.
1060 represent! I would have said every sentence you said and it correctly applies to me. But to add, it's not just the cost of powering the GPU, it's also the kind of PSU I would need to get, and the amount of heat it would be blowing into the room in this already-hot climate. I'd be paying for the AC to offset the heat if the TDP was too high.
@@rhoharane 750i here! Building a new rig this week...going with a 5600G for now and will swap that out for a better CPU/GPU combo later...looking like another year's wait should do it, and meanwhile there are a lot of games to catch up on even from just going from a 8400Duo/750i to a 5600G.
I seem to remember these same reviewers complaining about the 1650. Now It's the most common gaming card in use. But yeah, when a manufacturer releases a sub 150 watt card, the reviewers do nothing but whine.
@@SIPEROTH The 6750XT was being sold for €337 at Spanish PCComponentes but there's no stock right now. The 6700XT is also being sold for €347 here in Portugal.
"People not buying stuff makes the price go down"... absolutely. Both Nvidia and AMD have put out hardware, especially this generation, that isn't worth their MSRP. I sincerely hope both companies wake the hell up and realize that most of their clientele aren't made out of money, and most of us who aren't walking, talking wallets aren't gonna buy at these prices. The theme of this generation continues to be "do better, guys". AMD, Nvidia, wake the hell up already, or the losses in gaming revenue will continue.
"AMD, Nvidia, wake the hell up already, or the losses in gaming revenue will continue." Here's the problem: neither of them care. They're both far more interested in selling parts to massive companies for use with AI. Insert clip of Jensen and his weird AI karaoke here.
The fact that this has 128bit bus makes this like xx50 tier card or even lower. This should be no more than 220$ max. And what makes it even worse is the 7600 has MSRP of 250$ it performs very similarly.
Spoiler alert: all the 4000 cards below the 4090 have been dropped down a tier but kept their old naming scheme and a tasty price bump or reduced VRAM.
@@bluerendar2194 Cheaper due to being less feature. Turning on all of the eye candy for 3A games, the 7600 looks bad. And it's since the 20 series, the x60 ONLY, no Ti, or extra stuff, it's still MSRP US$300. Even going back to the 1060 6GB, aftermarket version is going to cost way more than the FE version of US$300.
historically graphic cards are supposed to lose half the value roughly every 18mos bought two $700 tier cards 6 years ago sold one over $500 3 years later sold one for $300 weeks ago LOL 1080Ti is right up there with the 8800 GTX for legendary cards of its era
Man, that's a pretty pricey 4050 ... and yeah, this just reinforces my opinion that Nvidia simply downtiered every card except their halo product. In relation to performance, the 4080 would be closer to a xx70 card compared to the xx90 performance, the 4070 Ti is more of a xx60 Ti, the 4070 a xx60 and the 4060 looks like a xx50 card. The huge performance gap between the 4090 and 4080 also indicates that they simply skipped a proper xx80 SKU, that might come later as the 4080 Ti, sold at premium xx80 Ti prices of course. Nvidia has just straight up been overcharging and underdelivering this entire generation with the sole exception of the impressive 4090, but that's not relevant for ~98% of the customer base because very few people can afford or even need that card.
I don't care about the 4090, especially not for $2.5-2.6k CAD. The only ones I see actually needing a card that powerful are professionals and content creators; the average gamer doesn't need the 4090, and whoever does have one just means they own it the same way an average Joe buys a Lamborghini: for bragging rights.
It’s also not relevant because it’s a bit of a paper launch. I can’t find an Nvidia FE 4090 for less than $1700 - and these are all likely used despite being labeled new. Nvidia will not budge on this as long as they are envied.
and 98% is generous, in reality is more than 99.9% of the consumer base than can't afford a 4090 and that's not an exageration that's reality looking at numbers so except for the top 0.01% of the consumers the 40 series is absurdly overpriced
So wait, somehow getting a 7600 for $250 is something we should be attacking AMD over? I'm glad I watched the first few seconds of this review that I thumbed down because of the sarcasm. The ONE valuable thing I learned is AMD is bundling a game with the 7600 which is a FANTASTIC deal. The new game is $60 USD, I found a Sapphire Pulse for $255. New games hold their price for about a year, more if it's really popular, but I'll depreciate its value and just say the game is a $50 USD value. I'm getting a Sapphire Pulse 7600 for $205 USD and the game for $50 USD. Well this is the BEST thing I found out yet from the reviews for the 4060. Already added to cart and about to buy. THANKS. And Steve you can keep your sarcasm. It doesn't benefit most people although there's a lot of assholes that seem to like it. I guess appealing to the assholes is good.
Australia's GPU prices have done nothing but go up over the last couple of years. A 4060 is around $550-625 AUD, with a 4070 costing $1100 minimum, according to PC Part Picker taking into account local vendors as well. The previous 3060 at launch was very briefly $350 AUD but has since risen due to NVIDIA's market manipulation. The future for buying a GPU in Australia is becoming staggeringly expensive, and people like myself are holding on to older "functional" products that still perform well within the 1080p and medium graphics settings range.
4060ti is currently hovering around USD$500 and $600 in the Philippines, about the same as the 3060ti. There's no way we're getting the 4060 anywhere close to $300 A lot of people are just switching to console tbh.
The greed from AMD and Nvidia have already made the A750 and A770 extremely competitive. Personally id say the A750 is the better buy than a 4060 or RX 7600. Hopefully Battlemage offers the same great value that Alchemist has, because its clear that AMD and Nvidia are trying to milk consumers.
@@__aceofspades amd? they literally have the best value lmao love how you clowns conveniently ignore the myriad of issues with intel keep sucking them off
$200 for that performance on some games is actually a steal. I'm so happy to see Intel becoming competitive as quickly as they are. And that card will only do better as drivers keep coming out. Some of their drivers I've heard boost performance astronomically in many titles.
@@ForTheOmnissiah I have both the Arc750 and Arc770. Looooooots of work on many things is still needed. Power draw, black screens, stuttering, 1% lows, many games that just perform bad, emulators etc. Also i bought the A750 first and then the A770. I have to say the Arc770 isn't even close to be worth the price difference and i feel like i over payed even though i got it at the cheapest price one could get one in Europe. The 16GB don't seem to matter much ether. They start having some value only when you go at 4K but the card isn't powerful enough for 4K gaming anyway so is all for nothing.
You guys need to understand efficiency is not only about money saving. It's money, noise, PSU size/efficiency/longevity, and caring for the planet. It's a step in the right direction. The issue is only the market being flooded with 2nd hand cards at a similar price.
Hi Steve and Team, great video... like always. I want to point out, NIVIDA did not drop the MSRP for the 60series in Europe.... 3060 MSRP 330€ / 4060 MSRP 330€. So NVIDIA is ABSOLUTLEY DOING EVERYTHING to keep the prices high and disguising it. Adding Layer after Layer, which an average consumer doesnt check. Also the power prices in the "Less Power - Slide" are total BS. The max price for a kWh in Germany is 0.4€ (since January 2023 by Law!)... the average price is 0.35€ and the latest price is 0.28€. I life in Germany and i am paying 0.275€. The prices for the UK also seem to be totally out of Date... much lower now. This graphics card is a 4050 and NIVIDIA could sell it for a price of 180€, which yould be a fair deal (for customer and NVIDIA). Even the "critical" reviews do not drive home this crucial point! Please dont have empathy for any marketing BS. Also: no Investor will leave NVIDIA, because they are offering fair priced end consumer products! Every Manager who belives a statment like this is simply admitting to be bad at his job... or lying and greedy! Like all end customer, all investors can be "tricked". Anyway... like all companies reaching 80% of market share... NIVDIA will also start to struggle hard in about 3 - 5 years, when the self inflicted disapoitnment from investors and end customer kicks in. Hopefully with INTEL in the 3rd gen and AMD doing AMD-stuff, we will have a healthy market for GPUs back then aswell. Looking forward to it.
My Vega64 died start of the pandemic and I borrowed a friend's 1070. This thing has been holding strong and I am terrified of the day it starts to die. I am not ready to pay these prices to upgrade 😂
At the time of this comment, there's a 3060 Ti on sale at Best Buy for $279.99. Price/performance wise there's zero reason to get a 4060 over the previous gen stuff. HOWEVER, with that low power consumption and heat output, AIBs would be shooting themselves in the foot if they don't make a low profile version of the 4060 to win over the SFF builders.
About 100+w in an sff system could be a challenge to cool down, though I can't say I wouldn't love to see it happen, I know that the most we can hope for is a 4050 low profile.
@@PackardKotchI'm running a 3090 in a sff case on water at under 55C under full load. You can put 4090 I sff cases like the Dan c4. But if you want to go ssff then you need something like that
Waiting definitely pays off. Sold my 3070 for $330 and almost paid $600 for the 4070. Waited, waited, waited until I got an Asus Dual OC 4070 for $480 in a quick promo. I'd say that's a much fairer price than what they started with.
this is the ONLY advantage. and the one that the customers have the best. is time. companies have to meet their margins, we don’t. if you need something RIGHT NOW at this moment there are cheaper options. if you need something better then you can wait. 4070 is a no brained for sub 500 bucks that’s an absolutely crazy value proposition
This review just made rx6700xt a better buy at $320.00 , used even cheaper . 12 GB vram , 25% better raster , similar RT perf, I will take more Vram any day .
We have to hold on our bucks and keep our outdated cards, the more we wait, the more we win. Those flash sales that they did a couple weeks ago are proof that it's working. Keep holding my friends
@@abdullahgawish722 Of course it doesn't apply for people who are making their first build or if their current GPU died. But if you say that because you are playing on iGPU, you are mad but you have all my respect sir
It's not that much of a challenge when the needle barely moves in the mainstream segment. We're basically in the GPU version of the post-Sandy Bridge slump.
@14:30 Btw there were Intel Drivers today that improved their performance in F1 2022 by a decent margin (up to 36% at 1080p, and 20% at 1440p), making them more competitive than they were previously.
@@telleva7890 that's not an issue. AMD is close to Nvidia this gen so depending on how things go in the future we might see AMD catching up to Nvidia in terms of raw performance. The advantage AMD has is that they're switching to chiplets and that they already have experience with chiplet architecture, meanwhile Nvidia has these huge, complicated and expensive to produce monolithic boards. There's something to he said about AMD's ability to deliver 80% of Nvidia's performance with an architecture that's far smaller than Nvidia's in terms of die size. Having a smaller die size is helpful because it makes scaling easier long term. Unless Nvidia makes drastic changes we might see them slowing down performance gain gen over gen (which we're kind of seeing already) That's why I think AMD can match Nvidia at the top at some point soon. As for Intel, all they have to do is make sure they have good support for all relevant APIs and then they can keep launching midrange GPUs until they're ready to go harder. At this point Nvidia's main advantage is their software and they've worked on it for decades so it'll be difficult for others to compete in that aspect. But in terms of gaming all of that is kind of irrelevant
if you can afford it, 310$ for the 6700xt is hands down the best deal right now for a new card. Or getting a used 3070 for around 310$~ though used is always a gamble.
22:15 I highly disagree! You're right that it might not be number one criteria in deciding what GPU to buy, but with rising temperatures everyone should ALWAYS consider power consumption whenever buying anything consuming power - it doesn't matter if it is electricity or any other kind of. For a single consumer it may not make a bit difference but - in the case products sell very good - we are not talking about single digit numbers but at least high hundred thousands and the combined power consumption is what makes the difference. START THINKING BEYOND THE SINGLE CONSUMER HORIZON!
@@Louisthesaxman1 dude wasent it super expensive like awhile ago ? I looked online it’s like 275-300 dollars. Right now I have GeForce 1060 6GB. Would that be a big upgrade going from that to 3060ti?
Still using 980ti. I had wanted to buy a 3080 when they came out but "paper launch" and all that. Now, seeing cards released these days that are aiming for 1080p, looks like I had gotten a 40 series all the way back in 2016!
I'm in the exact same some boat but on the 1080. I just can't get excited over Nvidia right now, the prices are redicious and I refuse to pay them. But of course this also affects the 2nd hand market too...
Welcome to the review of what should have been a rtx 4050 at $200-220. 107 die ✅️ 128 bit bus ✅️ Performance a bit better than last gen 60 tier card ✅️
@@iequalsnoob 7500 doesn't exist yet though. And in case you misspelled the 7600, it's in the same ball park with the same 8GB VRAM. Heck it even gets beats beaten by the 7600 in some scenarios, so the 4060 doesn't win it all. All that while being (realistically) $50-60 cheaper, makes the 7600 the far better buy, if we were in a vacuum. Fortunately we're not, and the RX 6700 and 6700XT exist, and they're the best option at around $300. Get them while stocks last.
So no need to upgrade my 2 rigs. #1 4K rig is a 13.7KF w a 3080, and #2 1080p is a 12.7K w a 3060Ti. I realize comparisons were done with new retail pricing. But I want to point out that I bought both the 3080 & the 3060ti for $670 used. That's less than 1 4070 ti and only a little more than 1 4070. Buy used, and keep these 40 series on the shelf...
The efficiency is one of the few reasons for a 4060 BUT unfortunately it is NOT as efficient as advertised on paper... they removed the shunts from the pcb so the actual power draw can't be controlled exactly. RESULT actual power draw is more like 130 than 115. SECONDLY, idle power draw as of right now is 50 watts. Yup you read that right for some reason it is not clocking/volting down properly and draws a whopping 50 watt in IDLE....
Seems like the RX 7600 actually offers decent value for money at $250, at least in terms of gen over gen improvement. As well as when to compare it to higher end Turing cards like the 2080/2080 Super - or even midrange RDNA 2 like the RX 6700 10GB/6700 XT
@@iequalsnoob oh, pfft my bad I was listening in the background with my phone off as I was on my way driving to work. But still, $260 isn't that bad either but I'll admit most people might as well go for the 6700 XT if they can get it for around $300. I personally paid $340 for an OC model 3060ti, but that was also over a year ago when they were still going for considerably more than that & I've been really happy all around. Haven't encountered a game that "uses more than 8GB of VRAM", and I run it at 1080p/1440p high (DLSS enabled when needed)
The 7600 basically matches the Cores/TMUs/ROPs of the 6600/6650 XT, but they down clocked it and called it an improved upon 6600, so AMD is doing the right thing by not calling it 7600 XT. While Nvidia takes the 3060 and cuts everything, including the bus and calls it a 4060/TI. They are basically pulling their 4080 12 GB shenanigans all over again with the 4060 and 4060 TI cards instead of calling them 4050 series like they should.
@@anotherfinesith288 yes, exactly. The 30 series was so successful because each tier of card (xx50, xx60, xx70, ti variants, etc) all either matched or surpassed the performance offered by each higher tier of the 20 series. Such as 3050 (which succeeded the 1650) > 1660 Super/1660ti, 3060 ≥ 2070, 3060ti ≥ 2080 Super, 3070 ≥ 2080ti, 3070ti+ offering higher performance not previously seen. The 40 series has been a bit of a mess overall, but mainly just the 4060/4060ti. 4070 and above appear to (somewhat) follow the same trend as the 30 series did with gen over gen improvement
@@N_N23296 at least it will be on the pallets stored in front of the 4060s and 4060ti's 🤣 Edit: idk why the RX 6700 (non-XT) isn't talked about as an alternative to the 7600, it's got 10GB of VRAM which is still better than 8GB and the two cards are basically equal in performance & price. Only real difference I could think of off the top of my head is that the 7600 *might* have slightly better RT performance as it's on RDNA 3 but realistically none of the entry to midrange AMD cards should be considered if you want a "good" RT experience. Their value lies in standard rasterization (and more VRAM with cards like the 6700 /6700 XT)
it would be good to include the 3060 in your power consumption graph, it’s missing. Also, power consumption is an important parameter to judge cooling requirements, aside from $$$ on electricity. I think if I have to buy another couple fans, or increase case size to accomodate a hotter GPU, that additional cost is real and immediate.
Yeah I think people are too up their ass on the die size and all that over the giant power consumption cut while still getting a performance uptake over the old gens. I don't care about 4k at all for any gaming and never will. The 4060 is perfectly fine where it is and people need to get over it. I also like how 30 series and back cards were mostly compared by MSRP or sell price from 1st party sellers but now they are all comparing against used cards now for price. I want a new card then I want a new card with future benefits of DLSS3 and Frame Gen. This is only on the 4060 as every other after is kinda bad to pretty bad. Given how hot CPUs are going now along with hotter GPUs. I would be happy with the 4060 and the more they complain about it being a 4050 makes me want to buy it even more.
One thing to note: the 3060 12gb will perform better on workload apps than the 4060. Apps like blender and unity use well over 8gb of VRAM for high resolution work, and they often go into system RAM and load through the memory bus of the card. On these apps the bus speed and VRAM are often the bottlenecks while working on high res renders and animations. Reason i specify blender and unity is that they are free programs that indie, artists and small developers use, the kind to have a smaller budget for their work stations. The 3060 12gb is a workhorse that punches above its class with these apps, even over the 3060ti, the 4060 is the opposite.
As a creative hobbyist who uses software like Blender and Unreal for fun projects but also plays PC games from time to time, the RTX 3060 12GB has been great all-rounder card. The 4060 on the other hand can rot on shelves.
Reminder: all of their benchmarks fall to 1% of their results on a platform that doesnt have resizable bar enabled. Which makes no damned sense to me, but apparently intel wants to die on that hill.
@@Mr.Genesis That would be a thing, if a majority of games supported DLSS3. None of the games I play support it so there's no value on me to get an 40 series card over a 30 series or an AMD card.
I'm still running a 1080Ti with a 7700k and I'm *literally right now* reaching the point where I'm looking at a system upgrade, both for gaming but also actual work use cases. Mainly the CPU isn't giving me what I need anymore, but also the 1080Ti to cut a long story short isn't in the most healthy state even though it still works meaning it makes the most sense for a complete refresh, so this kind of commentary on upgrade cycle pacing is very useful for me, even though as always it is quite frustrating to have to wait for sane pricing
i still have a 1080ti in my htpc try it with fsr2+...it'll still crush a lot of modern titles at 1440p that 11GB frame buffer lol..i thought for sure 20-24GB would be the norm by now
What's funny is that people bitching about paying $700-800 for a 1080 Ti back in the day and labeling it a poor value are the same ones complaining about mid range cards. The people who "overpaid" for the top end card years ago are only now really needing a upgrade. It really seems like a case of people feeling how they have been nickel and diming themselves to death on cheap low and mid range parts.
It's quite clear nvidia believes that the price point for 1080 low end cards is anything up to $500. They assumed the price increases due to the global health crisis / supply chain issues / crypto mining were permanent and acted accordingly.
The power argument is missing one crucial point. In moderate to cold climates there is a significant portion of a year when you need to use some sort of heater. At those times the card directly behaves as part of your home heating system. And that offsets corresponding costs.
Sure, going from cache to VRAM is expensive. But going from VRAM to system RAM is far, far more expensive. No amount of cache will help swapping from RAM.
Is it really that bad? Both RAM and GPU are connected directly to the CPU, and you can find even DDR4 with sub 10 nanosecond latency, have you already checked VRAM latency values?
@@RicardoSilvaTripcall Yes. Cache is on-die (the GPU itself). VRAM is on the same PCB. System RAM you finally involve the CPU and motherboard and PCI lanes. We're talking orders of magnitudes. Think of the GPU going to system RAM like Windows using your SSD as virtual memory. It's that bad. Otherwise discrete GPUs would just use system RAM exclusively.
@@cole-nq1ru of course it is faster, but have you measured the impact? Probably not ... And please, don't compare accessing something on the RAM with something on a SSD ...
Not sure. I'm relatively new to gaming and graphics I purchased a MSI with the 4060 in it. I've been pretty happy with it but it does have its limitations but I think for a thousand dollar prebuilt I got what I paid for.
Ngl, I was tempted to opt for the 4060 when building my first PC early this month but decided to go for the 6700 XT instead. Bought it for 270 and I'm loving every moment of it. Games ran super smooth plus the white aesthetic of the Hellhound Spectral White version suits well with my white PC build.
@@david8FCB Used ofc. Brand new Asrock 6700 XT Challenger Pro OC is 400 for example in my place. Plus its not white which is something that I looked for after performance. Saw a listing of used 6700 XT Hellhound Spectral White for 270 and I ran with it XD.
Will stick with my 3060 thanks! Maybe the 5060 might be something if performance and price are finally in sync, but certainly not worth getting the 4060.
@@brutallica2944if we keep giving nvidia more money they will just keep raising prices up the roof if people stopped buying the new 40 series im willing to bet prices will come back in sync with performance kinda hypocritical for me to say that due to me currently owning a 4080 but ya
Azu, the 3060TI will need to be replaced next gen because of the vram alone. Consoles now come with 10GB of vram. That's The base for games now. When the ps6/xbwhatever comes out it'll go up more. Which will most likely be next gen or middle of next gen hardware. Since consoles usually last 6 years. We're currently 3 years into the ps5. By the time the 50 series comes out we'll be into year 5 of the ps5.
@@indent566 The mere fact that you didn't belive my sarcasm is a testament to how many idiots that still support Nvidia even when they are pissing in your mouth and would make a comment like my sarcastic one, exept being dead serious about it. At least you are straight up with your situation, thats more than 90% of peope today.
We *might* see such "generosity" on the 5060. 10gb on a 160-bit bus with GDDR7 could provide decent bandwidth too. Speeds of 36gbps are rumored. But, even 26gbps on a 160-bit bus would provide 520GB/s. That's a hair more than the 4070. And, if we see that being the case then my biggest worry would be another reduction in cu's. The more I think about GDDR7 I also worry that Nvidia might decide to put 12gb's of 24gbps GDDR7 (good) over a 96-bit bus (bad) for 320GB/s total bandwidth (landing between a 4060 and 3060🤦♂). If 22gbps is the slowest GDDR7 speed (GDDR6X tops at 24gbps), also consuming the least power, then making a 12gb/192-bit card would give us a respectable 528GB/s (or 440GB/s on a 10gb/160-bit card). Tl;dr next-gen memory should help quite a bit. Just beware of even more cutdown dies.
These companies should be banned from business and fined for shitting the planet with 8gigs 128bit bus GPUs that are already dated even before the launch in 2023! Period!
To be honest I would say that seeing so many gamers holding and not buying those trash cards is the best thing coming out from the 40XX series but those reviews definitively come second in my opinion 😂
You're asking to compare a top of the line card vs a low end card with retail prices of over double not even factoring in the insane amount of inflation since the 1080 ti released
@@Sellout1738 Just thought it’d be interesting to see how Pascal holds up. And sure it may have been very expensive back then, but you can find the card used at an attractive price. I’m thinking of picking one up for a spare parts build
@@Sellout1738 Yes. The card is over 6 years old, technology advancements have always given more performance for the same or lower cost. Compare a 1060 with a GTX 480.
But also note that 1660 and 1060 cards popped up in this benchmark so it wouldnt be that far of a stretch. I personally wouldn't buy a 1080 bc I'm a weirdo that uses fp32 compute and pascal compute was horrendous (a 1080ti is on par with a lowend amd laptop dgpu from 2017)
@@Sellout1738 We are asking to get a card that costs about the same used. I have one. its not like comparing to a 480. the rasterization power is similar to a 3060 ti / 3070.
I'd like to see the 4060 go against the 3060/3060ti and 6700xt on pcie 3. They're similarly priced and in the budget end, and pcie 3 was standing on and b450 and 10th/11th gen Intel only one to two generations ago. So upgrading to this line of GPU is reasonable.
It is a lot of eh. A 4090 barely has enough througput to lose performance between gens, and that is mere 2%. None of these cards can saturate pcie gen 3 bandwithdth.
it's not an upgrade. this is a handful of benchmarks but the 6700 xt generally outperforms it by a lot (generally considered to be a competitor to the 3070 not the 3060) even if it were the case that most of the time they were neck and neck in performance, 8gb of vram is not enough unless you do not want to play many future games.
First of, thank you for great work! I have a proposal for an additional segment in your videos, for which testing should not be too difficult: idle power consumption of CPUs and GPUs. I live in Germany with electricity currently costing 0.44€/kWh I am probably not speaky only for myself when I say that most of the time using my computer not spent gaming or on productivity, but on mundane tasks such as browsing the internet, watching videos, looking and moving files and other such activities that consume close to no power. I am therefore very interested in the idle power consumption of different products. For example, my ... i9 12900 consumed 9W idle r9 5900X consumed 40W idle 3080 consumes 90W idle there are things that would have been very good to know beforehand. These 40W + 90W add up to 130W, which is more or less what I measured going into the PSU (~145W idle) What does this cost compared to a ~10W idle 1060 6GB and a ~10W 13500? Assuming a PSU loss of 50% at low power, this would be a comparison between 145W and 40W idle. Thats ~100W idle difference. 100W = 0.1kW My computer runs idle for about 8 hours a day (I am a remote university student) 0.1 kW * 365,25 days/year * 8 h/day * 0.44€/kWh = 128,57€/year IDLE power consumption. I believe this is something one should be able to factor in when purchasing a product and getting this information is currently very difficult.
100 Watts vs 200 Watts is a huge money pit, I use about 230 Watts when gaming, which is $190 a quarter on my electricity bill, when I buy graphic cards I think about my back pocket, I will always give up 10 frames a second to save me $50 on my electricity bill. My A750 might be 2% faster, but it is also $100 more expensive, this is why I bought the 4060!
I have a undervolted 6500XT @ 5800X3D , play on medium/normal settings , use rtss to get a nice flat 30fps frame time, and FSR quality to sharpen the image and save even more , get rid of all the blur , depth of field , god rays and everything like that. And set the texture as high as the 4Gb can take and that's it . PS4/PS4 pro equivalent and as i only play old games anyway being able to have a gpu overing at 50watts GPU + 60watts CPU most of the time is Gold. By the way by luck my 5800X3D take -30 PBO and with the cache the 1 and even 0.1% low improved so much compared to my old 5600 non X. Basically à 150 watts system screen included
Nice to see numbers compared to a 1060 6gb. 👍 I'm still on one of those and it feels fine for most of what I do (encoding is more important than games for me). It's too bad they gimped this card compared to the 3060 12gb in bus width and total ram. 😢 AV1 encoding is a necessity for whatever I eventually upgrade to... Maybe I'll wait another generation, lol.
A750 is what you want if you can live with 8GB, if you want 16GB the A770. Both have AV1, and encode with higher quality and speed than Nvidia's 4000 series and of course RDNA 3, if encoding is your primary concern, Arc is absolutely for you. A750 has been $200 in the past, A770 is $300. In these benchmarks you can see both cards trade blows with the 4060 and RX 6700.
@@__aceofspades Yeah, I pretty much agree there that the Intel cards are an interesting option and have a high quality encoder. I've seen they still can occasionally hard crash and something odd happens with the encoder when the cards get fully loaded (vextakes channel's testing). Haven't looked into to much detail on that because my current setup doesn't support for resizable bar. Whenever I can upgrade it's probably going to be a full replacement. Thanks for the feedback!
80 Euro games as polished and finished as a Kickstarter scam covered by KiraTV. 150~200 Euro GPUs priced two to three or even four times more. Dictionaries need to redefine the word "progress", especially in the context of hardware.
as very satisfied,15+ something years ScytheMugen cpu cooler user, i feel the pain, of a much appreciated product/company never being part of benchmarks, feels weird!
Took a gamble and bought a 6700 XT a couple weeks ago (from a 1660S that was no longer meeting my needs) since I figured the 4060 would suck. It doesn't suck as much as I expected it, but the 6700 XT is definitely the way to go imo (if you can find one at the $340 mark or lower, of course). I do miss NVENC (I actually am a person who'd make use of it, now I'm on AMF for instant replay and x264 for standard encode), but the power of this card and the surprisingly quiet fans under medium load because of the downclocking is... well, I'll tell myself it's worth the tradeoff. Also 12GB of VRAM, which is nice future proofing - we'll see how quickly or (likely) not 8GB becomes a widespread problem.
also decided on a 6700xt at a price equivalent to 340 usd. can't wait till i can build my rig and get it running. value is the name of the game in gaming right now, hardware and software, so im not disappointed with lower RTX performance as majority of my games can't enable it anyway (outside of Riftbreaker which i'll be getting this summer)
I was going to replace my RTX 3060 Ti with the 4060 Ti but for the first time in decades there is no actual point of doing that, which is kinda embarrassing for Nvidia.
At least you’re in acknowledgement of that rather than just chucking the info given out of your head and assuming upgrading to new gen xx60 would be an improvement just based off marketing. You’re saving yourself money, headaches, and tons of confusion. Run that 3060 ti til you absolutely need to upgrade.. cause this 40 series launch has been bizarre to say the very least
3060 ti was a cut down 3070, and had much better value than 3060. 40(50) ti was never going to beat it. Most people shouldn't need to upgrade every gen anyway.
How AMD Zen Almost Didn't Make It - an interview: th-cam.com/video/RTA3Ls-WAcw/w-d-xo.html
Watch our best cases of 2023 (so far) round-up! th-cam.com/video/GYM9t4c_ev0/w-d-xo.html
totally agree the 4060 looks more like a 4050. but why didn't you compare it to the 3050 then? maybe even adding what should have been the 3050 Ti into the mix, meaning the 3060 8GB?
great video other than that. I love it. keep leading the community into a better future, thx GN team!!
can you make a video hypothesizing the 4060 is a 4050, 4060Ti a 4050ti, all the way up, except 4090? also giving them a fair price from your POV would be nice. I think the community needs smth like this for better discussions
@@random-zr5km I think since the 4050 wasn't released, and the 4060 got a 110W TDP design, it look like 4050 behind the scenes.
The 4060 is in the 60ish power category and its 110W-TDP.
I also think that the 3050 was a mistaken due we never really seen an 50ish power category card on 130Watt TDP, but the 3050 is kicking ass with its TDP.
Lets go back to stone age to review stuff!
550TI- 116W
650- 65W
650TI- 110W
750- 55W
750TI- 60W
950- 90W
1050- 75W
1050TI- 75W
3050- whopping 130Watt TDP
I Think the GPU die manufacturers will avoid the low TDP due to competition, back then we even had GT610, GT630, GT710, GT730 on Nvidia side and we had HD6350 ,HD6450 ,HD7350, HD7450 on AMD side. Now we dont even have 50's series.
I still even have my old af HD5550 laying on my desk, but i think i will show up in my windows xp retro pc butt due its not bad for xp.
How is power savings not relevant for low end? People who buy low end cards don't care how much they pay for electricity? And ones who care - buy expensive cards? :D Nvidia slides also show 10-20 load hours per week, that's not that much. And by the end of card "life" (with first owner), it's power consumption matters much more - reselling price lower\power price probably higher. Not that many people interested in old-furnace hardware, 980ti outperforms 1060 6g but most people would prefer 1060
x580 or look for 1070+. Plus it's better for SFF things,game rooms\lan parties where you need populate lots of PCs without blowing up fuses,offices, for kids\family PCs.
There is only one question - why did Nvidia compared it to 3060 on that chart, this all was obvious joke about Arc 750.
so the short version of this since I myself have a 3060 Ti is wait for 5060 Ti and hopefully nvidia has learned there lesson from this trash launch of the 4060 family.
@@Steellmor the question is still odd due the Arc750 is a big boy with 225W-TDP, it better compared to 70ish power, due its usage. I think, (i also thinking about that everybody thinks that) the 110W-TDP card cannot be put next to a 225W-TDP card, also from the Intel GPU's everything can come out, intel gpu drivers evolving as time pass.
It's hard to get excited about anything Nvidia does anymore when the greatest generational increases are the increased prices and model numbers. This is a 4050, not a 4060.
test
This is worse than a 4050. This freaking card can't even play games today without DLSS. So what the fuck it's the point of this card/
But but muh frame generations....
Major eyeroll there.
@@2SaltyRecipes Meh, Nvidia hate bandwagon aside, I would say it does what it's supposed to, it's an entry level, ultra-low power consumption card. And I don't think nvidia's chart for power savings is completely off either, especially for us in Europe. If you mostly game esports games like CS or (whatever is current FOTM) and want to save on energy bill, it does seem like a decent choice to me.
@@Aquaquake $300 for entry level? a 1050 ti was entry level. that card was $130
Nvidia releases $300 4050 labeled as 4060, Nvidia shocked consumers unimpressed at value.
Hell No!
4050???!
More like 4030
Still outperforms the rx 7600
@@iequalsnoob which is also a terrible product. Hell, 90% of the gpus launched so far in 2022 and 2023 and terrible in value.
@@LEV1ATAN lmao. The 4050 will be a 4010
It's not a 4050 in performance. How can you be so deluded.
Scamvidia did a great job of proposing an RTX 3050 "refresh" with DLSS 3.0
We would have gone crazy about getting this card at this price just 15 months ago.
Low performance segment certainly should celebrate either this card or other ones that got further discounted because of this release. It's going in the right direction one way or another.
@@vensroofcat6415 When these kinds of cards hit below $300 then I'll be happy.
@@vensroofcat6415 The fact that is is more "affordable" than when we were all scammed during the plandemic is not a good thing imho. That means we should be happy to get "just the tip" because a few months ago we would have got the whole sausage.
@wolfNflux So true. I mean, if we could have DLSS 3.0 compatible with the 3060 cards, this would crush the whole 40 series.
@@fridaycaliforniaa236did you intentionally call it "plandemic"? How sad.
I like how Gamers Nexus has the 2 bars on the side to show you how long each slide is being presented.
They have insanely good editing
Yup. I first noticed it probably way after they started implementing that. So subtle yet helpful!
@@liarwithagun well.. It's not Micheal Bay yet.. But yeah I like these efforts on the viewer experience very much.
There was a guy who used to do the exact same thing with his Call of Duty 4 guides.
its a good idea but personally i was feeling little distracted with the BARS on either side.
Thank you for mentioning this is a PCIE X8 card. For low to mid range cards, these are attractive upgrades for older systems still on PCIE 3.0. It's not fair to expect PCIE 3.0 benchmarks, but 3.0 users will likely need all 16 lanes, so it's a good thing to mention. Thanks
This is irrelevant. A 4090 loses 2% performance on Gen3 x16, 6% on Gen2 x16. You can look up PCI-E scaling benchmarks (TechPowerUp is one site).
The only scenario where this could affect the 4060 is when you run out of VRAM, but then you're screwed anyway.
@@THU31 4060 isn't x16, so putting 4090 here in comparison is just stupid
Looking at the performance of a used RTX 3060 Ti/ 3070 or a RX 6600 XT/ 5700 XT in this charts I would suggest those way cheaper upgrade options with similar performance level.
@@xii_yuki1935 Clearly you don't understand this at all. The number lanes doesn't matter, the bandwidth does.
The 4090 is 2.5 times faster and it's not bottlenecked by Gen3 x16, so why would this card be bottlenecked by Gen3 x8?
New revisions of PCI-Express are developed for professional applications, that's where it matters. There's a reason why none of the new cards use Gen5, it's pointless for gaming. It's purely marketing based on the illusion that higher number means more performance.
@@THU31 I guess the performance drop for the 6500XT (another x8 card) going from PCIe 4.0 to 3.0 being anywhere between 10% to 20% was our lack of understanding.
If you cut the number of lanes in half, you cut the potential bandwidth in half. PCIe 4.0 x16 won't bottleneck any of these cards, PCIe 3.0 x16 and PCIe 4.0 x8 will be the same and unnoticeably worse than the former, but PCIe 3.0 x8 will be a quarter of the bandwidth and will definitely impact performance enough to change the value proposition. And that's the issue here, a "budget" (formerly midrange...) card is more likely to be used in a system limited to PCIe 3.0, giving it x8 lanes handicaps it for the casual user who doesn't know better, since it won't show up in most benchmarks or be explained in any vendor datasheet.
Also, as an aside, don't condescend to random strangers on the internet without doing the legwork.
TLDR: PCIe 3.0 x8 bottlenecks even a 6500XT and the proof is within a mouseclick's reach.
Man, I remember/miss when the 60 series cards were the sweet spot for budget /performance.
The "sweet spot" depends entirely on your budget. There is no absolute "the".
70 cards were always sweet spot for me tbh. GTX 670 was super close to 680 for like $100 or more less lol
560ti was probably the best 60 series card there ever was. Back in the day before NVIDIA gimped SLI.
I too remember when *60 class was good choice, and *70 was for higher res or higher fps.
@@johnnypopstar I could afford a 4090. But Im happy playing on 2k with my 3060 Ti. Thats a sweet spot.
Why does Nvidia not just send its e-waste directly to recycling , it would be far more efficient.
They would lose too much money 🤑
Jensen wants to have a new jacket for every day of the calendar year, not just every week or month.
Bc people don't always watch reviews so people still buy these cards regardless of what everyone may think. A lot of these go into pre-built pcs where way more of the people in the world get into pc gaming over building their own.
Why doesnt AMD do the same?
@@ZeroHourProductions407even if 1 jacket costs 350 USD, he will only spend 127'750 USD in a year. And he earns MUCH more...
it's crazy how only the 4090 had a meaningful performance jump compared to its predecessor while the rest of the line up is a glorified refresh with DLSS 3
Nvidia is flyfishing. "LOOK AT THE SHINY THING!" And people remember it's called "nvidia" something and then go buy "nvidia" somethings.
Nvidia market share for last two moths… 84% so Nvidia is right…
😢😢😢
This is exactly how I feel about this.
Though, I"ll admit to liking the 4070 as it's basically a 4080 with 1/3 less watts. I really like less watts. Never got a 3080, probably never will. I can't game in a room with hardware that's that hot.
they can boost performance of 4060 nut they want to sold all old generation from 3060ti and 3070
It's because the 3090 was a joke.
Speaking of “massive waste” of hardware, I’d like to see an updated review of AMD’s 6600 (XT) and 6700 XT reviews (maybe Intel Arc with its driver fixes, not necessarily as separate videos). Both reviews were (rightfully) hyper negative due to their launch price, but they’re now considered some of the less awful GPU choices for their prices.
"less awful choices" is a big understatement, they're some of the best GPUs you can get for their current price.
@@thegamerfe8751 What price? €450-€480 still pretty steep.
@@jamegumb7298 $270 where I am. (Used though)
@@jamegumb7298320 dollars new in US. 4060 is 300 dollars so pretty similar in pricing
@@thegamerfe8751 but you are still buying a GPU thats a generation and a half old.
Updating my wife's rig from 2017 with a 1060, and it's honestly so nice to see that card on some of these charts, you've really considered your audience and where they might be coming from at this price point. Much appreciated as always.
My MIL's computer is also 2017 but had a GT-1030 2Gb, which I upgraded with Rx 470 4Gb and 450w psu. But the 470 cannot keep up and glad I got the 4060 so I do not have to upgrade anything else for a few years of 1080p/60.
I hope card sales continue to plummet for nvidia. I just hope future generations never forget these shenanigans.
Hate to break it you and other gamers, but nVidia doesn't care about the gaming market. They haven't cared for some time. They make their money in enterprise which means workstation and now ML use.
The PC gaming market is just a side gig for nVidia right now.
This is going to be forgotten in like 2 weeks, people are really stupid and honestly just don't care at all.
Yeah sadly they make more than 10x more selling enterprise cards. They should split the company up, make gaming products a different branch under their umbrella.
@@kristoffer3000 Well for the most part, the people who'll forget this news in 2 weeks wouldn't sit through or even skim a video like this anyways. The unfortunate part is that we only represent a small percentage of total consumers. The good news is that, based on sales, the average consumer is clearly taking notice of the news surrounding these cards.
Yeah there's no point in a new card when you can buy an older used card at a cheaper price
NVIDIA at 3000 series launch: "look, 8k gaming is the future!"
NVIDIA in the future: "Pay us ridiculous prices to play at 1080p..."
Exactly. So absurd. Talking about 8k but not giving enough vram in 2023 to run 1080p ultra
The fact that 1080p is still a thing is another problem.
@@Hybris51129 Why? Not everyone can or want to upgrade to a higher resolution. Also, with micro-PCs like Steam Deck or Aya Neo 1080p is more than enough for 7-10' displays and it is not as hard to run as a higher resolution which is a big requirement on such small, constrainted systems.
@@LAndrewsChannel On things like the Steam deck you are perfectly right but we are seeing people outright spec full size desktops for 1080p and companies are still marketing to those people.
It's a very strange thing like what it would have been like to still be building a new desktop in 2004 that is meant to run at 480p or 720p instead of 1080p.
@@Hybris51129 Well, if there is a buyer then there will also be a seller.
I bet in 2004 720p was still a very popular resolution for both old and new systems as well. Change takes a long time, even in this sector.
Wow such performance at 1080p. Imagine what the future holds... Wait this is the future, that was something I was thinking in 2016.
no worries we got you covered with 6xxx series nvidia gpu.
We got for you DLLS 5.0 with fake frames 600.
we will just upscale from 240p and you will have +600 fps.
Nvidia: what texture quality? Please this interview is over, we give you 600fps with low power consumption! You always want more and more!
Interviewer: But why is it that 3xxx series have less fps but games look so much crisp and detail?
Nvidia: this interview is over!
JENSEN: THE MORE YOU BUY THE MORE YOU(WE) SAVE
@@lok6314 I never thought I would use AMD parts after I got my 1060 some years ago with a very nifty I5 6400. Now I'm on a 5800X3D(cause really it was killer price to performance for it's price) and a 6900XT (cause I accidentally got it really cheap, I was looking at a 3070). Who know maybe in a couple of years I'm gonna be on a Intel GPU with a nVidia CPU...
@@TimisDaniel At this rate it'll be an intel gpu and corsair cpu
@@TimisDaniel everything is possible.
I most of the time switch between the hardware that gives me the best value fo rmoney.
If it was 2016 you could get the top tier card, a 1080ti, for $699... Which still is better than this today (minus ray tracing ofc).
I sure got a lot of use out of mine before I bought a 4090 when it launched.
Surprised Steve hasn’t turned to the bottle after reviewing gpus this year
yeah, anyway i prefer this type of questions more, technical actually will only ruin your 'buying experience'
its like asking questions abt food, you know
Seeing my EVGI 1060 SSC on the same charts as a 4090 is trippy as hell. Thank you GN for doing such fine whine work on Nvidia.
the sheer hubris of charging US$300 for that waste of sand.
Worst part is that you just *know* people are gonna buy it anyway, it's just pitiful.
Today I learnt a new word to describe Nvidia's antics
Thats 2020 gtx 1080 used money
It's not a waste of sand, it's a perfectly capable RTX4050
@@Domenico1997JuTubeit's e-waste is what it is
Kudos to GN & HWUB for standing their ground and not bowing to Nvidia.
Very Very Disappointed from Jayz2cents and I Unsubbed him
@@puppy.shorts In some ways, I feel like I can't be mad at him. He told right at the beginning he was restricted in many ways. It's not like he acted like stupid and tried to sell this shit. He tried to be neutral and told many times he could do this or that. And if he can get some money why not.
It's throwing hwub under the bus for me.. That was a bit shady, will he now call gn a salty channel for saying the exact same thing?
@@fridaycaliforniaa236 Jayz business is built on information, entertainment and consumer trust. If you see another "unimpressive" (at best) product and chose to basically sell it to your viewers it's not a good look. And the problem with trust is: It's lost a lot easier than gained.
@@puppy.shorts Think it depends on the Size of the creator I know RandomgamingHD did it but hes quite small dosnt really do benchmarks so im not too annoyed at that but bigger channels should no better than to be marketing
The large cache with a reduced memory bus is primarily an advantage when working with small datasets. Which is either going to be low-res rendering. Or crypto-mining. They built a card to do something nobody is really interested in anymore.
Silicon design is a slow train, what is released now started development 2-3 years ago
@gardotd426 my guy hasnt seen the underworld
@gardotd426 It will go berserk again next year
@gardotd426lso even when they do support crypto mining, theres been bunch of scams so short of etherium, bitcoin or cryptos that have been around for while,
Luna, tera, NFTs being pushed ect harmed crypto overall,
Plus federal government is trying to launch digital dollar and is trialing it in june or july
Probably gonna be regulated as Cryptos become more regulated as a asset like stocks or something, federal government already getting involved.
The 16GB version could be compelling for budget AI enthusiasts. But that still means giving money to Nvidia.
My 11 year old HD7950 has a 384bit memory interface (240 GB/s), which is the reason I haven’t changed GPU to date. If I had to get a card that would handle the next 11 years, it surely wouldn’t be a 128bit 4060. And the $450 I paid for that card back in the day was a lot, but justified after 11 years and still running strong. I wouldn’t pay $450 (street price) for a subpar 4060 that won’t even make it over three years.
Kudos to the editors for adding more legible highlights and color coordinated bars in their charts during comparison.
Did that about 7 years ago!
@@GamersNexus Damn, I don't know why I only noticed now, went back to the first few vids I saw from about 4 years ago and they're there. Maybe the hella computex stuff made me forget about them. But these darker background ones are new for sure, when you compared the card specs with a white hightlight.
No wonder EVGA left Nvidia. I could not imagine trying to deal with Nvidia's bs. must of been pretty bad and getting worse.
I plan on buying whatever EVGA makes in the future - to do my part in supporting a quality oriented company that has good business ethics ...
We need to publicly belly ache EVGA into start making AMD cards.
@@creed5248 well in that gn extras video, evgas bios engineers are leaving the company. At least those top two.
@@SinexTitanwhy? Intel is crap.
@@VoldoronGaming He means for Battlemage and beyond
Would have been fun to compare the 4060 to 5700XT, an 8GB $400 card from 4 years ago.
Just use the 6600XT as a proxy for the 5700XT, they are very close in raster.
@@spacechannelfiver yep. and the 6600XT isnt far off of the 7600 too, so in conclusion the 2 generations old 5700XT which you can get used for ~200€ gets quite close to the brand new 4060 for 50% more money
@gardotd426You mean 1080TI
The 3060 beats the 4060 in some tests lol. Almost every 40 series card is gimped in some way this generation. The 4080 isn't bad but $500 jump in price over the 3080 is nuts.
@@walter1824and 2070s. These days my 150 bucks ex mining 5700xt is faster than my titan xp and is knocking on 2080 territory in a bunch of games. The thing has aged insanely well once again
i love how each time a new card get reviewed it shows how much the intel cards getting better over time .
Aging like fine scotch.
Aging like a fine 4970k. What a beautiful era of cpu.
same at this rate intel will win.
@@Adamchevy They did, didn't they?
Yes it is!
i bought a 4060 and i can say im pretty happy. its an amazing 1080p card. cyberpunk maxed out with all ray tracing options stable 60fps . looks so pretty :)
Same here. I like it's power efficiency, too. Although I'd have happily paid extra for 16gb of RAM.
Do you have any problems with the shorter memory bus or the 8gb of vram? I'm on the fence about getting one and a lot of people seem to be talking about those two things. My 1060 6gb still works for what I use it for but I'd like to start playing some newer games. 1080p, ryzen 5 3600
@@loongcat6500 well, it's difficult to know because there's no way to measure or diagnose that. I can't compare it to an identical card without those.
I can tell you why it's designed like that - it's designed to be good at 1080p gaming, but not at AI creativity because Nvidia don't want to canabalize that market with cheaper gaming GPU's. It can do that, for the most part, but it's performance isn't professional-level. With 16gb or 24gb of ram and a wider bus and that power efficiency it would be great at it.
It's a fine 1080p gaming card with great power efficiency that can do AI art just fine with some care. You have to decide if that's what you want.
@@loongcat6500 for 1080p it runs every game maxed out 60+ fps, Hogwarts legacy, Cyberpunk2077, Immortlas of Aveum etc...your CPU may bottleneck it a bit but yes its very worth it. I reccomend it 100% over the 3060 as you can even play 1080p 120 fps with FG in any current title.Now, in a game like Ratchet and Clank Rift apart with ray tracing ON it will spike and stutter a bit with High settings but its a port from PS5, on Cyberpunk2077 i can play it 1080p RT Overdrive + FG at 100+ FPS...CS 2 i can play it locked 138 fps on my 144hz monitor with max settings so its really good.
@unholydonuts
I went with 4060 due to A) it's newer and B) power efficiency.
The cramped case I have doesn't have the best airflow and 4060 cards are typically smaller and much, much cooler at speed which is a benefit in my case. So they make most sense.
However .. I've now got interested in AI art and the 8gb is sometimes an issue for fitting larger models. But if I'd gone with the 3060 I'd have had 50% more VRAM, but less processing speed and more issues around heat and airflow. Keep in mind that games tend to fluctuate load, whereas AI tends to just be consistent high load throughout the task ..
I feel like the 12gb RTX 3060 at $320 was a better value when it launched (if you got one at MSRP), and i like it's performance at 1440p in modern games.
I have a RTX 3060, and while it's not the most high performing card out there, I am happy with what I got. I can play most games at mid to high quality in 1440p with 60+ fps.
Yeah honestly that’s so fire, I do love how much watts my 4070 saves tho for real. But 12gb 3060 at $320 is honestly such a great deal
Wish they gave the 3060 ti the same treatment with the VRAM.
I also got a 3060 12GB, and I then found out 5 days later that the 4060 was coming out that day, at the time I knew VERY little about GPUs, so I was a bit upset at myself for missing the 4060. But now after having some more knowledge on GPUs I don't think I missed out on much, honestly.
Sure the frame gen and DLSS 3 is very good and all, but it's not the end of the world having to put a medium preset over high, or go to 1080p from 1440p for performance. Plus I have higher VRAM and in some cases the 4060 caps out with VRAM and performs worse at 1440p, or so I've heard, which feels absurd to me.
I'll probably wait on swapping until there's a 5060 out there, or maybe even 6060.
You would probably help a lot of newbies out by specifying what bus width actually means. 128bit bus means 4 physical memory packages (32bit X 4 = 128) and this helps to illustrate why these cards should be cheaper, because they have far reduced cost due to fewer components, not just the memory itself, but the memory VRM and the heatsink mass, number of thermal pads, it all adds up.
Hey they are the "best offender" regardless. NOBODY is looking to help noobies, and I firmly believe that is why we are where we are. On the other hand, the noobies are a generation of p*****s that feel emotional pain if their hardware lacks a feature, so you have to walk that line as a tech channel.
FFS: We can reconstruct a 4k image with only 1080p resolution data. Its 2-3x as fast as raw rasterization. It looks just as good, sometimes better and sometimes worse. Its probably the biggest perfomant-oriented feature ever created, by far.
Should be the biggest, most hyped feature in this hobby's history, and would have 10 years ago... Today, its all about those feelings!
You know what else has a 128b memory bus? The RTX 3050. I don’t see this card coming in at 250 when it’s using a cutdown variant of the AD107 die, and underclocked memory to reduce board costs.
Also fewer PCB layers, potentially. Or less copper wiring at the very least.
@@dex6316 amazing stuff, isn't it?
It even looks cheap, without labeling you would have a hard time telling this apart from a 1650
They (GPU makers) keep leaning on the fact that 1080p is the most popular resolution and it’s like yeah, duh, because most gamers want 1440p minimum but can’t afford to go bigger and you all refuse to bring higher resolution capabilities to lower end cards even though you easily could and actually set a new standard for games but no. So they’re stuck there because of you and you keep leaning on that fact to excuse your lack of improvements. Hence, time vortex!
Most PC gamers have a whole-system budget of about $1000 US. That's why everyone is still running 1080p and why gaming laptops are so popular even though they're by no means an ideal gaming experience.
6700xt says hello.
Honestly I just wish monitors weren't as expensive.
Yep. It's a problem, or at least situation, that they had a significant hand in creating, and now they will milk it for every $ it's worth. 1080p was the goal over a decade ago.
It seems like both AMD and Nvidia is trying to segment their cards into resolution tiers. They want to create budget cards for 1080p, 1440p for mid-tier and 4k for the highest tier. They are gimping the cards artificially by restricting vram and memory bus etc.
I'm one of the many still rocking a 1060, so I appreciate it being included in the benchmarks. I've actually specifically been waiting for a new $300 dollar card with a 100-125W tdp for years, and completely skipped the 2000 and 3000 series because of how much the power consumption at the $300 price point kept skyrocketing, so it's nice to see a return to that. But of course, they had to F* it up with the small memory bus and reduction in memory from the 3060.
I'm with you...working with a 1070 8GB in my laptop from 2017. It still does fantastic on everything but rendering Stable Diffusion images quickly.
Looks like I'll either keep rocking it for a couple more years, or go with a 3060 12GB desktop setup.
1060 represent!
I would have said every sentence you said and it correctly applies to me.
But to add, it's not just the cost of powering the GPU, it's also the kind of PSU I would need to get, and the amount of heat it would be blowing into the room in this already-hot climate. I'd be paying for the AC to offset the heat if the TDP was too high.
@@rhoharane 750i here! Building a new rig this week...going with a 5600G for now and will swap that out for a better CPU/GPU combo later...looking like another year's wait should do it, and meanwhile there are a lot of games to catch up on even from just going from a 8400Duo/750i to a 5600G.
Yup. Nvidia had a potential big win here with the power consumption. And then they did Nvidia things...
You can get a 3060 TI rn for around 260 on second hand market and undervolt it to draw about 160watts full load.
I seem to remember these same reviewers complaining about the 1650. Now It's the most common gaming card in use. But yeah, when a manufacturer releases a sub 150 watt card, the reviewers do nothing but whine.
This just makes me that much happier to have snagged a new 6750 xt for $319
oof paid $350 for mine
Unfortunately can't find it cheaper than 380-390 euro in Europe. If i found a 350 i would jump on it.
@@SIPEROTH The 6750XT was being sold for €337 at Spanish PCComponentes but there's no stock right now. The 6700XT is also being sold for €347 here in Portugal.
"People not buying stuff makes the price go down"... absolutely. Both Nvidia and AMD have put out hardware, especially this generation, that isn't worth their MSRP. I sincerely hope both companies wake the hell up and realize that most of their clientele aren't made out of money, and most of us who aren't walking, talking wallets aren't gonna buy at these prices.
The theme of this generation continues to be "do better, guys". AMD, Nvidia, wake the hell up already, or the losses in gaming revenue will continue.
"AMD, Nvidia, wake the hell up already, or the losses in gaming revenue will continue."
Here's the problem: neither of them care. They're both far more interested in selling parts to massive companies for use with AI. Insert clip of Jensen and his weird AI karaoke here.
CEO Quote: "The more you buy, the more save" Me: You smoking something Mate?"
The fact that this has 128bit bus makes this like xx50 tier card or even lower. This should be no more than 220$ max.
And what makes it even worse is the 7600 has MSRP of 250$ it performs very similarly.
7600 has the same bit bus width and 8GB.
@@AlfaPro1337 It's also cheaper, more warranting that price. Still not great, but *better* at least. As stocks of previous gens run out, at the
Spoiler alert: all the 4000 cards below the 4090 have been dropped down a tier but kept their old naming scheme and a tasty price bump or reduced VRAM.
@@bluerendar2194 Cheaper due to being less feature.
Turning on all of the eye candy for 3A games, the 7600 looks bad.
And it's since the 20 series, the x60 ONLY, no Ti, or extra stuff, it's still MSRP US$300.
Even going back to the 1060 6GB, aftermarket version is going to cost way more than the FE version of US$300.
Lmao 260 is the lowest it is and that's not a constant price. Not worth buying it vs the 4060
I hope Intel can keep making and improving their GPUs, we desperately need a solid third player in this market, especially in the mid-range.
Every new Nvidia card review this generation is just gonna make me keep the 1080 Ti in my new build.
This is firmly planted in my head as well.
historically graphic cards are supposed to lose half the value roughly every 18mos
bought two $700 tier cards 6 years ago
sold one over $500 3 years later
sold one for $300 weeks ago
LOL 1080Ti is right up there with the 8800 GTX for legendary cards of its era
I feel the same with a 2080.
I wish I never got rid of my 1080ti. That thing was a beast.
@@lawrencelin272 My GTX 1080 feels left out
As a 1060 user looking at new cards, this was quite a nice surprise to see the 1060 still mentioned in comparisons.👍
Me too! Sniff, just put the old boy out to pasture. The new 4060 is five times faster and doesn't complain like the old 1060.
Man, that's a pretty pricey 4050 ... and yeah, this just reinforces my opinion that Nvidia simply downtiered every card except their halo product. In relation to performance, the 4080 would be closer to a xx70 card compared to the xx90 performance, the 4070 Ti is more of a xx60 Ti, the 4070 a xx60 and the 4060 looks like a xx50 card. The huge performance gap between the 4090 and 4080 also indicates that they simply skipped a proper xx80 SKU, that might come later as the 4080 Ti, sold at premium xx80 Ti prices of course.
Nvidia has just straight up been overcharging and underdelivering this entire generation with the sole exception of the impressive 4090, but that's not relevant for ~98% of the customer base because very few people can afford or even need that card.
I don't care about the 4090, especially not for $2.5-2.6k CAD. The only ones I see actually needing a card that powerful are professionals and content creators; the average gamer doesn't need the 4090, and whoever does have one just means they own it the same way an average Joe buys a Lamborghini: for bragging rights.
It’s also not relevant because it’s a bit of a paper launch. I can’t find an Nvidia FE 4090 for less than $1700 - and these are all likely used despite being labeled new. Nvidia will not budge on this as long as they are envied.
and 98% is generous, in reality is more than 99.9% of the consumer base than can't afford a 4090 and that's not an exageration that's reality looking at numbers so except for the top 0.01% of the consumers the 40 series is absurdly overpriced
@@ResidentWeevil2077 then you want to care about the 4060. The 60 has always been the most popular card for mid-end gaming and this is just disgusting
Lmao still a better buy than a $260+ rx 7600
Why didn't AMD just release the 7600 for 249 out of the gate is beyond me. It's like they're allergic to good marketing😅
initial adopters profit + when the price do fell down, now its a 'good deal'
AMD has a product manager serious need of being letting go kekw
Some already 250, and will faling once 6600/XT stock shelves will be empty
Marketing strats
Because they didn't know nvidia was going to drop the 4060 for 300. They probably expected 350 before they announced it after AMD announced the 7600.
The biggest takeaway I have from all the RTX 4060 reviews is Intel Arc hype.
Yup, Intel is the only one that is offering a good value in terms of price, performance and features.
Thank you for always including FFXIV benchmark, you are probably the only channel that has a comparison for this benchmark, and it helps a lot. 😊
So wait, somehow getting a 7600 for $250 is something we should be attacking AMD over?
I'm glad I watched the first few seconds of this review that I thumbed down because of the sarcasm. The ONE valuable thing I learned is AMD is bundling a game with the 7600 which is a FANTASTIC deal.
The new game is $60 USD, I found a Sapphire Pulse for $255. New games hold their price for about a year, more if it's really popular, but I'll depreciate its value and just say the game is a $50 USD value. I'm getting a Sapphire Pulse 7600 for $205 USD and the game for $50 USD.
Well this is the BEST thing I found out yet from the reviews for the 4060. Already added to cart and about to buy.
THANKS.
And Steve you can keep your sarcasm. It doesn't benefit most people although there's a lot of assholes that seem to like it. I guess appealing to the assholes is good.
Australia's GPU prices have done nothing but go up over the last couple of years. A 4060 is around $550-625 AUD, with a 4070 costing $1100 minimum, according to PC Part Picker taking into account local vendors as well. The previous 3060 at launch was very briefly $350 AUD but has since risen due to NVIDIA's market manipulation. The future for buying a GPU in Australia is becoming staggeringly expensive, and people like myself are holding on to older "functional" products that still perform well within the 1080p and medium graphics settings range.
Why is that? Australia have big import tax?
@@SwingArmCity Australia Tax.
4060ti is currently hovering around USD$500 and $600 in the Philippines, about the same as the 3060ti.
There's no way we're getting the 4060 anywhere close to $300
A lot of people are just switching to console tbh.
Buy a 6700xt, it's pretty well priced in Aus right now
Import taxes, inflation and weak AUS dollar is a bad combination. You can practically watch your buying power fall in real time.
these companies are getting wayy too comfrtable with doing nothing, hope intel puts up a good fight
The greed from AMD and Nvidia have already made the A750 and A770 extremely competitive. Personally id say the A750 is the better buy than a 4060 or RX 7600. Hopefully Battlemage offers the same great value that Alchemist has, because its clear that AMD and Nvidia are trying to milk consumers.
@@__aceofspades amd? they literally have the best value lmao love how you clowns conveniently ignore the myriad of issues with intel keep sucking them off
Nvidia users your doing Great 👍. Continue allowing the 40 series cards to sit on store shelves.
Yup. Boycott ScamVidia
Excellent review about a mediocre card!
No regrets about getting my A750 when it was on sale for $200 last month.
$200 for that performance on some games is actually a steal. I'm so happy to see Intel becoming competitive as quickly as they are. And that card will only do better as drivers keep coming out. Some of their drivers I've heard boost performance astronomically in many titles.
@@ForTheOmnissiah I have both the Arc750 and Arc770. Looooooots of work on many things is still needed. Power draw, black screens, stuttering, 1% lows, many games that just perform bad, emulators etc.
Also i bought the A750 first and then the A770. I have to say the Arc770 isn't even close to be worth the price difference and i feel like i over payed even though i got it at the cheapest price one could get one in Europe. The 16GB don't seem to matter much ether. They start having some value only when you go at 4K but the card isn't powerful enough for 4K gaming anyway so is all for nothing.
You guys need to understand efficiency is not only about money saving. It's money, noise, PSU size/efficiency/longevity, and caring for the planet.
It's a step in the right direction. The issue is only the market being flooded with 2nd hand cards at a similar price.
Hi Steve and Team,
great video... like always.
I want to point out, NIVIDA did not drop the MSRP for the 60series in Europe.... 3060 MSRP 330€ / 4060 MSRP 330€. So NVIDIA is ABSOLUTLEY DOING EVERYTHING to keep the prices high and disguising it. Adding Layer after Layer, which an average consumer doesnt check.
Also the power prices in the "Less Power - Slide" are total BS. The max price for a kWh in Germany is 0.4€ (since January 2023 by Law!)... the average price is 0.35€ and the latest price is 0.28€. I life in Germany and i am paying 0.275€. The prices for the UK also seem to be totally out of Date... much lower now.
This graphics card is a 4050 and NIVIDIA could sell it for a price of 180€, which yould be a fair deal (for customer and NVIDIA).
Even the "critical" reviews do not drive home this crucial point! Please dont have empathy for any marketing BS.
Also: no Investor will leave NVIDIA, because they are offering fair priced end consumer products! Every Manager who belives a statment like this is simply admitting to be bad at his job... or lying and greedy! Like all end customer, all investors can be "tricked".
Anyway... like all companies reaching 80% of market share... NIVDIA will also start to struggle hard in about 3 - 5 years, when the self inflicted disapoitnment from investors and end customer kicks in. Hopefully with INTEL in the 3rd gen and AMD doing AMD-stuff, we will have a healthy market for GPUs back then aswell. Looking forward to it.
Enjoying my 3070ti for the coming years. I held on to my 1080 as long as i could.
I would not switch out a 3 series card for a 4 series either.
if i could i would have done the same thing but i couldn't get a 3070ti for a reasonable price so i ended up with a 6700XT
I've got a 3070 that I overclocked a bit. 1440p monitor. It's decent enough. I'll probably wait for the next generation.
My Vega64 died start of the pandemic and I borrowed a friend's 1070. This thing has been holding strong and I am terrified of the day it starts to die. I am not ready to pay these prices to upgrade 😂
@@MrSystane I am, since it seems like it’s only gonna get worse lol…
At the time of this comment, there's a 3060 Ti on sale at Best Buy for $279.99. Price/performance wise there's zero reason to get a 4060 over the previous gen stuff. HOWEVER, with that low power consumption and heat output, AIBs would be shooting themselves in the foot if they don't make a low profile version of the 4060 to win over the SFF builders.
About 100+w in an sff system could be a challenge to cool down, though I can't say I wouldn't love to see it happen, I know that the most we can hope for is a 4050 low profile.
@@PackardKotchI'm running a 3090 in a sff case on water at under 55C under full load. You can put 4090 I sff cases like the Dan c4. But if you want to go ssff then you need something like that
Waiting definitely pays off. Sold my 3070 for $330 and almost paid $600 for the 4070. Waited, waited, waited until I got an Asus Dual OC 4070 for $480 in a quick promo. I'd say that's a much fairer price than what they started with.
this is the ONLY advantage. and the one that the customers have the best. is time. companies have to meet their margins, we don’t. if you need something RIGHT NOW at this moment there are cheaper options. if you need something better then you can wait. 4070 is a no brained for sub 500 bucks that’s an absolutely crazy value proposition
That's not bad
This review just made rx6700xt a better buy at $320.00 , used even cheaper .
12 GB vram , 25% better raster , similar RT perf, I will take more Vram any day .
I'm still glad I bought a used 6700XT last year for £290. No new release has come close to this kind of value.
We have to hold on our bucks and keep our outdated cards, the more we wait, the more we win. Those flash sales that they did a couple weeks ago are proof that it's working. Keep holding my friends
Agreed
What if I don't have any cards lol
@@abdullahgawish722 Of course it doesn't apply for people who are making their first build or if their current GPU died. But if you say that because you are playing on iGPU, you are mad but you have all my respect sir
The more you wait, the more you save
It's not that much of a challenge when the needle barely moves in the mainstream segment. We're basically in the GPU version of the post-Sandy Bridge slump.
@14:30 Btw there were Intel Drivers today that improved their performance in F1 2022 by a decent margin (up to 36% at 1080p, and 20% at 1440p), making them more competitive than they were previously.
@@telleva7890 nah, 10 is way too much. They just need better support for other APIs
@@telleva7890 disagree heavily
@@telleva7890 that's not an issue. AMD is close to Nvidia this gen so depending on how things go in the future we might see AMD catching up to Nvidia in terms of raw performance. The advantage AMD has is that they're switching to chiplets and that they already have experience with chiplet architecture, meanwhile Nvidia has these huge, complicated and expensive to produce monolithic boards. There's something to he said about AMD's ability to deliver 80% of Nvidia's performance with an architecture that's far smaller than Nvidia's in terms of die size. Having a smaller die size is helpful because it makes scaling easier long term. Unless Nvidia makes drastic changes we might see them slowing down performance gain gen over gen (which we're kind of seeing already)
That's why I think AMD can match Nvidia at the top at some point soon. As for Intel, all they have to do is make sure they have good support for all relevant APIs and then they can keep launching midrange GPUs until they're ready to go harder.
At this point Nvidia's main advantage is their software and they've worked on it for decades so it'll be difficult for others to compete in that aspect. But in terms of gaming all of that is kind of irrelevant
Buy 6600xt honestly. Best budget GPU at this point.
agreed
if you can afford it, 310$ for the 6700xt is hands down the best deal right now for a new card. Or getting a used 3070 for around 310$~ though used is always a gamble.
27:22 "Game on AMD" *Intel processor box*. You ok Newegg? You trying to tell us something?
22:15 I highly disagree! You're right that it might not be number one criteria in deciding what GPU to buy, but with rising temperatures everyone should ALWAYS consider power consumption whenever buying anything consuming power - it doesn't matter if it is electricity or any other kind of. For a single consumer it may not make a bit difference but - in the case products sell very good - we are not talking about single digit numbers but at least high hundred thousands and the combined power consumption is what makes the difference. START THINKING BEYOND THE SINGLE CONSUMER HORIZON!
It seems buying 3060ti on launch was great decision, even after 4000 series full launch.
I'm in the same boat heh, I think I'll be holding on to my 3060ti for a lonnnnggg while
I'm thinking of buying one new today. You can get them for £300 new in the UK
Is a 3060ti pretty good ?
@@Cognitoman yea
@@Louisthesaxman1 dude wasent it super expensive like awhile ago ? I looked online it’s like 275-300 dollars. Right now I have GeForce 1060 6GB. Would that be a big upgrade going from that to 3060ti?
Still using 980ti. I had wanted to buy a 3080 when they came out but "paper launch" and all that. Now, seeing cards released these days that are aiming for 1080p, looks like I had gotten a 40 series all the way back in 2016!
I'm in the exact same some boat but on the 1080. I just can't get excited over Nvidia right now, the prices are redicious and I refuse to pay them. But of course this also affects the 2nd hand market too...
Honestly the used market is the only worthwhile option right now.
@@Dfm253 not even used market is worth you, you don't know how long it will last.
Welcome to the review of what should have been a rtx 4050 at $200-220.
107 die ✅️
128 bit bus ✅️
Performance a bit better than last gen 60 tier card ✅️
At least at isn't as bad as the rx 7500
*than
@@iequalsnoobThe 7500 hasn't been announced yet, afaik. Certainly not released.
@@iequalsnoob 7500 doesn't exist yet though. And in case you misspelled the 7600, it's in the same ball park with the same 8GB VRAM. Heck it even gets beats beaten by the 7600 in some scenarios, so the 4060 doesn't win it all. All that while being (realistically) $50-60 cheaper, makes the 7600 the far better buy, if we were in a vacuum. Fortunately we're not, and the RX 6700 and 6700XT exist, and they're the best option at around $300. Get them while stocks last.
So no need to upgrade my 2 rigs. #1 4K rig is a 13.7KF w a 3080, and #2 1080p is a 12.7K w a 3060Ti. I realize comparisons were done with new retail pricing. But I want to point out that I bought both the 3080 & the 3060ti for $670 used. That's less than 1 4070 ti and only a little more than 1 4070. Buy used, and keep these 40 series on the shelf...
The efficiency is one of the few reasons for a 4060 BUT unfortunately it is NOT as efficient as advertised on paper... they removed the shunts from the pcb so the actual power draw can't be controlled exactly. RESULT actual power draw is more like 130 than 115. SECONDLY, idle power draw as of right now is 50 watts. Yup you read that right for some reason it is not clocking/volting down properly and draws a whopping 50 watt in IDLE....
Twice the performance and price vs 1060 but only 33% more vram. Classy NVIDIA with the planned obsolescence.
Seems like the RX 7600 actually offers decent value for money at $250, at least in terms of gen over gen improvement. As well as when to compare it to higher end Turing cards like the 2080/2080 Super - or even midrange RDNA 2 like the RX 6700 10GB/6700 XT
Where is it 250? The cheapest any store has had it is 260 even look at the video and don't listen to his words
@@iequalsnoob oh, pfft my bad I was listening in the background with my phone off as I was on my way driving to work. But still, $260 isn't that bad either but I'll admit most people might as well go for the 6700 XT if they can get it for around $300. I personally paid $340 for an OC model 3060ti, but that was also over a year ago when they were still going for considerably more than that & I've been really happy all around. Haven't encountered a game that "uses more than 8GB of VRAM", and I run it at 1080p/1440p high (DLSS enabled when needed)
The 7600 basically matches the Cores/TMUs/ROPs of the 6600/6650 XT, but they down clocked it and called it an improved upon 6600, so AMD is doing the right thing by not calling it 7600 XT. While Nvidia takes the 3060 and cuts everything, including the bus and calls it a 4060/TI. They are basically pulling their 4080 12 GB shenanigans all over again with the 4060 and 4060 TI cards instead of calling them 4050 series like they should.
@@anotherfinesith288 yes, exactly. The 30 series was so successful because each tier of card (xx50, xx60, xx70, ti variants, etc) all either matched or surpassed the performance offered by each higher tier of the 20 series. Such as 3050 (which succeeded the 1650) > 1660 Super/1660ti, 3060 ≥ 2070, 3060ti ≥ 2080 Super, 3070 ≥ 2080ti, 3070ti+ offering higher performance not previously seen. The 40 series has been a bit of a mess overall, but mainly just the 4060/4060ti. 4070 and above appear to (somewhat) follow the same trend as the 30 series did with gen over gen improvement
@@N_N23296 at least it will be on the pallets stored in front of the 4060s and 4060ti's 🤣
Edit: idk why the RX 6700 (non-XT) isn't talked about as an alternative to the 7600, it's got 10GB of VRAM which is still better than 8GB and the two cards are basically equal in performance & price. Only real difference I could think of off the top of my head is that the 7600 *might* have slightly better RT performance as it's on RDNA 3 but realistically none of the entry to midrange AMD cards should be considered if you want a "good" RT experience. Their value lies in standard rasterization (and more VRAM with cards like the 6700 /6700 XT)
4050 renamed to 4060 while asking for 4070 price.
Seems to be the running theme of this generation.
it would be good to include the 3060 in your power consumption graph, it’s missing. Also, power consumption is an important parameter to judge cooling requirements, aside from $$$ on electricity. I think if I have to buy another couple fans, or increase case size to accomodate a hotter GPU, that additional cost is real and immediate.
Yeah I think people are too up their ass on the die size and all that over the giant power consumption cut while still getting a performance uptake over the old gens. I don't care about 4k at all for any gaming and never will. The 4060 is perfectly fine where it is and people need to get over it. I also like how 30 series and back cards were mostly compared by MSRP or sell price from 1st party sellers but now they are all comparing against used cards now for price. I want a new card then I want a new card with future benefits of DLSS3 and Frame Gen. This is only on the 4060 as every other after is kinda bad to pretty bad. Given how hot CPUs are going now along with hotter GPUs. I would be happy with the 4060 and the more they complain about it being a 4050 makes me want to buy it even more.
One thing to note: the 3060 12gb will perform better on workload apps than the 4060. Apps like blender and unity use well over 8gb of VRAM for high resolution work, and they often go into system RAM and load through the memory bus of the card. On these apps the bus speed and VRAM are often the bottlenecks while working on high res renders and animations.
Reason i specify blender and unity is that they are free programs that indie, artists and small developers use, the kind to have a smaller budget for their work stations. The 3060 12gb is a workhorse that punches above its class with these apps, even over the 3060ti, the 4060 is the opposite.
If you're working on something that uses this amount of VRAM you would be better off with a stronger GPU
As a creative hobbyist who uses software like Blender and Unreal for fun projects but also plays PC games from time to time, the RTX 3060 12GB has been great all-rounder card. The 4060 on the other hand can rot on shelves.
Is it only me or suddenly the Intel cards are awesome value.
Reminder: all of their benchmarks fall to 1% of their results on a platform that doesnt have resizable bar enabled. Which makes no damned sense to me, but apparently intel wants to die on that hill.
@@ZeroHourProductions407 Nvidia also wants to die on the hill that you **should** use DLSS3 and Frame generation to top charts lol
@@ZeroHourProductions407not a bad hill to die on, most budget card buyers will have a brand new PC with resizeable BAR included
@@Kamilr97 "most budget card buyers will have a brand new PC" press X to doubt
@@Mr.Genesis That would be a thing, if a majority of games supported DLSS3. None of the games I play support it so there's no value on me to get an 40 series card over a 30 series or an AMD card.
thanks for including more older cards (5600xt, 2060, etc)
I'm still running a 1080Ti with a 7700k and I'm *literally right now* reaching the point where I'm looking at a system upgrade, both for gaming but also actual work use cases. Mainly the CPU isn't giving me what I need anymore, but also the 1080Ti to cut a long story short isn't in the most healthy state even though it still works meaning it makes the most sense for a complete refresh, so this kind of commentary on upgrade cycle pacing is very useful for me, even though as always it is quite frustrating to have to wait for sane pricing
i still have a 1080ti in my htpc
try it with fsr2+...it'll still crush a lot of modern titles at 1440p
that 11GB frame buffer lol..i thought for sure 20-24GB would be the norm by now
@@lawrencelin272 My evga 1080 ti is still playing everything just fine. Nvidia wont make a "mistake" like this ever again.
Lemme get that 1080ti bruh lol
My 970 needs a refresh
Get a ryzen 7600 and 6700xt combo.
What's funny is that people bitching about paying $700-800 for a 1080 Ti back in the day and labeling it a poor value are the same ones complaining about mid range cards. The people who "overpaid" for the top end card years ago are only now really needing a upgrade. It really seems like a case of people feeling how they have been nickel and diming themselves to death on cheap low and mid range parts.
Why is the 3070 missing from the benchmark comparisons? You show a couple in the B-roll at 27:40.
If I had a penny for every time I read that prices are falling, I could've gotten a 4060 by now.
I'll help: prices are falling
(they are, but not on any card I'd want to buy)
They are, got a 4070 on offer for CHF 480. We just have to wait
But they are, by $20! Line-ups around the block ensue.
It's quite clear nvidia believes that the price point for 1080 low end cards is anything up to $500. They assumed the price increases due to the global health crisis / supply chain issues / crypto mining were permanent and acted accordingly.
Damn, intel arc beat the 4060 in some tests!
The A750 has also been $200 vs the 4060 that starts at $300...
The problem is… in some. In general even 4060 is much faster than 770…
😢
My 1660 still serves me well enough. I'm not buying anything until the values are worth it again like the 60 series was.
The power argument is missing one crucial point. In moderate to cold climates there is a significant portion of a year when you need to use some sort of heater. At those times the card directly behaves as part of your home heating system. And that offsets corresponding costs.
Thanks Steve. This review of the 4050 is a great one.
Sure, going from cache to VRAM is expensive. But going from VRAM to system RAM is far, far more expensive. No amount of cache will help swapping from RAM.
Correct!
Is it really that bad? Both RAM and GPU are connected directly to the CPU, and you can find even DDR4 with sub 10 nanosecond latency, have you already checked VRAM latency values?
@@RicardoSilvaTripcall Yes. Cache is on-die (the GPU itself). VRAM is on the same PCB. System RAM you finally involve the CPU and motherboard and PCI lanes. We're talking orders of magnitudes. Think of the GPU going to system RAM like Windows using your SSD as virtual memory. It's that bad. Otherwise discrete GPUs would just use system RAM exclusively.
@@cole-nq1ru of course it is faster, but have you measured the impact? Probably not ... And please, don't compare accessing something on the RAM with something on a SSD ...
This is a horribly, horribly overpriced 4050. Not even a Ti, just a regular 4050.
Just wait when Nvidia release heavily cut down 4050 for $250!
Enjoy!
😂
Not sure. I'm relatively new to gaming and graphics I purchased a MSI with the 4060 in it. I've been pretty happy with it but it does have its limitations but I think for a thousand dollar prebuilt I got what I paid for.
Ngl, I was tempted to opt for the 4060 when building my first PC early this month but decided to go for the 6700 XT instead. Bought it for 270 and I'm loving every moment of it. Games ran super smooth plus the white aesthetic of the Hellhound Spectral White version suits well with my white PC build.
You made the right choice
Really good choice, one of the best price/perf value at the moment
How did you find one for 270
@@david8FCB Used ofc. Brand new Asrock 6700 XT Challenger Pro OC is 400 for example in my place. Plus its not white which is something that I looked for after performance. Saw a listing of used 6700 XT Hellhound Spectral White for 270 and I ran with it XD.
I don’t get the appeal of white PCs. Look ugly to me compared to the classic black or gray.
Will stick with my 3060 thanks! Maybe the 5060 might be something if performance and price are finally in sync, but certainly not worth getting the 4060.
I think you should buy more nvidia now, the prices will most likely come back to normal if you do.
@@brutallica2944if we keep giving nvidia more money they will just keep raising prices up the roof if people stopped buying the new 40 series im willing to bet prices will come back in sync with performance kinda hypocritical for me to say that due to me currently owning a 4080 but ya
Why do people even decide to upgrade gpu every 1-2 generation? They are there to last 3-5 gens…
Azu, the 3060TI will need to be replaced next gen because of the vram alone. Consoles now come with 10GB of vram. That's The base for games now. When the ps6/xbwhatever comes out it'll go up more. Which will most likely be next gen or middle of next gen hardware. Since consoles usually last 6 years. We're currently 3 years into the ps5. By the time the 50 series comes out we'll be into year 5 of the ps5.
@@indent566 The mere fact that you didn't belive my sarcasm is a testament to how many idiots that still support Nvidia even when they are pissing in your mouth and would make a comment like my sarcastic one, exept being dead serious about it.
At least you are straight up with your situation, thats more than 90% of peope today.
This makes the Arc 750 look good lmao
The 4060 loses to Arc A750 in Cyberpunk at 2K Ultra? What!!
Would be interested in the 4060 if it had 10 GB of VRAM.
We *might* see such "generosity" on the 5060. 10gb on a 160-bit bus with GDDR7 could provide decent bandwidth too. Speeds of 36gbps are rumored. But, even 26gbps on a 160-bit bus would provide 520GB/s. That's a hair more than the 4070. And, if we see that being the case then my biggest worry would be another reduction in cu's. The more I think about GDDR7 I also worry that Nvidia might decide to put 12gb's of 24gbps GDDR7 (good) over a 96-bit bus (bad) for 320GB/s total bandwidth (landing between a 4060 and 3060🤦♂). If 22gbps is the slowest GDDR7 speed (GDDR6X tops at 24gbps), also consuming the least power, then making a 12gb/192-bit card would give us a respectable 528GB/s (or 440GB/s on a 10gb/160-bit card). Tl;dr next-gen memory should help quite a bit. Just beware of even more cutdown dies.
These companies should be banned from business and fined for shitting the planet with 8gigs 128bit bus GPUs that are already dated even before the launch in 2023! Period!
Are the reviews the only good thing to come out of the 40-series? Otherwise these launches are kinda depressing.
To be honest I would say that seeing so many gamers holding and not buying those trash cards is the best thing coming out from the 40XX series but those reviews definitively come second in my opinion 😂
I love greedy company being trashed by Steve and the crew
Would've loved to see the 1080Ti in this comparison. With the way things are going right now it's still viable.
You're asking to compare a top of the line card vs a low end card with retail prices of over double not even factoring in the insane amount of inflation since the 1080 ti released
@@Sellout1738 Just thought it’d be interesting to see how Pascal holds up. And sure it may have been very expensive back then, but you can find the card used at an attractive price. I’m thinking of picking one up for a spare parts build
@@Sellout1738 Yes. The card is over 6 years old, technology advancements have always given more performance for the same or lower cost. Compare a 1060 with a GTX 480.
But also note that 1660 and 1060 cards popped up in this benchmark so it wouldnt be that far of a stretch.
I personally wouldn't buy a 1080 bc I'm a weirdo that uses fp32 compute and pascal compute was horrendous (a 1080ti is on par with a lowend amd laptop dgpu from 2017)
@@Sellout1738 We are asking to get a card that costs about the same used. I have one. its not like comparing to a 480. the rasterization power is similar to a 3060 ti / 3070.
I'd like to see the 4060 go against the 3060/3060ti and 6700xt on pcie 3. They're similarly priced and in the budget end, and pcie 3 was standing on and b450 and 10th/11th gen Intel only one to two generations ago. So upgrading to this line of GPU is reasonable.
Don´t even consider 4060 for pci 3.0…
It is a lot of eh. A 4090 barely has enough througput to lose performance between gens, and that is mere 2%.
None of these cards can saturate pcie gen 3 bandwithdth.
it's not an upgrade. this is a handful of benchmarks but the 6700 xt generally outperforms it by a lot (generally considered to be a competitor to the 3070 not the 3060) even if it were the case that most of the time they were neck and neck in performance, 8gb of vram is not enough unless you do not want to play many future games.
First of, thank you for great work! I have a proposal for an additional segment in your videos, for which testing should not be too difficult: idle power consumption of CPUs and GPUs.
I live in Germany with electricity currently costing 0.44€/kWh
I am probably not speaky only for myself when I say that most of the time using my computer not spent gaming or on productivity, but on mundane tasks such as browsing the internet, watching videos, looking and moving files and other such activities that consume close to no power.
I am therefore very interested in the idle power consumption of different products.
For example, my ...
i9 12900 consumed 9W idle
r9 5900X consumed 40W idle
3080 consumes 90W idle
there are things that would have been very good to know beforehand.
These 40W + 90W add up to 130W, which is more or less what I measured going into the PSU (~145W idle)
What does this cost compared to a ~10W idle 1060 6GB and a ~10W 13500?
Assuming a PSU loss of 50% at low power, this would be a comparison between 145W and 40W idle. Thats ~100W idle difference.
100W = 0.1kW
My computer runs idle for about 8 hours a day (I am a remote university student)
0.1 kW * 365,25 days/year * 8 h/day * 0.44€/kWh = 128,57€/year IDLE power consumption.
I believe this is something one should be able to factor in when purchasing a product and getting this information is currently very difficult.
How well is 6700xt aging that's the takeaway for this video🤔
Got my sapphire 6700xt nitro when it came out and to think I was going to get a 1660 super for $300 less, glad I spent the extra dollars
100 Watts vs 200 Watts is a huge money pit, I use about 230 Watts when gaming, which is $190 a quarter on my electricity bill, when I buy graphic cards I think about my back pocket, I will always give up 10 frames a second to save me $50 on my electricity bill. My A750 might be 2% faster, but it is also $100 more expensive, this is why I bought the 4060!
I have a undervolted 6500XT @ 5800X3D , play on medium/normal settings , use rtss to get a nice flat 30fps frame time, and FSR quality to sharpen the image and save even more , get rid of all the blur , depth of field , god rays and everything like that. And set the texture as high as the 4Gb can take and that's it . PS4/PS4 pro equivalent and as i only play old games anyway being able to have a gpu overing at 50watts GPU + 60watts CPU most of the time is Gold. By the way by luck my 5800X3D take -30 PBO and with the cache the 1 and even 0.1% low improved so much compared to my old 5600 non X. Basically à 150 watts system screen included
Nice to see numbers compared to a 1060 6gb. 👍 I'm still on one of those and it feels fine for most of what I do (encoding is more important than games for me). It's too bad they gimped this card compared to the 3060 12gb in bus width and total ram. 😢 AV1 encoding is a necessity for whatever I eventually upgrade to... Maybe I'll wait another generation, lol.
A750 is what you want if you can live with 8GB, if you want 16GB the A770. Both have AV1, and encode with higher quality and speed than Nvidia's 4000 series and of course RDNA 3, if encoding is your primary concern, Arc is absolutely for you. A750 has been $200 in the past, A770 is $300. In these benchmarks you can see both cards trade blows with the 4060 and RX 6700.
@@__aceofspades Yeah, I pretty much agree there that the Intel cards are an interesting option and have a high quality encoder. I've seen they still can occasionally hard crash and something odd happens with the encoder when the cards get fully loaded (vextakes channel's testing). Haven't looked into to much detail on that because my current setup doesn't support for resizable bar. Whenever I can upgrade it's probably going to be a full replacement. Thanks for the feedback!
Thank you SO much for including the 1060. That's my current card, and now that I'm looking to upgrade, it's so wonderful having direct comparisons
You will never be a miqo'te. Go back to second life.
80 Euro games as polished and finished as a Kickstarter scam covered by KiraTV.
150~200 Euro GPUs priced two to three or even four times more.
Dictionaries need to redefine the word "progress", especially in the context of hardware.
Great review as always, but as a 3070 owner I always miss it in the charts.
as very satisfied,15+ something years ScytheMugen cpu cooler user, i feel the pain, of a much appreciated product/company never being part of benchmarks, feels weird!
Took a gamble and bought a 6700 XT a couple weeks ago (from a 1660S that was no longer meeting my needs) since I figured the 4060 would suck. It doesn't suck as much as I expected it, but the 6700 XT is definitely the way to go imo (if you can find one at the $340 mark or lower, of course). I do miss NVENC (I actually am a person who'd make use of it, now I'm on AMF for instant replay and x264 for standard encode), but the power of this card and the surprisingly quiet fans under medium load because of the downclocking is... well, I'll tell myself it's worth the tradeoff. Also 12GB of VRAM, which is nice future proofing - we'll see how quickly or (likely) not 8GB becomes a widespread problem.
also decided on a 6700xt at a price equivalent to 340 usd. can't wait till i can build my rig and get it running.
value is the name of the game in gaming right now, hardware and software, so im not disappointed with lower RTX performance as majority of my games can't enable it anyway (outside of Riftbreaker which i'll be getting this summer)
@@junk3996 I do absolutely 0 raytracing so that was entirely not a concern for me. All about that raster baby.
Feel that it's gonna be a fun review. Thank you for the good work
The power efficiency gain is pretty spectacular. That could very well be the deciding factor for me considering the price of power where I live.
Yeah I got the 1650 for power reasons as well FPS per watt is nearly my most important factor
Thank you for honest review. U saved hard earned money of lot's of people.
I was going to replace my RTX 3060 Ti with the 4060 Ti but for the first time in decades there is no actual point of doing that, which is kinda embarrassing for Nvidia.
At least you’re in acknowledgement of that rather than just chucking the info given out of your head and assuming upgrading to new gen xx60 would be an improvement just based off marketing. You’re saving yourself money, headaches, and tons of confusion. Run that 3060 ti til you absolutely need to upgrade.. cause this 40 series launch has been bizarre to say the very least
3060 ti was a cut down 3070, and had much better value than 3060. 40(50) ti was never going to beat it. Most people shouldn't need to upgrade every gen anyway.