These kinds of reviews just expose how poor TH-cam reviews can be. 4060 TI is the best GPU to get for $400 and here's why is that. + DLSS 3 capability is the future + Strong Ray Tracing performance + Excellent 1080p performance (100 fps on average at Tom's Hardware) + Competitive 1440p performance (75 fps on average at Tom's Hardware) + Reliable 8-pin power adapter + Low power consumption (160W TDP) + HDMI 2.1 Connector + Compact physical size + New, launched on May 24th, 2023 You can run any 2-5 years old game with +90 FPS in FullHD/QHD, you can run the latest AAA games with DLSS and Ray Tracing ON. In future games you can utilize the new techs like AI support, Ray Tracing, DLSS 3 etc. Very competitive product, but I truly cannot understand why somebody would buy an AMD GPU? Most probably Intel will replace AMD in GPU market in coming years.
Nvidia basically dropped the entire mid-range & entry-level SKUs down one GPU tier. Now the RTX 4060 is based on AD107, which always was a 50-class chip, and the 4060 Ti is based on AD106, which in the past was used for the non-Ti 60-class. But they can't hide that fact from your reviews.
I'd be interested to see this compared with the 3060 (non-Ti) for this reason. Yes, the price/performance is the same, but what could NVidia's current gen be giving us?
That's not the story. The one is Samsung and huge and power hungry. The other is TSMC. Smaller and more powerful and more efficient. Not the same apples to oranges
@@djchotus1 That's not how die binning works. All the good quality dies capable of running at the lowest voltages go into pro cards tuned for best efficiency and get sold at a premium to data centres with deep pockets, and gamers get the crap left over running at speeds way that require voltages way above the efficiency sweet spot with half the VRAM and double power consumption for 10% more performance.
@@lmaoded4854 Why lie? The 4050m is 40% faster at the same power draw of the 3050m. And the 3000 series absolutely did have bigger dies compared at each model level compared to previous generations.
At this point, if you don't feel that your intelligence had been insulted by Nvidia, then you have been thoroughly brainwashed by their marketing department.
@@paradigmshift7541 I undervolted it and it's using 180W, 20W more than a 4060Ti. Also the undervolt has improved the performance by about 2-3% so for 1440p (the res I'm using) it's basically a 4060Ti. Energy is relevant to me, but I don't think 20W extra power draw for another 1-2 years justifies spending 520 USD (That's how much a 4060Ti is in Europe), especially since it's basically the same performance. Y'all Nvidia fanboys need to find something else to convince us, talk about FG idk.
@@Relex_92 You're right, I wasn't very clear in my comment. My intention was to say that Nvidia keeps my 3060Ti competitive with the 40-class equivalents. I know that now, with the RX 6700XT and RX 6800 the 3060Ti/3070 is not really competitive, only if you're the biggest fan ever of Ray Tracing. I am fully aware that 8gb of VRAM will be a serious issue for people playing 2024-2025 AAA games on this card. The thing is I'm not really excited for any newly announced games and probably won't be until GTA 6 or smth. I mainly play simulator stuff like Asseto Corsa, Automobilista, F1 with some story games like HZD and The Witcher when I'm bored. This card will be enough for me to enjoy the games I play at 1440p high settings for the next 1-2 years, can't say the same for everyone and that's why you're right. By the time I want new games that need more GPU muscle and VRAM, I will probably already be on an AMD GPU with 16 or 20gb of VRAM. Probably gonna buy the RX 7700XT/7800 long after release when you start to get those sweet deals.
1080 is NOT matched by 2060, but rather the 2070. The GTX-16 Series/Super Variants were okay (for the low-end/midrange), but the Vanilla RTX-20 Series was poor value all round. AMD did have the edge on value with the RX-5000 duo cards. The RTX-30 Series was actually not great, it was decent, it only seems great since the RTX-20 Series was so bad. Again, AMD had the edge on value with the RX-6000 cards. Same with the RTX-40 Series, it's actually not that bad. It only seems bad because the RTX-30 Series was decent. All that Nvidia needs to do is shave off at $100 price from the midrange cards ($200 from high-end), and it's looking good again. This time AMD does NOT have the edge on value, the RX-7000 cards are similarly disappointing, and also need a substantial price drop.
NVidia has way less fails to their name than AMD. Like 30 series was awesome and with good pricing (the COVID BS messed it up). The 10 series was also incredible. 12 was ok. And like... you can go into the past and at least 2/3rds of gens were really good, and half of the rest was at least ok. 40 series is a disaster.
@@korinogarowhat matters is what we pay and the cards were way too expensive.... amds 6000 series is more vaiable now because its cheaper and more vram with same Performance lol
@@wiLdchiLd2k the only Nvidia cards that can match those 2 in performance are bottlenecked in VRAM and cost double that, and the 4000 series is unaffordable unless you're willing to burn money/have enough for a 4090. most people don't
I got a new 3060TI for $305 recently, and given all the info on this channel it feels like that's more than a reasonable price for the performance it delivers.
It's a shame Hardware Unboxed doesn't share Nvidia's vision for what games really want: more AI-generated frames than rendered frames, and 10 more years of 1080p.
TBF I'll probably still be on 1080p in ten years. My 3DTV is 1080p, and they don't make 3D monitors or TVs anymore, so any monitor or TV you can buy right now would be a massive downgrade.
@@astronemir There are 1080p monitors that are up to $700-800. Just because you don't see the benefits of playing 1080p on a Fast TN panel, doesn't mean some people still do that lol.
Reality is the 4060 Ti doesn't have the grunt to need 16gb. 12 would have been heaps. Meanwhile the 4070Ti looks like being limited by ram going forwards.
@@Eleganttf2 The first mid-ranged graphics card with 8GB of Vram, the 8GB version of the RX 480, was released SEVEN years ago, for a mere 240 USD (about 305 USD if you adjust for inflation). It's RIDICULOUS that 8GB is still being used on 400 dollar cards SEVEN years after it became normal for a card in the 300 dollar range (after adjusting for inflation).
The people that were shouting in the wind that the 4060 SKU was in reality a 4050 should be feeling fairly vindicated about now... jesus, this is hard to believe
Reviewers need to be a bit cagey regarding leaks, rumors, and predictions to protect their reputations (and avoid time-wasting semantic arguments with the peanut gallery) but it's pretty easy for laypeople to look at die size, bus with, and VRAM capacity as well as what's going on with the rest of the stack to make informed predictions. The positioning of these products shouldn't be surprising to anyone who has been paying attention to how die and bus cuts have worked across generations. GTFO with these claims that a few dozen MB of cache is some sort of performance magic. That's meaningful for code (especially crypto mining code) but means little for general rasterization, as has been shown empirically.
The problem is that there is *no* Lovelace GPU design that fits the performance uplift what one would expect from a 4060/4060Ti. It's not like they have something in their back pocket to fill the gap from the 4070 to the 4060 (which is HUGE!). It was a conscious decision to _really_ segregate the market between midrange and upper-midrange. The 4070 does a decent job trouncing a 3080 in modern games and gets close to, or sometimes beats, a 3080Ti whereas the 4060/4060Ti struggle to match same-tier prior products!
@@chronossage "Remember back in the day when the new xx60 card was matching last years xx80 card..." No, because that's complete bullshit and never happened. *NO* xx60 (non-Ti) card has ever matched a xx80 (non-Ti) card, EVER.
Igors Lab also discovered that there might be something wrong with the telemetry of the 40 series cards. They apparently sip more energy then we thought they would. Which is quite the scandal if you ask me.
Here in India 4060 ti on sale is 35 k and 4070 is 60k minimum and 3060 ti 33k minimum so yeah 4060ti is a good deal, country wise prices vary alot,AMD is expensive here than Nvidia,4060ti is definitely a good buy for people in india
@@hunterhearsthelmsy1 prob with intel right now is its wonky drivers. they still have a huge hill to climb to make them relevant. If you take the a770 model, the 3060ti is basically better than the a770... its a sad state
The whole product stack doesn't make sense. The RTX 4080 is cut down to 60% of the power of an RTX 4090. Historically speaking, 60% of the top tier GPU has always meant that it's a 70 class card, not an 80 one. And this means that basically every other card aside from the 4090 this generation has the wrong price and name. The 4070 TI and 4070 are both 60 TIs and 60s while the 4060 TIs and 4060s are actually 50 class GPUs. Very very concerning.
@@wiLdchiLd2k the issue is ...imagine this; a car company offers a 1500 series truck. meaning it has a payload of roughly 1500 lbs. 1000 in the bed, 500 in the cab. But then suddenly reduced the payload to 1200. but charges the 1500 price..so you think the price is fair, i mean thats what they charged for the previous version.... Meanwhile they are making a truck that has an actual 1500lb payload, but they are calling it the 2000 series truck (its got a bigger number it has to better right?)... for an extra few grand. See the problem? Youre paying more and getting less while also being decieved about what the product should be able to handle.
Rx 6800 non-XT can be had for $450 in the US and not much more than a 4060Ti in other regions so I really don't see why anyone would buy the 4060Ti right now
Only reason I can think of is sff pc builders, for whom these low power cards are an absolute god send. Then again, they're low power precisely because they're using a tier down silicon per price point, hence the non-existant performance uplift. Sff builders are used to overspending on their parts so it'll be an easier pill to swallow for them ig.
@@wiLdchiLd2k that's not a selling point, it's a gimmick - a nice to have, and shouldn't dictate the price whatsoever. And it should always comes AFTER a raw performance uplift. Same with FSR and all other variations of DLSS. For GPUs a feature becomes a selling point only when it's an industry standard and/or is supported by the vast majority of applications relevant at the time. For example, RT was entirely a gimmick back in 2018, but not today. Even if you were to argue that DLSS3 will eventually reach where RT is today, today's hardware will fade into irrelevance by then, just like how the entire RTX2000 series did.
@@hasnihossainsami8375Totally disagree with your points. First of all, 20 series is quite relevant today. It hasn’t fade into irrelevance. Second, features are an important part of product costing as it improves the product experience so a company can charge more if they are providing you more feature. Why are you paying for any software then!? lol. Softwares are actually like features to improve experience. This is not to support the Nvidia greediness to increase price. But as a general rule, features do have value if they are improving experience, as simple as that. And dlss3 do improve that experience in several games now like Spiderman, Hogwarts legacy, etc and don’t be ignoring the fact that Amd sponsored titles don’t have dlss3 due to Amd scummy practices
@@pinakmiku4999 Do you really need to buy a new gpu to only get a new software improvement???? I mean if you were getting more rasterization and an improved software i could understand going for nvidia.
@@pinakmiku4999 spoken like a true Nvidia shill. Everything below the 2080Ti, a $1200 card, is borderline unusable with RT on in modern games without DLSS. Fact check yourself. And DLSS3 is not a selling point because it fosters the idea that same hardware performance for the same price 2.5 years later with only software uplift is okay. It is not okay. We're paying for the full package, hardware included, not just the software. Imagine if the 3070 had the same performance as the 2070 but could do DLSS2 even better - same scenario. The 4060ti is not even a stagnation, but instead a downgrade from the 3060Ti in every physical performance metric except power consumption, and it makes up for lost ground with software. That's disgusting.
I can’t be the only person to switch from Nvidia to AMD (6000 series) when I saw what my options were from the green team. I’m interested to see what the steam hardware survey looks like this time next year.
I don't think the numbers will shift big on that scale. Steam will probably report that plenty of people will still use 1060s and 1650s. NVIDIA is, on the big numbers scale, just losing to NVIDIA of a half-decade ago. There's very little push to do much upgrading ever since 1080p reliable FPS was cracked all the way back then. Because NVIDIA of today is practically refusing to make 1440p reliable and affordable. Why else would their x60 card cost way more than it used to, but still target 1080p?
You're not, switched both gaming machines in my house. 2060 6GB to 6700XT and 3070ti to 7900XT. We should be good for a couple of generations. We don't ray trace and team red gives us the power to get all the detail and FPS we want without team greens "special features". No regrets here.
I dont got preference, but I actually wouldve prefered to try an Nvidia GPU again after two AMD cards (latter had better price/performance). But Nvidias GPUs this gen are just so bad/overpriced theres no point even trying. Ended up with 6800 XT. Sometimes AMD drivers can be wonky, but they have gotten a lot better these days, and any issue cant be as bad as 8 or 12GB of VRAM with a slow memory bus. I just hate that memory stutter and screwed up frametimes.
AMD currently holds 20% market share, last year it was 12%. So you're not the only one. RTX5000 better be really good with plenty of VRAM otherwise Nvidia will lose more gaming customers. On the other hand, their gaming revenue is nothing compared to their AI revenue so I wouldn't be surprised if they just keep going down this road. The RTX4080, a powerful chip with only 16GB VRAM that will 100% be bottlenecked by VRAM in 2024 games at high resolutions/settings, just dropped to $999.. it used to be over $1200.. absolutely bonkers when some 2023 AAA games already come close to using all of that 16GB VRAM. Despite being slower, I would choose a 20GB 7900XT over a 16GB RTX4080 if I could get either of them for free. That 20-24GB VRAM on AMD's 7900 cards will not go unused within a normal card's lifespan (4 years on average before people upgrade), it's not there for show lol.
Steam survey isn't accurate, due to the way it works it gives huge boosts to whatever is in use by gaming cafes, 1000 accounts on 1 pc with a 3060 and steam counts it as 1000 3060s.
Nice to know my 3060 TI is almost the same as the 4060 TI, only problem is I wanted an upgrade with more VRAM for video editing. Guess it's time to look at used options. As a bonus I can get about 5% more FPS from a small OC/UV.
@@adamek9750 Anyone only interested in 1080p, med - high settings are still set for the next 2 years. Where the 3060 ti starts to struggle is 1440p high. Steam hardware charts show an uptick in 1440p monitors but 1080p still reigns supreme on the platform so most are still safe.
You need to upgrade it now, if you are interested in 4k or 1440p ultra and ray tracing. Just because 4060ti sucks doesn't mean you don't have to upgrade 3060ti. It just means you have to spend more money, this was nvidia's plan all along.
Usually efficiency, which it's not even good at. A 6700xt uses less power for better performance. And consider it's a 50 class card the powerdraw is really bad as they usually had enough with just 75W.
@@saynotomanifestv3101 Indeed i know of one that is all about the "technologies" you do not get with AMD and how said tech makes the product future proof.
@@XX-_-XX420 Funny thing is, you can often buy a better AMD card, downclock it, and you might even get better performance/frame than some new Nvidia GPUs.
@@termitreter6545 not sure about rtx4000 vs RX7000 as I haven't looked into that, but a 6700XT os very effecient you basically get 3070-3070ti performance at 130W. In some games you'll be far better due to the 50% more Vram you have. And since a 4060ti is worse than a 3060ti draws about 200W it's pretty easy to look at the 130W card offering more performance and pick that for any possible reason as its better at literally everything. For RX7000 I saw some guy having decent gains with a 7900XT after undervolting he was talking about 250W vs like 400 or something like that. But with the 6700XT I know exactly how good it is as I had one. Even stock it only uses 186W under full load. So Rx6000 vs rtx4000 it's a no brainer as AMD is better in every single way. (looking at the mid range atleast).
I get worried seeing the latest model of cards doing worse than the previous gen thinking nvidia would nerf the previous gen in an update to make the newer gen look better.
They're too clever for that. People would immediately know their cards got 10-20% slower and why. What they would do is simply make sure new games perform better and of course additional features like DLSS and RT become the selling point.
im more worried nvidia might release decent cards than not to long later gimp them without telling people so people expect a certain performance but dont get it because the box doesnt actually tell you the difference between the gpus. ie kinda like what they did with the 3060's 1060's 1030's and what not without reviewers people would get absolutely scammed and they wouldnt know it.
Well the difference here is that it isn't simply a refresh pushed to the limit. It would be if the 11th gen chips were ~50% smaller and the memory controller was essentially running everything in single channel. This is worse because it truly is a generational improvement but instead of passing that to us they cut it down until it performs the same as last gen to maximize profits. It would be extremely impressive what Nvidia has done if it wasn't so damn malicious.
@@JJFX- without getting into architectural changes I’m talking about one generation you’re getting 100 FPS in one title and 90 in another, then the newer gen gets 110 in first title and 87 in second like huh? 😂 this would been more impressive as a 3060 “Super” as you mentioned rather than a new generation but I think that involves admitted defeat on the entry level 4000 series as you release a new 3000 series card
@@FIVESTRZ Oh no you're absolutely right, I'm simply saying this is even worse because Intel was essentially making the best of bad situation whereas Nvidia is making the worst of a good situation lol
@@JJFX- 😆 facts facts. Either way it is what it is can’t be too mad if they are phasing out 3000 series with each launch - just means nothing changed. If the 3000 series lingers this will become a head scratcher
Just like how I didn't have any reason to upgrade my i7 2700k for 7 years. We just went over the CPU dark ages. Now we're in the GPU dark ages where every generation is garbage.
@@DyceFreakpretty sure after the r9 390X you had to get win 10 or newer. Specifically didn't buy RX5000 at the time as it required win 10 or newer and I was using a win 8.1/10 dual boot.
Not at all if you care about it's power consumption per performance. It will pay for itself over time vs the 200watt behemoths. But if you're expecting a linear performance graph between generations in line with cancerous consumeristic progress, then this isn't your girl.
@@andrew6978yeah this would be a decent 4050, but the powerdraw is definitely way to high as 50 class cards typically had enough with 75w, this thing on the other hand....
Wait... they sell a 4060 ti version at 16gb for $499 what the frick. Its very clear nvidia tried to shift all of the 40 series cards up an entire tier and is why they "cancelled" the 4080 12gb and just rereleased it as the 4070 ti. It should have been the 4070. Remember the 3060 had a bus width of 192 which is the same as the current 4070 ti. The 3050 has 128 bit bus like this 4060 ti card.
14:10 - "The 4060Ti is a terrible product at $400." Great video. Sums it up nicely. Would be good to know at what price reviewers believe this card becomes reasonable and/or desirable.
It's simple , Nvidia want's to sell you Frame gen and AV1 stuff , not extra performance , they really think people will buy in to that and will rush out to buy 40XX , i guess that has its good sides to , more used 30Xx series on the market = less $$ for used parts
a few months before the big price crash? ouch. I sold my 6700XT for the same price I bought it for in April 2022, just as prices were collapsing. A month later they had lost 40% of their value.
@@Takisgr07afaik, they're mostly grading the performance of the GPUs in rasterization, which is still the predominant way of rendering games. don't blame them for catering to most gamers instead of using tech that's exclusive to this generation.
@@nickyang1143start at the 680, that's when Nvidia started their fuckery with using smaller dies meant for smaller cards. The 680 was meant to be a 660 but AMD screwed that generation up so bad Nvidia could get away with it.
This is the same business model that the automotive manufacturers are making="If we make our affordable lower tier products suck enough, then our customers will be forced to bump up to our higher priced product line where our profit margins are much larger"...AKA "Feed the lower class a SH1T Sandwich and they'll eventually pay for our overpriced menu items" :/
Don't forget to mention the PCI-E Port on this card is capped to x8. That means that for everyone owning a PCI-E Gen 3 system this card isn't even a option when upgrading from a older series of GPU's. And there are a lot of 3.0 systems out there.
It "is" a 4050ti. Nvidia bumped the entire stack down because not enough people bought the 3090 and Nvidia thinks everyone is absolutely loaded. The truth is that the 4080 should have been a cut down AD102 with 20GB VRAM, the 4070ti should have been the fully enabled AD103 die, the 4070 a slightly cut down AD103, both with 16GB. The 4060ti should have been a fully enabled AD104 with 12GB, etc. The fully enabled AD106 die should have been the 4050ti (where 8GB makes any sense at all!).
@@jurgengalke8127 I think you probably got the better deal. And, unless your playing at 4k or are unhappy with 120-144fps, then the 6800 is probably plenty.
@@sparkythewildcat97 i mean if you want to paly in 4k then the RTX 4060ti will suck even more, the card cant even put out 60 fps in 1440p games, how you except to play in 4k then ? By lowering graphic to low settings just to play in 4k ? kinda stupid.
I've seen two german reviewers tackling this issue. First was der8auer in a 4060ti/3060ti video and PCGH in a 4060 on pcie3 website article (diagrams should be self explanatory even for non-natives). Results were of course mixed but some were really interesting. Because in some games you'd lose up to 10 % on 1080p max. details E.g. Forza 5 and Hogwarts Legacy.
It's entirely possible considering the the 4060 and Ti only use 8 lanes. So if you drop it down a gen you're operating with the modern equivalent of a 4 lane pcie bus. Or to go the other direction, the equivalent of a full slot at PCIE 2.0 speeds. It's like they saw the RX 6400 putting a mobile class GPU on a desktop card, halved bandwidth and all, and said "we could do that", but instead of making it an XX40 or even an XX50 they put it in the 60 class which feels all kinds of wrong. That wouldn't be so bad if it was a clear improvement on the 3060 ti, but nah it trades blows with its previous gen counterpart.
Just look for PCI 16x vs 8x comparisons of similar nVidia cards like this one. th-cam.com/video/COcHHX2MdKs/w-d-xo.html Most games it doesn't really matter with only a few FPS difference but some will suffer more. It's the 128bit memory bus that kills the performance of this product not being limited to 8x PCIE.
Nvidia is using DLSS as an excuse to release new products with hardly any generational improvements. They claim that DLSS 3 only works on the newest GPUs, but I'm sure they just designed it that way. In the future we will probably have newer and better upscalers that only work on the newest products to encourage upgrading.
Day 1 3060ti buyer, $800aud(+shipping) and worth every cent. Have not bought a GPU with less than a 256bit bus since 2003, I also replaced my 2060 Super with the 3060ti, was expecting to get the 4060ti as speculation suggested it should have been 3080 level in performance but this is what we got instead. FX5200 vibes
FX5200 is actually currently a go-to card for Win98, so it's held up quite well despite the FX's bellyflop of an existance. I just went from a VEGA64 to a RX7600, I waited 4 years and the entire price gouging fiasco before buying again. I pity your rampant consumerism.
yeah dude, and we thought that 20-Series was the worst gen ever give price to performance..... 2060 super managed to match The Legendary GTX 1080, for 100$ less Msrp and new features, In this generation, u would at least hope that 60 Ti card would get close to 3080.... oh well it cant even beat 3060 ti properly lmao
What's so stupid about all this is if they just did a 160bit bus with 10gb and maybe 10% more cuda cores than it has, pretty much we wouldn't be having this conversation, and Nvidia would still have fantastic margins.
Yup, it doesn't matter how much vram they put on this card if the memory bit bus is too small its pointless. Memory bit bus size is more important than vram capacity imo.
Steve is there a way to simulate a 4060ti with a 4080,like you did with 3dv cache 12/16 core Where u disabled cores etc. Just to show what numbers we could get on a bigger bus and more memory(vram) ?(keeping cores normal etc) Or is the sort of simulation not possible? If it is possible it would be great to see how it should've performed if Nvidia never cut back soo much
No, GPUs are too complicated for that. I think theres going to be a 4060TI with 16gb of memory? Not sure if that card isnt gonna differ in other ways, though.
No ones saying its not a solid buy but by itself in a vacuum, its stagnant and and even downgraded and only rely on power consomption/ Frames Gen to look decent. as a new buy from a basement of cards its a pretty solid $400 card but really isn't doing anything for the mid range 3060ti/RX6700 Card owners like me, so I retract my complements.
4:29 Tech System Specs, build within Sponsor's Case, for only: CPU: AMD Ryzen 7 7800X3D 4.2 GHz 8-Core Processor ($439.00) CPU Cooler: Thermaltake TOUGHLIQUID Ultra RGB 107 CFM Liquid CPU Cooler ($279.99) Motherboard: Gigabyte X670E AORUS MASTER (rev. 1.0) EATX AM5 Motherboard ($489.99) Memory: Corsair Dominator Platinum RGB 32 GB (2 x 16 GB) DDR5-6000 CL30 Memory ($159.99) Storage: Corsair CSSD-F4000GBMP600PXT 4 TB M.2-2280 PCIe 4.0 X4 NVME Solid State Drive ($394.99) Case: Thermaltake CTE C750 ARGB ATX Mid Tower Case ($199.99) Power Supply: Thermaltake Toughpower GF3 TT Premium 1200 W 80+ Gold Certified Fully Modular ATX Power Supply ($199.99) Operating System: Microsoft Windows 11 Home OEM - DVD 64-bit ($117.98) Monitor: Gigabyte AORUS FI32Q-X 32.0" 2560 x 1440 270 Hz Monitor ($649.99) Case Fan: 4 x Thermaltake SWAFAN EX14 RGB 81.6 CFM 140 mm Fans 3-Pack (4x $119.99), becasue your Case can fit up to 14 140mm FAN and only comes with 3 and you won't enjoy of enough turbulent air for your at least 6 more fans of your GPU and radiator of your CPU. You need at least 60W to 100W of fans...but not in one noisy and windy fan. Total: $3411.87 or more than 4000€... without GPU. Please could you tell me how terrible really are GPUs from Intel, Nvidia or even AMD 6800, 6700, 7600, 6600 or 6400, and prices, and performances and RT, FSR, DLSS and so on...and how important are FPS, oh yes and classic missing resolution or settings: 1080p 1440p and 2160p not being tested together, and testing ULTRA and not high and medium settings, or RT and upscaling performance to quality...
I think what upset me the most,i was legit waiting for this gen cards and then i end up getting an 6700XT, yes i could get this card 2y ago, oh well. I am happy with this card coming from an RX570, the uplift was substantial but still just shows how new-gen card not always is worth waiting for ( talking about entry / mid - level cards) now I'm deciding if i go AM5 or just get an 5600/ 5800X3D , currently i am using an R5 1600AF.
@@ThePurplePassage ye but in order for me to get 5800X3D i need to swap mobo and ram , as my ancient Gigabyte GA-B350M Gaming 3 is not good for the CPU , yes supports but dealing with Bios updates from gigabyte is just pain , is not simple as download bios and update , you actually need to do a lot steps and update other stuff. This is why i was considering AM5 or INTEL (never used a intel setup tbh but i have no preferences) but I've heard AM5 gives some headaches as is not matured yet , not sure how is it now. My budget rn is 400€, 5800X3D is 300€ , new mobo around 100€ , 32GB RAM 3600MHZ CL18 70€ (CL16 100€) , surpasses my budget at the moment but i dont mind wait a lil longer and keep R5 1600AF as is alright in most scenarios ( currently playing BDO and works) I have 2 options here:, -Get tomorrow R5 5600/B550/32GB ram (411€). -Wait a month and get 5800X3D instead (or AM5 setup with an 7700 or 7600) I'm having some kind of fomo here , will 5800X3D sell out? i never had fomo buying an component lol but since 5800X3D is out for a year or so and AM4 is in the end of line. Could grab CPU now and get rest of stuff next month ( unless the CPU will go down in price but hard to know those things)
If a bios up date ie downloading installing @bios and the latest bios f51g then 3 clicks is alot of work not sure you would like the half a day work for building a new pc also gigabyte have that security issus so every one on a gigabyte mb should update thier bios now! that includes if you buy and new mb first job always update bios , that said no point building a new am4 system just go am5 or intel if your building fresh
Do you think could be a drive problem not optimized for the new Lovelace architecture , the 3000 series use the Ampere old but very established architecture ?
@@Zuggeerr Yeah there's a reason they don't/can't make x030 chips anymore. The tiniest and cheapest chips all moved up a couple of tiers (in price, not performance)
I recently upgraded a friend's system to a 3060 Ti Strix for $300 and it has 2 years of warranty coverage still. The seller even drove it to me. Could of nabbed a 3070 for $350 from other sellers but warranty transfers sounded iffy or the sellers were unprofessional. No brainer versus a 4060 Ti
Great video and so glad you didn't go down the Nvidia DLSS false frame road. When you buy a gpu you want it significantly better than the previous series and not have to depend on high-latency false ugly DLSS frame software to make it appear better - thank you very much.
The issue is we can't opt-out of paying for the optical tensor flow, which I think is what's inflate the price. NVidia is pushing for DLSS3 to be massively implanted. The cards do have insane energy efficiency though. 4070 has 3080 performance at 200W, which is a big heap. If it was priced correctly, it would have been the real 4060Ti at $400, and this 4060Ti should've been a 4060 at $300
I really don't understand why people keep saying the 3070 graphics card is not great! I own more than like 600 games on steam and every last one of them plays amazingly. I'm not talking indie games either! I mean all the heavy-hitting games!
Because Nvidia crippled it by not giving it enough VRAM. 8 gigs is not enough for that class of card. The 2080 Ti will outperform it in Ray tracing because it has more VRAM than it does
Love these comparisons keep up the great work! So glad i got a 3070 after a age of waiting for only £250 about six months ago. I had a feeling 40 series was going to be out of my budget and the 4060ti just confirms I made the right choice. A little worried about VRAM moving forward but so far its been great.
Unless you're worried about maxing out settings your good. I have a 3070 as well. 80 something games in my steam library. No problem getting high frames for my 170hz monitor at 1440p.
Still rocking 3070 with little to no issues. Overpaid like crazy here in Sweden when it got released, it was almost 800euro, and this card took me months of searching to get.. Prices are still crazy high in my country for GPUs, they're still at 680-700euro retail price
4070(which isnt a great buy) matches the 3080 which is 2 tiers above it, while having 50% more vram than its last gen tier, yet the 4060ti can barely match its own tier, with the same amount of ram AND less bandwidth😂
The real mystery is why Nvidia is releasing gaming GPUs at all right now. They'd probably save money with a tax break by cancelling them, and then reusing the chips for AI and other data center usage.
Best guess is they feel the need to release regularly to maintain mindshare and AIB relationships. If GPU investment for AI goes tits up and Nvidia needs to drop back to gaming, it will be a bit of a problem if they've been out of the market for several years.
I think this is simply to keep onto existing workforce right. They have a lot of people working specifically on GeForce stuff they feel some kind of obligation to keep drivers up to date etc. So the easiest thing for them is to just keep the whole thing at least afloat to avoid any riscs associated with leaving the market all together. It's for convenience sake. Nothing more. They have no need or want to produce compelling products to normal people anymore. That is clear.
Great review!! I just bought my friend a 6750 for $350 on Newegg as I'm putting together a new gaming rig for him. I wanted to get him something by team green but the 40 series just stinks. It was more to do about the VRAM since the 6750 has 12GB and I think it will age better over time. He only plays at 1080p and doesn't care about RT. He's coming from a GTX 760 so the 6750 will be a monumental upgrade. Let's see if the 50 series course corrects!
the Rx6750 is way way WAY better then the 4060ti. You wanted him something from team green ? every time i build a PC for friends i get them something from AMD because the price to performance was Always better and this is what matters the most, then the downgrade is reality on team green because a friend back then with a GTX1060 has now worser performance in the same games as back then then a friend with a Rx580 nowdays.
@@allxtend4005 Not a Nvidia schill, but users benchmark has the 4060 ti 13% faster than the RX 6750. I came from a 1060 card so either one of the RX6750 or 4060 ti is a huge improvement over what i have. I have to ask myself if that 13% faster is worth $394 on microcenter vs $429.00.
@@Annihilator-zv6xh Except that it is. I just returned a cash happy, jittery, driver uninstaller POS of an RX 6750 TX - this was a bloody daily occurrence. I replaced it with a 4060 Ti and it's faster, cooler, has not crashed a single time, and is even more efficient by about 35-40%; running in the 130 watt range for most games I tested. Even Starfield, unoptimized crap that it is, runs at 60-80 FPS on average.
At this point, I don’t realistically see myself buying a Nvidia card any time soon. Intel's and AMD's look more appealing, especially potential future cards.
Got fed up and took the plunge. Bought a merc 310 7900 xtx. Great card to replace my favorite gpu the 1080 ti but now I need to upgrade cpu lol. Which I will do in a month or two
The memory bandwidth is a big deal in AI workloads. Paperspace's "GPU Memory bandwidth" has a blog post, as well as a post in huggingface "Run a Chatgpt-like Chatbot on a Single GPU with ROCm". Both of them said that memory bandwidth is more important than pure Tflops due to idling. Sadly, I can't find any good benchmark sources for the 4060 series.
They overproduced RTX 3000 and Radeon 6000 GPUs during the Etherium mining craze...once it ended...they had a bunch of stuff that wasn't sold yet. Now they've gotta give sub-par value on the new generation so the old stuff will still sell at good prices. Next gen (a year or so from now) should be considerably better. If not...then I don't know what to say.
the 4060 Ti is a 4050 Non-Ti card for $400 dollars. Can not understand how NVidia fans still defend NVidia, come on people stop fanboying and start screaming or next generation you're going to get RTX 5080's that should be RTX 5060 Non-Ti cards for $1000! Think im joking? We charge you more and you get less, Suck it. - NVidia 2022-23. Frame Generation and DLSS is just there to make a less powerful card look more powerful. Reviewers need to remove all these "Performance" tricks out of testing and do PURE raster P E R I O D ! ! ! ! !
Those results are wild! My buddy was looking for a pc around $1,000 to game on 1080p and maybe eventually 1440p. I convinced him to get a prebuilt (he’s not trying to do it himself) with a 6750xt and some other decent specs for exactly $1000. For a good budget pc at 1080p especially the AMD 6000 series is superior to Nvidia’s 3000 series IMO!
@@gerthddyn heck yea! He mainly wants to play competitive shooters and get 160+fps… he’s coming from an Xbox one 🤣 🤮. He should get 200+ in most games (low to medium graphic settings)
@@wallacesousuke1433 nah 6,000 series out performs Nvidia’s 3,000 series at 1080p and 1440p. Nvidia shits on AMD’s 6,000 series at 4K but I only play competitive shooters so idgaf about 4K. I can’t speak on the next gen GPUs though. I’m pretty sure the 4090 is the king in all aspects but shits so absurdly priced.
@@notblazedfordays9536 "I only play competitive shooters" Yikes 🤢, well, in this case yes, you're good to go with shitty Radeon cards since you only play shitty games, but for less niche uses, Nvidia is, sadly, the way to go. Better features such as DLSS, AI upscaling, more stable drivers, better for emulation, VR, better RT, better for productivity, better optimization for most games on release (since it is the dominant brand and having better OpenGL and DX11 performance)
If people are willing, the RX7600 is just that. Outside of the US, this card is probably the first card post covid with a reasonable price tag. Close to 3060Ti performance while being cheaper than the 4060TI and 4060.
@@wiLdchiLd2k it can. don't forget that there are plenty games that do not have these features. Unfortunately Steve completely failed to mention any of this, making this the most misleading GPU review I've seen from a major channel.
@@wiLdchiLd2k actually it'll still easily lose as the 4060 gets demolished by the 3060 in demanding games and dlss3 requires Vram, which the 4060 is already out of its super slow memory. Not to keep in mind the latency dlss3 adds vs native so even with a higher fps the experience will be worse. Further more if you are at a gpu limit dlss3 doesn't seem to do a whole lot its more for cpu bound games as the dlss3 frames do not happen at all, so the fake frames not actually happening in the game can increase your fps by not Involving anything besides the gpu. And Considering how bad dlss3 is even on a 4090 and that it just seems like a pretty bad technology in general, you ain't winning with that either way tbh. Normal dlss kinda has more of a argument but Xess and fsr also exist and only get a few fps less than dlss if sometimes maybe not more.
@@wiLdchiLd2k also dlss3= frame generation Frame generation = frames that literally do NOT happen in the game and is just something the the gpu guesses so it ends up massively adding latency. Because keep in mind not just latency from your gpu to the entire pc. But also from your pc to the monitor, the monitor to the pixels, the pixels to your eyes, which then goes you probably know where, which you then need to process. And then after that you can process the real frames. So a lot of added latency vs only real frames. It ends up playing like a weird game that has way to much input lag really so not great for most games.
@@XX-_-XX420 "It ends up playing like a weird game that has way to much input lag really so not great for most games..." and I'm sure you are talking from hours of direct experience with DLSS and frame generation on midrange Lovelace, not just talking out of your ass, blindly repeating the bullshit you've "heard" online?
you said the 3070 ti is now at $350 us ? where did you see that ? all I can find is around $600 - $850. under those prices are open boxes or refurbished stuff...
i don't. i love seeing what game devs can do when they can let their graphics stretch their legs. the mainstream experience will always be dictated by the consoles, and this represents another generation of custom PCs that can't offer a console experience for console money. that sucks. your 3060ti isn't going to be able to deliver fewer frames just because the newest hardware can do more.
@@Bruhhhoff agreeing in youtube comments?! FORBIDDEN ...but yeah as a launch day 6900xt owner i'm very happy to sit this generation out. probably gonna sit the next one out too. saving money is OK with me!
The other problem that i see with the 4060s (ti or not) is that they use 8x PCIE, so even in the older computers that you'd probably put this card in to give it a little oomf (like a Ryzen 2000 series) it won't be able to use as much as the card wouldve been used if it was a full 16x. So basically even the market that would buy it cant even use it as well as a 3060.
It has already been tested that the PCIe x8 interface makes this an even worse product than it already is for upgrading older systems that use PCIe 3. The reduced bandwith of the memory bus and the insufficient frame buffer capacity make it access the main memory more frequently - and these hurt even more if the PCIe bandwidth is halved.
AMD 7900xtx is the king of all cards no question about that, we are only around the corner from the new fsr 3 which will then put its on par with the 7900xtx
What bothers me most about this is the labor and materials going in to such a marginal improvement (and not even across the board). They had to retool or create new assembly lines for this, and 6 years from now, the 4060ti will be sitting on shelves unsold, or for those who did sell, likely already replaced. Eventually they'll be recycled - yet more labor going into salvaging this product - or dumped in a landfill; all because Nvidia wanted to fill out their product sheet to satisfy shareholders.
Any info on the rumors that Nvidia is threatening to limit or block AIBs from receiving supply if they work on intel battlemage GPUs? There is an article up on notebookcheck
One of your best videos in a while. Clear question, straightforward data collection, powerful (if demoralizing) conclusion. At least now we know exactly where we stand.
Realistically the comparison should be done on a pci-e 3.0 board, because that's what 90% of people thinking of upgrading in this price category will be running.
These kinds of reviews just expose how poor TH-cam reviews can be. 4060 TI is the best GPU to get for $400 and here's why is that.
+ DLSS 3 capability is the future
+ Strong Ray Tracing performance
+ Excellent 1080p performance (100 fps on average at Tom's Hardware)
+ Competitive 1440p performance (75 fps on average at Tom's Hardware)
+ Reliable 8-pin power adapter
+ Low power consumption (160W TDP)
+ HDMI 2.1 Connector
+ Compact physical size
+ New, launched on May 24th, 2023
You can run any 2-5 years old game with +90 FPS in FullHD/QHD, you can run the latest AAA games with DLSS and Ray Tracing ON. In future games you can utilize the new techs like AI support, Ray Tracing, DLSS 3 etc. Very competitive product, but I truly cannot understand why somebody would buy an AMD GPU? Most probably Intel will replace AMD in GPU market in coming years.
Just Buy It!
Ohh bro about to get cooked!! 🔥
Suuure...
Not even worth the time
My man is onto NOTHING🔥🔥🔥🗣💯💯
Nvidia basically dropped the entire mid-range & entry-level SKUs down one GPU tier. Now the RTX 4060 is based on AD107, which always was a 50-class chip, and the 4060 Ti is based on AD106, which in the past was used for the non-Ti 60-class. But they can't hide that fact from your reviews.
I'd be interested to see this compared with the 3060 (non-Ti) for this reason. Yes, the price/performance is the same, but what could NVidia's current gen be giving us?
@@therealsunnyk it's just so they can clear the 30 series stock
Yeah but muh AI muh framegen
This. Since 4080 all are scummy named. And I am not NV fanboy. Nor AMD at current practices. I do own 6800xt(sons pc) and 4090 myself...
I bet they will phase out all mid and low range cards, and their future products would only consist of 80 and 90 class.
Use my comment to show NVIDIA how many of us hate the 4060 Ti
Me .
Ngreedia " The Way Gamer Meant To Be Played"
The entire 40 series
Too late for that. Plus they won't care. They know what they were doing.
This thirsty for likes?
Look at the die size, the 4060 Ti is the successor to the 3050 mobile.
That's not the story. The one is Samsung and huge and power hungry. The other is TSMC. Smaller and more powerful and more efficient. Not the same apples to oranges
@@djchotus13060m is more power efficient than 4050m
@@djchotus1This isn't how generational die tapouts work...
@@djchotus1 That's not how die binning works. All the good quality dies capable of running at the lowest voltages go into pro cards tuned for best efficiency and get sold at a premium to data centres with deep pockets, and gamers get the crap left over running at speeds way that require voltages way above the efficiency sweet spot with half the VRAM and double power consumption for 10% more performance.
@@lmaoded4854 Why lie?
The 4050m is 40% faster at the same power draw of the 3050m.
And the 3000 series absolutely did have bigger dies compared at each model level compared to previous generations.
At this point, if you don't feel that your intelligence had been insulted by Nvidia, then you have been thoroughly brainwashed by their marketing department.
TRUTH!
As a 3060Ti owner, thanks Nvidia for keeping my GPU competitive so I can stretch it another 1-2 years before moving to AMD.
and you'll be using way more power for those 1-2 years, it's like energy is irrelevant to people lmao
@@paradigmshift7541 I undervolted it and it's using 180W, 20W more than a 4060Ti. Also the undervolt has improved the performance by about 2-3% so for 1440p (the res I'm using) it's basically a 4060Ti. Energy is relevant to me, but I don't think 20W extra power draw for another 1-2 years justifies spending 520 USD (That's how much a 4060Ti is in Europe), especially since it's basically the same performance. Y'all Nvidia fanboys need to find something else to convince us, talk about FG idk.
@@paradigmshift7541yeah i would like to save the 20$ yearly bucks than a 4060ti would save against my actual 3060ti, ty nvidia
@@Relex_92 You're right, I wasn't very clear in my comment. My intention was to say that Nvidia keeps my 3060Ti competitive with the 40-class equivalents. I know that now, with the RX 6700XT and RX 6800 the 3060Ti/3070 is not really competitive, only if you're the biggest fan ever of Ray Tracing. I am fully aware that 8gb of VRAM will be a serious issue for people playing 2024-2025 AAA games on this card. The thing is I'm not really excited for any newly announced games and probably won't be until GTA 6 or smth. I mainly play simulator stuff like Asseto Corsa, Automobilista, F1 with some story games like HZD and The Witcher when I'm bored. This card will be enough for me to enjoy the games I play at 1440p high settings for the next 1-2 years, can't say the same for everyone and that's why you're right. By the time I want new games that need more GPU muscle and VRAM, I will probably already be on an AMD GPU with 16 or 20gb of VRAM. Probably gonna buy the RX 7700XT/7800 long after release when you start to get those sweet deals.
Fingers crossed yours doesn't have the faulty Hynix memory
1060 matches 980. 2060 matches 1080. 3060 matches 2070. 4060Ti fumbles against a 3060Ti. Someone is not trying at all.
put in mind that graphics cards have more bandwidth Memory bus than the 4060 ti😂😂😂
But but but! bUT! fRaMe GeNeRaTiOn! More FPS for free!!1!1
@@lloydaranyes aNd dLsS yeah evErYboDy wanTs tHAt
Oh they are and have been, upselling low tier cards as a tier higher for years they just got cought this time.
1080 is NOT matched by 2060, but rather the 2070. The GTX-16 Series/Super Variants were okay (for the low-end/midrange), but the Vanilla RTX-20 Series was poor value all round. AMD did have the edge on value with the RX-5000 duo cards.
The RTX-30 Series was actually not great, it was decent, it only seems great since the RTX-20 Series was so bad. Again, AMD had the edge on value with the RX-6000 cards.
Same with the RTX-40 Series, it's actually not that bad. It only seems bad because the RTX-30 Series was decent. All that Nvidia needs to do is shave off at $100 price from the midrange cards ($200 from high-end), and it's looking good again. This time AMD does NOT have the edge on value, the RX-7000 cards are similarly disappointing, and also need a substantial price drop.
Nvidia never fails to disappoint.
And we never fail to buy their shit.
NVidia has way less fails to their name than AMD. Like 30 series was awesome and with good pricing (the COVID BS messed it up). The 10 series was also incredible. 12 was ok. And like... you can go into the past and at least 2/3rds of gens were really good, and half of the rest was at least ok. 40 series is a disaster.
Come on it is 5-8% faster that is nothing to zzzzzzzzzzzzzzz
and it has only been 30+ months since the 3060Ti launched 😂
@@korinogarowhat matters is what we pay and the cards were way too expensive.... amds 6000 series is more vaiable now because its cheaper and more vram with same Performance lol
AMD always fails to capitalize.
Just get a 6700XT or an RX 6800. Nothing else is even closely worth the money.
Me personally gonna buy xfx rx 6900xt merc for only 400$ in my region. It's a used card but as good as new
@@AmirEpix
6900XT for $400?
What the Hell?
absolutely lol
@@wiLdchiLd2k the only Nvidia cards that can match those 2 in performance are bottlenecked in VRAM and cost double that, and the 4000 series is unaffordable unless you're willing to burn money/have enough for a 4090. most people don't
@@wiLdchiLd2k vram is the answer. textures on medium or low is much worse than low fps.
I got a new 3060TI for $305 recently, and given all the info on this channel it feels like that's more than a reasonable price for the performance it delivers.
It's a shame Hardware Unboxed doesn't share Nvidia's vision for what games really want: more AI-generated frames than rendered frames, and 10 more years of 1080p.
And irony is Frame Generation uses even more VRAM for a poor 8 GB card. Gamers definitely want more 8 GB cards for 10 more years.
LOL😆
TBF I'll probably still be on 1080p in ten years. My 3DTV is 1080p, and they don't make 3D monitors or TVs anymore, so any monitor or TV you can buy right now would be a massive downgrade.
My phone is 1080p. Why the fuck would I spend $300+ on a gpu to play games on a $20 monitor? You can literally get 1080p monitors for free 2nd hand
@@astronemir There are 1080p monitors that are up to $700-800. Just because you don't see the benefits of playing 1080p on a Fast TN panel, doesn't mean some people still do that lol.
This product shouldn't have cost more than $250, and probably should have also been a 4050 Ti.
My exact sentiments! And the 4060 should've been a 4050 at 170 USD.
Almost eveything in the 40 series is marketed as a full tier higher card and priced accordingly this is nothing new unfotunately.
You got it right
250 sounds too low, they obviously wouldn't take losses
At least only a 4060 at most.
Would've much rather had 16GB on the 4070 ti
With 280W Power target that would actually be a card i would buy. Well maybe a tiny bit cheaper too.
Yep idfk what they were thinking tbh i really hope mid next year they introduce 4000 Super series to fix this horrible mess
yes, and for 4080 then 20GB
Reality is the 4060 Ti doesn't have the grunt to need 16gb. 12 would have been heaps.
Meanwhile the 4070Ti looks like being limited by ram going forwards.
@@Eleganttf2 The first mid-ranged graphics card with 8GB of Vram, the 8GB version of the RX 480, was released SEVEN years ago, for a mere 240 USD (about 305 USD if you adjust for inflation).
It's RIDICULOUS that 8GB is still being used on 400 dollar cards SEVEN years after it became normal for a card in the 300 dollar range (after adjusting for inflation).
The people that were shouting in the wind that the 4060 SKU was in reality a 4050 should be feeling fairly vindicated about now... jesus, this is hard to believe
Reviewers need to be a bit cagey regarding leaks, rumors, and predictions to protect their reputations (and avoid time-wasting semantic arguments with the peanut gallery) but it's pretty easy for laypeople to look at die size, bus with, and VRAM capacity as well as what's going on with the rest of the stack to make informed predictions. The positioning of these products shouldn't be surprising to anyone who has been paying attention to how die and bus cuts have worked across generations. GTFO with these claims that a few dozen MB of cache is some sort of performance magic. That's meaningful for code (especially crypto mining code) but means little for general rasterization, as has been shown empirically.
The problem is that there is *no* Lovelace GPU design that fits the performance uplift what one would expect from a 4060/4060Ti. It's not like they have something in their back pocket to fill the gap from the 4070 to the 4060 (which is HUGE!). It was a conscious decision to _really_ segregate the market between midrange and upper-midrange. The 4070 does a decent job trouncing a 3080 in modern games and gets close to, or sometimes beats, a 3080Ti whereas the 4060/4060Ti struggle to match same-tier prior products!
@@awebuser5914 Remember back in the day when the new xx60 card was matching last years xx80 card.
@@chronossage "Remember back in the day when the new xx60 card was matching last years xx80 card..." No, because that's complete bullshit and never happened. *NO* xx60 (non-Ti) card has ever matched a xx80 (non-Ti) card, EVER.
@@awebuser5914 If they can't build a 4060Ti, then they shouldn't sell a 4060Ti. It's not really that hard of a concept.
Igors Lab also discovered that there might be something wrong with the telemetry of the 40 series cards. They apparently sip more energy then we thought they would. Which is quite the scandal if you ask me.
Pretty sure HUB gets their power figures "from the wall" instead of software
As far as I know this is only the case for the entry level 4060 non-ti. They saved basically on the supervision chip.
Here in India 4060 ti on sale is 35 k and 4070 is 60k minimum and 3060 ti 33k minimum so yeah 4060ti is a good deal, country wise prices vary alot,AMD is expensive here than Nvidia,4060ti is definitely a good buy for people in india
How about intel?
@@TonyChan-eh3nz AMD chips are a bit cheaper but motherboards are too expensive.
@@hunterhearsthelmsy1 I was also talking about GPU’s
Arc 770 for 39k and 750 for 31k, a bit pricey as we can see
@@hunterhearsthelmsy1 prob with intel right now is its wonky drivers. they still have a huge hill to climb to make them relevant. If you take the a770 model, the 3060ti is basically better than the a770... its a sad state
Why do people insist on calling it a 60? It's a REBRANDED 50!!! That's why it sucks at higher resolutions!
The whole product stack doesn't make sense. The RTX 4080 is cut down to 60% of the power of an RTX 4090. Historically speaking, 60% of the top tier GPU has always meant that it's a 70 class card, not an 80 one.
And this means that basically every other card aside from the 4090 this generation has the wrong price and name. The 4070 TI and 4070 are both 60 TIs and 60s while the 4060 TIs and 4060s are actually 50 class GPUs.
Very very concerning.
Because that's the name Nvidia gave it, so we have to use it so that other people will know what product is being discussed?
@@wiLdchiLd2k No, names are meaningless anyway. But they carry a specific expectation with them in terms of power and performance.
@@wiLdchiLd2k the issue is ...imagine this; a car company offers a 1500 series truck. meaning it has a payload of roughly 1500 lbs. 1000 in the bed, 500 in the cab. But then suddenly reduced the payload to 1200. but charges the 1500 price..so you think the price is fair, i mean thats what they charged for the previous version.... Meanwhile they are making a truck that has an actual 1500lb payload, but they are calling it the 2000 series truck (its got a bigger number it has to better right?)... for an extra few grand. See the problem? Youre paying more and getting less while also being decieved about what the product should be able to handle.
4090 should've been called Titan (fitting for its size too) while 4080 should've been 4090 and etc
Rx 6800 non-XT can be had for $450 in the US and not much more than a 4060Ti in other regions so I really don't see why anyone would buy the 4060Ti right now
Only reason I can think of is sff pc builders, for whom these low power cards are an absolute god send. Then again, they're low power precisely because they're using a tier down silicon per price point, hence the non-existant performance uplift. Sff builders are used to overspending on their parts so it'll be an easier pill to swallow for them ig.
@@wiLdchiLd2k that's not a selling point, it's a gimmick - a nice to have, and shouldn't dictate the price whatsoever. And it should always comes AFTER a raw performance uplift. Same with FSR and all other variations of DLSS.
For GPUs a feature becomes a selling point only when it's an industry standard and/or is supported by the vast majority of applications relevant at the time. For example, RT was entirely a gimmick back in 2018, but not today. Even if you were to argue that DLSS3 will eventually reach where RT is today, today's hardware will fade into irrelevance by then, just like how the entire RTX2000 series did.
@@hasnihossainsami8375Totally disagree with your points. First of all, 20 series is quite relevant today. It hasn’t fade into irrelevance. Second, features are an important part of product costing as it improves the product experience so a company can charge more if they are providing you more feature. Why are you paying for any software then!? lol. Softwares are actually like features to improve experience. This is not to support the Nvidia greediness to increase price. But as a general rule, features do have value if they are improving experience, as simple as that. And dlss3 do improve that experience in several games now like Spiderman, Hogwarts legacy, etc and don’t be ignoring the fact that Amd sponsored titles don’t have dlss3 due to Amd scummy practices
@@pinakmiku4999 Do you really need to buy a new gpu to only get a new software improvement???? I mean if you were getting more rasterization and an improved software i could understand going for nvidia.
@@pinakmiku4999 spoken like a true Nvidia shill.
Everything below the 2080Ti, a $1200 card, is borderline unusable with RT on in modern games without DLSS. Fact check yourself.
And DLSS3 is not a selling point because it fosters the idea that same hardware performance for the same price 2.5 years later with only software uplift is okay. It is not okay. We're paying for the full package, hardware included, not just the software. Imagine if the 3070 had the same performance as the 2070 but could do DLSS2 even better - same scenario.
The 4060ti is not even a stagnation, but instead a downgrade from the 3060Ti in every physical performance metric except power consumption, and it makes up for lost ground with software. That's disgusting.
mad lad benches 50 games because he was curious, love this channel
I can’t be the only person to switch from Nvidia to AMD (6000 series) when I saw what my options were from the green team. I’m interested to see what the steam hardware survey looks like this time next year.
I don't think the numbers will shift big on that scale. Steam will probably report that plenty of people will still use 1060s and 1650s. NVIDIA is, on the big numbers scale, just losing to NVIDIA of a half-decade ago. There's very little push to do much upgrading ever since 1080p reliable FPS was cracked all the way back then. Because NVIDIA of today is practically refusing to make 1440p reliable and affordable. Why else would their x60 card cost way more than it used to, but still target 1080p?
You're not, switched both gaming machines in my house. 2060 6GB to 6700XT and 3070ti to 7900XT. We should be good for a couple of generations. We don't ray trace and team red gives us the power to get all the detail and FPS we want without team greens "special features". No regrets here.
I dont got preference, but I actually wouldve prefered to try an Nvidia GPU again after two AMD cards (latter had better price/performance). But Nvidias GPUs this gen are just so bad/overpriced theres no point even trying.
Ended up with 6800 XT. Sometimes AMD drivers can be wonky, but they have gotten a lot better these days, and any issue cant be as bad as 8 or 12GB of VRAM with a slow memory bus. I just hate that memory stutter and screwed up frametimes.
AMD currently holds 20% market share, last year it was 12%.
So you're not the only one.
RTX5000 better be really good with plenty of VRAM otherwise Nvidia will lose more gaming customers. On the other hand, their gaming revenue is nothing compared to their AI revenue so I wouldn't be surprised if they just keep going down this road.
The RTX4080, a powerful chip with only 16GB VRAM that will 100% be bottlenecked by VRAM in 2024 games at high resolutions/settings, just dropped to $999.. it used to be over $1200.. absolutely bonkers when some 2023 AAA games already come close to using all of that 16GB VRAM. Despite being slower, I would choose a 20GB 7900XT over a 16GB RTX4080 if I could get either of them for free.
That 20-24GB VRAM on AMD's 7900 cards will not go unused within a normal card's lifespan (4 years on average before people upgrade), it's not there for show lol.
Steam survey isn't accurate, due to the way it works it gives huge boosts to whatever is in use by gaming cafes, 1000 accounts on 1 pc with a 3060 and steam counts it as 1000 3060s.
Nice to know my 3060 TI is almost the same as the 4060 TI, only problem is I wanted an upgrade with more VRAM for video editing. Guess it's time to look at used options.
As a bonus I can get about 5% more FPS from a small OC/UV.
I'm glad that I won't need to upgrade from my 3060Ti for a very long time
You do because 8gb vram 😂
Don’t worry I’ll have to upgrade mine myself😢
@@adamek9750 Anyone only interested in 1080p, med - high settings are still set for the next 2 years. Where the 3060 ti starts to struggle is 1440p high. Steam hardware charts show an uptick in 1440p monitors but 1080p still reigns supreme on the platform so most are still safe.
@@UWG3 Me with my 12gb 3080 and a 5k monitor:
“I’m in danger!”
If you play in 1440p I have some bad news.
You need to upgrade it now, if you are interested in 4k or 1440p ultra and ray tracing. Just because 4060ti sucks doesn't mean you don't have to upgrade 3060ti. It just means you have to spend more money, this was nvidia's plan all along.
I wonder what are excuses that the shills will use to justify this card.
Muh fake frame generation
Usually efficiency, which it's not even good at. A 6700xt uses less power for better performance. And consider it's a 50 class card the powerdraw is really bad as they usually had enough with just 75W.
@@saynotomanifestv3101 Indeed i know of one that is all about the "technologies" you do not get with AMD and how said tech makes the product future proof.
@@XX-_-XX420 Funny thing is, you can often buy a better AMD card, downclock it, and you might even get better performance/frame than some new Nvidia GPUs.
@@termitreter6545 not sure about rtx4000 vs RX7000 as I haven't looked into that, but a 6700XT os very effecient you basically get 3070-3070ti performance at 130W. In some games you'll be far better due to the 50% more Vram you have.
And since a 4060ti is worse than a 3060ti draws about 200W it's pretty easy to look at the 130W card offering more performance and pick that for any possible reason as its better at literally everything.
For RX7000 I saw some guy having decent gains with a 7900XT after undervolting he was talking about 250W vs like 400 or something like that. But with the 6700XT I know exactly how good it is as I had one. Even stock it only uses 186W under full load.
So Rx6000 vs rtx4000 it's a no brainer as AMD is better in every single way. (looking at the mid range atleast).
I get worried seeing the latest model of cards doing worse than the previous gen thinking nvidia would nerf the previous gen in an update to make the newer gen look better.
It will happen when they get close to release the 5000 series, they will nerf 3000 so they can clear stock of 4000
Yes it will happen at some point, but then just revert to older drivers.
They're too clever for that. People would immediately know their cards got 10-20% slower and why. What they would do is simply make sure new games perform better and of course additional features like DLSS and RT become the selling point.
Is why i am still using 3050 i got last year...
im more worried nvidia might release decent cards than not to long later gimp them without telling people so people expect a certain performance but dont get it because the box doesnt actually tell you the difference between the gpus. ie kinda like what they did with the 3060's 1060's 1030's and what not without reviewers people would get absolutely scammed and they wouldnt know it.
Gives me the 11900K vs 10900K vibes for sure!
Well the difference here is that it isn't simply a refresh pushed to the limit. It would be if the 11th gen chips were ~50% smaller and the memory controller was essentially running everything in single channel.
This is worse because it truly is a generational improvement but instead of passing that to us they cut it down until it performs the same as last gen to maximize profits. It would be extremely impressive what Nvidia has done if it wasn't so damn malicious.
@@JJFX- without getting into architectural changes I’m talking about one generation you’re getting 100 FPS in one title and 90 in another, then the newer gen gets 110 in first title and 87 in second like huh? 😂 this would been more impressive as a 3060 “Super” as you mentioned rather than a new generation but I think that involves admitted defeat on the entry level 4000 series as you release a new 3000 series card
@@FIVESTRZ Oh no you're absolutely right, I'm simply saying this is even worse because Intel was essentially making the best of bad situation whereas Nvidia is making the worst of a good situation lol
@@JJFX- 😆 facts facts. Either way it is what it is can’t be too mad if they are phasing out 3000 series with each launch - just means nothing changed. If the 3000 series lingers this will become a head scratcher
EVGA's decision to drop nvidia making more and more sense now
I'm thrilled with the 4060Ti release - means my 6700XT will be relevant for AAA gaming for quite a few years
LOL felt the same
Who cares about AAA gaming, the 6000 series supports Win7 so you can make your computer into a classic gaming behemoth instead!
Just like how I didn't have any reason to upgrade my i7 2700k for 7 years. We just went over the CPU dark ages. Now we're in the GPU dark ages where every generation is garbage.
my thoughts exactly! finally AMD cards will get some spotlight and marketshare
@@DyceFreakpretty sure after the r9 390X you had to get win 10 or newer. Specifically didn't buy RX5000 at the time as it required win 10 or newer and I was using a win 8.1/10 dual boot.
Wow, the performance of that GPU is just sad
Not at all if you care about it's power consumption per performance. It will pay for itself over time vs the 200watt behemoths. But if you're expecting a linear performance graph between generations in line with cancerous consumeristic progress, then this isn't your girl.
@@DyceFreak It's the price that's sad, not the performance.
@@DyceFreakyeah if you care about effecieny it's even worse. A 6700XT will use less for far better performance.
@@andrew6978yeah this would be a decent 4050, but the powerdraw is definitely way to high as 50 class cards typically had enough with 75w, this thing on the other hand....
Performance is really good if you imagine it's a 4050Ti for 250$
Wait... they sell a 4060 ti version at 16gb for $499 what the frick. Its very clear nvidia tried to shift all of the 40 series cards up an entire tier and is why they "cancelled" the 4080 12gb and just rereleased it as the 4070 ti. It should have been the 4070. Remember the 3060 had a bus width of 192 which is the same as the current 4070 ti. The 3050 has 128 bit bus like this 4060 ti card.
14:10 - "The 4060Ti is a terrible product at $400."
Great video. Sums it up nicely.
Would be good to know at what price reviewers believe this card becomes reasonable and/or desirable.
200 pounds
It's simple , Nvidia want's to sell you Frame gen and AV1 stuff , not extra performance , they really think people will buy in to that and will rush out to buy 40XX , i guess that has its good sides to , more used 30Xx series on the market = less $$ for used parts
Awesome content from the Benchmarking God,
Your content helped me buy the 6700xt in Jan 22,
Keep up the great work
Got mine in Nov. Very happy with my choice. HUB was also my starting point for researching which card was the best buy.
a few months before the big price crash? ouch. I sold my 6700XT for the same price I bought it for in April 2022, just as prices were collapsing. A month later they had lost 40% of their value.
imagine being that gullible to spend money on someone whos clearly biased and dont even use the newer cards to their fullest lmao.
@@Takisgr07afaik, they're mostly grading the performance of the GPUs in rasterization, which is still the predominant way of rendering games. don't blame them for catering to most gamers instead of using tech that's exclusive to this generation.
@@Takisgr07 RT doesn't interest me in the least. I got the card for the longevity(12gb vram) and I'm still very happy with it.
Thanks for grinding out all these benchmarks really interesting picture of the generational performance "increase"
Yeah I'd love to see a graph of dollar per frame adjusted for inflation from GTX 780 onwards
@@nickyang1143start at the 680, that's when Nvidia started their fuckery with using smaller dies meant for smaller cards. The 680 was meant to be a 660 but AMD screwed that generation up so bad Nvidia could get away with it.
This is the same business model that the automotive manufacturers are making="If we make our affordable lower tier products suck enough, then our customers will be forced to bump up to our higher priced product line where our profit margins are much larger"...AKA "Feed the lower class a SH1T Sandwich and they'll eventually pay for our overpriced menu items" :/
Don't forget to mention the PCI-E Port on this card is capped to x8. That means that for everyone owning a PCI-E Gen 3 system this card isn't even a option when upgrading from a older series of GPU's. And there are a lot of 3.0 systems out there.
Doesn't really matter unless you run out of VRAM, at which point you're stuffed anyway.
@@Hardwareunboxed Yes you're right. This should be my point as you also mentioned the 8gb of vram.
As I said, it makes no real difference.
Consumer oriented vids and the actual facts makes your vids worth all my time and more, thanks.
It’s why they are the best in the business rn!
Yep, compared to the last two drama videos it's good to see benchmarks and real consumer oriented content.
It'd be a superb 4050ti if priced as such.
250 Max
I'd buy one for sure.
It "is" a 4050ti. Nvidia bumped the entire stack down because not enough people bought the 3090 and Nvidia thinks everyone is absolutely loaded. The truth is that the 4080 should have been a cut down AD102 with 20GB VRAM, the 4070ti should have been the fully enabled AD103 die, the 4070 a slightly cut down AD103, both with 16GB. The 4060ti should have been a fully enabled AD104 with 12GB, etc. The fully enabled AD106 die should have been the 4050ti (where 8GB makes any sense at all!).
This should be $200, if made at all
Exactly, as a 4050ti for like 200 bucks, it would be a great card..
I've seen the 6800 non-xt for just over $300 recently, and used 6800xt's are selling for very close to $400
Yep I paid $330 inc tax and shipping for my new 6800, would have got the 6800xt for $400 but it sold out as I carted
@@jurgengalke8127 I think you probably got the better deal. And, unless your playing at 4k or are unhappy with 120-144fps, then the 6800 is probably plenty.
And undervolted RX 6800 can eat like 120-150W while being faster that thar 4060ti crap :D
@@sparkythewildcat97 i mean if you want to paly in 4k then the RTX 4060ti will suck even more, the card cant even put out 60 fps in 1440p games, how you except to play in 4k then ?
By lowering graphic to low settings just to play in 4k ? kinda stupid.
But it's Radeon 🤮
It would be nice if you could test PCIE3.0. Many of us are with AM4 300&400 mobos. I bet that 5% difference at 1440p might be gone.
Agreed! There's many systems that still use PCI gen 3. It could be a big performance penalty vs Gen 4, let alone gen 5.
I've seen two german reviewers tackling this issue. First was der8auer in a 4060ti/3060ti video and PCGH in a 4060 on pcie3 website article (diagrams should be self explanatory even for non-natives).
Results were of course mixed but some were really interesting. Because in some games you'd lose up to 10 % on 1080p max. details E.g. Forza 5 and Hogwarts Legacy.
It's entirely possible considering the the 4060 and Ti only use 8 lanes. So if you drop it down a gen you're operating with the modern equivalent of a 4 lane pcie bus. Or to go the other direction, the equivalent of a full slot at PCIE 2.0 speeds.
It's like they saw the RX 6400 putting a mobile class GPU on a desktop card, halved bandwidth and all, and said "we could do that", but instead of making it an XX40 or even an XX50 they put it in the 60 class which feels all kinds of wrong. That wouldn't be so bad if it was a clear improvement on the 3060 ti, but nah it trades blows with its previous gen counterpart.
@@fabianrosenthal4644 Oh wow, 10% is really bad. I know the 6600 XT had the issue, but only 2% or so average.
Just look for PCI 16x vs 8x comparisons of similar nVidia cards like this one. th-cam.com/video/COcHHX2MdKs/w-d-xo.html Most games it doesn't really matter with only a few FPS difference but some will suffer more. It's the 128bit memory bus that kills the performance of this product not being limited to 8x PCIE.
The more you buy... the less you get -Jensen Huang
Nvidia is using DLSS as an excuse to release new products with hardly any generational improvements. They claim that DLSS 3 only works on the newest GPUs, but I'm sure they just designed it that way. In the future we will probably have newer and better upscalers that only work on the newest products to encourage upgrading.
@@wiLdchiLd2k Just wait for Intel
Day 1 3060ti buyer, $800aud(+shipping) and worth every cent. Have not bought a GPU with less than a 256bit bus since 2003, I also replaced my 2060 Super with the 3060ti, was expecting to get the 4060ti as speculation suggested it should have been 3080 level in performance but this is what we got instead. FX5200 vibes
Me too. I had some trepidation at the time but ultimately very happy I got it.
FX5200 is actually currently a go-to card for Win98, so it's held up quite well despite the FX's bellyflop of an existance. I just went from a VEGA64 to a RX7600, I waited 4 years and the entire price gouging fiasco before buying again. I pity your rampant consumerism.
Simp
yeah dude, and we thought that 20-Series was the worst gen ever give price to performance..... 2060 super managed to match The Legendary GTX 1080, for 100$ less Msrp and new features, In this generation, u would at least hope that 60 Ti card would get close to 3080.... oh well it cant even beat 3060 ti properly lmao
3060ti also, and running with a 5800x3D, 32gb ram.. get 140-150fps 1440p on tarkov and WZ2 👌
The synthetic gimping is what kills this... They are making this hardware slower on purpose. Which is very counter productive...
What's so stupid about all this is if they just did a 160bit bus with 10gb and maybe 10% more cuda cores than it has, pretty much we wouldn't be having this conversation, and Nvidia would still have fantastic margins.
watching this with my 6800xt 16 GB and a bag of popcorn
Same
Yup, it doesn't matter how much vram they put on this card if the memory bit bus is too small its pointless. Memory bit bus size is more important than vram capacity imo.
Steve is there a way to simulate a 4060ti with a 4080,like you did with 3dv cache 12/16 core Where u disabled cores etc. Just to show what numbers we could get on a bigger bus and more memory(vram) ?(keeping cores normal etc) Or is the sort of simulation not possible?
If it is possible it would be great to see how it should've performed if Nvidia never cut back soo much
It's impossible other than heavily editing the vbios, which even enthusiasts can't easily do
No, GPUs are too complicated for that. I think theres going to be a 4060TI with 16gb of memory? Not sure if that card isnt gonna differ in other ways, though.
not really, you cant disable SMs on gpus like you can disable CPU cores in bios
No ones saying its not a solid buy but by itself in a vacuum, its stagnant and and even downgraded and only rely on power consomption/ Frames Gen to look decent. as a new buy from a basement of cards its a pretty solid $400 card but really isn't doing anything for the mid range 3060ti/RX6700 Card owners like me, so I retract my complements.
4:29 Tech System Specs, build within Sponsor's Case, for only:
CPU: AMD Ryzen 7 7800X3D 4.2 GHz 8-Core Processor ($439.00)
CPU Cooler: Thermaltake TOUGHLIQUID Ultra RGB 107 CFM Liquid CPU Cooler ($279.99)
Motherboard: Gigabyte X670E AORUS MASTER (rev. 1.0) EATX AM5 Motherboard ($489.99)
Memory: Corsair Dominator Platinum RGB 32 GB (2 x 16 GB) DDR5-6000 CL30 Memory ($159.99)
Storage: Corsair CSSD-F4000GBMP600PXT 4 TB M.2-2280 PCIe 4.0 X4 NVME Solid State Drive ($394.99)
Case: Thermaltake CTE C750 ARGB ATX Mid Tower Case ($199.99)
Power Supply: Thermaltake Toughpower GF3 TT Premium 1200 W 80+ Gold Certified Fully Modular ATX Power Supply ($199.99)
Operating System: Microsoft Windows 11 Home OEM - DVD 64-bit ($117.98)
Monitor: Gigabyte AORUS FI32Q-X 32.0" 2560 x 1440 270 Hz Monitor ($649.99)
Case Fan: 4 x Thermaltake SWAFAN EX14 RGB 81.6 CFM 140 mm Fans 3-Pack (4x $119.99), becasue your Case can fit up to 14 140mm FAN and only comes with 3 and you won't enjoy of enough turbulent air for your at least 6 more fans of your GPU and radiator of your CPU. You need at least 60W to 100W of fans...but not in one noisy and windy fan.
Total: $3411.87 or more than 4000€... without GPU.
Please could you tell me how terrible really are GPUs from Intel, Nvidia or even AMD 6800, 6700, 7600, 6600 or 6400, and prices, and performances and RT, FSR, DLSS and so on...and how important are FPS, oh yes and classic missing resolution or settings: 1080p 1440p and 2160p not being tested together, and testing ULTRA and not high and medium settings, or RT and upscaling performance to quality...
This Thumbnail got me! Great work Steve. Click attracting w/o being clickbait. Thanks Steve.
Forget the name on the box, it's a $400 50 class GPU.
I think what upset me the most,i was legit waiting for this gen cards and then i end up getting an 6700XT, yes i could get this card 2y ago, oh well.
I am happy with this card coming from an RX570, the uplift was substantial but still just shows how new-gen card not always is worth waiting for ( talking about entry / mid - level cards) now I'm deciding if i go AM5 or just get an 5600/ 5800X3D , currently i am using an R5 1600AF.
5800X3D
Yeah, the 5800X3D would be attractive here imo; keep your existing DDR4 RAM and mobo (assuming compatible),
Get the 5800x3D. You don't need to go with AM5 just yet.
@@ThePurplePassage ye but in order for me to get 5800X3D i need to swap mobo and ram , as my ancient Gigabyte GA-B350M Gaming 3 is not good for the CPU , yes supports but dealing with Bios updates from gigabyte is just pain , is not simple as download bios and update , you actually need to do a lot steps and update other stuff.
This is why i was considering AM5 or INTEL (never used a intel setup tbh but i have no preferences) but I've heard AM5 gives some headaches as is not matured yet , not sure how is it now.
My budget rn is 400€, 5800X3D is 300€ , new mobo around 100€ , 32GB RAM 3600MHZ CL18 70€ (CL16 100€) , surpasses my budget at the moment but i dont mind wait a lil longer and keep R5 1600AF as is alright in most scenarios ( currently playing BDO and works)
I have 2 options here:,
-Get tomorrow R5 5600/B550/32GB ram (411€).
-Wait a month and get 5800X3D instead (or AM5 setup with an 7700 or 7600) I'm having some kind of fomo here , will 5800X3D sell out? i never had fomo buying an component lol but since 5800X3D is out for a year or so and AM4 is in the end of line. Could grab CPU now and get rest of stuff next month ( unless the CPU will go down in price but hard to know those things)
If a bios up date ie downloading installing @bios and the latest bios f51g then 3 clicks is alot of work not sure you would like the half a day work for building a new pc also gigabyte have that security issus so every one on a gigabyte mb should update thier bios now! that includes if you buy and new mb first job always update bios , that said no point building a new am4 system just go am5 or intel if your building fresh
Steve, you blokes are doing a great job of calling this out garbage customer abuse. I'm not going to touch them.
Do you think could be a drive problem not optimized for the new Lovelace architecture , the 3000 series use the Ampere old but very established architecture ?
The radeon 6700xt is the goat of this generation of cards
I'd say it was, until the 6800/xt came down under 500 dollars. But still happy with my choice, 6700xt is great midrange
My popcorn is ready for the 4050 benchmarks! Let's go! 🥳
Damn if 4060 is really 4050 then what exactly is 4050? GT4030?
Yeah...I agree. They can't release lower class cards when the 4060 is AD107 which is the smallest die lol
@@Zuggeerr Yeah there's a reason they don't/can't make x030 chips anymore. The tiniest and cheapest chips all moved up a couple of tiers (in price, not performance)
If they keep the same silicon as the laptop version should be around 20% slower than the 4060
there is no and will be no 4050, they don't have a smaller die than this...
Steve I was wondering if you have ever asked nvidia why they are nerfing the memory bus bandwidth this generation?
Save money
@@Hardwareunboxed and planned obsolescence?
@@Hardwareunboxed greedy bastards they are
@@Hardwareunboxed Well that's rubbish....... But thanks for replying.
I hope they saved lots of money, otherwise they wont sell enough volume to pay for all the costs until prices drop.
I recently upgraded a friend's system to a 3060 Ti Strix for $300 and it has 2 years of warranty coverage still. The seller even drove it to me. Could of nabbed a 3070 for $350 from other sellers but warranty transfers sounded iffy or the sellers were unprofessional. No brainer versus a 4060 Ti
Thx for this HUB. Your video highlights just how important it is to do research first before buying any new cpu's or gpu's.
4060ti = a 4050 in disguise. 150usd max.
Great video and so glad you didn't go down the Nvidia DLSS false frame road. When you buy a gpu you want it significantly better than the previous series and not have to depend on high-latency false ugly DLSS frame software to make it appear better - thank you very much.
The frames are real in my heart.
Maybe the real frames were the friends we made along the way.
@@wiLdchiLd2k asking for a friend are you a left leaning democrat ?
Cope.
@@Takisgr07 I mean, my friend just wanted to know cause usually the corporate shills are moronic Left Ln., Democrats he just wanted to know
The issue is we can't opt-out of paying for the optical tensor flow, which I think is what's inflate the price. NVidia is pushing for DLSS3 to be massively implanted.
The cards do have insane energy efficiency though. 4070 has 3080 performance at 200W, which is a big heap. If it was priced correctly, it would have been the real 4060Ti at $400, and this 4060Ti should've been a 4060 at $300
this isn't a 4060, only 128 bits bus and 8gb ram. Needed 192 and 12gb, respectively.
The issue is Nvidia making a 4050 Ti and then calling it / pricing it as a 4060 Ti.
I really don't understand why people keep saying the 3070 graphics card is not great! I own more than like 600 games on steam and every last one of them plays amazingly. I'm not talking indie games either! I mean all the heavy-hitting games!
Because Nvidia crippled it by not giving it enough VRAM. 8 gigs is not enough for that class of card.
The 2080 Ti will outperform it in Ray tracing because it has more VRAM than it does
Do you know if they're going to sell the DLSS3+AV1 DLC separately?
not if you buy now, mine came with both enabled :P
🤣
Love these comparisons keep up the great work!
So glad i got a 3070 after a age of waiting for only £250 about six months ago. I had a feeling 40 series was going to be out of my budget and the 4060ti just confirms I made the right choice. A little worried about VRAM moving forward but so far its been great.
Unless you're worried about maxing out settings your good. I have a 3070 as well. 80 something games in my steam library. No problem getting high frames for my 170hz monitor at 1440p.
I think the vram thing is overblown.
I got scalped for a 3070 back in 2021but fk it anyway, does good for me on a 1440p
Still rocking 3070 with little to no issues. Overpaid like crazy here in Sweden when it got released, it was almost 800euro, and this card took me months of searching to get.. Prices are still crazy high in my country for GPUs, they're still at 680-700euro retail price
@@andrew6978 It IS overblown, but still definitely a thing.... especially if you want to play maximum everything...
Nvidia trying to bring rx6700xt up on steam hardware survey
Even more pleased as a 2080Ti owner, after five years it still smokes this garbage and probably is within 25% of a 4070 in many games ;)
4070(which isnt a great buy) matches the 3080 which is 2 tiers above it, while having 50% more vram than its last gen tier, yet the 4060ti can barely match its own tier, with the same amount of ram AND less bandwidth😂
The real mystery is why Nvidia is releasing gaming GPUs at all right now. They'd probably save money with a tax break by cancelling them, and then reusing the chips for AI and other data center usage.
Best guess is they feel the need to release regularly to maintain mindshare and AIB relationships. If GPU investment for AI goes tits up and Nvidia needs to drop back to gaming, it will be a bit of a problem if they've been out of the market for several years.
I think this is simply to keep onto existing workforce right. They have a lot of people working specifically on GeForce stuff they feel some kind of obligation to keep drivers up to date etc. So the easiest thing for them is to just keep the whole thing at least afloat to avoid any riscs associated with leaving the market all together.
It's for convenience sake. Nothing more. They have no need or want to produce compelling products to normal people anymore. That is clear.
I look forward to many more years with my RX 6600 at this rate.
UE5 will kill most of entry level cards
I upgraded from a 3060ti to a 4070. The extra vram is very nice and overall the card is faster. But I wouldn't have gone with the 4060ti.
Let's not forget it's only PCIe 4.0 x8. It reminds me of the RX6500XT
Isn't 5% in the margin of error?
Great review!! I just bought my friend a 6750 for $350 on Newegg as I'm putting together a new gaming rig for him. I wanted to get him something by team green but the 40 series just stinks. It was more to do about the VRAM since the 6750 has 12GB and I think it will age better over time. He only plays at 1080p and doesn't care about RT. He's coming from a GTX 760 so the 6750 will be a monumental upgrade. Let's see if the 50 series course corrects!
the Rx6750 is way way WAY better then the 4060ti.
You wanted him something from team green ? every time i build a PC for friends i get them something from AMD because the price to performance was Always better and this is what matters the most, then the downgrade is reality on team green because a friend back then with a GTX1060 has now worser performance in the same games as back then then a friend with a Rx580 nowdays.
@@allxtend4005 Not a Nvidia schill, but users benchmark has the 4060 ti 13% faster than the RX 6750. I came from a 1060 card so either one of the RX6750 or 4060 ti is a huge improvement over what i have. I have to ask myself if that 13% faster is worth $394 on microcenter vs $429.00.
@@Annihilator-zv6xh Except that it is. I just returned a cash happy, jittery, driver uninstaller POS of an RX 6750 TX - this was a bloody daily occurrence. I replaced it with a 4060 Ti and it's faster, cooler, has not crashed a single time, and is even more efficient by about 35-40%; running in the 130 watt range for most games I tested. Even Starfield, unoptimized crap that it is, runs at 60-80 FPS on average.
Very informative video as always. Thank you for the hard work!
glad i got my 3060 ti for $399 right on release
Lucky
Bought a 3070 used for under 300$ one year ago. Happy for now on a 1440p monitor, but I don't play the latest games.
@@WeeManXL where did you buy it from? i have a 3070 barely used i've been wanting to sell
For under $500 you can now find used but really good 3080Ti 12GB; period. If your budget is $350-$400, get 3070/Ti used.
At this point, I don’t realistically see myself buying a Nvidia card any time soon. Intel's and AMD's look more appealing, especially potential future cards.
Got fed up and took the plunge. Bought a merc 310 7900 xtx. Great card to replace my favorite gpu the 1080 ti but now I need to upgrade cpu lol. Which I will do in a month or two
On what cpu are you thinking?
I want to know. :)
Maybe a 5800X3D if you're on AM4.
@@Aegie for my budget was thinking an i5 13600k. then maybe upgrade to an i9 later when prices drop
@@peterpeter5666 you can take something from am5 for future. Like 7 7700x
This make me wonder how good is the 4060 Ti on AI workloads, given that they have dedicated so much of the silicon to AI, or have they ?
The memory bandwidth is a big deal in AI workloads. Paperspace's "GPU Memory bandwidth" has a blog post, as well as a post in huggingface "Run a Chatgpt-like Chatbot on a Single GPU with ROCm". Both of them said that memory bandwidth is more important than pure Tflops due to idling.
Sadly, I can't find any good benchmark sources for the 4060 series.
mostly they just didn't include as much silicon...
Why is pc gaming hardware moving backwards?
Jensen, Lisa and their investors need new leather jackets, oc
Because people are morons and buy whatever shit these companies put out.
Greed.
They overproduced RTX 3000 and Radeon 6000 GPUs during the Etherium mining craze...once it ended...they had a bunch of stuff that wasn't sold yet. Now they've gotta give sub-par value on the new generation so the old stuff will still sell at good prices. Next gen (a year or so from now) should be considerably better. If not...then I don't know what to say.
the 4060 Ti is a 4050 Non-Ti card for $400 dollars. Can not understand how NVidia fans still defend NVidia, come on people stop fanboying and start screaming or next generation you're going to get RTX 5080's that should be RTX 5060 Non-Ti cards for $1000! Think im joking? We charge you more and you get less, Suck it. - NVidia 2022-23. Frame Generation and DLSS is just there to make a less powerful card look more powerful. Reviewers need to remove all these "Performance" tricks out of testing and do PURE raster P E R I O D ! ! ! ! !
Please don't confound memory bandwidth and memory bus. The problem is low memory bandwidth, bus width alone doesn't tell anything.
Those results are wild! My buddy was looking for a pc around $1,000 to game on 1080p and maybe eventually 1440p. I convinced him to get a prebuilt (he’s not trying to do it himself) with a 6750xt and some other decent specs for exactly $1000. For a good budget pc at 1080p especially the AMD 6000 series is superior to Nvidia’s 3000 series IMO!
6750xt is a good card. I got one for my niece. Great price.
@@gerthddyn heck yea! He mainly wants to play competitive shooters and get 160+fps… he’s coming from an Xbox one 🤣 🤮. He should get 200+ in most games (low to medium graphic settings)
AMD is trash on the GPU side though
@@wallacesousuke1433 nah 6,000 series out performs Nvidia’s 3,000 series at 1080p and 1440p. Nvidia shits on AMD’s 6,000 series at 4K but I only play competitive shooters so idgaf about 4K. I can’t speak on the next gen GPUs though. I’m pretty sure the 4090 is the king in all aspects but shits so absurdly priced.
@@notblazedfordays9536 "I only play competitive shooters"
Yikes 🤢, well, in this case yes, you're good to go with shitty Radeon cards since you only play shitty games, but for less niche uses, Nvidia is, sadly, the way to go. Better features such as DLSS, AI upscaling, more stable drivers, better for emulation, VR, better RT, better for productivity, better optimization for most games on release (since it is the dominant brand and having better OpenGL and DX11 performance)
AMD and Intel have an opportunity to blow Nvidia away in the low end segment, will they take it?
People will continue to buy nvidia, and nvidia will justify the poor performance with DLSS.
GPU market's F-ed.
If people are willing, the RX7600 is just that.
Outside of the US, this card is probably the first card post covid with a reasonable price tag. Close to 3060Ti performance while being cheaper than the 4060TI and 4060.
I would really like the same comparison between the 4060 and 3060ti since they cost the same
It would probably lose in a 4060 vs 3060 comparison. Also don't forget to compare it to the 6700XT as its a bit cheaper than the 4060 non TI.
@@wiLdchiLd2k it can. don't forget that there are plenty games that do not have these features. Unfortunately Steve completely failed to mention any of this, making this the most misleading GPU review I've seen from a major channel.
@@wiLdchiLd2k actually it'll still easily lose as the 4060 gets demolished by the 3060 in demanding games and dlss3 requires Vram, which the 4060 is already out of its super slow memory. Not to keep in mind the latency dlss3 adds vs native so even with a higher fps the experience will be worse. Further more if you are at a gpu limit dlss3 doesn't seem to do a whole lot its more for cpu bound games as the dlss3 frames do not happen at all, so the fake frames not actually happening in the game can increase your fps by not Involving anything besides the gpu. And Considering how bad dlss3 is even on a 4090 and that it just seems like a pretty bad technology in general, you ain't winning with that either way tbh.
Normal dlss kinda has more of a argument but Xess and fsr also exist and only get a few fps less than dlss if sometimes maybe not more.
@@wiLdchiLd2k also dlss3= frame generation
Frame generation = frames that literally do NOT happen in the game and is just something the the gpu guesses so it ends up massively adding latency.
Because keep in mind not just latency from your gpu to the entire pc. But also from your pc to the monitor, the monitor to the pixels, the pixels to your eyes, which then goes you probably know where, which you then need to process. And then after that you can process the real frames. So a lot of added latency vs only real frames.
It ends up playing like a weird game that has way to much input lag really so not great for most games.
@@XX-_-XX420 "It ends up playing like a weird game that has way to much input lag really so not great for most games..." and I'm sure you are talking from hours of direct experience with DLSS and frame generation on midrange Lovelace, not just talking out of your ass, blindly repeating the bullshit you've "heard" online?
you said the 3070 ti is now at $350 us ? where did you see that ? all I can find is around $600 - $850. under those prices are open boxes or refurbished stuff...
4:45 I'm guessing the smoke indicates that the mainboard is running an older BIOS that still has those voltage issues with the 7800X3D?
RTX 4060 Ti, top contender for Waste of Sand 2023
That's to generous, it's a waste of oxygen.
What a great, precise and surgical review, thank you for it Steve. Congratulation and keep posting.
As a 3060Ti Owner I see this an absolute W 🗿
i don't. i love seeing what game devs can do when they can let their graphics stretch their legs. the mainstream experience will always be dictated by the consoles, and this represents another generation of custom PCs that can't offer a console experience for console money. that sucks.
your 3060ti isn't going to be able to deliver fewer frames just because the newest hardware can do more.
@@SB-pf5rc I agree..
@@Bruhhhoff agreeing in youtube comments?! FORBIDDEN
...but yeah as a launch day 6900xt owner i'm very happy to sit this generation out. probably gonna sit the next one out too. saving money is OK with me!
The other problem that i see with the 4060s (ti or not) is that they use 8x PCIE, so even in the older computers that you'd probably put this card in to give it a little oomf (like a Ryzen 2000 series) it won't be able to use as much as the card wouldve been used if it was a full 16x. So basically even the market that would buy it cant even use it as well as a 3060.
amd cards are just so much better on older cpus because of lower driver overhead
PCIe 3 x8 won't bottleneck those (suppossedly) 50 class 128bit GPUs
x4 would make a difference. x8 no way
It has already been tested that the PCIe x8 interface makes this an even worse product than it already is for upgrading older systems that use PCIe 3. The reduced bandwith of the memory bus and the insufficient frame buffer capacity make it access the main memory more frequently - and these hurt even more if the PCIe bandwidth is halved.
@@GewelReal well derbauer already tested it
AMD 7900xtx is the king of all cards no question about that, we are only around the corner from the new fsr 3 which will then put its on par with the 7900xtx
The 4080 should have been the 4070, and so on. Nvidia duped us all with the namings / memory bus
What bothers me most about this is the labor and materials going in to such a marginal improvement (and not even across the board). They had to retool or create new assembly lines for this, and 6 years from now, the 4060ti will be sitting on shelves unsold, or for those who did sell, likely already replaced. Eventually they'll be recycled - yet more labor going into salvaging this product - or dumped in a landfill; all because Nvidia wanted to fill out their product sheet to satisfy shareholders.
Is that all this is about, satisfy share holders
Any info on the rumors that Nvidia is threatening to limit or block AIBs from receiving supply if they work on intel battlemage GPUs? There is an article up on notebookcheck
One of your best videos in a while. Clear question, straightforward data collection, powerful (if demoralizing) conclusion. At least now we know exactly where we stand.
Realistically the comparison should be done on a pci-e 3.0 board, because that's what 90% of people thinking of upgrading in this price category will be running.
It seems like they really screwed themselves with that 4080 12GB unlaunch. It screwed everything up somehow.