YouTubers are doing reviews WRONG - nVidia RTX 4060 Release Review

แชร์
ฝัง
  • เผยแพร่เมื่อ 25 มิ.ย. 2024
  • Hooooo boy... The nVidia RTX 4060 has been launched to an absolute $hit storm of hellfire from hardware reviewers big and small. And the numbers they're coming up with ARE accurate, but do they tell the entire story? If everything your review is garbage now, it might be time to re-evaluate the way we review hardware.
    Grab yourself a Pint Glass at craftcomputing.store before the internet white knights ensure I'm never heard from again.
    Links to items below may be affiliate links for which I may be compensated
    Check out the PNY XLR8 RTX 4060 8GB - www.pny.com/pny-geforce-rtx-4...
    XFX Speedster Swift Radeon RX 7600 8GB - amzn.to/3XweTMK
    Gigabyte A520 ITX - amzn.to/440lT7l
    AMD Ryzen 5600G - amzn.to/3Ny98tq
    16GB Geil Orion DDR4-3200 - amzn.to/3JDq4h3
    WD_Black SN770 2TB NVMe - amzn.to/3NPcxFA
    Cooler Master RC-130 - amzn.to/3NBtMJl
    Fractal Design Terra ITX - bit.ly/45YKMS6
    Follow me on Mastodon @Craftcomputing@hostux.social
    Support me on Patreon and get access to my exclusive Discord server. Chat with myself and the other hosts on Talking Heads all week long.
    / craftcomputing
    0:00 - BRING ON THE HATE BECAUSE YOU HAVEN'T WATCHED THE VIDEO YET
    1:34 - Standardized Test Benches
    5:42 - My $800 PC Setup and Methodology
    8:37 - DOOM Eternal
    9:13 - Projects CARS 3
    10:10 - Spiderman Remastered
    10:50 - Tiny Tina's Wonderlands
    11:39 - Cyberpunk 2077
    12:43 - Jedi Survivor
    13:31 - Red Dead Redemption 2
    14:26 - Benchmark Overview
    15:45 - Time To Rethink How We Review
    17:42 - How Did We Get Here (it's nVidia's own fault)
    22:12 - Where do we go from here?
  • วิทยาศาสตร์และเทคโนโลยี

ความคิดเห็น • 1.4K

  • @Hardwareunboxed
    @Hardwareunboxed ปีที่แล้ว +517

    Since I’m in the thumbnail I’ll reply :)
    Quality settings don’t matter, low, medium, high or ultra, the RTX 4060 sucks all the same relative to the parts it doesn’t want to suck in comparison to: 3060, 3060 Ti, 6600 XT, 6650 XT, 6700 XT.
    We didn’t and never do compare lower end cards with flagship models, the RTX 4080 and 4090 for example weren’t in our review data. The 4070 Ti and 4070 were and we never made a single comparison with them.
    The machine used for testing should ensure it’s not limiting the performance of the GPU. It’s also a very poor argument if you’re suggesting testing with lower quality settings which are more likely to see a CPU or system limit rear its ugly head. Of course people can also upgrade so it’s just not a good argument to make. Test individual components without limits as you have no idea what games people play and the quality settings they use, you’re just asking for trouble when deliberately introducing bottlenecks.
    4:00 - It’s less about showing people exactly how many frames a GPU will produce and more how it compares to relevant parts, such as the RTX 3060 in this example. So again whether you test the 4060 and 3060 using medium or ultra, the margins should be much the same, assuming you didn’t introduce a system bottleneck to cap performance.
    As for DLSS, we dedicate extremely in-depth content to cover those technologies, though they were covered in reviews for higher end parts. Benchmark graphs with DLSS3 Frame Generation enabled are extremely misleading for sound reasons that Gamers Nexus and ourselves have explained in our content.
    5:35 - Buy a fire sale 6700 XT is the answer, the data is clear on that one.
    My advice would be this. Just test the way you want to test and let the viewers decide if you’re doing it better than everyone else, or not. In my opinion your review was seriously lacking in comparative data to make any real conclusion and was not nearly as nuanced (as you put it) as most other reviews I’ve read/watched.
    Anyway, nothing personal, you’re a great guy and I know the intention here wasn’t to start drama, so I’m just giving you some direct feedback in the same vein.

    • @CraftComputing
      @CraftComputing  ปีที่แล้ว +49

      Had to stand on my head to read this one, but I do appreciate the feedback :-)
      As I said, you and GN are my 1A/1B stops for definitive benchmarks, due to your excruciating detail. And this video was in no way meant to offend any of you.
      The overall idea here was to say that standardized benchmarks are a great way to develop comparisons across multiple generations. And that research IS valuable and relevant. So keep doing it.
      But if a gamer has $300 for a GPU, what can they expect for performance when they install it in their own hardware. What about their *experience* overall with some tweaking of settings?
      Again, my point was both testing methodologies can lend relevant information for those looking to make a buying decision. And as readily available as the 6700 XT or RTX 3060 are today, that does nothing helpful one month from now for someone with $300 in their pockets in a Microcenter.
      The title said 'Wrong" (and I'll admit, it did get clicks), but my opinions on benchmark methodology was just as nuanced.
      Again, much respect Steve. Hope I didn't ruffle any feathers over there.

    • @shanent5793
      @shanent5793 ปีที่แล้ว +4

      Why did you guys cave on the 1080p CPU benchmarks? You said all the reasons for high resolution benchmarks were "in no uncertain terms, wrong" yet here we are with 1440P and 4K in recent CPU comparisons

    • @marcinkarpiuk7797
      @marcinkarpiuk7797 ปีที่แล้ว +3

      6700XT is great choice, better then 4060 atm, there is also one large drawback to it - not everyone got 650w psu for it, if u want upgrade just gpu and have like 450-550w psu...

    • @Hardwareunboxed
      @Hardwareunboxed ปีที่แล้ว +51

      @@CraftComputing no worries mate, I get that it's about clicks. I recon there are better ways of going about it, but I do get it.
      @shanent5793 caved? We still use and heavily focus on 1080p and did so for the content in question. Also 1080p is still exclusively (or the primary res) shown in CPU reviews. That said for years now we have often looked at other resolutions for some CPU content, so nothing has changed.

    • @CraftComputing
      @CraftComputing  ปีที่แล้ว +28

      I'd love to have a conversation face to face if you'd be game for it. I'm sure there is more to the conversation on both sides than we can convey in TH-cam comments. Can I shoot you a DM and work out a time?

  • @MrFluteboy1980
    @MrFluteboy1980 ปีที่แล้ว +638

    I watched the GN review of the 4060. They didn't hate it because it didn't test well compared to the 4090. They hated it because it was no better, and infact was sometimes worse, than the prior generation 3060. That's what the hate is about.

    • @cardboardsnail
      @cardboardsnail ปีที่แล้ว +52

      The only part in the GN review where the 4060 was slower than the 3060 was Cyberpunk 2077 4K Ultra setting where the average FPS was slower by 0.7fps. That's an edge case scenario and nobody in their right mind would be playing Cyberpunk at that resolution with those settings on either of those cards unless they're some weirdo who enjoys framerate dips into the mid-teens. The 4060 was faster in every other test. The disappointing part is how it performs against cheaper cards like the 6600XT ($220) and 7600 ($255) and how it's unable to beat the 3060 Ti ($330) or the 6700XT ($310). It's not generally slower than the 3060, but it needs a price cut ($300 is too much) in light of what's currently available on the market.

    • @thewallpersonified3152
      @thewallpersonified3152 ปีที่แล้ว +72

      ​@@cardboardsnailjustify it however you want, it's not ok that in some situations it can stay even or even lose to a 3060, which I have seen some scenarios it can lose. That should be completely unacceptable

    • @CFWhitman
      @CFWhitman ปีที่แล้ว +17

      @@thewallpersonified3152 In other words, you are objecting to the misleading naming of the card compared to the previous generation. This seems to be a valid criticism. When you consider the misleading naming along with the inflated price, you realize that someone who hasn't gone online and scouted out the reviews might think that they are getting a better card than they really are, thus justifying the price, and it was the model name that brought them to that wrong conclusion.

    • @ssplayer
      @ssplayer ปีที่แล้ว +13

      @@thewallpersonified3152 I crunched some numbers. Adjusted for inflation, the launch price of the 4060 is 20% less than the 3060. Does that help a little?

    • @thewallpersonified3152
      @thewallpersonified3152 ปีที่แล้ว +27

      @@ssplayer LMAO no

  • @DimitrisChr
    @DimitrisChr ปีที่แล้ว +404

    You are ignoring the whole 3000 series cards just to suit your argument. The 4060 card is not an upgrade nor its a good value compared to previous gen cards both from amd and nvidia.

    • @whiskizyo2067
      @whiskizyo2067 ปีที่แล้ว +107

      he's also making blatant straw-man argument (arguing against an argument that wasn't even made) saying everybody is expecting high performance out of this low tier/bracket card - but nobody is, in fact it's mainly been 1080p reviews and it's still a terrible card for basic gaming, in comparison to both last gen and other current offerings. blatant shilling imo, i just wonder for how much

    • @Pumciusz
      @Pumciusz ปีที่แล้ว +24

      @@whiskizyo2067 If it was a good card we should expect high performance. 3060ti can play 1440p so 4060 should play it better.

    • @MarcoLoves360gamer
      @MarcoLoves360gamer ปีที่แล้ว +29

      He's been paid that is clear. And unsubed.

    • @CraftComputing
      @CraftComputing  ปีที่แล้ว +46

      I'm saying if you test every card at ultra settings, is THAT the only information you present as a reviewer to someone who is trying to make a buying decision? The 4060 is NOT the improvement from the 3060 everyone wanted. But if you are building a PC and have $300, ultra settings numbers don't matter to you nearly as much.

    • @Fay7666
      @Fay7666 ปีที่แล้ว +15

      @@CraftComputing So I'm still watching, but I'm expecting the point of the comments being that the parity in performance at standardized unrealistic settings would translate to parity in performance on more realistic settings.
      Even if FrameGen were the be-all of performance (it isn't), it's still limited to the small lineup of games that support it and the whim of any new game supporting it. If there's an expectation that every game going forward will support it then alright, but that's a very unrealistic ask. Unless everything you plan to play will support those features, there's a possibility that those features end up unused in the future and so leave you with an overall worse product.

  • @morbidjin
    @morbidjin ปีที่แล้ว +83

    The 4060 is really the 4050

    • @nathangamble125
      @nathangamble125 ปีที่แล้ว +6

      AD107 146 mm². Yep.

    • @sjneow
      @sjneow ปีที่แล้ว +4

      Nah it is the 4030. The 4060Ti is 4050

  • @sybreeder86
    @sybreeder86 ปีที่แล้ว +223

    No Comparsion to RTX 3060 / 3060Ti so for someone who don't know difference in performance can be misleading. If This card is slower than 3060 Ti it is a not a good value. No matter what DLSS can do.

    • @Angus_CLC
      @Angus_CLC ปีที่แล้ว +9

      Especially here in Hong Kong, where some brand new 3060 Ti matches the 4060 MSRP of HKD$2399.

    • @Tritone_b5
      @Tritone_b5 ปีที่แล้ว +3

      I was going to say the same thing, Why get the 4060 if you can get better performance with the 3060Ti

    • @HosakaBlood
      @HosakaBlood ปีที่แล้ว

      ​@@Tritone_b5because is not suppose to replaced a 3060 ti in the first place you still buying a older Gen card just with disconted price more heat more power usage etc

    • @Tritone_b5
      @Tritone_b5 ปีที่แล้ว +6

      ​@@HosakaBlood So let me get this straight, if you were to buy a new GPU today, you have no problems buying a new GPU for the same price, because it has a lower power consumption than an older but more powerful GPU? I mean more power to you.
      However, I highly doubt that most people think that way.
      IMO as others have stated this should be a 4050 heck, even if it was 4050ti people would have welcomed it.

    • @arayramadhan8340
      @arayramadhan8340 ปีที่แล้ว +3

      ​@@HosakaBloodthe old knowledge were "the new xx60 will be on par or better than the old xx70" so if you're comparing with the last gen xx60 Ti to the new 60, the 60 should have a better performance.

  • @paulshardware
    @paulshardware ปีที่แล้ว +16

    hey I didn't even review the 4060

    • @ofmcmxxxvii2788
      @ofmcmxxxvii2788 11 หลายเดือนก่อน

      😢

    • @CraftComputing
      @CraftComputing  11 หลายเดือนก่อน +3

      It seems the only winning move was not to play.

  • @mikeycee432
    @mikeycee432 ปีที่แล้ว +249

    Bro they’re just chopping chips down a whole tier and trying to sell you a 4050 at 4060 prices. What do architectural gains matter when you’re down 20% in shader cores. Stop giving this company your money.

    • @HoldinContempt
      @HoldinContempt ปีที่แล้ว

      Hes a nvidia shill. Just unsub from his channel and block him.

    • @emlyndewar
      @emlyndewar ปีที่แล้ว +25

      But… but.. I just want to publish a shit take to get rage clicks.

    • @mikeycee432
      @mikeycee432 ปีที่แล้ว +9

      You guys own nvidia stock or something ? 😂😂

    • @eDoc2020
      @eDoc2020 11 หลายเดือนก่อน +1

      Why would shader core count matter if it's still faster?

    • @loganmedia1142
      @loganmedia1142 11 หลายเดือนก่อน +1

      @@user-by2eh6el9y But it isn't $30 cheaper. It is really a xx50 class card. It is therefore $50 more expensive, but offers a very good boost in performance over its predecessor. Power consumption though is about the same.

  • @subrezon
    @subrezon ปีที่แล้ว +106

    This video needs a different title: "How to feel good about giving Nvidia $300 for a dogshit GPU"

  • @thedandyp
    @thedandyp ปีที่แล้ว +177

    18:16 The $299 MSRP of the 1060 6GB was exclusively for the limited Founder's Edition, the board partner price was $249, if we're talking relative price to performance, that's an enormous difference.
    Another point that virtually all tech reviewers seem to be excluding is that, at the time, this wasn't the lowest entry point; the 1050 2GB, 1050TI and 1060 3GB were obviously part of the same generation as the 1060 6GB, and were substantially cheaper, while still offering decent, acceptable performance (for the time), these options, at these kinds of price points ($110, $140, and $200 respectively) simply don't exist anymore.

    • @thomasgeekohoihanssen9242
      @thomasgeekohoihanssen9242 ปีที่แล้ว +18

      Hard like on this!
      The price scale comparison needs to start with the baseline, as you say. Nvidia is obviously trying a 1984 style use of language, where reviews tries to make sense of a $300 ‘60 card and siting “inflation” to say the ‘60 series has not gone up in price.
      But when it used to be $120-$150 for the ‘50 class and now the stack starts with the ‘60s at $300, that becomes the new entry-level.

    • @Doug-mu2ev
      @Doug-mu2ev ปีที่แล้ว +4

      @@thomasgeekohoihanssen9242 Does inflation not exist where you live? In the USA from 2020-2023 price change due to inflation is 16%. Also, I had a 1050ti and 1060 6GB, they were ok GPUs. If you adjust a 1060 $249 price from 2016 to now it is over $300. Our options today are just as good as they were before… do you remember when a cup of coffee at Starbucks was a buck fifty? Yeah, that was back in 2005, now it is $2.65. Inflation is real, and it stinks. Ok, back to reminiscing about the golden age of computing on my 8086 (which cost $1200 in 1988… don’t want to think about how much that is in today $$)

    • @xingbairong
      @xingbairong ปีที่แล้ว +7

      @@Doug-mu2ev Yeah, but as the guy said the new entry level has basically been moved to $300. Also the high-end prices have gone through the roof. Right now PC parts prices are among the best we've had in years, so obviously inflation doesn't affect tech the same way it does other sectors, which shouldn't be a surprise. During inflation food, utility bills, mortgage/loans become the most important, while tech becomes secondary and prices tend to go down(no demand = lower prices).
      The issue is that they refuse to bring back GPU prices to what they were or at least close to those numbers and to name them appropriately.
      RTX 4060 is realistically an RTX 4050 and it should be prices accordingly, but people, for some reason, try to argue otherwise. Name of the cards doesn't matter as much when it doesn't match the established ladder of performance they've built over many years.

    • @ragtop63
      @ragtop63 ปีที่แล้ว +9

      I told people, as soon as I started seeing everyone jumping on that "parts shortage" train that it was going to permanently reset the mark. I'm absolutely certain that the shortage they were referring to was just an excuse to raise their MSRPs. If you looked at the availability of GPUs vs cell phones at the time, MILLIONS of phones were still available and being sold. Wouldn't a raw material shortage have affected them too? Then there were the pallets of GPUs that were being sold directly to miners (I have photographic proof of this). These greedy manufacturers can most definitely reduce the MSRP of their products to something more inline with standard inflation but since the marker has been successfully reset, they have no reason to.

    • @syrynx4454
      @syrynx4454 ปีที่แล้ว +6

      Doug-mu2ev That cuz you are assuming the rtx 4060 its a 60 class card, it s not, its a 4050 or 4050 ti at best, now price it according to that and see how much ot should be in today market

  • @nathangamble125
    @nathangamble125 ปีที่แล้ว +177

    "the answer can't be to avoid buying anything new and only look for fire sales of previous-generation cards"
    It literally can though. If a product isn't worth buying, we won't buy it, and prices will come down.

    • @CraftComputing
      @CraftComputing  ปีที่แล้ว +11

      "Not worth buying"
      Cool. In that case, the PS5 and Series X are on aisle 12 sir.

    • @svn5994
      @svn5994 ปีที่แล้ว +98

      @@CraftComputing Fanboy take goes to: Crafting Computers.

    • @nathangamble125
      @nathangamble125 ปีที่แล้ว +41

      ​@@CraftComputing The consoles are based on last-gen hardware, and are a good choice for some people, but most of the games I want to play don't run on the PS5 and Xbox, and it's more convenient to have one machine that does everything, especially when my space is limited.
      People who mostly want to play new AAA games buying the consoles instead of new graphics cards will hopefully have a similar effect on Nvidia's pricing as PC gamers buying older graphics cards or refusing to buy a card at all.

    • @OldBuford
      @OldBuford ปีที่แล้ว

      At first I thought this craft guy was bought out by nvidia...now I'm starting to think he's just a high functioning stooge

    • @thetruestar6348
      @thetruestar6348 ปีที่แล้ว +26

      @@CraftComputingou are a fanboy this is funny as hell to see in the flesh

  • @antiv0dka
    @antiv0dka ปีที่แล้ว +21

    "If everything your review is garbage now, it might be time to re-evaluate the way we review hardware"
    But, they ARE garbage.

    • @emlyndewar
      @emlyndewar ปีที่แล้ว

      Just another washed up hardware channel. There’s so many of them, and the attitude after this video is so bad. Just find a better one. 👍

    • @damasterpiece08
      @damasterpiece08 ปีที่แล้ว +2

      aka move the goalposts to make yourself feel better for being scammed by ngreedia

  • @yasminesteinbauer8565
    @yasminesteinbauer8565 ปีที่แล้ว +36

    You didn't mention one essential factor in one word - consoles. It used to be easy to outperform consoles with a PC relatively cheaply. Today, if a 400€/$ graphics card can't display the same texture resolution as a console that's over 2 years old because it doesn't have enough Vram, it's rightly worthy of criticism. And the consoles are the leading platforms for which games are developed.

    • @CraftComputing
      @CraftComputing  ปีที่แล้ว +9

      Guess what? Consoles are using ALL the FSR tricks, lower rendered resolution then upscaling, plus have unified memory between CPU and GPU (and a lot of it, along with no overhead for Windows).
      And not a single console player complains about not getting 100% rasterized, non-fake upscaling, faux-K output. They plug in the console and play their games.

    • @yasminesteinbauer8565
      @yasminesteinbauer8565 ปีที่แล้ว +11

      @@CraftComputing Upscaling mainly reduces the required processing power of the GPU. This does not solve the problem of non-loading or low-resolution textures because of the too low vram. And even weaker GPUs can display high resolution textures if they have enough memory.
      The consoles can have more than 8 GB of vram if they want. (XBox has 10 GB of memory attached faster -> obviously vram) So a mid-range PC should have at least as much to not provoke problems that otherwise occur when games are ported. Not all games will get elaborate memory management routines to run optimally with the 2-4 GB less vram of 8 GB cards. I wouldn't recommend a card with less than 12 GB to anyone who wants to play at least at console level.
      However, I have nothing against upscaling in principle. My point was that consoles also shape customer expectations. Not just the comparison to previous generations.

    • @classicallpvault8251
      @classicallpvault8251 11 หลายเดือนก่อน +2

      Nonsense. A 6700 XT is significantly below 400 and has 12GB of VRAM and can handle all textures that one of the consoles can. It's more powerful than the GPU in the PS5(which is a 6700 non-XT but clocked lower than a desktop version of that GPU) and is about on par with the Series X, which has 30% more cores but again is clocked slower for power efficiency and cooling reasons.

    • @queenform
      @queenform 11 หลายเดือนก่อน +4

      @@CraftComputing and guess what? they're getting a complete machine for $500
      meanwhile a $300 - $500 CARD in 2023 still comes with 128 bit 8GB VRAM because Nvidia decided to rebrand a tier this generation

    • @masterdftw4983
      @masterdftw4983 11 หลายเดือนก่อน

      @@queenform But you cant upgrade a console. And its cooling sucks. Waste of money.

  • @BastianNoffer
    @BastianNoffer ปีที่แล้ว +253

    Actually most of the reviews that I have seen don't complain about the performance compared to a 4080 or 4090, those were even not on the charts, but more in regards to generational uplift in the same product class. And as specially talking comparing with older platforms one of the most critical things today is cards shipping with 8x PCIe lanes instead of 16x therefore making use of PCIe 3.0 based Boards more and more questionable, but this includes chips like Ryzen 3000 and Intel 10th Gen. which are still relevant for Gaming. This fact hits AMD and NVidia alike though.

    • @RandarTheBarbarian
      @RandarTheBarbarian ปีที่แล้ว +14

      Craft Computing being one of the proponents of the used xeon market really should take something like that into account. I mean I understand that testing every old or odd configuration is an impossibility, but shaving lanes on the "budget class" GPU means a heck of a lot when it's common for a budget minded consumer to retain parts of their previous system. Even 50 class cards from previous gens still retained their full 16 lanes, which makes the 4060 ti feel more like it should be a 4040 or something, which again if taken into account with its $300 price tag feels absurd. If you're gonna leave performance on the table because the lane count has been halved then a comparison with an old card that didn't do that is not only justified, it's more necessary than ever when calling it "budget"

    • @cannaroe1213
      @cannaroe1213 ปีที่แล้ว +3

      @@RandarTheBarbarian imagine trying to be a youtuber in a space where everyone buys second-hand stuff. who's going to sponsor you? cleaning products?

    • @informatikabos5481
      @informatikabos5481 ปีที่แล้ว +1

      @@cannaroe1213 So the Video is an Nvidia sponsored ad? That sounds disingenuous.

    • @MrDarcykampe
      @MrDarcykampe ปีที่แล้ว

      @@RandarTheBarbarian There's plenty of budget pc's out now that have multiple nvme slots, and if you use more than one of them, the pcie slot gets cut in half. So an 8x slot video card is fine for that if you want to use all the nvme slots.

    • @Doug-mu2ev
      @Doug-mu2ev ปีที่แล้ว +1

      I don’t think you understood his point: the other reviewers are not comparing against the flagships but they are testing them as if they were in the same class as the flagships (using the same benchmark settings and crazy high end test rig). It is not a realistic test suite for this tier of product.

  • @jasonweisberger8257
    @jasonweisberger8257 ปีที่แล้ว +133

    The point of a review is to review that part for itself. By posting any performance numbers at all without eliminating other variables, you've basically reviewed that particular PC, not the card itself. This is a Jeff's random PC review. Wrong title.

    • @nhand42
      @nhand42 ปีที่แล้ว +3

      But it seems GPU reviews have fallen into the trap of reviewing specs like it's the specs that matter. It would be like reviewing cars by putting them all on the dyno and dividing the number of HP at the wheel by the cost of the car. It's weird. Who reviews cars like that? Instead the reviewers will take it for a test drive and say "it feels sluggish" or "its responsive and nimble". Qualitative descriptions that matter more to consumers than "21 jigawatts of multi-core Deloreans".
      They both have their place, for sure, and maybe you can argue there is no "quality" to measure in GPU only raw performance. But after recently buying a Steamdeck and playing games at 800p/40fps and having a blast, it suddenly became clear I don't need 4k/120fps to enjoy a game.
      And I don't know what a qualitative review would look like. I'm not in the business of reviewing cards. But I'm losing interest in endless barcharts and % points and FPS/$ or CUDA/year comparisons. I don't care!? Tell me something I can't figure out from the spec sheet.

    • @rett.isawesome
      @rett.isawesome ปีที่แล้ว +3

      ​@@nhand42it's really only pc enthusiasts that are like that too, aren't they? Car people don't talk cars that way, music gear heads aren't that way, literally no one cares about performance in a vacuum in any field of interest except for the PCMR.

    • @nhand42
      @nhand42 ปีที่แล้ว +2

      ​@@rett.isawesome Well I said there's a place for both because *some* car enthusiasts definitely do put cars on dynos and compare HP. And some music enthusiasts probably get excited about the instruments moreso than the music. However in our hobby of gaming there's an overwhelming emphasis on specs, its borderline obsessive.
      There are rare exceptions. Digital Foundry certainly does some analysis of spec, but mostly their content is comparing how the game looks, whether it runs smooth, the visual differences between FSR and DLSS. These are qualitative analysis and that's a hugely popular channel for a reason. They are a breath of fresh air in an otherwise stale landscape. And sure they do compare frame rates and dollar figures too, but it's not their only focus or even their primary focus.
      Now imagine the next level of qualitative analysis. What's the best GPU for RPG on a budget? Or the best GPU for racing games? Or the best GPU for streamers? Can I look purely at the specs and barcharts to figure that out. I think not. I think we need a review channel who takes that next step. Maybe not those examples I gave because I'm not in the business of reviewing so I don't know what's actually required. But hopefully you get the idea that we need more than barcharts.

    • @MrMisticZ
      @MrMisticZ ปีที่แล้ว +4

      ​@@rett.isawesomeYou're very wrong about music. Headphone audiophiles care only about sound quality and prices, same way PCMR cares about performance and prices.
      Also you can't compare GPUs to cars when cars' performance is being forefully bottlenecked, because otherwise you would kill yourself with it. It would be, however, appropriate if GPUs had a chance to blow up past a certain performance point. But they don't. Every other quality is being ignored because you aren't reviewing the cards, you're reviewing the GPUs, which is the engine of a car and not a car itself.
      You can't be objective if you don't eliminate variables, yes it's not ideal for choosing a part to buy, but reviewing a system with those bottlenecks is even less usful, because you may have different bottlenecks and now you suddenly have no idea how it will perform, whereas with isolated testing you can estimate the difference yourself.

    • @vincent_9494
      @vincent_9494 11 หลายเดือนก่อน

      @@nhand42
      "But it seems GPU reviews have fallen into the trap of reviewing specs like it's the specs that matter."
      Is it possible that it is just you (plus few others) who misinterprets those reviews? Because I certainly do not view them that way 😉

  • @AliceC993
    @AliceC993 ปีที่แล้ว +376

    If there's any situation where a 4060 is _worse_ than the preceding 3060, it's a bad card. Sorry.
    I also cannot justify a $300 card having an AD-107 GPU.
    It's not AS disgustingly bad as the 4060 Ti, at least.

    • @CraftComputing
      @CraftComputing  ปีที่แล้ว +42

      That matters on the 'hold accountable' side of GPU reviews, not the side of "which of these two should you buy because you have $300 for a GPU today"
      That's the point of this video. Both testing methods are useful and valid.

    • @Tuxic
      @Tuxic ปีที่แล้ว +127

      @@CraftComputing but why do you limit the choice to these two GPUs at $300 when there's plenty of better fish in the sea? That side doesn't make sense in my mind, and feels apologetic even if you explicitly say that it's not.

    • @mikeymaiku
      @mikeymaiku ปีที่แล้ว +4

      @@Tuxic let him just whip out every 300 dollar gpu in existance thats relevant today.

    • @joeyjojojr.shabadoo915
      @joeyjojojr.shabadoo915 ปีที่แล้ว +6

      @@Tuxic Same Generation, Direct Competitors that can both utilize all of the latest features offered by each manufacturer.

    • @killingtimeitself
      @killingtimeitself ปีที่แล้ว +5

      @@Tuxic ignoring older gpus and other manufacturers for good reason they have the same stuff, or very similar stuff, and can generally perform the same tasks, a 10 series card has very different nvenc support and hardware to a 30 series card. For example.

  • @sgtotaku
    @sgtotaku ปีที่แล้ว +212

    I will say some of the reviewers have pointed out the differences between the 4060, 4060ti, 3060, 3060ti, and even 2060, all of which are in the same class, and even in the same rough price range. Removing the bottlenecks with an overkill test bench allows the GPU to be the only factor, and that is what tells me that even though its a brand new card down in my price bracket, it just doesnt have anywhere near enough power to justify upgrading from my rx6600

    • @stupidoldgamer
      @stupidoldgamer ปีที่แล้ว +6

      You're not bringing anything new to the party. The reviewers could be missing the point? I think maybe you should be comparing power usage per FPS now. Its not just about FPS but the cards ability to manage a low power profile. I am astonished that there are better next gen cards that use less power than older gen cards that use less power and more efficient.

    • @Bob_Smith19
      @Bob_Smith19 ปีที่แล้ว +6

      @@stupidoldgamerPower usage is why I upgraded to a 4070. It’s not the best card but since I game maybe 20% of the time the computer is on the power usage has made a big difference. Idle power matters to a lot of people.

    • @Fay7666
      @Fay7666 ปีที่แล้ว +7

      @@stupidoldgamer Power consumption has improved, but unless you live in one of those places where they're price gouging for electricity it doesn't really make sense to be a factor unless you have Pascal or lower (and even then I'd say Pascal and Maxwell are still alright-ish on that front).

    • @marv6424
      @marv6424 ปีที่แล้ว +1

      @@mongoworldofthebizarre Toms Hardware does that. Their gpu charts are very interesting.

    • @MrGts92
      @MrGts92 ปีที่แล้ว +11

      The power efficiency total bs as an excuse for buying a poorly priced card.

  • @loki42dnd
    @loki42dnd ปีที่แล้ว +89

    Remember when Nvidia cut the memory bandwidth in half from the 3060 to the 4060?

    • @tyre1337
      @tyre1337 ปีที่แล้ว +27

      jeff: "i'm gonna pretend i didn't see that"

    • @loganmedia1142
      @loganmedia1142 11 หลายเดือนก่อน +14

      Because it is really a 4050.

    • @bdhale34
      @bdhale34 11 หลายเดือนก่อน +3

      They cut the PCIe lanes in half as well between those cards.

    • @xPhantomxify
      @xPhantomxify 11 หลายเดือนก่อน +1

      The real problem is people always skipping benchmark videos to where they benchmark Cyberthrash 2077 and then act as if it's the most next-gen game with the best graphics. The game is terribly unoptimized, rushed and just a sellout tech demo for Nvidia. STOP WATCHING BENCHMARKS FOR CYBERTHRASH 2077. 95% OF PC GAMERS WILL NOT PLAY THIS GARBAGE GAME. The same goes for TLOU, Jedi Survivor and some others. The 4060 is incredibly power efficient for the performance and price. Efficiency > raw performance with terrible pricing and power draw.

    • @youreright7534
      @youreright7534 11 หลายเดือนก่อน +4

      ​@xPhantomxify tf are you on about? You clearly bought a 4060 and are trying to justify it. Cyberpunk is an incredible game in 2022-23, easily the most beautiful game in existence and a phenomenal story. The path tracing is just unreal, even the ray tracing is incredible. 4060 is just garbage. 4070, 4080, and 4090 are the only cards worth buying this gen. 3060 ti beats 4060 pretty badly while being cheaper

  • @LordApophis100
    @LordApophis100 ปีที่แล้ว +125

    The 4060 would have been a good 4050 at 250$. This generation Nvidia just moved every card one up in the stack and increased the price.

    • @EV3RGREEN
      @EV3RGREEN ปีที่แล้ว +13

      This.

    • @xPhantomxify
      @xPhantomxify 11 หลายเดือนก่อน

      The real problem is people always skipping benchmark videos to where they benchmark Cyberthrash 2077 and then act as if it's the most next-gen game with the best graphics. The game is terribly unoptimized, rushed and just a sellout tech demo for Nvidia. STOP WATCHING BENCHMARKS FOR CYBERTHRASH 2077. 95% OF PC GAMERS WILL NOT PLAY THIS GARBAGE GAME. The same goes for TLOU, Jedi Survivor and some others. The 4060 is incredibly power efficient for the performance and price. Efficiency > raw performance with terrible pricing and power draw.

    • @EV3RGREEN
      @EV3RGREEN 11 หลายเดือนก่อน +6

      @@xPhantomxify You're absolutely wrong, and that's because Nvidia is messing with you. Cyberpunk is terribly optimized, true.
      However - it's a 4050. Not even Ti. The 4060 uses a "AD107" chip. Afaik, last time Nv released a card using a xx7 chip it was the 1050ti (GP107). So it's not more power efficient since it's just not a 60 class. It makes it easy to claim power efficiency gain when you're releasing a product with the same class name, but that just isn't the same class at all - and it confuses most people.

    • @EV3RGREEN
      @EV3RGREEN 11 หลายเดือนก่อน +1

      @@user-by2eh6el9y I'm living in Western Europe where the 4060's MSRP is 329€, while the 3060's MSRP was 325€.
      Now if you compare apples to apples: the 1050 (GP107 chip - the 3050 used a GA106 chip, a cutdown 3060) was 125€ MSRP, the 4060 (AD107), more than a 2x increase in price. Even the 1050ti (GA106) was cheaper (155€), and so was the 3050's MSRP (250€). All prices non adjusted for inflation, as it is slightly different per-country in Europe - but inflation isn't 2x with a downclassed GPU. Nvidia does it however.

    • @xPhantomxify
      @xPhantomxify 11 หลายเดือนก่อน

      @@EV3RGREEN The performance increases in the low to mid range have been terrible to say the least, yes. 3060 to 4060, 3060Ti to 4060 and 4060Ti, 3070 to 4060, 4060Ti. But I do think the 4060 is still a good card for the price and especially the power draw(110W!) I think someone buying a 4060 will not really play all these heavy singleplayer games anyway. Most likely just indie/multiplayer games which will be very fine.

  • @carmonben
    @carmonben ปีที่แล้ว +40

    Jeff, I think the situation is different from the past mostly because both nvidia and amd are still selling their "last generation" product stack brand new for "reasonable" prices, and judging from the L/D ratio, people seem to feel that those cards should be considered part of their CURRENT lineup

    • @CraftComputing
      @CraftComputing  ปีที่แล้ว +4

      That's absolutely part of it.

    • @nohay4549
      @nohay4549 ปีที่แล้ว +22

      @@CraftComputing It wasn't the part of your video. That's everyone's concern

    • @FlakAttack0
      @FlakAttack0 ปีที่แล้ว +6

      100% this is why I disliked the video. Actions speak louder than words: AMD/Nvidia have priced last and current generations as if they are all part of the same generation. That leads right into the next words that should be said by any competent PC builder: the RX 7600 and RTX 4060/4060 ti should be avoided in the majority of situations. As a matter of principle, I don't support buying last gen either unless you do not have a choice.
      I will wait until the next generation. They can keep these stinkers.

    • @mikeymaiku
      @mikeymaiku 11 หลายเดือนก่อน

      @@FlakAttack0 and with the more stinkers they release the more "these stinkers" stay on the lot.
      whats stopping nvidia from going "the 4060 is now a 5060 because we still have so much die stock sitting around".
      didnt amd do this with all thier RX lineup? for several years to boot?

  • @WarriorProphet
    @WarriorProphet ปีที่แล้ว +100

    I dunno, when I can pickup a 6600xt for $220 why in heck would I pay $299 for a 4060???

    • @pk_ripper
      @pk_ripper ปีที่แล้ว +2

      Because in the end, no one cares what you have. Have fun.

    • @nathangamble125
      @nathangamble125 ปีที่แล้ว +12

      @@pk_ripper That doesn't answer the question, it's just irrelevant.

    • @lukew2194
      @lukew2194 ปีที่แล้ว +1

      $220 for 6600 xt, I will sell you one for $170

  • @CheapBastard1988
    @CheapBastard1988 ปีที่แล้ว +63

    I appreciate that the standardised testing isn't necessarily a realistic condition for end users, but just building a random cheap (new) system is not the solution as it would only compare to a very specific system. This testing methodology is worse I think.
    Realistically, a card in this price category is very likely to be installed into an existing build. Very likely powered by an R5 3600 on a B450 motherboard. And I expect this card to run very bad in a board with PCIe 3.0 because this graphics cards only has 8 PCIe lanes. So there's already a testing discrepancy with your "realistic" test.
    You'd need to set up several scenarios if you want to test realistic performance properly with all kinds of older hardware. "How would it work with a 9900K, or maybe a 4th gen i5? Is it now bottlenecked by PCIe bandwidth? Would SMT on a quad core CPU matter? What about a hexa core?" The list would go on and on. Ultimately the numbers from just a single hardware config (like in this video) are pointless (from a realistic scenario perspective).

    • @CraftComputing
      @CraftComputing  ปีที่แล้ว +4

      I'm not saying standardized test benchaes shouldn't be used. In fact, I 100% endorsed them for how they're being used by the channels I mentioned.
      The problem is, again, they don't tell a potential buyer of a GPU if it can game at 1440P and 120 FPS with a couple tweaks. The only information it can provide is "74 FPS at ultra settings, and 16% slower than this other card on the same machine". It reviews the hardware, not the experience.

    • @nohay4549
      @nohay4549 ปีที่แล้ว +19

      @@CraftComputing In that regard your title should read 800$ PC with 4060/7600 review not just 4060/7600 review. You could add 6700xt to this build and get even better performance for almost the same price if not litle more.

    • @CraftComputing
      @CraftComputing  ปีที่แล้ว +7

      Would you have clicked?

    • @nohay4549
      @nohay4549 ปีที่แล้ว +10

      ​ @CraftComputing I am a sub and actually find your other videos very helpful and sometimes even binge watch even the old ones, not because of clickbaits. Haven't watched your streams but never missed a video since I started my own homelab. So, yeah I would've clicked. I was really expecting something more sensible than what you presented as you usually do but things took a different turn. All the best

    • @nohay4549
      @nohay4549 ปีที่แล้ว +4

      @@CraftComputing I am a sub and actually find your other videos very helpful and sometimes even binge watch even the old ones, not because of clickbaits. Haven't watched your streams but never missed a video since I started my own homelab. So, yeah I would've clicked. I was really expecting something more sensible than what you presented as you usually do but things took a different turn. All the best

  • @Phynellius
    @Phynellius ปีที่แล้ว +23

    The numbers reviewers have generated do tell a story, the story of how Nvidia would rather sell you hardware with their specific software solution to make it adequate in mind

  • @ha231
    @ha231 ปีที่แล้ว +13

    "I avoid drama like the plague" is what every drama llama says.

    • @CraftComputing
      @CraftComputing  ปีที่แล้ว

      Find another drama piece on my channel. There's hundreds to choose from.

  • @Raphael_Campos
    @Raphael_Campos 11 หลายเดือนก่อน +14

    Dude managed to piss everyone off and still make a bad review.
    No comparison to other generations, did an useless price comparison when XX60 were always on par with previous generation XX70, except for this gen.
    Also, there’s a reason standardized testing exists, it’s to remove the CPU of the equation, testing it’s raw power as is testing rasterization performance. Since DLSS/FSR isn’t available in every game.

  • @rustybobdotca
    @rustybobdotca ปีที่แล้ว +23

    The biggest problem with your argument is that you leave out the 3000 series. At times the 4060 barely beats the 3060. At times it gets embarrassed by the 3060 (Last of Us at 1440P). It gets embarrassed by the 3060Ti. The fact that the 4060 can't beat previous gen card one step above it, is embarrassing.
    If the 4060 was priced at $200, nobody would be upset. It's not. It's barely faster than the RX7600. It loses to the RX7600 in ray tracing at times!

    • @CraftComputing
      @CraftComputing  ปีที่แล้ว

      The 3060 doesn't matter to a consumer with $300 in their pocket at Microcenter today.

    • @hoihoi8
      @hoihoi8 ปีที่แล้ว +16

      @@CraftComputing Who shops in an actual store. Amazon has tons of 3060's

    • @FukurouMafia
      @FukurouMafia ปีที่แล้ว +19

      @@CraftComputing yes it does. and it's sad to see that someone's willing to die on this particular hill, instead of letting it go. you're obviously wrong.

    • @mikeymaiku
      @mikeymaiku ปีที่แล้ว

      i still dont understand this stance, if the gpu was 200 dollars what type of technology are you expecting? cost of services, cost of employees, cost of business has gone up, how can you expect a company not to increase prices when thier bottom end / operating cost also increase?

    • @vincent_9494
      @vincent_9494 11 หลายเดือนก่อน +1

      @@CraftComputing
      so you say that just because somebody has 300$, he/she should spend all of it automatically, even if he/she can buy cheaper (even much cheaper) alternatives with far better performance/price ratio? For example RX6600/XT or Intel Arc, all still sold brand new?
      "Hell yeah, throw that extra 100$ into it, baby, you will get on average 20% more performance for 50% more money, but DLSS3 is the king and it consumes somewhat less electricity, well worth it! Do not even think about anything else!"
      And you are trying to pretend that you are on side of the budget customer and your intention is to give better information to them than those who you criticise? WTF? Do not be surprised if common folks grab their pitchforks after listening to you!

  • @CoryMT
    @CoryMT ปีที่แล้ว +168

    My opinion about DLSS & FSR is similar to SLI back in the day. If it's not supported by all games and can't be relied on, then don't buy a card using that technology as a criteria. Similarly I don't want benchmarks to take it into consideration.
    From a channel that endorses buying used older generation technology and showing that it's still usable, I find it disappointing that there were no comparisons to older generation cards.
    We as consumers don't need to consider Nvidia and AMD needing to make more money, and just roll over for them, when we feel like they are trying to shift the value equation into their own favor even further.
    Generation to generation these cards are a bad value, and even if you don't want to sound like an apologist, you do.
    I assume that you wanted to spin your video into this direction to stand out in the sea of critical videos saying the same thing.

    • @mjc0961
      @mjc0961 ปีที่แล้ว +2

      Well said. It's been discovered that AMD is intentionally blocking games from introducing DLSS, based on that alone we cannot factor DLSS into our purchasing decisions. And having to limit benchmarks to only the games that didn't get kneecapped by AMD is no good.

    • @llynellyn
      @llynellyn ปีที่แล้ว +4

      My opinion about DLSS & FSR is the same as FXAA, it's just a way of sacrificing quality to raise FPS, may as well just turn down the shadow quality instead.

    • @dex2531
      @dex2531 ปีที่แล้ว +5

      @@mjc0961I don’t think AMD is blocking, developers are choosing FSR because it’s much easier to implement. Most of the game engines have FSR support out of the box and literally takes minutes to enable then further optimization to make it work better, DLSS is not as simple and takes much more development time to implement. FSR is also 100% open source, supports all graphic cards and has very similar gains on Nvidia cards (yes in upscaling, hardware unboxed has explained this) the only downside is the slightly worst image quality.

    • @CraftComputing
      @CraftComputing  ปีที่แล้ว +9

      FSR works on any card. DLSS doesn't. If you had to choose one to implement, which would it be?
      Also, AMD develops FSR, so of course when they sponsor a game, they're going to use their tech in that project. Why pay for DLSS to be included when it's not your software anyway?

    • @minhduong1484
      @minhduong1484 ปีที่แล้ว +13

      ​@@CraftComputing "FSR works on any card. . ." Except his argument was FSR and DLSS was not on every game which would mean reviewers could not use some games as a comparison unless they turned off FSR and DLSS. Which is what they do.

  • @SB-pf5rc
    @SB-pf5rc ปีที่แล้ว +7

    videos like this are why youtube removing the dislike is bad for the viewer. there's no way to flag bad/misleading content now.

  • @Luinzito
    @Luinzito ปีที่แล้ว +11

    can you smell it? it's a nvidia fan

  • @BansheeBunny
    @BansheeBunny ปีที่แล้ว +82

    4:30 Nvidia does not want reviewers to use rasterization to test their cards because it would show only a small uplift from the last generation, regardless of what components you use in your bench. The other problem is DLSS (pick a version) is only available in a limited number of games and Jedi Survivor is not one of them.

    • @mapesdhs597
      @mapesdhs597 ปีที่แล้ว +7

      Plus, dirrect native raster comparisons make the 4060 look especially bad at 1440p.

    • @igelbofh
      @igelbofh ปีที่แล้ว +1

      So, technology has plateaued. Game devs should become less stupid and wasteful. Games with visuals that were easily achievable in 2015 on 960 if released now would require a 4080, for no god reason and no improvement in visuals

    • @marcogenovesi8570
      @marcogenovesi8570 ปีที่แล้ว

      @@igelbofh (laughing in crappy console port)

    • @efad3215
      @efad3215 ปีที่แล้ว +2

      @igelbofh has tech plateaud or is there just not as much r+d on rasterization compared to the r+d of dlss/fsr/frame-gen/the AI boom?

    • @Karthex
      @Karthex 11 หลายเดือนก่อน +2

      For me it's this, very few of the games I play have DLSS support so the raster performance is more important.

  • @adamhafiddin9564
    @adamhafiddin9564 ปีที่แล้ว +33

    Honestly its all come down to the value of these cards. Why would anyone buy a 300-350 dollar card (based on the price id seen in my country) when you can achieve the same performance with rx 6600/50xt for around 100 ish dollar cheaper? The last gen cards are the much obvious choice giving the lack of performance gain over the previous gen

    • @CraftComputing
      @CraftComputing  ปีที่แล้ว +3

      Someone would buy a $300 card if shopping for a new $300 card and the options are between $279 and $299. Gains over the previous generation don't matter to someone wanting to buy a card today, and are talking points to rake nVidia over the coals about, not tell consumers to maybe buy a new bike instead.

    • @DeathlyShadowXD
      @DeathlyShadowXD ปีที่แล้ว +20

      ​@@CraftComputingwouldn't they just buy a much better last gen card at $300 then?

    • @emlyndewar
      @emlyndewar ปีที่แล้ว +10

      @@DeathlyShadowXD he doesn’t get it at all. Maybe he needs to calm down on the craft beer.

    • @rayjaymor8754
      @rayjaymor8754 11 หลายเดือนก่อน

      @@CraftComputing > Gains over the previous generation don't matter to someone wanting to buy a card today
      Sort of.
      For sure, a customer that is buying a brand new card or isn't upgrading has very little to lose with buying a 4060 compared to a 3060 Ti.
      (Except for the fact that in most cases a 3060Ti is still slightly cheaper, and has better performance).
      The issue is that for what the customer is paying, the performance SHOULD be better.
      nVidia are clearly shifting expectations and price points and this deserves to be called out.

  • @RyugaHidekiOrRyuzaki
    @RyugaHidekiOrRyuzaki ปีที่แล้ว +61

    In a decade of TH-cam, this is the first time I'll bother to dislike a video, despite being a long term fan of the channel. I won't repeat what everyone else is saying in the comments

    • @CraftComputing
      @CraftComputing  ปีที่แล้ว +6

      I respect your opinions. You can politely disagree with mine. Can't wait to watch your comprehensive coverage of GPUs when I fade into obscurity.

    • @kazuhu580
      @kazuhu580 ปีที่แล้ว +25

      @@CraftComputing Bro thinks he's important 💀

    • @Doug-mu2ev
      @Doug-mu2ev ปีที่แล้ว +11

      @@CraftComputing your video was so powerful it motivated a someone to break a decade of silence! That is impressive!

    • @CraftComputing
      @CraftComputing  ปีที่แล้ว +3

      Michael Superbacker was the first comment on this video, so you've got to go further than that to impress me.

    • @emlyndewar
      @emlyndewar ปีที่แล้ว

      @@CraftComputing your head is so far up your arse. This seems extremely out of character for you. Hey, keep attacking the fanbase the though. Great strategy… Where’s that unsubscribe button?

  • @TR.Pixels
    @TR.Pixels ปีที่แล้ว +19

    When a 3060Ti is faster than a 4060 and is ever so slightly more or the same price, it kinda makes sense that the majority of reviewers wouldn't recommend it.
    Also high graphic settings are needed to remove the CPU bottlenecks as much as possible and put the majority of the strain on the GPU, changing settings just because it's a slower card would give cards a deceiving advantage which is not accurate to relative performance of other cards at all.

    • @nathangamble125
      @nathangamble125 ปีที่แล้ว +1

      Using lower settings in GPU benchmarks can be useful in some situations. Sometimes one GPU will scale better at lower settings than another, especially when differences in VRAM capacity and bandwidth become relevant. The relative performance of different parts of the GPU (e.g. ROPs, TMUs, shaders) can also affect performance scaling at different settings.
      Testing at maximum settings to eliminate bottlenecks makes sense, but reviewers should _also_ test at lower settings, at least in some cases.

    • @TR.Pixels
      @TR.Pixels ปีที่แล้ว +2

      @@nathangamble125 Thats where different resolution testing comes in.

    • @cardboardsnail
      @cardboardsnail ปีที่แล้ว

      That really depends on where you live. On Newegg for example, the 3060 Ti isn't cheaper than the 4060 at the time of writing. 4060 is going for $300, 3060 Ti is going for $333.

    • @minhduong1484
      @minhduong1484 ปีที่แล้ว

      @@cardboardsnail From one of the reviews, I remember them recommending the 3060 if it had more VRAM for the same price as it would be more future-proofed as games are leaning towards more VRAM. The 3060 Ti was a soft recommend if the viewer wanted better performance for slightly more money.

    • @loganmedia1142
      @loganmedia1142 11 หลายเดือนก่อน

      @@minhduong1484 I'm inclined to question whether the 3060 has enough processing power to really make use of the additional RAM it has.

  • @Kleptophobia
    @Kleptophobia ปีที่แล้ว +21

    It's the memory downgrade.
    It's the mere 20% performance upgrade over the RTX3060 while the 4090 is 60% faster than a 3090.
    Nvidia is not making much effort at delivering Ada's generational upgrades to the midrange.

    • @CraftComputing
      @CraftComputing  ปีที่แล้ว

      170W vs 110W. There are improvements, just not to performance.

    • @Pand0rasAct0r_
      @Pand0rasAct0r_ ปีที่แล้ว +6

      @@CraftComputing the wattage point makes no sense as you wont save much between the 2.

    • @CraftComputing
      @CraftComputing  ปีที่แล้ว

      Unless electricity is expensive in your region, or you don't want an extra 60W of heat in your gaming room.

    • @SIW808
      @SIW808 ปีที่แล้ว +6

      ​@@CraftComputingI don't think anyone was complaining about 170W power consumption. We just wanted faster products

    • @Kleptophobia
      @Kleptophobia 11 หลายเดือนก่อน +1

      ​@@CraftComputing
      That lower power limit still leads to worse performance per watt relative to its generational peers.
      Out of the box RTX 4060 are less power efficient per watt compared to RTX 4090, RTX 4080 and even the 7900 XTX according to TechPowerUp's tests.
      Less VRAM, less efficient, less generational uplift. But hey it's only 115W.

  • @jeffreyparker9396
    @jeffreyparker9396 ปีที่แล้ว +79

    Please explain how those test benches are not helpful. By eliminating bottlenecks elsewhere in the system you see what the maximum performance of the card is. Compare that to similarly done CPU benchmarks and the lower number should be roughly what you can expect, assuming you didn't create an issue with cooling or somewhere else in the system. By eliminating bottlenecks you get the raw performance of the part and by looking at the raw performance of each part when building a system you can reasonably see where your bottleneck will be and that will show you an estimated performance.
    By testing with hardware that might be reasonably paired with the part, in order to get the same information I outlined above you essentially have to test every single possible combination. Basically by allowing bottlenecks to happen in different parts of the system this data actually becomes useless unless someone builds the exact system that is tested while the other reviewers tests remain useful no matter what parts are used in the build.

    • @jeffreyparker9396
      @jeffreyparker9396 ปีที่แล้ว +26

      Also the different settings argument is completely invalid because gamers nexus, LTT and jaystwocents reviews included lower settings that were more reasonable for this card.

    • @diabloii72
      @diabloii72 ปีที่แล้ว +24

      This was honestly the most BAFFLING part of the video to me.

    • @CraftComputing
      @CraftComputing  ปีที่แล้ว +10

      I didn't say the test benches weren't helpful. I said for compiling absolute performance lists, that is EXACTLY HOW IT SHOULD BE DONE.
      What I also said was it does little to provide a potential buyer of a budget card what performance they should expect in a budget PC. And THAT you could argue is a more important part of a GPU review than evaluating the product stack as a whole.

    • @jeffreyparker9396
      @jeffreyparker9396 ปีที่แล้ว +23

      @@CraftComputing and I explained how a buyer can derive the potential performance from the absolute performance numbers and how doing it differently is less helpful.

    • @jeffreyparker9396
      @jeffreyparker9396 ปีที่แล้ว +15

      ​@@CraftComputingyou said that it was unhelpful for buyers, I explained exactly how it helps buyers. The fact that you talked about performance lists which I never once mentioned indicates that you didn't actually read my entire comment.

  • @liffordius8630
    @liffordius8630 ปีที่แล้ว +20

    Hi Craft Computing, I think you forgot that the previous 30 series and 6000 series both have dlss2 and fsr2. The feature that the 4060 brings to the table is a dlss3 frame generation which you did not showcase that being said even with your comparison the 4060 with dlss is still performing within margin of error with a 3060. You can still buy new in box last Gen cards with warranty as well.

  • @zom.
    @zom. ปีที่แล้ว +76

    Hey Jeff, I do see your point...
    However, there are some key points that I believe that you are missing. Especially when targeting lower cost builds and cards. It's just impossible to ignore the fact that Nvidia's value proposition isn't great this time around. When options such as the 6700xt, 6600xt, 3060ti, and last gen's higher VRAM 3060 exist, all with mostly better gaming performance, (give or take the 3060) I believe it's just wrong to point people in the direction of the budget end of current gen graphics from team red and green. The only reason I believe AMD remotely weasles out of this, is their consistent aggressive pricing (not to mention constant sales) on the low end.
    I agree with your argument that FSR/DLSS is a perceived user value add, and experience add, and I agree that we should be using that data in reviews, however, I don't think it can crutch bad raster performance in terms of card value. There is still quite a lot out there that does not support these upscaling techniques, and therefore I don't think that it can be put into the equation when calculating complete gaming value. Also, take in to account that 3000 series supports DLSS 2.0. It gets ever more complicated with FSR 2.0.
    I believe that your argument stands while comparing in generation, but I don't think thats what people are mad about. It's just erosion in value, loss in generation over generation improvement and better options from last generation at the same price. Cross comparison in between generations is what kills this argument in my mind.
    (edit, I'm seeing other people talking about PCIe 8x, and the smaller memory bus. I think the point of PCIe 8x, especially on older systems still possibly on PCIe 3.0, where 8x is a huge hit, is a massive point, especially as these types of consumers will be shopping at the same price point.)

    • @CraftComputing
      @CraftComputing  ปีที่แล้ว +5

      What are you alternatives if the answer is "everything sucks, don't buy anything"? I hear the PS5 and Series X are pretty great.

    • @Fay7666
      @Fay7666 ปีที่แล้ว +53

      @@CraftComputing 6600/6650/6700/7600? Even the 3060s are still being sold new. Not to mention the used market if you're willing to go for that.

    • @RawmanFilm
      @RawmanFilm ปีที่แล้ว +2

      This.

    • @MayonesaMayhem
      @MayonesaMayhem ปีที่แล้ว +6

      I think the single largest hit to the value of this card and the TI is the cut to pcie lanes, more often than not I see people with older systems buy into 60 class cards, which means that I’d urge anyone with a pcie 3.0 system to consider a different card.

    • @zom.
      @zom. ปีที่แล้ว +24

      ​​​@@CraftComputingWell, the point that I was making was that the alternatives are last gen in the budget segment. NVIDIA is poorly competing with itself. Was literally the entire point of half of my post! 😂
      Last gen is cheap, performant, and abundant!

  • @Pheatrix
    @Pheatrix ปีที่แล้ว +28

    You have valid points.
    However there is one big point I can't aggree with:
    Last-gen card comparisons are important!
    Especially if they are still available and cheaper. A review that ignores a cheaper alternative you can still buy is ignoring the reality. Why would I buy a shiny new card for more money if I can buy a less shiny old (not used!) card that has better performance for less money?
    That is not only important for holding the companies accountable, but also important for making an informed choice when spending your money.

    • @Timi7007
      @Timi7007 11 หลายเดือนก่อน

      This! Even with the focus on price-point this review was lacking the 3060 and 6700.

  • @RmnGnzlz
    @RmnGnzlz ปีที่แล้ว +34

    He's right guys, we have to think about sustainability for the company. They can't just give us more, look at their finances lately, they'd go bankrupt.

    • @cristiano14068
      @cristiano14068 11 หลายเดือนก่อน +3

      Who cares if they go bankrupt? Does Nvidia cares if you (consumer) go bankrupt? No. Then why the opposite? We need quality gpus with good pricing. Otherwise, let them go bankrupt some other company will fill their gap. Simple.

    • @AIC_onyt
      @AIC_onyt 11 หลายเดือนก่อน

      NVIDA is a datacenter AI manufacturer anyway. the consumer cards dont mean shit to them. they sell us a 4050 as a 4060 with lower memory badwith, while their servers have 100+ gb of HBM VRAM (afaik). every machine they sell a Data center makes them 300k. consumer GPUs are a waste of silicon for them

  • @queenform
    @queenform 11 หลายเดือนก่อน +5

    imagine being THIS CONFIDENTLY wrong

  • @Kelekona_808
    @Kelekona_808 ปีที่แล้ว +20

    I can't get behind the car comparison because you come down to comfort/style/quality of life differences between products. While mostly between computer parts your just looking at raw performance between the stack. Sure there are quieter cards and cards that lean towards a certain aesthetic, but they primarily are judged based on how they perform and less on the other factors.

    • @CraftComputing
      @CraftComputing  ปีที่แล้ว

      So my plan for an automotive spinoff channel should just be test every car for top speed on a banked oval then. Or are you saying there is room for nuance in reviews?

    • @kopasz777
      @kopasz777 ปีที่แล้ว +17

      @@CraftComputing let me know when subjective aspects become benchmarkable. A benchmark is to provide the ground truth, forming an opinion is on the viewer's end.

    • @Kelekona_808
      @Kelekona_808 ปีที่แล้ว +4

      @@CraftComputing There's room for nuance on quality of life features all things being equal. Price to performance for an entire line/level of card probably will take priority before nuance when comparing between products for me.

    • @davemichael1859
      @davemichael1859 ปีที่แล้ว +12

      @@CraftComputing That analogy is plain wrong, reviewers are testing cards ENSURING that they receive the same workloads - then and only then a reviewer can show the RELATIVE performance gain from generation to generation or family to family. DLSS has this particular issue that it isn't supported on older graphic cards or non-nvidia ones. How should we test the relative performance of 4000 series vs. 3000 series if the workload's not the same? Maybe we can do it for 5000 vs 4000 series but now it's impossible. There's some analysis on DLSS performance by Hardware Unboxed and they even went out of their way to ensure that the DLSS version is THE SAME. In your vid one comparison is AMD card + FSR vs nvidia card + DLSS - what the hell am I going to do with that? I can't use that information. I can, however use the information that card X is ~10% faster than card Y at a 20% price increase. I can extrapolate/estimate based on that. I can't do it on "nuances".
      As for testing under components which may bottleneck your card... It's tricky and time-consuming. I'd love to see a graph on how different CPUs affect a card but you provided a single test point. Test point is useless to me. Finding out which CPU family starts to seriously bottleneck a card is something I can use to see if I need to upgrade or not.
      And btw, "top speed on a banked oval" is analogous to testing games on low settings, not ultra.

  • @nohay4549
    @nohay4549 ปีที่แล้ว +33

    The only issue with 4060/4060Ti and 3060/3060Ti is that they are not offering much performance uplift compared to 30th gen over 20th and 20th gen over 10th gen for the same or more expensive price. Previously 60s series cards had better or equal to 70s series of older generation. That didn't happen with 40s series cards

    • @SomePotato
      @SomePotato ปีที่แล้ว +4

      Absolutely. Nvidia has been taking the piss for several generations.

    • @mkvalor
      @mkvalor ปีที่แล้ว +1

      It's faster than the 3060. And it runs cooler and uses less power. So let's not pretend there's no benefit whatsoever.

    • @andreewert6576
      @andreewert6576 11 หลายเดือนก่อน +1

      @@mkvalor yes but the new 60 should match or beat the old 70 card. This has been true for 660, 760 (which *was* a relabeled 670), 960, 1060, 2060 and 3060. For a decade, we had steady progress. The same performace for less money every generation.
      Then, with 2xxx we got the same performance for the same money. Sure you could buy a 2060 but it cost the same as a 1070. Since then, nVidia have found ways to make it look even worse.

  • @garwynrosser8907
    @garwynrosser8907 ปีที่แล้ว +60

    As you said "it's NVIDIAs fault". So we don't have to accept it. They have to sell cards to us... We aren't forced to give them money.
    Additionally, most of these reviews do focus on performance. And we are being given cards that perform equal to or only slightly better than last gen with much higher prices. With the exception of the top tier cards you want us to ignore.
    They advertise the extreme performance in these top tier cards, then offer us leftovers. As if simply selling a 4090 gives them permission to put a premium on everything else they sell.
    So, what's the point of this video exactly? It doesn't even sound like reviewers are doing the wrong thing. It seems your argument is that people are greedy and we should except that. Guess what? No, I'm not going to accept that. I'll hand my money over when they sell something worth the value.

    • @ignacio6454
      @ignacio6454 ปีที่แล้ว +6

      Is called being an Nvidia chill. He is a wannabe chill you see? He is expecting Nvidia to reach out with the checks or the GPU's for him to play and review for free. THAT is the point of the video brother, I found it for you.

  • @esoel
    @esoel 11 หลายเดือนก่อน +4

    When you have a bad take, and you know you have a bad take, but you still decide to put it out in the world...

  • @FeintMotion
    @FeintMotion ปีที่แล้ว +22

    Steve @ HUB took this argument apart already doe

    • @CraftComputing
      @CraftComputing  ปีที่แล้ว +2

      As I said, I haven't watched or talked to any other reviewers prior to publishing. I'm curious to hear what he had to say.

  • @9techpc
    @9techpc ปีที่แล้ว +28

    I dont think that it is the lineup (naming) in general. I think many people expected more (for the price) from this new generation. While RTX 3000 series cards had a 20%+ uplift in performance to the 2000 series, the 4000 series mid tier is just around 10%, sometimes even less. And the high end brings the uplift we were hoping for, but costs way more than previous gen.
    The 4000 series lineup feels more like a "refresh" generation or a "tick" generation.

    • @CraftComputing
      @CraftComputing  ปีที่แล้ว

      How much uplift from Pascal did Turing have? How about Kepler to Kepler 2.0?
      Consumers always want paradigm-changing leaps. That's not always in the cards.

    • @Savant_Ananya
      @Savant_Ananya ปีที่แล้ว +1

      ​@@CraftComputingso you would be fine by not having a performance uplift and high prices.
      Just say you are an nvidia employee, we wouldn't care and this video is just a bad way to trash talk big TH-cam channels.

    • @CraftComputing
      @CraftComputing  ปีที่แล้ว

      When did I trash the channels? I spent 4 minutes praising their work, but asking if more data with real-world builds would be beneficial to budget gamers.

  • @MarcoLoves360gamer
    @MarcoLoves360gamer ปีที่แล้ว +71

    This is a xx50 class card so this needs all the blacklash. Should had been priced at $200. I wont be changing gpus for the next few years.

    • @CraftComputing
      @CraftComputing  ปีที่แล้ว +3

      No, it's not.

    • @aapje
      @aapje ปีที่แล้ว +33

      @@CraftComputing The die size is smaller than nearly every x50 released in the past.

    • @jmiller007
      @jmiller007 ปีที่แล้ว +2

      @@aapje It's made on a smaller process. The Ad106 has 22.9m transistors vs 12m on the GA106 (3060).

    • @MarcoLoves360gamer
      @MarcoLoves360gamer ปีที่แล้ว +4

      @@aapje too much cope on his part.

    • @aapje
      @aapje ปีที่แล้ว +14

      @@jmiller007 You conveniently leave out that AD106 is not in the 4060. The AD107 that is in there has fewer shader cores than GA106.

  • @ledoynier3694
    @ledoynier3694 ปีที่แล้ว +26

    As many said , you nicely dodged the 3060 elephant in the room. The only reason i see to buy a 4060 is if you don't have any 3060 in stock anywhere.
    For me the benchmarks should ALWAYS be about raw power in a standardized test rig, simply because as you mentionned it's easier to compare cards (and realize the 4060 is worthless with the 3060 in existence), and the performance differences will translate to our lower power rig.
    Knowing the exact FPS we get with our mid range CPU is not really relevant.. well it's nice to know but then you leave behind another question.. Should i get this or that GPU if i plan to maybe upgrade my CPU later on? basically the raw power benchmark gives you figures for what you can get in that case.
    Basically your approach, and other reviewer's are complementary and i bet if it wasn't so time consuming they would probably do both but they have to have a life too ^^

    • @MoireFly
      @MoireFly 10 หลายเดือนก่อน +1

      It's sometimes healthy to validate benchmark assumptions by testing illogical combos. However, typically, the key assumption behind testing with high-end everything except the hardware-under-test is validated. That key assumption: that in the vast majority of situations, the vast majority of bottlenecking occurs within 1 component. While that holds, what benchmarks limited by GPU really tell you is what perf that GPU _can_ reach. And similarly, if you then check CPU benchmarks, you can find what perf that CPU _can_ reach. In that way, you can figure out which pairings make sense, i.e. wherein you avoid in investing extra cash into components that don't matter. Oh, and as a cherry on top - other reviewers did test non-maxxed out settings at occasion, and have _in detail_ looked into FSR vs. DLSS, contrary to the lies in this video, and IIRC they all mention the difference in the 4060 review conclusions too.
      This whole video is sheer nonsense. It's a comparison of limited relevance, with a methodology that's poor, and throwing around false accusations vs. other reviewers that even if they were true aren't even all that problematic for the purposes of a review.
      The one tidbit that is kind of relevant is the request to test at non maxxed out settings. That's at least reasonable, albeit potentially infeasible. And the extra effort of figuring out which settings will in many cases not even impact the rankings in the benchmark, making it a moot point. Well - except for specific settings in which one of the competitors has a significant advantage, i.e. raytracing - and hey, professional benchmarkers _do_ split that out.

  • @elihere1242
    @elihere1242 ปีที่แล้ว +13

    I can’t agree with you I am tired of nVIDIA draining my wallet tired of complaining and keep buying nvidia I have a 3090 but I’m picking up a 6700xt for my son, no more nvidia in my house after this.

    • @CraftComputing
      @CraftComputing  ปีที่แล้ว

      You chose to spend $1500 on a 3090. What game do you like to play ISN'T an option with the 3070, or a 4070?

    • @elihere1242
      @elihere1242 ปีที่แล้ว +2

      @@CraftComputing 4070 was not around I bought my card about a month after release play Witcher, cod, metro exodous red dead redemption, 3070 didn’t have 12 go of memory that why I didn’t pic that plus had 500 cash back in my card and decided to pull the trigger.

    • @Bob_Smith19
      @Bob_Smith19 ปีที่แล้ว +1

      @@elihere1242Be glad you didn’t buy a 3070. I ditched it for a 4070 because memory was an issue. Especially if your turned on RTX. I don’t regret the upgrade one bit. Better performance at lower power draw and the memory should buy me a couple more years.

    • @elihere1242
      @elihere1242 ปีที่แล้ว

      @@Bob_Smith19 agree

  • @freddiequin6328
    @freddiequin6328 ปีที่แล้ว +75

    "The answer can't be to avoid buying anything new, and to only look for fire sales of previous generation cards."
    Why not? If Nvidia has priced their new product so badly that it is more economical to buy older cards or wait, then that's what reviewers should tell their audience.
    The only way to get Nvidia to lower their prices is to boycott, and that's what reviewers need to encourage us to do.

    • @DioBrando-qr6ye
      @DioBrando-qr6ye ปีที่แล้ว +11

      Especially when those previous generations are still available new in retail stores.

    • @derekphilp9622
      @derekphilp9622 ปีที่แล้ว +11

      Because he is a consumer who needs the new shiny thing.

    • @Doug-mu2ev
      @Doug-mu2ev ปีที่แล้ว +2

      Or reviewers should review the NEW features of this generation of GPUs so we know what we are getting for the “upgrade”. I am sick of GPU reviews where they pretend that GPUs are just CPUs that do graphics. It is insulting to us to leave out things like encode/decode, AI, RT, etc when they are actually important factors to day to day life with a GPU!

    • @derekphilp9622
      @derekphilp9622 ปีที่แล้ว +2

      @@Doug-mu2ev yeah you are right the amount of time me and my friends use AI, RT etc each day is almost 90% of are awake time just 10% of our awake time if for eating and it's only a matter of time before we use AI, RT etc for that.

    • @Doug-mu2ev
      @Doug-mu2ev ปีที่แล้ว +1

      @@derekphilp9622 Neat, you win the sarcasm award. But seriously, there are people who use GPUs for more than just making frames go faster. Have you tried video super res? It’s pretty sweet but it can chug a bit on my RTX 3050, wondering if the upgrade would help.

  • @ha231
    @ha231 ปีที่แล้ว +6

    I finally subscribed last night and woke up to this. I don't know what to think but it's nothing good. Geez

    • @CraftComputing
      @CraftComputing  ปีที่แล้ว

      I said in this video I respect and appreciate outlets who use standardized test benches, but question if we could provide budget gamers with more real world expectations instead of only drag race numbers.
      I also mentioned nuance is a lost art.

  • @rainerduensch509
    @rainerduensch509 ปีที่แล้ว +26

    I wonder how a 3060TI or a 6700XT, which cost almost the same as a 4060, would perform. In my opinion the 3060TI and the 6700XT are the true winner of the lates launches

    • @Pand0rasAct0r_
      @Pand0rasAct0r_ ปีที่แล้ว +2

      6700xt about 10-15% more performance the 3060ti about 10%

  • @SpawnBootcamp
    @SpawnBootcamp ปีที่แล้ว +12

    All the other reviewers showed was that the rtx 3060 12 gb is a better choice, or get even more performance by getting an old rx 6700 xt for roughly the same price. That's how bad this card is. Hardware unboxed did a short questionnaire in which they showed that most of us are running 1440p. So why not test it at 1440p?

    • @minhduong1484
      @minhduong1484 ปีที่แล้ว

      Because it would pointless as that would make the 4060 look worse. With 8GB of VRAM and lower memory bandwidth, there would be more performance bottlenecks. Frankly these benchmarks at 1080p is as good as it gets for the 4060.

    • @Matias-wu6xp
      @Matias-wu6xp ปีที่แล้ว

      I can assure you most people are not running 1440p monitors no matter what a youtube poll of a tech reviewer says

    • @SpawnBootcamp
      @SpawnBootcamp ปีที่แล้ว +1

      @@Matias-wu6xp tbh I trust the poll of Hardware Unboxed more than your assurance.

    • @Matias-wu6xp
      @Matias-wu6xp ปีที่แล้ว

      @@SpawnBootcamp If you think a TH-cam poll of one tech reviewer is representative of most gamers than you must be incredibly gullible in your daily life

    • @SpawnBootcamp
      @SpawnBootcamp ปีที่แล้ว +1

      @@Matias-wu6xp I am saying I trust it more than the opinion on youtube of a complete stranger namely you. Learn to read instead of throwing 'gullible' around.

  • @v4714v
    @v4714v ปีที่แล้ว +16

    What an absolute dog take. As a budget-conscious buyer, I watch reviews of GPUs in maxed out parts in comparison with other cards to know which to get when no other variables have changed. That's the whole point - which card has best value for money IN COMPARISON to other cards in a CONTROLLED environment. Then, knowing I don't have maxed out parts, I adjust FPS expectations accordingly, but the cardinal order of GPUs by value for money remains the same when I adjust my parts list to what I can afford.
    By comparison, your video gives me no value, because I won't know what's the bottleneck. To my eye, your review isn't one of the 4060, but rather 4060 plus all possibility of bottlenecking that may come from your card. How can I tell whether the 4060 is the source to some issue, or your non B or X-series mobo?
    Your idea of catering to budget-consciousness consumers only has merit if you test other GPUs using the very same peripherals you listed in this video. For this, my proxy is a combination of channels who do 1080p + 1440p video setting analyses, then sometimes look up videos that show gameplay in 20 or so games with summary stats like FPS.
    At best, the idea behind this review needs more workshopping, and at worst the idea is just sensationalism with objectivity as an afterthought. 100% give you the benefit of the doubt, but come on man.

    • @CraftComputing
      @CraftComputing  ปีที่แล้ว +1

      As a PC builder, you also do nothing but put together a collection of variables into a PC, the end result of which being more important to your experience and enjoyment than the performance of each individual part in a vaccum.

  • @skatar01887
    @skatar01887 ปีที่แล้ว +4

    Got to say you lost me on this, just about every reviewer states their test are done at lower setting, refresh rates etc. perhaps guessing at what they test with, watch, learn or at least ask them under what & all conditions they are conducting the tests.

  • @g00mbasv
    @g00mbasv ปีที่แล้ว +9

    a rather arrogant take on the situation. sure, if you ignore the historic performance increase gen over gen and overlook the fact that this level of performance has been available in this price range for a while, sure, you can fill your mouth with the 'hurr durr only I'm doing it right" argument.

  • @eduardofrances
    @eduardofrances ปีที่แล้ว +54

    The 1060 when launched could play any of the games of its generatuon at 1080p at high settings.
    With that in nind the fact that both cards can't run well Cyberpunk or Jedi survivor which are the most recent just tells you how little you are receiving for your money today.
    Our expectations are marked by how good the performance of cards like the 1060 6GB in itis own generation, if current cards cant run well current games where is the value of the 4060 or 7600?
    If they can't run current games now they will struggle to play the next ones then people are investing in cards that won't provide what past generations could. That's not a good value proposition at all.

    • @CraftComputing
      @CraftComputing  ปีที่แล้ว +4

      I went back and looked at 1060 reviews before writing this video. Yes, it could play almost every game at 1080P max at 60 FPS. Our expectations are also much higher now, and 60 FPS isn't the goal. BTW, framerates from my tests at 1440p today... 188, 141, 91, 84, 23 (Cyberpunk w/rtx), 41, and 71. This was also mostly at the second highest presets, with only Cyberpunk cranked to Ultra.
      But I guess contextualizing results isn't your strong suit.

    • @Bob_Smith19
      @Bob_Smith19 ปีที่แล้ว +7

      Is no one going to address the developers releasing incomplete games that are poorly optimized. They use initial buyers are beta testers. I seem to be the only one that has a problem paying for unfinished games. It’s apparent they weren’t when patches fix the issues.

    • @SB-pf5rc
      @SB-pf5rc ปีที่แล้ว +7

      @@Bob_Smith19 developers are releasing games optimized for the current console generation cuz that's where the market is and it's a single target spec. when you have a computer that doesn't have the hardware to meet those demands then you're going to have a bad time. eventually devs tend to go back and fix things up for people with lower spec hardware.
      buying an 8gb card in 2023 is a poorly optimized build.

    • @Fay7666
      @Fay7666 ปีที่แล้ว +1

      1060 launched alongside the PS4/Xbone gen, which was famously a very underpowered generation. So while I get the point, it is a bit disingenuous to make it when they actually gave us good consoles this time around.

    • @eduardofrances
      @eduardofrances ปีที่แล้ว +3

      @@CraftComputing the games that got higher framerates were launched with cards from previous genarations, it is only normal they get high FPS.
      Current generation games like Returnal have higher demands and the recommended settings arent exactly on par with what the 4060 and 7600, these games are targeting the capabilities of current gen consoles which aren't met by this cards.
      Also DLSS frame generation and FSR aren't supported in every game and it is even worse since some have one or the other and not both, relying on these technologies to have decent performance could mean the card you choose may or may not take advantage of the technology and you may not be receiving the benefits they provide at all.
      And it is unwarranted that you have to resort to sarcasm to close your reply, I haven't been disrespectful to you, I would have expected the same thing in return.

  • @ThePhilosogamer
    @ThePhilosogamer ปีที่แล้ว +3

    My god, that opening was like consuming distilled essence of strawman. Not a single hardware reviewer maxes out the game's settings indiscriminately when testing GPUs. They use settings that are within the GPU's power profile and positioning within the company's product stack. There are no 4K ultra with ray-tracing enabled tests for a 60 class card. They do not test lower end GPUs against flagship GPUs from the same or even previous generations. They test them against other GPUs within the same class and similar price points from multiple accessible generations. They do not usually test DLSS and FSR, not because "there's no objective way to measure performance differences." They don't usually test or focus on them because they aren't widely usable in games. A consumer cannot rely on them to get the performance they may be looking for. The 8gb issue IS significant, because it does NOT just affect the most extreme ends of gaming, and is an issue that is becoming increasingly pervasive reducing or extending the longevity of the product being reviewed. Most importantly of all, test benches using expensive hardware to isolate variables are not performed to tell consumers how they can expect a particular GPU to perform in their home setup. They use that expensive hardware to single out GPU performance and nothing more. There WOULD be a problem if this were done in comparison to flagship GPUs. But, again, reviewers compare GPUs within the same or similar weight classes. There is never going to be such a significant difference in scalability that is going to magically reverse the value proposition between cards at such a low power profile and resolution.
    Virtually every reviewer has stated how poor the value proposition is on the 4060. They are not saying that solely or even primarily because of its historically poor generational uplift. They're saying it because you can spend EXACTLY THE SAME amount of money *right now* on another new card and get better performance. *Right now* you can pay $50 less and get nearly identical performance. Right now, you can pay $20 more and get significantly better performance. And all of these are from new cards that should have longer shelf lives than this one due to their larger VRAM. There are no issues with methodology, misleading arguments, or clever ways to spin it. You are basically setting money on fire purchasing the 4060 at its current price point. It is a ripoff.

  • @minhduong1484
    @minhduong1484 ปีที่แล้ว +6

    I'm sorry but I don't understand your points. Reviewers simply cannot take into account all the hardware differences and individual game settings of each person so they have to somewhat standardize how they do things to get some objective numbers. To be clear, most reviewers do not use the most expensive hardware ever to performance tests. They try to use the exact same hardware so that there is some level of comparison. Generally this means the best hardware they are not the bottlenecks in the system. On another point FSR or DLSS is not on every game or card. Some reviewers test them but show them in more fair comparisons like Nvidia cards only for games that had DLSS turned on.
    But to address your hardware setup, it does not reflect the vast majority of people these days. Very few people are using an ITX A520 MB that has limited supported of Zen and Zen3 processors.
    The other thing is I do not understand what is your point about comparing against the best GPUs. As a matter of practicality the high end GPUs are the ones released first in every generation. In this case the 4090 was released in Oct 2022 with the 4060 only now being released. So every new card is compared to every previously released card. Could reviewers go back an rebalance benchmarks every single time a new card is released? Yes, they could but then every old review is effectively useless as reviewers shift numbers.
    Then is your statement "Why is it today that reviewers think there is such a disparity between top end and bottom end cards?" What? There has ALWAYS been a disparity. This is not the problem today. The problem today is that for almost the same money as the 3060, Nvidia is launching the 4060 that in some cases does not beat the 3060. It is not because the 4080 or 4090 existed. To me, you seem to focus too much on the 4080/4090 numbers and ignore the 4060/3060 comparison. That is why reviewers are saying it does not provide a lot of value for the performance.
    And the last thing, these reviewers are not saying these are the worst cards known to man and those someone upgrading an older GPU will see no value. They are all saying the 4060 value wise is not good compared to a 3060. With some 3060s on sale and having more VRAM for the same price as a 4060, they should probably get the 3060.

  • @randonkbay
    @randonkbay ปีที่แล้ว +4

    Clickbait title says they're all wrong, first 30 secs of video says, "They're not wrong." Never seen your videos before but that's how far I made it.

  • @juanvalencia9378
    @juanvalencia9378 ปีที่แล้ว +20

    Although it's true that Nvidia, Intel, AMD, and especially Apple, among others, don't care about us the consumers, that's not entirely the issue. It's about how blatant bad and prevalent it's gotten. Simply because others do it or have done it in the past doesnt make it ok. The only way things will change is if consumers, not all but enough, protest the status quo with their wallets.

  • @Atheismo9760
    @Atheismo9760 ปีที่แล้ว +4

    21:15 No, it doesn't. It can't beat the 3060 ti, when it should be beating the 3070 ti.

  • @zacharylewis417
    @zacharylewis417 ปีที่แล้ว +6

    4060 makes sense for someone that is several generations behind. Or needs to look at power consumption because costs of electricity has gone up.

    • @CraftComputing
      @CraftComputing  ปีที่แล้ว

      What would you build brand new today? You have $800. No used parts. Make a list here.
      CPU
      Motherboard
      Memory
      GPU
      Storage
      Power Supply
      Case

    • @cardboardsnail
      @cardboardsnail ปีที่แล้ว +3

      @@CraftComputing
      12400F ($150)
      Asrock B660M Pro RS ($95)
      2x16GB Team T-Create DDR4-3200 ($55)
      Asrock Phantom Gaming D 6750XT ($360)
      Team MP33 1TB ($38)
      Thermaltake Toughpower GX2 600W ($67)
      Gamemax M60 Dual Mesh Case ($35)
      Prices from Newegg as of time of posting

  • @iguanac6466
    @iguanac6466 ปีที่แล้ว +18

    This card is aimed at low/midrange systems, but it takes a performance nosedive on anything that doesn't have a GEN4 PCIe slot. How much design+$$$ would it have cost them to support PCIe GEN3 X16? Was the memory bandwidth nerf a cost savings thing or an intentional nerf to make sure this fit the performance profile they wanted for the 4060?
    Too many caveats when I can get a card that is objectively faster for less money.

    • @CraftComputing
      @CraftComputing  ปีที่แล้ว

      Ryzen 3000 series introduced PCIe 4.0 in July 2019. That's another point reviewers do bring up, but it only applies to systems that predate the standard. It would make a great conversation in an upgrade video, not necessarily for initial reviews or build guides.

    • @lordofhyphens
      @lordofhyphens ปีที่แล้ว +2

      Nvidia tried to minimize their die size with the 4060. That limits the I/O width in terms of pins. This is also why the memory bandwidth is anemic. They tried to compensate with cache and DLSS3 Frame gen, but if your games don't support the tech and you have less sensitivity to power costs you can do quite well with a 3000 series card from a value proposition.

    • @CraftComputing
      @CraftComputing  ปีที่แล้ว

      @iguanac... The 5600G is PCIe 3.0 only.

    • @Fay7666
      @Fay7666 ปีที่แล้ว +1

      Z490 and 10th Gen were still PCIe3, and those platforms are modern enough not to trash them but to consider a 4060 class card just fine. Even on the new side as I think the i3 10100 is still on sale and recommended as a solid baseline for a budget system. And as someone likely mentioned, 5600G (and similar) are PCIe3.
      Those basically are the kings of budget, that are still modern enough and would likely aim at the 4060 camp.

  • @alhdgysz
    @alhdgysz 11 หลายเดือนก่อน +4

    Pretty ignorant take. No one is complaining against the relative performance against the 4090.
    Cannot wait to hear this on the WAN show

  • @ThatsTheMidnightGamer
    @ThatsTheMidnightGamer 11 หลายเดือนก่อน +5

    Imagine trying to call out Gamer's Nexus for bad review practices

    • @christopherjames9843
      @christopherjames9843 11 หลายเดือนก่อน +2

      Yeah, that takes some balls or stupidity.

  • @jasonscherer2631
    @jasonscherer2631 ปีที่แล้ว +7

    From what I've seen from most people I know, 1080p is still really the prime spot for gaming. With inflation people aren't really just upgrading systems in general lately. I'm even still running a GTX1080 due to money constraints and life in general. I would love to have even that much money to just buy a new card right now.

  • @arthuralford
    @arthuralford ปีที่แล้ว +5

    "Things are going to get awkward at LTX." Wiser words have never been spoken

    • @CraftComputing
      @CraftComputing  ปีที่แล้ว +3

      I actually welcome the conversation. My video was about how we as reviewers might serve budget markets better, and in the end, everyone should win.

  • @markkoops2611
    @markkoops2611 ปีที่แล้ว +8

    The point the other reviews were pointing out was that this doesn't compete with last gen card at this price. The 4060 is not competive at this price unless you're looking for new features such as av1 or new dlss.
    At 1080 where nvidia wants to place this gpu, it's not a good value

    • @ssplayer
      @ssplayer ปีที่แล้ว

      If you adjust the launch price of the 3060 for inflation the 4060 is about 20% cheaper.

    • @loganmedia1142
      @loganmedia1142 11 หลายเดือนก่อน +1

      @@ssplayer But it is not in the same class as the 3060. It's actually a 4050. Therefore it is nearly 20% more expensive.

  • @MadsonOnTheWeb
    @MadsonOnTheWeb ปีที่แล้ว +5

    This is a 300 dollars GPU in place of a 200 dollars GPU. There's no such thing about bad GPU, only bad prices. Quoting RGHD

    • @CraftComputing
      @CraftComputing  ปีที่แล้ว

      Eggs are $4 today vs $1.50 last year.

    • @MadsonOnTheWeb
      @MadsonOnTheWeb ปีที่แล้ว +1

      ​@@CraftComputing Eggs and other stuff fluctuates, a lot. I get what you want to say but isn't a good comparison. Usually prices go to a new normal acceptable value, which isn't happening here. I guess it won't. Nvidia is doing the Apple move. If people pay, why would we decrease prices?
      There's nothing wrong to fight for better prices instead of just accepting it.

  • @gucky4717
    @gucky4717 ปีที่แล้ว +8

    There are 2 problems. First people can compare the cards by DIE-sizes. In that case, the 4060 has the size of a 3050, and thus they treat it like that.
    But there is also the problem of cost, a 5nm TSMC chip is MUCH more expensive compared to a 8nm Samsung one.
    Nvidia also didn't themselfes a favour with the naming scheme. In MY opinion, the 4070Ti should have been the 4070 and all below one class lower. And ALL the prices below the 4080 should have been about 100$ lower. In that case some people might have called the 699$ 4070 expensive, but with a almost 3090 performance there wouldn't have been such an uproar.
    For more details:
    4080= 999$
    4070ti = 4070 = 599$
    4070 = 4060Ti = 499$
    4060Ti 8GB = 4060 = 349$
    4060 = 4050 = 249$
    Fun fact, here in germany, those lower prices are almost the actual street prices right now... with the exception of the 4070Ti for 100€ more.

    • @CraftComputing
      @CraftComputing  ปีที่แล้ว

      Relating to die size, how much more is TSMC charging for their 5nm process than the 14nm of Pascal?

    • @gucky4717
      @gucky4717 ปีที่แล้ว

      @@CraftComputing There are estimates for 17.000$ to 20.000$ for one 300mm wafer in 5nm. I don't know the exact number, but 14nm TSMC in 2015 must have been around 4000$ per wafer.

    • @loganmedia1142
      @loganmedia1142 11 หลายเดือนก่อน +1

      Not just die size, but the other specifications, like power consumption, bus width and number of cores. They all point to this being a 4050.

    • @gucky4717
      @gucky4717 11 หลายเดือนก่อน

      @@loganmedia1142 "What would be historicly a 4050."
      But in the end Nvidia makes the name of a card...

    • @vincent_9494
      @vincent_9494 11 หลายเดือนก่อน

      Not only the die size, but the chip naming convention, core count, power consumption as well (basically everything) are lining up to be able to declare that the 4060 is a 4050 in reality.

  • @ZomgZomg007
    @ZomgZomg007 ปีที่แล้ว +6

    Lol dislike bar shows what the consumers want

  • @greenprotag
    @greenprotag ปีที่แล้ว +40

    An interesting video. I agree about a budget appropriate build, but I find it odd that you reference the comparisons that still make sense, but are not included. There are new, old stock offerings of previous gen cards that negate the "used GPU" issue while still offering good performance. I feel like the 3060ti & 6700xt arguments are VERY valid . 'Without talking to anyone, I bet people are going to mention these cards' yes because they are relevant in your $800 build and when you are a profit oriented company, making a 4060 that doesn't even beat the 3060 ti is kind of a problem when that previous gen card is available NEW and USED with VERY competitive pricing. I think the reason I was personally considering the 4060ti was not necessarily the max performance, but rather the efficiency vs the previous gen, but that was mainly because I was focused on VERY small formfactor with restrictions on power budget and cooling.

    • @RandarTheBarbarian
      @RandarTheBarbarian ปีที่แล้ว +3

      That and for a budget oriented consumer the cut to 8 lanes running on the "budget" $300 card is something that we have to take into account. If someone is doing a budget build it's not out of the realm of possibility that something older is going to be retained, and if the 4060 isn't always beating the 3060 ti at full power what happens when we cut it down to an effective x4 by modern standards by virtue of the connections not being there. If a 3060 ti will perform at full power at half the lanes that's the exact equivalent of running a 1 gen out of date PCIE standard, if the 4060 is already at half the lanes does it fully perform with a quarter of what would be full bandwidth? If not then that makes the loss of the 40 series budget to its 30 series predecessor even worse on paper given how budget building actually goes.
      In the 10 series we didn't even do that as standard with a 50 class card. As I see it with that and the fact it sometimes loses to a 3060ti it should be named and priced as a 4050 ti at worst, and I'm half tempted to call it the 4040 because of the lane cut.

    • @minhduong1484
      @minhduong1484 ปีที่แล้ว

      "when you are a profit oriented company, making a 4060 that doesn't even beat the 3060 ti is kind of a problem when that previous gen card is available NEW and USED with VERY competitive pricing. "
      Personally it makes sense to me that a profit oriented company would brand a 4050 as a 4060 even with 3060s still around for more profit. I would imagine that any 3060s out there even new are probably the last of the production run so profit oriented company might take a hit in the short term but wait it out for more profit in the next few months as the 3060 stock is depleted and consumers have no choice.

  • @Devilion901
    @Devilion901 11 หลายเดือนก่อน +3

    Expensive test bench's are needed so that they don't bottleneck the card, such standardizations may not be realistic for the potential end user with 299$ budget BUT these tests are absolutely needed, there is a difference between what's realistic vs whats real, I prefer latter.

  • @EmperorTerran
    @EmperorTerran ปีที่แล้ว +34

    I like being contrarian, and I will be buying 4060 for my father to replace his old 1060 and I love that it has plenty of power for his games while being so power efficient. But I think hardware unboxed summed it up well, you could have for a year been playing on 6650 XT for the same price and get same performance. It is embarrassing for nvidia.

    • @minhduong1484
      @minhduong1484 ปีที่แล้ว +2

      My take from some of the reviews is that if you can get a 3060 with 12GB for same price (and you can), get that instead. The 8GB memory limit is starting to become a problem on some games. Or for slightly more, get a 3060Ti albeit there is less VRAM.

    • @four11
      @four11 ปีที่แล้ว +5

      I don't think contrarianism is a valid purchasing strategy when you're just buying a bad product. If anything, buying an Intel or AMD graphics card is what would make you an actual contrarian, not the current market leaders which most people go for.

    • @Shapershift
      @Shapershift ปีที่แล้ว +6

      "being contrarian"
      "Buys Nvidia"
      LOL

  • @eugkra33
    @eugkra33 ปีที่แล้ว +5

    I'd be curious to know how you benchmarked each game. Did you use the exact same same place in the game doing the exact same actions or used a build in benchmark? Because those FSR and DLSS gains and loses in some games seem really odd.

    • @CraftComputing
      @CraftComputing  ปีที่แล้ว +3

      I am very meticulous in benchmarking. Same scene or savepoints, using repeatable areas in games as often as possible, same general game play, and I record three different passes, each 3+ minutes, and restarting between game runs.

  • @NinthSettler
    @NinthSettler ปีที่แล้ว +4

    Why would i buy this when i can buy a used 1080 ti for less

    • @CraftComputing
      @CraftComputing  ปีที่แล้ว +1

      Warranty, driver support, DLSS, RTX, hasn't been running in a mining rig for 5 years or an enthusiasts gaming rig for 7.

  • @jordanmartinetti8224
    @jordanmartinetti8224 ปีที่แล้ว +5

    You have tons of content buying old and used hardware. And your suggestion in this video is we have to buy new? I was on the fence about unsubscribing but now I know I will and not miss anything

    • @OldBuford
      @OldBuford ปีที่แล้ว +1

      Did the same. I used to at least learn something about old tech/networking from this guy but this entire video has stripped away any credibility he had ad a tech reviewer

    • @CraftComputing
      @CraftComputing  ปีที่แล้ว

      Me saying "we should provide more relevant benchmarks for budget gamers" has ruined your perception about whether I can be trusted to review budget products.....cool...

  • @im36degrees
    @im36degrees ปีที่แล้ว +21

    I see your point comparing the 7600 to the 3060, but it was Nvidia and AMD that decided to keep making the last gen cards to fill out their product stack, so comparing this card to the 6700xt is completely valid.... its not a fire sale if they continued producing these cards well into the new generation.

  • @IrocZIV
    @IrocZIV ปีที่แล้ว +5

    Wow, that dislike ratio.

  • @MarcosCodas
    @MarcosCodas ปีที่แล้ว +5

    No, sorry mate. Love your videos, just not this one. That's a XX50-tier die at XX60-tier pricing. There's plenty of options, too. Brand new, in box, with warranty. So, no. Particularly HU's coverage, which is super thorough when it comes to dollar-per-frame, new-and-used market pricing, etc, is so complete, there's not a question in my mind that it's the correct approach.

  • @derekphilp9622
    @derekphilp9622 ปีที่แล้ว +6

    ​​I think jeff has an incredibly bad take and needs to step away for a while.

  • @evilgeek87
    @evilgeek87 ปีที่แล้ว +4

    I don't really disagree with the argument that lower end cards shouldn't necessarily just be benchmarked at high resolution ultra settings, however I've seen plenty of reviews that don't do that. I've seen reviews use both the highest end CPU and an i5 12400 and running games at medium settings. Also, it's not unfair to say $300 is a lot of money to spend for a card that doesn't have enough vram to max out 1080p and not stutter in 1440p. We're still looking at a supposed 60 class that has only gained 2 gb of vram since the 1060. Furthermore the fact that it can be outperformed by other cards currently on the market and at lower prices makes it a poor value. Dlss3 on a low end card is not worth $50 over last gen cards or the diminished vram. Given that the rx7600 is already selling for $250 and can compete so well with the 4060 based on your numbers already makes it the worse value imo just within this generation. Additionally, the last gen cards are worth considering because of overproduction from both AMD and Nvidia making the last green options remain a part of the product stack until they manage to sell through the glut of cards produced to satisfy an insatiable market.

  • @UncleLayne
    @UncleLayne ปีที่แล้ว +17

    If someone wants to find out what sort of performance they can get out of a GPU with their current CPU, they just need to look up reviews of both items and see whichever one performs worse, because that will be the bottleneck for that title. It's that simple. They don't need specific testing methodologies made for them to show what an arbitrarily picked and bottlenecked system can do, they can just compare the existing performance numbers of their current and prospected PC parts. It's really not that hard. All of the information in the standardized testing methods applies to ALL builds, because at least one of the performance metrics will be accurate and most relevant. You're not making things easier for budget PC gamers, you're making the information less reliable and harder to extrapolate by adding unnecessary variables. Trying to fix a problem that doesn't actually exist doesn't make you smart.

    • @Doug-mu2ev
      @Doug-mu2ev ปีที่แล้ว

      Ok, try to do the simple thing you just said and also factor in FSR 2.0, DLSS, Frame Generation and ray tracing and tell me what the perf result is. I know it comes to a shock to a lot of people but GPUs are not CPUs!

  • @donsph
    @donsph 11 หลายเดือนก่อน +2

    Someone obviously drank the Nvidia marketing coolaid.

  • @SteelSkin667
    @SteelSkin667 ปีที่แล้ว +5

    The naming is a very good point, and something that I think most enthusiasts have identified as well, but I'm not sure that is the biggest complaint about the current range of graphics cards. Renaming the whole range does allow the different series to fall in line with their usual price brackets, but it doesn't fix the issue with dubious with price to performance of these newer cards compared to the outgoing models. In the mid-range especially, GPU vendors used to be able to to offer significantly faster silicon gen-on-gen for the same dollar amount. It just feels like progress on that front has all but stalled.

  • @thedarkdade
    @thedarkdade 11 หลายเดือนก่อน +4

    That DISLIKE to like ratio. I knew this guy shoot himself in the foot eventually.

  • @vitalinov
    @vitalinov 11 หลายเดือนก่อน

    whats the 170 usd 1440p monitor?
    i have on my wishlist the
    Gigabyte M27Q-SA and MSI G274QPF-QD
    so getting a cheap 1440p ,monitor to replace the second screen interests me

  • @aemonblackfyre4159
    @aemonblackfyre4159 ปีที่แล้ว +4

    The problem with this whole „use a lower powered system“ ist also that this might mean upgrading. If you want to upgrade to a 4060 with a pcie 3.0 motherboard you’re gonna have a bad time.
    In my opinion there’s just too much cost cutting going on. Everything but the 4090 seem to be in the wrong price category. It just feels like we’re missing some cards.
    4060 feels like a 4050 and then it seems to make more sense. 4070 is what I’d expect a 4060 or ti to be and it just seems like we’re missing a true 4080 and a 4080ti plus huge price cuts on the whole line up.
    It used to be the 90 or Titan cards were prohibitively expensive compared to the rest of the stack and now it’s basically the best value card since everything else is both too expensive and too slow at the same time.

    • @CraftComputing
      @CraftComputing  ปีที่แล้ว

      The 5600G is a PCIe 3.0 GPU. So try again.

    • @aemonblackfyre4159
      @aemonblackfyre4159 ปีที่แล้ว

      @@CraftComputing now compare it to a 3060 it’s got full 16 lanes plus more memory plus more bandwidth and you’re even paying less
      I’m pretty sure der 8auer made a video about this topic for the 4060ti and I’d guess this translates to the non ti aswell
      The only thing that I understand is the power argument (which you didn’t make) since it’s a 50 class chip it just sips power so if you’re buying new you might go for the 4060 since it dumps less heat in your room and saves you some money depending on prices (living in Germany with the 2. highest energy prices in the world helps that argument)

    • @loganmedia1142
      @loganmedia1142 11 หลายเดือนก่อน

      It is a 4050. Nvidia have just called it a 4060 in an attempt to fool people into thinking the xx60 has had a price reduction. And some people are actually falling for it.

  • @paulbrooks4395
    @paulbrooks4395 ปีที่แล้ว +7

    Definitely appreciate the logical nuance. Where the questions of dubious value come up inevitably relates to last gen products at the same tier. At a time when inflation is killing discretionary budgets, people want more for their money. Given the relatively small increase over last gen and the lack of triumph over the 3060 Ti, it’s reasonable to ask “what’s the point?” For, in the best case scenario, I buy a new card that’s worse than the last card. This evaluation is based on performance-per-dollar because dollars are so scarce.
    Other considerations such as power usage simply no longer matter. People just want to ensure that they can eek every bit of performance potential out of their purchase as possible. Or said another way-people would spend more liberally on minor improvements if they had the cash.
    The evaluation of the 40 series has always come down to a single word: value. I hear it on every channel listed. The value isn’t there, despite the advances in technology, nor is the future proofing that even the 3060 had with 12 GB (it shows at 1440p, there is less frame variance due to this). It leads us to the conclusion that the 40 series hasn’t been able to effectively appeal over previous generations (again in performance per dollar).
    I agree that the 4060 isn’t bad *per se*, and it’s definitely not bad at 1080p and with reduced settings (I concur that reduced settings are a fair expectation). However, the comparisons are made in light of last gen still being available for sale as new or second hand.
    I have argued that the technology of the 40 series is excellent in terms of performance per watt (under appropriate cases) and other architectural improvements, but people-average people-can’t afford the very cards that provide a meaningful increase in *graphical* performance.
    I have to agree with Linus here, there’s no such thing as a bad product (to a point), just a bad price. To a certain extent, Nvidia and AMD would probably have had more sales by remanufacturing last gen and selling them at a lower price than creating a new generation of unaffordable and unspectacular products (in terms of relative price to performance).

  • @hescominsoon
    @hescominsoon ปีที่แล้ว +5

    Also the 6700xt often outruns the 4060 and 7600 for the same or less money. 8gb of vram is quickly becoming a bottleneck as well. Fsr/dlss reduce the load by lowering the rasterizing loads by rendering at a lower resolution then upscaling. I would rather it truly rendered at the resolution I set.

  • @leucome
    @leucome ปีที่แล้ว +2

    Look 10 year ago I got a brand new 128bit 160mm chip GPU with 16x pcie for 100$ Canadian pesos. Now a 146mm with a 128 bit bus 8x pcie and they ask over 500$ Canadian pesos. It is technically a cheaper cut down board design and it is 500% more expensive. There is a limit to bullshit, Inflation was not that crazy in the last 10 year. Seriously people are already too lenient in the first place we need to be even harsher on these benchmark and comparison.

  • @Elinzar
    @Elinzar ปีที่แล้ว +4

    i get that maybe using a flagship cpu in a flagship motherboard with flagship ram* is not the use case for a 4060 since it is a budget card and there is value using lower spec cpus and platforms to review it, and i totally support you in using 1440p as the new standard for lowish/ mid tier cards since the panels are so low in price (i will snipe that 200 usd asrock monitor in the next 2 months or so)
    but as someone that can only buy budget hardware ignoring other budget cards even if they are not in the same generation or whatever is a kick in the balls honestly
    and i think you are missing the point of why everyone says the card is dogshit, it is because versus similar priced cards that can be bought new with warranty, the 4060 and the 7600 too just suck arse, and loads of arse, the 6700xt indeed puts to shame the 4060 as the 6700 non xt puts to shame both too while costing basically the same, and the 6650xt isnt that far off with the bat in hand either, and im not talking sale prices here either
    like cmon if you want to be fair, be fair and do not ignore they exist either as pretty much all of the Nvidia 3000 series and AMD 6000 series lineup are also begin sold NEW WITH WARRANTY TODAY for far better pricing
    when they run out of stock, you can say that for 300 the 4060 is a solid choice as the 7600, until then, be fair

  • @devinbaines
    @devinbaines ปีที่แล้ว +4

    I see and appreciate your take on this. However, just how many test benches should the big reviewers be running? And how do they determine which one to use in which scenario? Steve from HUB addresses this frequently in his various channels. There is always the call for "why didn't you test that new GPU with my mobo, cpu and memory setup"? There are good explanations for why one would do it the way they do. How far down the test bench stack does a reviewer go with a new piece of kit and remain relevant (or sane)?

  • @gustersongusterson4120
    @gustersongusterson4120 11 หลายเดือนก่อน +2

    At least I now know that the 'bring back youtube dislike button' extension is working.

  • @DeathFlame500
    @DeathFlame500 ปีที่แล้ว +2

    The point is to see the card in its best environment

    • @CraftComputing
      @CraftComputing  ปีที่แล้ว

      What about it's natural environment? Because a 4060 ain't gonna share a case lid with a 13900KF.

  • @Szala89r
    @Szala89r 11 หลายเดือนก่อน

    Does the result of RTX4060 in CyberPunk are including DLSS or DLSS+FG?

  • @nikzel
    @nikzel ปีที่แล้ว +11

    Everyone else is wrong. Here are some totally arbitrary choices and tests, with some b-roll. These cards aren't so bad. You should buy! Totally coincidentally, I have kickback links in the description. 10/10 infomercial.

    • @tyre1337
      @tyre1337 ปีที่แล้ว

      somebody please think of the company's profit margins

  • @branno9596
    @branno9596 ปีที่แล้ว +10

    I love the thumbnail shows Linus but all his issues aren’t relevant to LTT’s review.
    This is just video clickbait.

    • @CraftComputing
      @CraftComputing  ปีที่แล้ว

      As I mentioned, I didn't talk to anyone at LTT about their impressions prior to publication. But did anyone in the thumbnail not call the 4060 'an absolute waste of silicon'?

  • @mandasantoso
    @mandasantoso ปีที่แล้ว +2

    Well, it's called standardized for a reason you know...

  • @djchotus1
    @djchotus1 11 หลายเดือนก่อน +3

    I disagree it's closest competitor is the 7600. It's actually the 3060/3060ti. Or. Also the Intel cards. 16GB and now great performance and continual driver upgrades. Or even the 6700xt as a new option.