The reason why they call it a 3050 6GB instead of calling it a 3040 is easy: To mislead people that aren't very tech savvy into thinking their OEM PC is better than it really is.
I've been buying nvidia cards for ~20 years. Their rampant predatory marketing with the 40 series pushed me to buy my first AMD card, the 7800XT. No regerts.
@AnonymousUser-ww6nsI got a 7800XT back in early December and yes, it was super buggy in certain games and it hated to work with DX12, resulting in many video driver timeouts and CTDs. I just got back to gaming on PC two weeks ago and haven't had a single problem since.
Just because people are dumb doesn't mean NVIDIA isn't predatory. The fact is they intentionally mislead customers into thinking they are getting a better product than what they receive and on top of this they price gouge their consumers.@@Overlord277
Same here. After 24 years with Nvidia I finally bought AMD 7900XTX. Fantastic card and I love Adrenalin. But Nvidia tech is just waaay ahead. That's why they can and will charge premium.
Someone should start a site where the cards are renamed adequately. A chart if you will. So that normal people can just look and see exactly where the cards truly fall in line. And the same for cpus.
You can do that with the dies. 4090 should be 4080 with the 102. 4070 should have the 103, 60 should have the 104, 50 should have the 106, and under that like a 3030 for that new 6GB card.
take Nvidia's 40 series and 30 series cards, subtract 20 from the model number and you have what the real model should be at a 2 tiers above what it should be price point
The naming scheme reminds me of the story between A&W and McDonalds. A&W sold a 1/3lb burger for either the same or less price than McDonald's 1/4lb burger but because people are dumb they assumed 1/4 is bigger than 1/3 because 4 > 3 🙄
@@TheOriginalCoda Because he refers to a similar situation as to where restaurants were presumably naming things a certain way in order to sell less of a product....He's implying that it is a scam and I ask how is that a scam. Like how does this need to be explained? Go enjoy your 1/4th burger buddy
Let's not forget that the 3050 was a *trash, joke card* to begin with when they launched it. It barely matched the performance of the 1660 Ti (a card with a $279 MSRP). They really had the audacity to go take said trash card and make it even more trash. To put it in perspective, they're basically making a card that's going to fall somewhere between a 1660 and a 1660 Super ($219-229 MSRP).
Honestly, AMD aside i would even feel more comfortable buying an Intel GPU rather than the original 3050. This "new" 3050 is a joke that shouldn't exist or at least should be 25% cheaper. For the Same price you can get an Arc A580, the difference in performance makes it worth the driver hustle that will come with it
yes in germany u can get an used 2060 ofr under 150 Euros, yes its used but still cheaper for better performance. the whole 3050 makes zero sense and custoimer should avoid to buy it
Gotta play the market back in dire times. Support the techs when the corporations are acting up. We snagged an refurb 3050 before the christmas '23 gouge kicked in (mid oct) for 209. It has without a doubt filled void for our decision to wait another year to leave 1080, and freed up a hundo or so for the overall system upgrade leaving in incredibly good shape for next gen cards. Sons pc is blasting the Skull & Bones beta with said 3050, max everything + rt on at 1080/60. It really does have a place in the market, just not for those feeding the beast. 🤷♂️
I agonized long and hard to upgrade from 1660ti, after an early year deals on a 4080 prebuilt and Oled 4k, not looking back for awhile, but entertained by youtube rants, well played Nvidia.
Even more recent than that. When I got the 7870 gigahertz edition in 2012, it was the fastest non-titan class card on the market. I got mine off Newegg brand new for $380. A flagship card for under 400 bucks.
I'd agree for the full height models, they need to be sub $150 to make sense. Just get a used RX 6600 for $130 to $140. But for the LP models, it think the price point is good. That's about what I spent for a brand new 1050 Ti LP when they came out. That's about what the 1650 sold for new, and still continues to sell for on the used market. At least this should be a good margin faster than the 1650 with a little more VRAM for the same price.
2080ti is $320 and smashes a 4060ti and beats a 3070ti STILL in all specs, VRAM, and real world performance...that shows just how pricy and ridiculous their last 2 gens have been priced
The amount of people I've seen claiming just ridiculous things is just insane. Look at some benchmarks, the 3050 beats the 1070 in every case I've seen. It's slower than a 2060 for sure, but a regular 2060 is a lot faster than a 1070.
@@mttrashcan-bg1ro benchmark sites are not trust worthy as it's an perfected simulation vs reality where heat energy and pc build really matter and clocks
No. Laptop is always much slower than Desktop. so 4090 match 4080 is just fine. The issue is the price is the same as 4090. If you want raw performance, go with a desktop PC.
@@KelvinKMS not 10 series. every laptop 10 series laptop was actually a desktop card at the same tier, the 1070mq, which was the lower wattage version, was actually a 1070ti die that was downclocked for the wattage cap but had the bigger core to balance the performance. if the cooling could handle it, they would actually perform within 10-20% of desktop same tier. and at around 50% of the power usage
@@bradhaines3142 1000 series really was an engineering marvel. desktop and laptop gpu performance parity and the 1080ti was significantly faster than its predecessor (980ti)
Yeah, laptops do have wattage limits and cooling. So 4090 laptop is just saying this is the max you can get because even then it’s questionable and depends on your cooling solution, in most laptops after a few min the card will thermal throttle and perform like a 4080 laptop GPU but with more ram. I went for the RTX 5000 ADA because I need raw compute for short bursts instead of the 4000. That said it is still very impressive that I hit 4070 desktop gaming performance at 145w on a quadro card with non gaming drivers.
I struggle to think of an industry that DOESN'T need more competition. But the thing about competition in capitalism is that to truly compete to the fullest, you have to buy out your competition. So when it gets that competitive, it's self-defeating
We have it, but then people scream "I need my Ray Tracing 4k 60..." and you are left with Nvidia. Or its I want my DLSS, or my this stupid thing or that stupid thing. Whenever people suggest something else, they find another reason to hide behind the Nvidia marketing shield.
@@davet.3901 Its really not. Amd cards have idle power draw issues that make them more expensive than nvidia cards via your power bill while also being way less performance. The only cards that even came close to competing were intel on the extreme low end with you swallowing the some games randomly won't work on your pc pill. I pray intel battlemage can save this garbage industry or maybe AMD can do what they did on the cpu side of things.
@@Fine_i_set_the_handle aaaah yes the old tale, weirdly enough this never comes up when Nvidia is the one with the higher idle draw. Besides, you have any proof for that claim? Because the last time I checked the IDLE usage of e.g. 7900 XT/XTX was pretty much on par with the 4080/90. I just checked a review from 31st of Jan about the 4080 Super and they're pretty much the same.
What's extra frustrating is this is on a budget card to make it seem less budget. Basically taking advantage of people with less cash to make it seem better than it is
@@AgentSmith1902 Yeah Jensen saying Moore's Law is dead is the dumbest thing he's ever said. AMD is moving forward a lot each gen with Ryzen CPUs, I know graphics cards aren't CPUs, but computer components clearly have more room left to go forward.
first Jay video I've watched in a while... know why? - Because the title actually said what it was about, (and it seemed interesting) Jay; you could have the most amazing and interesting video, but if you call it "This changes EVERYTHING" or "I had no idea Micro Center actually sold these..." or any other useless clickbait title, Ill never bother to open it, because I cant bother with something i don't care about. and I cant care if I don't know what it is. I feel like it's been getting better lately. hope you keep to it
I just can't even be assed to care about any of this shit nowadays. My old Skylake rig with 1080ti seems to do the trick. My Coffee Lake with 2070super is fine too. Nothing new in gaming is interesting, because it all seems so poisoned.
@@esko911 The performance between the two are negligible at best. so yeah if you don't care about RTX there's no reason to upgrade and like you said it's a lot cheaper to get it used
@@chandlerbing7570 This is part of why I still haven't upgraded.. I've got a 1080ti and it's still able to run pretty well any game at at least medium 1080p. Most times I'm on high or more. Will I upgrade next gen? Maybe. I can definitely say I got my money's worth after 6-7 years. xD
the two 3060s are the only exception in this scenario, they only have a 4gb ram difference and thats it, so the only benefit that the 12gb model gets is faster memory lanes ( which does often result in more performance).
ironically that's what i consider to be recent 40 series, they look and built like bricks and official value is probably 20% of what they actually ask for them atm, but ai craze sells, so there is that your 4090 probably cost 200-300 to build in a first place, yet they ask 1600 for them and now there is a 300-400 mark up because of the so called china ban, china is on the brink of collapse and most ppl can barely afford food there, you really think they need 4090's? lol
@@r3tr0c0e3 I'm totally fine with them have a 90 tier card at that kind of price for people who want them, as $1600 isn't that ridiculous when Titans were $1200 a decade ago. However, the 80 tier cards should never breach the $1000 mark. All this mark up bullshit shouldn't happen either, something isn't worth more because it's rare, that's a scam created god knows how long ago that gets so many people.
NVIDIA - Generally good hardware, but pricey. AMD - Rubberbands between better and worse than NVIDIA, cheap, but sometimes have weird compatibility issues. (RX 5700 XT was notorious for being defective)
Big love for going out of your way to confront the green giant. Huge respect for what you boys and girls do. The content is excellent and you really project that you guys care about your audience. Big respect from London have a spud form all the community for a job well done my man! 👊
Never buy a product without testing or independent reviews from mutiple sources. Never trust these corporations, always seek outside reviews and information.
they don't care, it wouldn't surprised me if this was unofficially ngreedia sponsored, i mean any ad is better than no ad in fact i wouldn't even know about this card if randos wouldn't start making video's about it lol they released a lot of garbage this year, i mean we already 4 have different versions of the so called same 4070 lmao with not much difference between them except for the price i don't remember 1060 bottom and top having 40% price difference, they did release tons of different variants of it 1060 3gb, 6gb,1650 1660ti etc, basically same card but there was no marginal price difference
This is why, I went AMD this time I can't complain with my Sapphire nitro+ 7900 xt. Plus, I don't have to worry about the power adapter melting or vram.
@@KenOtwellyou might wanna reconsider that. You need Atleast 850watt for a 7900XT. You cant forget about sudden energy spikes. I have a 7800x3d which is super Efficient and even i went with a 1000w psu just to be safe. And for the future proofing headroom.
If you're gaming then that's fine. But some of us also are creators and renderers. AMD can't do as well as Nvidia in those areas. Nvidia is one for all the jobs, while AMDs are for really just one job, and that job is gaming.
And that's exactly why I am and remain TEAM RED... There are 3 (!!!!) different Model-Types: NON-XT, XT, XTX. Partly strange because of the Numbers, BUT not like with NVIDIA. At NVIDIA there is: NON-TI, NON-SUPER, NON-TI-SUPER, TI-SUPER, TI, SUPER, Normal and don't forget the "D". WTH and WTF ??
I recently moved from this same card to a 4060ti, and I gotta say, even though the performance isn't that different, RTX is a much bigger difference than I expected it to be.
I just noticed this when I saw that RTX 3050 6GB has only 70 watt TDP (good, no need for additional power connector), while RTX 3050 8GB was like 130 watt. Initially I though that the shop I was browsing had made a mistake in the product description (after all, different memory variant of same GPU do happen). Anyway, the only instance in which RTX 3050 6GB will have any use case (if it is faster than AMD 8000 APUs) is if it will be sold in the same form factor as RTX A2000. Low profile, short length 2 slot for very small form factor HTPC builds.
I really don't think I will be getting a new graphics card now for a another 2 or 3 years the way things are. NVIDIA have really frustrated me over the last few years.
I probably won’t upgrade for like another 6-8 years. January I just got a brand new 3090/ ryzen 5900x complete build from a local pc builder for $1300. Something about the dude he was building it for put down money and the contract of 90 days ended. Gave me all the parts boxes and receipts
@@samoptimus4228 nah, probably 2 or 3 years. The 4090 is a great card, but its not the 4k 120fps in every game with maxed out settings everyone claims it to be.
Most average consumers aren’t watching TH-cam videos or navigating forums. Having deceitful naming conventions to purposely trick people is scummy. This is on Nvidia 100%
@@TheIndulgerssadly though that is what it takes to be a smart consumer. If you can’t do basic research before a large purchase that is 100% on you. I would never even think about buying a GPU without extensive research beforehand. Anything else is pure ignorance unless money is no object.
I just bought a XFX 7900XTX and I only been having it a couple day but so far I am loving it! I replaced my 2080 hybrid and I do not regret it! The only con I have come across is the card is a two slot and it is over 13.5 inches long filling up my full size case.
@@Drewkungfoo I bought a 3080ti from EVGA A week before they announced getting out of the game, and also before realizing the XC3 was not the option I should have grabbed. I'll have to wait a while for Battlemage to mature, since I'm not dropping another 800 bucks anytime soon
@@Drewkungfoo Not just the drivers, it's the cards/dies themselves. The A770 for example was supposed to be a 3070 competitor and ended up competing with the 3060/Ti. They'd need to sort out whatever architectural shortcomings Alchemist had for Battlemage as I don't believe Battlemage is a whole new ground up architecture but Celestial is.
I had a feeling this was about the 3050 6GB. Had they called the 3050 8GB the 3050 Ti, I don't think the naming would be such an issue... but they didn't, so it should have been the 3040. But NVIDIA is going to do NVIDIA things, so it's not all that surprising. However, I think it's a good product at the price point AS A LOW PROFILE card... considering people are paying $170 to $180 for the LP 1650 on eBay. But the full height cards at the same price are stupid when a $10 increase gets you the RX 6600 which wipes the floor with the 8GB 3050, let alone the cut down 6GB model. The full height models need to be sub $150 to make sense. Regardless... it absolutely should have been named the 3040.
There's literally no justification for buying a 3050, it's such an insane example of how powerful marketing and brand recognition is. It's not like you need another hardware attached to the brand of a GPU you're buying as is with CPUs and motherboards. As long as you have the powersuply you can put whatever card you want on a build. So If your build can support a 3050 it can support an RX 6600 which is from what I remember is between 20 to 40% faster
@@homelesswizard3161 I know... I said as much in my comment. The 3050 8GB at $200+ is stupid and the full height 3050 6GB at $180 is stupid. You'll get no argument from me. The only reason I think the 6GB is interesting is as a low profile card, where AMD has nothing good. Even NVIDIA has mostly abandoned that market, which is why people are left paying excessive prices for 1650's. At the same price, the 3050 6GB should be a good deal faster and has 6GB of VRAM. Hopefully it pushed pricing on the used 1650's down.
@@TheGameBench I don't know how your local market looks like but if you consider the RX 6500 a lowprofile it blows the 1650 out of the water and is cheaper. And the RX 6400 is also very slightly faster than the 1650 and that very much falls under the "lowprofile" definition while being even cheaper than the 6500.
@@homelesswizard3161 There is no such thing as an RX 6500 and there are no low profile RX 6500 XT's. The only current low profile AMD GPU is the RX 6400, which is a non option in anything without a Gen4 slot, and even then... still isn't a good option. It might seem good if all you look at is the avg frame rate, but it falls apart in the 1% and .1% lows just like the 6500 XT. Some game perform better than others in this regard, but they're both not good options and I'd rather have the 1650 with a 128-bit memory bus 8 PCIe lanes. Especially for the OEM SFF PC crowd which makes up the majority of this market, and most of those people are stuck on Gen3. All that being said, the discussion was about the 3050 6GB not the 1650, and the 3050 6GB will be faster than either the 6400 or the 1650, for the same money as a used 1650 LP. The 6400 isn't a good card regardless and people should have never been paying $170 for a used 1650. Now they have a better option that has more VRAM and DLSS, which can bring the 6GB model up to the performance of the 8GB model. Also, if you have a Plex server, the 3050 6GB can to 8 transcodes at once, making it very enticing for that use case as well, and I know more than a few people running Plex off a SFF OEM system.
Yeah except for the fact that the 3050 is a joke performance wise and definitely does not deserve a ti whatsoever. The 1050ti was like 10% faster than the 960 while the 3050 is 15% SLOWER than the 2060 non super.
Makes me miss the 1050ti. Sub-$200 card that was bus powered. Could easily slap that into a used haswell office system and be gaming for under for $500 for the whole setup. Made for great days to get people started on PC gaming.
Also having the best..that is the 4090! kinda makes them control prices. Which allows AMD to keep its relevancy to a point...meaning midrange, good value cards. Basically they cannot compete with best, so tried to ruin nvidia midrange. And they done well. But that is all
technically there was a total 4gb, it's just last 512mb but it was much slower, so yeah they can and will getaway with anything, from lying about performance to melting connectors and blaming it on user error lol all this tech channels are bought and paid for by this companies to make this ads, whether it's positive or negative
@AnonymousUser-ww6ns The only design flaw going on with that was a missing chromosome or something from the mouth breathers that cannot operate a simple plug.
That makes it even worse because a 3050 is fucking obsolete in 2022. Id give them an excuse if it was 2018-2019 but launching a card that much behind in times in 2022 is a fucking scam.
@@acid1787 Its only obsolete if you exclusively play the newest games at maxed out settings. Not sure why anyone would subject themselves to that though, new games always come out unfinished for $70 and max settings only look marginally better than medium-high these days for like half the performance.
They forget about DLSS too. It is not a bad GPU, just because it had high price in the time of mining. This is Hababox thing. They didn't get some samples and they are only for pure rasterizations. In my country RX6600 was 170$ more expensive and it is not that faster if you dont include Tiny Tina Wonderland. RTX3050 it's even faster in newer games.@@bignose1752
At this point I wouldn't put it past Nvidia if they are trying to obfuscate and fuck up the GPU market before they pull out in all but official way, just to make it more difficult on AMD/Intel when they try to pick up the pieces.
Yeah I remember before inflation hit as well. Quit yelling at the sky, prices will not come down. Your groceries aren't, what the hell makes you think GPUs will? I remember when ground beef was under $3 a lound as well.
I agree, a year ago I bought a 3080 12Gb card which has a larger VRAM memory and data bus (equivalent to 3080Ti) over a 3080 10Gb card. The naming convention needs to be re-thought, it's getting into insane Intel naming territory.
That makes it even worse because a 3050 is fucking obsolete in 2022. Id give them an excuse if it was 2018-2019 but launching a card that much behind in times in 2022 is a fucking scam.
The trouble with companies the size of Nvidia, they only want your respect when they need you to buy their stuff. Because gamers are now such a small part of their business, they don't give a damn about gamers. And as AI markets grow, their need for gamers shrinks
The RTX 4080 desktop is literally a 70 class card. The RTX 4080 has a die size of 389mm^2 while the RTX 3070 has a die size of 392mm^2... The RTX 4070Ti is a 60 class card, the 4070 is a 50Ti class card, the 4060Ti is a 50 class card and the 4060 is a 30/40 class card... What is worse is when they sold a sub-mid-range GPU like the gtx 1080, which was in fact a 60 class GPU for 600-700$, when it should have been 350$ at launch. And now they are doing the same with the rtx 4000 series...
with the 4080 super situation they just leaved the problem to the sellers, that cant sell the 4080 at the original MSRP so they dont have to cope with that problem and leaved sellers with huge stock of 4080 that will never sell because of the new cheaper 4080 super and cant take them back because is a "different model" nice one NVIDIA ;)
There were 3 versions of the GT 710, 3 versions of the GT 730 and 2 vastly different versions of the GT 1030. They have done this for years, it makes life so difficult at the low-end (or for retro machines). You have to use Wikipedia just to know which variant you are buying, if you can
Now i want to see Jay rant on how car companies name a completely different car the same name. Like sometimes it will go from a big body 4 door to a short compact 2 door sports car or vice versa and have the same damn name.
*Requested Follow Up Video:* What is a list of 15-20 cards, which are actually available now, that are - what you would consider to be - good value for their price points in terms of performance v. cost? It'd be nice to get a genuine rundown on what cards are worth buying (in said price ranges) in order to be able to sort through the clutter of predatory marketing and not get screwed over by it. A solid list of current market cards to buy ranging from $200 and up would be really helpful for people who don't compare hardware specs on spread sheets as part of their job (the way you do).
Nvidia actually released at least 6 different SKUs of the GTX 1060: 1060 6GB, 1060 6 GB Rev. 2, 1060 6GB with GDDR5X, 1060 6GB 9Gbps, 1060 5GB, 1060 3gb.
technically 1650,1660 etc can be considered from the same family, so add that to the pile of trash i can see rtx 4670 ti super 8gb coming in same sff as this 3050 in couple of years while performing like 4050 if they ever officially release one, i mean 4060 technically doesn't count as one hehehe
@@r3tr0c0e3 I disagree. Different architecture + it's not named 1060 so kinda misses the whole point of this discussion. However, I forgot two. Both 1060 6gb and 1060 3gb also released with GP104 dies later down the line. So that'll make it 8 desktop variants. 9 if you count the unreleased 8gb version, 11 if you count both mobile variants.
Nvidia will continue to treat us like crap. They don’t give a shit about the gaming market anymore, it’s not even a quarter of their revenue. AI chips is their new income stream. Prepare for darker times.
Yep. Also gaming used to be about half their revenue, now it's about a fifth. GPU companies are focused on making GPUs for AI data centres, not for gamers anymore. Cloud gaming may be the most affordable way in future.
And this is why I went with the Intel A770. I'm done supporting a company that has tried to do this multiple times. At this point gamers need to turn away and show them some financial harm. Maybe then they'll stop doing this. Until then though they aren't going to care because they know people are still just going to buy it.
It feels like Intel is worse in this regard. Look at what happened during the trickle of the CPU improvement releases a couple years back or so.@@jeffb.6642
@@jeffb.6642 lol. Intel is the best value over AMD. You can get a non z intel board with an i3 which is the old i7 4 cores 8 threads or an i5 non K which is 8 to 10 cores for hundreds less than AMD and it won’t stutter with tpm on win 11 like the AMD bug
Financial harm isn't coming to Nvidia. I had been using a 1080 Ti since 2018 and now Nvidia is worth a trillion US dollars. I bought a RTX 4080 for $930 brand new last week instead of waiting for some scenario that will never appear.
speaking of the gt 1030 they did the same thing, launched ddr4 vram version after the ddr5 vram version, gave it slower clocks and left it as a gt 1030 although the performance was different between the 2 cards.
The GTX 970 3.5GB vram, 3060 TI 8gb vs 12gb vram and performance differences, 3080 10gb vs 12gb price and performance differences, GT 1030 DDR4 vs DDR5 fiasco, and now this lol
I've had a few people ask why I chose a 7900XTX over a 4070Ti/4080 and basically call me stupid for it. This video explains exactly why. Nvidia does nothing but continue to abuse their market leader position to squeeze gamers until they quit PC gaming. I'm not targeting anything that needs a 4090's performance, and most of what I play doesn't benefit from ray tracing, so I have zero reason to support such abusive, monopolistic practices. And even if I did.... I cannot in good conscience pay money to a company that is so incredibly, transparently gloating about their price gouging. Screw Nvidia. Screw Jensen's stupid leather jackets.
As you said, the whole reason behind it is that most people (except those that REALLY DIG DEEP before buying) will see that it's a 3050 6GB compared to a 3050 8GB & think "well 2 GB less RAM but I save so much money" & not realise that they're getting a cut down card with so much less CUDA power AS WELL as less RAM.
I just got a 3060ti and called it a day. I know it doesn't have 16gb of vram and it really doesn't matter to me since I just play at 1080p anyway. All the naming conventions by either company intentionally obfuscate their true specs, so I just look up the actual specs and decide based on actual research.
As a previous 970 owner, this is incorrect. It did have 4GB, but if you played any game that required more than 3.5GB, you discovered that the last half gig ran at half the speed as the rest of the VRAM (thus killing the performance of the card overall).
Even shadier is the laptop card branding, same exact naming completely different cards. It is crazy that they can get away with it. This is what happens when there’s little to no competition in the market.
@@MudSluggerBP What happened with the 970 was that instead of cutting an entire block out of the memory sub-system and making the defective dies into a 3gb/192-bit card, they partially disabled one block in a way where two of the memory controllers were fighting for resources. It still had 256-bits worth of memory controllers that were connected to 4gb of GDDR5 at the advertised speed... but there was a bottleneck elsewhere in the system, in a part of the die not usually discussed, that made it act more like a 224-bit card. And while the last 0.5gb did run slower, it was still faster than running out of VRAM and having to use system RAM. I'm kinda inclined to give them the doubt on that one, because it really seems like the engineers got too clever with how to disable as little as possible in a defective 980, and marketing didn't know how to explain it. I owned one (it was the last Nvidia card I put in my main PC), and the drivers really managed it pretty well.
I'll not buy anything Nvidia unless they decide to pull their act together, AMD does everything I need, and if they don't get their act together, Intel is improving. And both of them are less slimy than Nvidia.
As long as the customer not savvy enough, companies will try to squeeze money out of the customer as much as possible. Good thing in PC world we have people like Jay & Steve.
I stopped buying Nvidia cards during the 20 series. Maybe if more people actually voted with their wallets we wouldn’t be in quite the mess we are in. Queue in the comments about how Nvidia doesn’t care about selling cards to the consumer and it doesn’t matter, etc
I get the rant, but I'm still over here since release day trying to get my hands on a Rog Strix 4080 super OC. It's getting on my nerves as well as my builders nerves. MC is 2 hours away so we don't have the luxury of just striking it lucky or gambling on one being in stock when we get there,,,, and NE hasn't had any in stock to order.... but yet literally all the other cards,,, they have plenty of. 90's, 70's, all of them. No 80's. People are selling them on Amazon though for close to 2 grand too which is the same price everyone has all the 90's listed at. It's ridiculous. It's obvious the 80's are selling so why aren't these stores stocked up with more knowing that people are cleaning them out every chance they get?????? It's literally the last thing my build needs and here I am absolutely STUCK. Really nerve racking.
I work with 3D (Blender, UE5) so I have a 3060 12GB because I really need the extra VRAM. I've looked into upgrading to a better card but the 4060Ti 16GB is quite deceptive for the price (I live in Brazil btw).. it's only PCIe 8x and the memory bus is smaller than my 3060. I won't be upgrading anytime soon.
Living in Brazil as well, I am about to build a system as I am way overdue. The market here is hard to get used to as many parts have high import duties. It seems to me Intel processors are taxed less than their AMD counterparts, and AMD video cards do not look to be priced competitively either. Unless you have a good muambeiro its hard to put anything together reasonably, and avoid the Chinese no-names that flood the market here. Tracking prices and exchange rates and buying when something is less than 25% over the USD price seems to be the best option. I expect to take a few months before I have all I need.
@@cabrageo I wish you a good luck with that. I ended up buying my 3060 near the end of the GPU price craze so I paid a lot for it, but considering I had a 1050Ti before, it was worth every cent.
As if Nvidia isn't already making enough money with the AI boom. Big thanks to Jay for keeping "regular" people informed in a way they can understand. His videos helped me a lot, as they must have for a lot of other people I'm sure.
Speaking of questionable. Frame generation and upscaling has been around for 15 years in TVs. Funny how you now need their AI cores to use frame gen, but AMD and everyone else, use software, and how 2012 Skyrim, and unreal engine have global illumination, reflections, etc. Hmmmm
$2k for an 80 series card in Australia is absolutely crazy... at this point I don't even want to upgrade ever. I only upgraded to a 3070 series last year but originally had a GTX 1080 the performance difference is barely anything
If I was still in the market for a graphic card, I might consider some of these lower price point cards. I agree with the recommendation for the naming scheme - Sometimes marketing people should NEVER be allowed to make decisions about product names. That said, I don't plan on any more upgrades soon, as I recently upgraded from a GT610 to a GTX 1660 TI. That's plenty on graphics power for me, and the price point was at the top of my budget as marked down from $359 to just $229.
75W power limit is for Low Profile cards in SFF casing that doesn't have extra power supplyu connectors. Since now GTX 1650 LP was the fastest card for SFF.
I'm sure somebody already pointed it out: the 1060 3GB had the same 192bit memory interface than the 1060 6GB. So compared to the 1060 3GB the 3050 6GB is even worse, since it's memory bus of 96bit is actually narrower than the 128bit bus of the 3050 8GB.
What's frightens me is the pricing it increased so much over the last 6 years the 5xxx series will topple everything can't imagine that the 5090 will be lower as 3000$
Also , believe it or not, this cards sell REALLY well in 3rd world countries where having a 60 or even 70 tier card means 6 to 10 months of paychecks...
I was one of those dummies who bought the 1060 3GB card. I mainly bought it instead of the 6GB card because of the card length. I had a drive cage in front of the card that would've made fitting the longer cards an issue.
Who TF cares about the 3050 anyways? These aren't gaming cards. In this current market, Nvidia has almost zero incentive to act in good faith. Ppl will buy their products regardless of how deceptive their advertising/marketing is.
Its crazy that they are releasing a new 30 series card after the 40 series has already been out & just launching the 40 Super series. The worst thing is they are doing this on the down low. Most people wouldn't even know about this release if you weren't paying attention to the market.
The 3050 8GB came out in early 2022 on performance level with a 2016 card (GTX 1070) The 3050 6GB is a 2024 card on performance level with a 2018 card (GTX 1650 Super) Good job Nvidia.
My first PC was 486DX2 with a separate Math Coprocessor and external, single speed, CDrom. I bought a sound-blaster long before I needed an external graphics card... Graphics cards weren't even really a thing until the internet moved into smaller towns and cities and even then the games were better optimized than they are today. Quake and EverQuest ran just fine on a Pentium. My bet is WoW drove one of the biggest PC upgrade revolutions in modern recorded history and that's when I'd bet graphics cards really started selling main stream.
I've never bought a new Nvidia card for this very reason. They are slimeballs looking for the easy money grab, and they don't give a crap about their customer base. I don't care if they do perform better. I have owned a few used ones only because I was able to get them either at a decent price or it came with a machine I purchased.
They did the same thing with the 770s way back in the day. I fell victim to it back when I was new to PC building and more naive then I am now. This has always been their M.O.
The reason why they call it a 3050 6GB instead of calling it a 3040 is easy: To mislead people that aren't very tech savvy into thinking their OEM PC is better than it really is.
captain obvious award of the year over here 😂
@@elpoderosasoStill 150 likes so imagine the average viewer 😂
@@elpoderosaso god job captain know it all. Now go back to study in a nice way
@@elpoderosaso We appreciate your input General Sarcasm!
As general of peace i want you guys to get along
I've been buying nvidia cards for ~20 years. Their rampant predatory marketing with the 40 series pushed me to buy my first AMD card, the 7800XT. No regerts.
How is their marketing preditory? Ya'll keep buying GPUs day one which signals to them that they're giving people what they want.
Same
@AnonymousUser-ww6nsI got a 7800XT back in early December and yes, it was super buggy in certain games and it hated to work with DX12, resulting in many video driver timeouts and CTDs. I just got back to gaming on PC two weeks ago and haven't had a single problem since.
Just because people are dumb doesn't mean NVIDIA isn't predatory. The fact is they intentionally mislead customers into thinking they are getting a better product than what they receive and on top of this they price gouge their consumers.@@Overlord277
Same here. After 24 years with Nvidia I finally bought AMD 7900XTX.
Fantastic card and I love Adrenalin. But Nvidia tech is just waaay ahead. That's why they can and will charge premium.
Someone should start a site where the cards are renamed adequately. A chart if you will. So that normal people can just look and see exactly where the cards truly fall in line. And the same for cpus.
YES THIS 110%
You can do that with the dies. 4090 should be 4080 with the 102. 4070 should have the 103, 60 should have the 104, 50 should have the 106, and under that like a 3030 for that new 6GB card.
take Nvidia's 40 series and 30 series cards, subtract 20 from the model number and you have what the real model should be at a 2 tiers above what it should be price point
if user benchmark was honnest and accurate that would be it... BUUUUUUUUUUUUUUUUT.... No
I generally use passmark for this.
The naming scheme reminds me of the story between A&W and McDonalds. A&W sold a 1/3lb burger for either the same or less price than McDonald's 1/4lb burger but because people are dumb they assumed 1/4 is bigger than 1/3 because 4 > 3 🙄
How is that a scam? Sounds like people are just stupid
@@cdude665 Where did they say it was a scam? Yes people are dumb.
@@TheOriginalCoda Because he refers to a similar situation as to where restaurants were presumably naming things a certain way in order to sell less of a product....He's implying that it is a scam and I ask how is that a scam. Like how does this need to be explained? Go enjoy your 1/4th burger buddy
@@cdude665no... He's saying that they tried to undercut McDonald's, but people only took the numbers at face value, so it didn't work.
*American people
Let's not forget that the 3050 was a *trash, joke card* to begin with when they launched it. It barely matched the performance of the 1660 Ti (a card with a $279 MSRP). They really had the audacity to go take said trash card and make it even more trash.
To put it in perspective, they're basically making a card that's going to fall somewhere between a 1660 and a 1660 Super ($219-229 MSRP).
Honestly, AMD aside i would even feel more comfortable buying an Intel GPU rather than the original 3050. This "new" 3050 is a joke that shouldn't exist or at least should be 25% cheaper. For the Same price you can get an Arc A580, the difference in performance makes it worth the driver hustle that will come with it
yes in germany u can get an used 2060 ofr under 150 Euros, yes its used but still cheaper for better performance. the whole 3050 makes zero sense and custoimer should avoid to buy it
Gotta play the market back in dire times. Support the techs when the corporations are acting up. We snagged an refurb 3050 before the christmas '23 gouge kicked in (mid oct) for 209. It has without a doubt filled void for our decision to wait another year to leave 1080, and freed up a hundo or so for the overall system upgrade leaving in incredibly good shape for next gen cards.
Sons pc is blasting the Skull & Bones beta with said 3050, max everything + rt on at 1080/60. It really does have a place in the market, just not for those feeding the beast. 🤷♂️
I agonized long and hard to upgrade from 1660ti, after an early year deals on a 4080 prebuilt and Oled 4k, not looking back for awhile, but entertained by youtube rants, well played Nvidia.
The RTX 3050 has less performance than a GTX 980
Early 2000s a mid-level card was $150. I miss those days.
Even more recent than that. When I got the 7870 gigahertz edition in 2012, it was the fastest non-titan class card on the market. I got mine off Newegg brand new for $380. A flagship card for under 400 bucks.
At the $170 price point, going to Ebay and buying a used graphics card seems a far more attractive option.
I'd agree for the full height models, they need to be sub $150 to make sense. Just get a used RX 6600 for $130 to $140. But for the LP models, it think the price point is good. That's about what I spent for a brand new 1050 Ti LP when they came out. That's about what the 1650 sold for new, and still continues to sell for on the used market. At least this should be a good margin faster than the 1650 with a little more VRAM for the same price.
you can get a 1080 for 150usd or even RX6600XT for that
2080ti is $320 and smashes a 4060ti and beats a 3070ti STILL in all specs, VRAM, and real world performance...that shows just how pricy and ridiculous their last 2 gens have been priced
Even if you don’t want to go used, $10 more will get you a new in box RX 6600 last time I checked (Microcenter).
@@jameslake7775 Yep, even Amazon has a Powercolor model for about $190.
Ouch.. The 3050 is already significantly slower than a 1070 or 2060. Last couple generations are going backwards at mid and low tier.
The amount of people I've seen claiming just ridiculous things is just insane. Look at some benchmarks, the 3050 beats the 1070 in every case I've seen. It's slower than a 2060 for sure, but a regular 2060 is a lot faster than a 1070.
@@mttrashcan-bg1ro benchmark sites are not trust worthy as it's an perfected simulation vs reality where heat energy and pc build really matter and clocks
Laptop 4090 is a 175w 4080, but they called it 4090 so they could charge more for it.
That's been going on for quite some time. The laptop I bought over a decade a go has a 460M, which is the laptop equivalent of the 420 desktop card.
No. Laptop is always much slower than Desktop. so 4090 match 4080 is just fine. The issue is the price is the same as 4090. If you want raw performance, go with a desktop PC.
@@KelvinKMS not 10 series. every laptop 10 series laptop was actually a desktop card at the same tier, the 1070mq, which was the lower wattage version, was actually a 1070ti die that was downclocked for the wattage cap but had the bigger core to balance the performance.
if the cooling could handle it, they would actually perform within 10-20% of desktop same tier. and at around 50% of the power usage
@@bradhaines3142 1000 series really was an engineering marvel. desktop and laptop gpu performance parity and the 1080ti was significantly faster than its predecessor (980ti)
Yeah, laptops do have wattage limits and cooling. So 4090 laptop is just saying this is the max you can get because even then it’s questionable and depends on your cooling solution, in most laptops after a few min the card will thermal throttle and perform like a 4080 laptop GPU but with more ram. I went for the RTX 5000 ADA because I need raw compute for short bursts instead of the 4000. That said it is still very impressive that I hit 4070 desktop gaming performance at 145w on a quadro card with non gaming drivers.
This industry needs more competition
I struggle to think of an industry that DOESN'T need more competition. But the thing about competition in capitalism is that to truly compete to the fullest, you have to buy out your competition. So when it gets that competitive, it's self-defeating
Competition is there, the problem is that people STILL give Nvidia their money!
We have it, but then people scream "I need my Ray Tracing 4k 60..." and you are left with Nvidia. Or its I want my DLSS, or my this stupid thing or that stupid thing. Whenever people suggest something else, they find another reason to hide behind the Nvidia marketing shield.
@@davet.3901 Its really not. Amd cards have idle power draw issues that make them more expensive than nvidia cards via your power bill while also being way less performance. The only cards that even came close to competing were intel on the extreme low end with you swallowing the some games randomly won't work on your pc pill. I pray intel battlemage can save this garbage industry or maybe AMD can do what they did on the cpu side of things.
@@Fine_i_set_the_handle aaaah yes the old tale, weirdly enough this never comes up when Nvidia is the one with the higher idle draw. Besides, you have any proof for that claim? Because the last time I checked the IDLE usage of e.g. 7900 XT/XTX was pretty much on par with the 4080/90.
I just checked a review from 31st of Jan about the 4080 Super and they're pretty much the same.
What's extra frustrating is this is on a budget card to make it seem less budget. Basically taking advantage of people with less cash to make it seem better than it is
Glad to see You back Jay. Hope your health is better my friend. We all love Ya and appreciate all You do for the PC Community! God Bless
"The more you buy, The more you save."
"Look at how heavy it is!"
-Jensen Huang.
swoosh
Jen-sen is the sound effect.
Moore's Law is probably, currently, running at about two times
@@AgentSmith1902 Yeah Jensen saying Moore's Law is dead is the dumbest thing he's ever said. AMD is moving forward a lot each gen with Ryzen CPUs, I know graphics cards aren't CPUs, but computer components clearly have more room left to go forward.
@@mttrashcan-bg1ro It cracks me up whenever I see Niktek's meme video of him hahahaha
first Jay video I've watched in a while... know why? - Because the title actually said what it was about, (and it seemed interesting)
Jay; you could have the most amazing and interesting video, but if you call it "This changes EVERYTHING" or "I had no idea Micro Center actually sold these..." or any other useless clickbait title, Ill never bother to open it, because I cant bother with something i don't care about. and I cant care if I don't know what it is.
I feel like it's been getting better lately. hope you keep to it
I just can't even be assed to care about any of this shit nowadays. My old Skylake rig with 1080ti seems to do the trick. My Coffee Lake with 2070super is fine too. Nothing new in gaming is interesting, because it all seems so poisoned.
I agree, these kind of rants are full of gold.
I also hate that! And in a lot of videos its just the same info over and over again. It's why i unsubscribed
Same
Dont forget:
"YOU WONT BELEIVE WHAT HAPPENS NEXT!!!!!" *pog-face* 😮😮
Lest we forget, there's the 3060 8Gig as well which is 15-20% slower than the 12Gig model.
and where a 1080 non TI was still better than a 3060 lol and cheaper to get used from someone.
@@esko911 The performance between the two are negligible at best. so yeah if you don't care about RTX there's no reason to upgrade and like you said it's a lot cheaper to get it used
@@chandlerbing7570 This is part of why I still haven't upgraded.. I've got a 1080ti and it's still able to run pretty well any game at at least medium 1080p. Most times I'm on high or more. Will I upgrade next gen? Maybe. I can definitely say I got my money's worth after 6-7 years. xD
the two 3060s are the only exception in this scenario, they only have a 4gb ram difference and thats it, so the only benefit that the 12gb model gets is faster memory lanes ( which does often result in more performance).
@@fourty9933 nope. Same die, but cut down shaders. And the narrow memory bus.
They would sell literal bricks for 1000 bucks. It's on us consumers to look through their lies. COMPANIES ARE NOT YOUR FRIENDS!
ironically that's what i consider to be recent 40 series, they look and built like bricks and official value is probably 20% of what they actually ask for them atm, but ai craze sells, so there is that
your 4090 probably cost 200-300 to build in a first place, yet they ask 1600 for them and now there is a 300-400 mark up because of the so called china ban, china is on the brink of collapse and most ppl can barely afford food there, you really think they need 4090's? lol
Hal, i can't do that. Hal?
@@r3tr0c0e3 I'm totally fine with them have a 90 tier card at that kind of price for people who want them, as $1600 isn't that ridiculous when Titans were $1200 a decade ago. However, the 80 tier cards should never breach the $1000 mark. All this mark up bullshit shouldn't happen either, something isn't worth more because it's rare, that's a scam created god knows how long ago that gets so many people.
Eat the rich means nvidia too, yes
3060 8GB and 12GB also have a 20-30% gap cause memory bandwidth (128bit vs 192bit bus)
and l2 cache (2mb vs 3mb)
It amazes me how the 8GB 3060 is never talked about. It's the real desktop version of the 3050ti and they named it the same as the real 3060.
The 4070 TI super is actually a 4080 gimp. Doesn’t quite have the ring that Jensen would prefer.
Everyone should have a gimp in their life. Jensen treats his customers likes gimps.
Like literally every other Ti or Super. You gamers are literally petulant children just looking for something to throw a fit over.
This is one of many reasons I refuse to use NVIDIA. AMD has their own issues but NVIDIAS reoccurring sleaziness rubs me the wrong way....
NVIDIA - Generally good hardware, but pricey.
AMD - Rubberbands between better and worse than NVIDIA, cheap, but sometimes have weird compatibility issues. (RX 5700 XT was notorious for being defective)
Haven't really had any compatibility issues and I've been using AMD since the ATI X700 pro lol
@@asialsky 5700XT notorious for being defective?
Obligatory 'not on my system'.
@@asialsky Nvidia doesnt offer good hardware value. They intentionally give less vram then amd to force people to upgrade often
Both companies release bad products just as often though. The recent 5700 non x for example sometimes being slower than a 5600
Big love for going out of your way to confront the green giant. Huge respect for what you boys and girls do. The content is excellent and you really project that you guys care about your audience. Big respect from London have a spud form all the community for a job well done my man! 👊
Never buy a product without testing or independent reviews from mutiple sources.
Never trust these corporations, always seek outside reviews and information.
True but not all reviewers are honest.
agreed. I all ways look on reviews and seeing people testing stuff like this cuss if you don't you will get scammed.
Thanks for this Jay, these companies need to be called out more, this practice needs to stop.
Only because Jay doesn't get nvidia paychecks...
they don't care, it wouldn't surprised me if this was unofficially ngreedia sponsored, i mean any ad is better than no ad
in fact i wouldn't even know about this card if randos wouldn't start making video's about it lol
they released a lot of garbage this year, i mean we already 4 have different versions of the so called same 4070 lmao with not much difference between them except for the price
i don't remember 1060 bottom and top having 40% price difference, they did release tons of different variants of it 1060 3gb, 6gb,1650 1660ti etc, basically same card but there was no marginal price difference
This is why, I went AMD this time I can't complain with my Sapphire nitro+ 7900 xt. Plus, I don't have to worry about the power adapter melting or vram.
I chose the XFX 7900 Merc Black - uses old adapter and I could keep my old 750W PSU. No regrets whatsoever!
im so happy i went with amd when i had the choice
@@KenOtwellyou might wanna reconsider that. You need Atleast 850watt for a 7900XT. You cant forget about sudden energy spikes. I have a 7800x3d which is super Efficient and even i went with a 1000w psu just to be safe. And for the future proofing headroom.
If you're gaming then that's fine. But some of us also are creators and renderers. AMD can't do as well as Nvidia in those areas. Nvidia is one for all the jobs, while AMDs are for really just one job, and that job is gaming.
Yes you may be right, but AMD is sneaky. Thats why i went with team blue.
And that's exactly why I am and remain TEAM RED... There are 3 (!!!!) different Model-Types: NON-XT, XT, XTX.
Partly strange because of the Numbers, BUT not like with NVIDIA.
At NVIDIA there is: NON-TI, NON-SUPER, NON-TI-SUPER, TI-SUPER, TI, SUPER, Normal and don't forget the "D". WTH and WTF ??
Also, regardless of ram it's still the exact same GPU, just with lesser or more ram.
Still running my EVGA GTX 1080 ti FTW3, love it.
Still running 2 1080tis myself...😊
I recently moved from this same card to a 4060ti, and I gotta say, even though the performance isn't that different, RTX is a much bigger difference than I expected it to be.
Me too, I'm loving my 1080TI!❤❤❤
I just noticed this when I saw that RTX 3050 6GB has only 70 watt TDP (good, no need for additional power connector), while RTX 3050 8GB was like 130 watt. Initially I though that the shop I was browsing had made a mistake in the product description (after all, different memory variant of same GPU do happen). Anyway, the only instance in which RTX 3050 6GB will have any use case (if it is faster than AMD 8000 APUs) is if it will be sold in the same form factor as RTX A2000. Low profile, short length 2 slot for very small form factor HTPC builds.
I am always so relaxed by the never ending heaven loop jay leaves running the background of his rant videos.
I really don't think I will be getting a new graphics card now for a another 2 or 3 years the way things are. NVIDIA have really frustrated me over the last few years.
Yeah, same here. And when i do upgrade, AMD here i come.
I probably won’t upgrade for like another 6-8 years. January I just got a brand new 3090/ ryzen 5900x complete build from a local pc builder for $1300. Something about the dude he was building it for put down money and the contract of 90 days ended. Gave me all the parts boxes and receipts
Last March I bought a 4090, I probably won't upgrade for anouther 2 or 3 years also.😂
@@djkiIIagwith a 4090 you shouldn't upgrade for half a decade
@@samoptimus4228 nah, probably 2 or 3 years. The 4090 is a great card, but its not the 4k 120fps in every game with maxed out settings everyone claims it to be.
Asking Nvidia to stop being tricky is synonymous with asking consumers to stop being tricked.
your pockets aren't big enough kid
@@SaraMorgan-ym6ue Ah yes, the typical "you're too poor" comment.. I've been expecting you. 😂🤣😂🤣
Most average consumers aren’t watching TH-cam videos or navigating forums.
Having deceitful naming conventions to purposely trick people is scummy. This is on Nvidia 100%
@@TheIndulgerssadly though that is what it takes to be a smart consumer. If you can’t do basic research before a large purchase that is 100% on you. I would never even think about buying a GPU without extensive research beforehand. Anything else is pure ignorance unless money is no object.
Just because i have money and i am willing to pay the price, i'm being tricked?
I just bought a XFX 7900XTX and I only been having it a couple day but so far I am loving it! I replaced my 2080 hybrid and I do not regret it! The only con I have come across is the card is a two slot and it is over 13.5 inches long filling up my full size case.
@@decadeofscattereddreams The only thing that I am having issues with so far is that when I try to play Helldivers 2 it keeps crashing on me.
Try roll back to early amd driver like 23.11.1. Amd is famous for their broken "new" drivers.@@harrykaincles
Here's hoping Battlemage is a viable option, I'd like to buy something else if it performs as well for the price
Intel is always pricey in my socialist country. But sure I'm all for competition specially when it bites Jensen's market share
they really gotta optimize and fine tune their drivers
@@Drewkungfoo I bought a 3080ti from EVGA A week before they announced getting out of the game, and also before realizing the XC3 was not the option I should have grabbed. I'll have to wait a while for Battlemage to mature, since I'm not dropping another 800 bucks anytime soon
@@Drewkungfoo Not just the drivers, it's the cards/dies themselves. The A770 for example was supposed to be a 3070 competitor and ended up competing with the 3060/Ti. They'd need to sort out whatever architectural shortcomings Alchemist had for Battlemage as I don't believe Battlemage is a whole new ground up architecture but Celestial is.
You do have AMD as an option?
I agree. 3040 would also be a good option for those slim form factor office PCs that are all over eBay for cheap.
Great video as always, Jay! Hope you’re feeling better 🍀
like the 4GB (G)DDR3 version of R7 240 or the DDR4 version of GT 1030
I had a feeling this was about the 3050 6GB. Had they called the 3050 8GB the 3050 Ti, I don't think the naming would be such an issue... but they didn't, so it should have been the 3040. But NVIDIA is going to do NVIDIA things, so it's not all that surprising. However, I think it's a good product at the price point AS A LOW PROFILE card... considering people are paying $170 to $180 for the LP 1650 on eBay. But the full height cards at the same price are stupid when a $10 increase gets you the RX 6600 which wipes the floor with the 8GB 3050, let alone the cut down 6GB model. The full height models need to be sub $150 to make sense. Regardless... it absolutely should have been named the 3040.
There's literally no justification for buying a 3050, it's such an insane example of how powerful marketing and brand recognition is. It's not like you need another hardware attached to the brand of a GPU you're buying as is with CPUs and motherboards. As long as you have the powersuply you can put whatever card you want on a build. So If your build can support a 3050 it can support an RX 6600 which is from what I remember is between 20 to 40% faster
@@homelesswizard3161 I know... I said as much in my comment. The 3050 8GB at $200+ is stupid and the full height 3050 6GB at $180 is stupid. You'll get no argument from me. The only reason I think the 6GB is interesting is as a low profile card, where AMD has nothing good. Even NVIDIA has mostly abandoned that market, which is why people are left paying excessive prices for 1650's. At the same price, the 3050 6GB should be a good deal faster and has 6GB of VRAM. Hopefully it pushed pricing on the used 1650's down.
@@TheGameBench I don't know how your local market looks like but if you consider the RX 6500 a lowprofile it blows the 1650 out of the water and is cheaper. And the RX 6400 is also very slightly faster than the 1650 and that very much falls under the "lowprofile" definition while being even cheaper than the 6500.
@@homelesswizard3161 There is no such thing as an RX 6500 and there are no low profile RX 6500 XT's. The only current low profile AMD GPU is the RX 6400, which is a non option in anything without a Gen4 slot, and even then... still isn't a good option. It might seem good if all you look at is the avg frame rate, but it falls apart in the 1% and .1% lows just like the 6500 XT. Some game perform better than others in this regard, but they're both not good options and I'd rather have the 1650 with a 128-bit memory bus 8 PCIe lanes. Especially for the OEM SFF PC crowd which makes up the majority of this market, and most of those people are stuck on Gen3.
All that being said, the discussion was about the 3050 6GB not the 1650, and the 3050 6GB will be faster than either the 6400 or the 1650, for the same money as a used 1650 LP. The 6400 isn't a good card regardless and people should have never been paying $170 for a used 1650. Now they have a better option that has more VRAM and DLSS, which can bring the 6GB model up to the performance of the 8GB model. Also, if you have a Plex server, the 3050 6GB can to 8 transcodes at once, making it very enticing for that use case as well, and I know more than a few people running Plex off a SFF OEM system.
Yeah except for the fact that the 3050 is a joke performance wise and definitely does not deserve a ti whatsoever. The 1050ti was like 10% faster than the 960 while the 3050 is 15% SLOWER than the 2060 non super.
Makes me miss the 1050ti. Sub-$200 card that was bus powered. Could easily slap that into a used haswell office system and be gaming for under for $500 for the whole setup. Made for great days to get people started on PC gaming.
They can get as slimey as they want as long as the community likes to buy the slime to play with
Look, there's AMD card that is better in rasterization and it's 1/2 the price 🤯
Noooo, AMD bad, no like AMD, Nvidia forever and ever 😂😂😂😂
Ive been team AMD forever. As long as anyone keep broadcasting their products and buying it, they will continue.
I love Ivan's Ooze!
Also having the best..that is the 4090! kinda makes them control prices. Which allows AMD to keep its relevancy to a point...meaning midrange, good value cards. Basically they cannot compete with best, so tried to ruin nvidia midrange. And they done well. But that is all
old amd gpus are a great value like the 5700xt
lets not forget the GTX 970 "4GB", which was actually 3.5gb... which they got sued over.... and lost.
It was an absolute disgrace and their excuse was even worse
technically there was a total 4gb, it's just last 512mb but it was much slower, so yeah they can and will getaway with anything, from lying about performance to melting connectors and blaming it on user error lol
all this tech channels are bought and paid for by this companies to make this ads, whether it's positive or negative
@AnonymousUser-ww6ns The only design flaw going on with that was a missing chromosome or something from the mouth breathers that cannot operate a simple plug.
5:54 3050 released 2022....
That makes it even worse because a 3050 is fucking obsolete in 2022. Id give them an excuse if it was 2018-2019 but launching a card that much behind in times in 2022 is a fucking scam.
@@acid1787 Its only obsolete if you exclusively play the newest games at maxed out settings. Not sure why anyone would subject themselves to that though, new games always come out unfinished for $70 and max settings only look marginally better than medium-high these days for like half the performance.
They forget about DLSS too. It is not a bad GPU, just because it had high price in the time of mining. This is Hababox thing. They didn't get some samples and they are only for pure rasterizations. In my country RX6600 was 170$ more expensive and it is not that faster if you dont include Tiny Tina Wonderland. RTX3050 it's even faster in newer games.@@bignose1752
Moreover. At that time 3050 was based on GA106 die (like 3060). Lately they switched it to dedicated GA107. And now they cut the ram on it.
@@bignose1752 id recommend people rather buy a 1060 or a 1080 at that price
At this point I wouldn't put it past Nvidia if they are trying to obfuscate and fuck up the GPU market before they pull out in all but official way, just to make it more difficult on AMD/Intel when they try to pick up the pieces.
I remember getting decent cards for gaming for 99.99
Yeah I remember before inflation hit as well. Quit yelling at the sky, prices will not come down. Your groceries aren't, what the hell makes you think GPUs will? I remember when ground beef was under $3 a lound as well.
I remember my x800xl for 347€ or my 9600xt for 150€
It must be used and old card. Such as GTX 1080.
yeah Voodoo card was cheap like $99@@LCJ73
More like 150, but yeah, much cheaper for sure
I agree, a year ago I bought a 3080 12Gb card which has a larger VRAM memory and data bus (equivalent to 3080Ti) over a 3080 10Gb card. The naming convention needs to be re-thought, it's getting into insane Intel naming territory.
3050 8gb was released in end of January 2022... not '18 - '19 somewhere around that
You are right. It was released on January 27 2022.😂
That makes it even worse because a 3050 is fucking obsolete in 2022. Id give them an excuse if it was 2018-2019 but launching a card that much behind in times in 2022 is a fucking scam.
FEWER Jay, NOT less.
you figure they would've learned from the Not A 4080 card.
The trouble with companies the size of Nvidia, they only want your respect when they need you to buy their stuff.
Because gamers are now such a small part of their business, they don't give a damn about gamers. And as AI markets grow, their need for gamers shrinks
It's not that they don't learn, they just expect us not to.
The RTX 4080 desktop is literally a 70 class card. The RTX 4080 has a die size of 389mm^2 while the RTX 3070 has a die size of 392mm^2...
The RTX 4070Ti is a 60 class card, the 4070 is a 50Ti class card, the 4060Ti is a 50 class card and the 4060 is a 30/40 class card...
What is worse is when they sold a sub-mid-range GPU like the gtx 1080, which was in fact a 60 class GPU for 600-700$, when it should have been 350$ at launch.
And now they are doing the same with the rtx 4000 series...
with the 4080 super situation they just leaved the problem to the sellers, that cant sell the 4080 at the original MSRP so they dont have to cope with that problem and leaved sellers with huge stock of 4080 that will never sell because of the new cheaper 4080 super and cant take them back because is a "different model" nice one NVIDIA ;)
Tell'em Jay!
There were 3 versions of the GT 710, 3 versions of the GT 730 and 2 vastly different versions of the GT 1030. They have done this for years, it makes life so difficult at the low-end (or for retro machines). You have to use Wikipedia just to know which variant you are buying, if you can
Now i want to see Jay rant on how car companies name a completely different car the same name. Like sometimes it will go from a big body 4 door to a short compact 2 door sports car or vice versa and have the same damn name.
*Requested Follow Up Video:* What is a list of 15-20 cards, which are actually available now, that are - what you would consider to be - good value for their price points in terms of performance v. cost? It'd be nice to get a genuine rundown on what cards are worth buying (in said price ranges) in order to be able to sort through the clutter of predatory marketing and not get screwed over by it. A solid list of current market cards to buy ranging from $200 and up would be really helpful for people who don't compare hardware specs on spread sheets as part of their job (the way you do).
Mr Jacket needs a date with a Pear of Anguish... Seeing as we're gettin bent over, without the courtesy of dinner first.
@@legendmaster1989 look up who he is referring to by "Mr Jacket" and what the pear of anguish is and you'll understand.
Mr Jacket is set for life on AI chips...
Nvidia actually released at least 6 different SKUs of the GTX 1060: 1060 6GB, 1060 6 GB Rev. 2, 1060 6GB with GDDR5X, 1060 6GB 9Gbps, 1060 5GB, 1060 3gb.
Or 8 if you want to include the mobile 1060 and 1060 max-q.
technically 1650,1660 etc can be considered from the same family, so add that to the pile of trash
i can see rtx 4670 ti super 8gb coming in same sff as this 3050 in couple of years while performing like 4050 if they ever officially release one, i mean 4060 technically doesn't count as one hehehe
@@r3tr0c0e3
I disagree. Different architecture + it's not named 1060 so kinda misses the whole point of this discussion.
However, I forgot two. Both 1060 6gb and 1060 3gb also released with GP104 dies later down the line. So that'll make it 8 desktop variants. 9 if you count the unreleased 8gb version, 11 if you count both mobile variants.
Give 'em hell, Jay !!! Sick of nVidia treating us like crap
As long as you continue to give them money....they will continue
legit using this video while I wait for Program Settings to load.
Nvidia will continue to treat us like crap. They don’t give a shit about the gaming market anymore, it’s not even a quarter of their revenue. AI chips is their new income stream. Prepare for darker times.
that throatees is the most interesting content from this whole naming-marketing shenanigan . . . good job
Why would they stop, it's working!
Yep. Also gaming used to be about half their revenue, now it's about a fifth. GPU companies are focused on making GPUs for AI data centres, not for gamers anymore. Cloud gaming may be the most affordable way in future.
Have you noticed a lot of vendors of pre builts have not even given the option of the 3060 12gb, changing instead to the 8gb?
And this is why I went with the Intel A770. I'm done supporting a company that has tried to do this multiple times. At this point gamers need to turn away and show them some financial harm. Maybe then they'll stop doing this. Until then though they aren't going to care because they know people are still just going to buy it.
How have the drivers been?
Too bad Intel is almost as slimy in the CPU space as Nvidia is in the GPU space.
It feels like Intel is worse in this regard. Look at what happened during the trickle of the CPU improvement releases a couple years back or so.@@jeffb.6642
@@jeffb.6642 lol. Intel is the best value over AMD. You can get a non z intel board with an i3 which is the old i7 4 cores 8 threads or an i5 non K which is 8 to 10 cores for hundreds less than AMD and it won’t stutter with tpm on win 11 like the AMD bug
Financial harm isn't coming to Nvidia. I had been using a 1080 Ti since 2018 and now Nvidia is worth a trillion US dollars. I bought a RTX 4080 for $930 brand new last week instead of waiting for some scenario that will never appear.
I bought an arc a750 for my friends beginner build we are doing
wasted 4 minutes to realize this video is about rtx3050. thanks jay
😂😂 grandpa worried about his final hours of his Lifespan.
speaking of the gt 1030 they did the same thing, launched ddr4 vram version after the ddr5 vram version, gave it slower clocks and left it as a gt 1030 although the performance was different between the 2 cards.
The GTX 970 3.5GB vram, 3060 TI 8gb vs 12gb vram and performance differences, 3080 10gb vs 12gb price and performance differences, GT 1030 DDR4 vs DDR5 fiasco, and now this lol
How are you feeling Jay?
I've had a few people ask why I chose a 7900XTX over a 4070Ti/4080 and basically call me stupid for it. This video explains exactly why. Nvidia does nothing but continue to abuse their market leader position to squeeze gamers until they quit PC gaming. I'm not targeting anything that needs a 4090's performance, and most of what I play doesn't benefit from ray tracing, so I have zero reason to support such abusive, monopolistic practices. And even if I did.... I cannot in good conscience pay money to a company that is so incredibly, transparently gloating about their price gouging.
Screw Nvidia. Screw Jensen's stupid leather jackets.
Tell em Jay !!!!!
As you said, the whole reason behind it is that most people (except those that REALLY DIG DEEP before buying) will see that it's a 3050 6GB compared to a 3050 8GB & think "well 2 GB less RAM but I save so much money" & not realise that they're getting a cut down card with so much less CUDA power AS WELL as less RAM.
why are nvidia releasing new 3000 series gpu's when they are now on the 4000 series ?
Nvidia has taken the HK corporate motto of "Because you suck and we hate you" and ran with it.
Go back to 2008, when they forced BFG out of business.
we need more $100-$200 cards price range, that can run basic 1080p 60, maybe with low-med settings
Used to be able to buy the previous Gen flagship for $200 that's how much i paid for my ati 4890 when the 5000 series came out.
"Disingenuous!"
"Disingenuine" is not a word!
I just got a 3060ti and called it a day. I know it doesn't have 16gb of vram and it really doesn't matter to me since I just play at 1080p anyway. All the naming conventions by either company intentionally obfuscate their true specs, so I just look up the actual specs and decide based on actual research.
They also kneecapped the 1030 by changing from DDR5 to DDR4, like it needed any help being worse.
I mean it was GDDR5 (not DDR5) and it was labeled GT 1030 DDR4... and every single card sold had the matching specs.
Good work, man ... people need to be taught.
especially those who do not compare GPU performance and specs before buying.
Let's also not forget the 970 which had 3.75GB of VRAM and advertised 4GB, or the 1030 GDDR4 which massively underperformed its GDDR5 counterpart.
3.5 was full speed and 512mb was slower
Not sure why you got upvotes for writing wrong info. It had 4GB, not 3.75gb
As a previous 970 owner, this is incorrect. It did have 4GB, but if you played any game that required more than 3.5GB, you discovered that the last half gig ran at half the speed as the rest of the VRAM (thus killing the performance of the card overall).
Even shadier is the laptop card branding, same exact naming completely different cards. It is crazy that they can get away with it. This is what happens when there’s little to no competition in the market.
Don't forget the slimey trick on the ram speed on the gtx 970
Wasn’t that advertising as a 4gb card when it was a 3.5 + 0.5gb card and not to do with speed?
@@MudSluggerBP What happened with the 970 was that instead of cutting an entire block out of the memory sub-system and making the defective dies into a 3gb/192-bit card, they partially disabled one block in a way where two of the memory controllers were fighting for resources.
It still had 256-bits worth of memory controllers that were connected to 4gb of GDDR5 at the advertised speed... but there was a bottleneck elsewhere in the system, in a part of the die not usually discussed, that made it act more like a 224-bit card.
And while the last 0.5gb did run slower, it was still faster than running out of VRAM and having to use system RAM.
I'm kinda inclined to give them the doubt on that one, because it really seems like the engineers got too clever with how to disable as little as possible in a defective 980, and marketing didn't know how to explain it.
I owned one (it was the last Nvidia card I put in my main PC), and the drivers really managed it pretty well.
I'll not buy anything Nvidia unless they decide to pull their act together, AMD does everything I need, and if they don't get their act together, Intel is improving. And both of them are less slimy than Nvidia.
Sneaky AMD and Slimey Nviida . Sounds about right
Welcome to capitalism
As long as the customer not savvy enough, companies will try to squeeze money out of the customer as much as possible.
Good thing in PC world we have people like Jay & Steve.
I stopped buying Nvidia cards during the 20 series. Maybe if more people actually voted with their wallets we wouldn’t be in quite the mess we are in. Queue in the comments about how Nvidia doesn’t care about selling cards to the consumer and it doesn’t matter, etc
that's + linux support is exactly why i bought RX7900GRE instead of RTX4070 despite way better power efficency .
For the people that have to upgrade every year, STOP doing that. Only upgrade if you have to. That's what would have prevented this mess.
Living in China and amd cards are more expensive. No idea why. Else I'd go amd
@@backlogbuddiessame here in europe. I really dont get the amd hype lol. Price/performance doesnt lie.
that makes no sense.
I get the rant, but I'm still over here since release day trying to get my hands on a Rog Strix 4080 super OC. It's getting on my nerves as well as my builders nerves. MC is 2 hours away so we don't have the luxury of just striking it lucky or gambling on one being in stock when we get there,,,, and NE hasn't had any in stock to order.... but yet literally all the other cards,,, they have plenty of. 90's, 70's, all of them. No 80's. People are selling them on Amazon though for close to 2 grand too which is the same price everyone has all the 90's listed at. It's ridiculous. It's obvious the 80's are selling so why aren't these stores stocked up with more knowing that people are cleaning them out every chance they get?????? It's literally the last thing my build needs and here I am absolutely STUCK. Really nerve racking.
I work with 3D (Blender, UE5) so I have a 3060 12GB because I really need the extra VRAM. I've looked into upgrading to a better card but the 4060Ti 16GB is quite deceptive for the price (I live in Brazil btw).. it's only PCIe 8x and the memory bus is smaller than my 3060. I won't be upgrading anytime soon.
Living in Brazil as well, I am about to build a system as I am way overdue. The market here is hard to get used to as many parts have high import duties. It seems to me Intel processors are taxed less than their AMD counterparts, and AMD video cards do not look to be priced competitively either. Unless you have a good muambeiro its hard to put anything together reasonably, and avoid the Chinese no-names that flood the market here. Tracking prices and exchange rates and buying when something is less than 25% over the USD price seems to be the best option. I expect to take a few months before I have all I need.
@@cabrageo I wish you a good luck with that. I ended up buying my 3060 near the end of the GPU price craze so I paid a lot for it, but considering I had a 1050Ti before, it was worth every cent.
I always appreciate that you call out bs.
at this point i have so little trust in what nvidia is selling me that I have to look at reviews for every sku lol. it's so rough
As if Nvidia isn't already making enough money with the AI boom. Big thanks to Jay for keeping "regular" people informed in a way they can understand. His videos helped me a lot, as they must have for a lot of other people I'm sure.
Speaking of questionable. Frame generation and upscaling has been around for 15 years in TVs. Funny how you now need their AI cores to use frame gen, but AMD and everyone else, use software, and how 2012 Skyrim, and unreal engine have global illumination, reflections, etc. Hmmmm
$2k for an 80 series card in Australia is absolutely crazy... at this point I don't even want to upgrade ever. I only upgraded to a 3070 series last year but originally had a GTX 1080 the performance difference is barely anything
And the funny part is that the rtx 4080 is a 70 class card. The 4080 has a die size of 389mm^2 the rtx 3070 has a die size of 392mm^2...
If I was still in the market for a graphic card, I might consider some of these lower price point cards. I agree with the recommendation for the naming scheme - Sometimes marketing people should NEVER be allowed to make decisions about product names. That said, I don't plan on any more upgrades soon, as I recently upgraded from a GT610 to a GTX 1660 TI. That's plenty on graphics power for me, and the price point was at the top of my budget as marked down from $359 to just $229.
Only thing that i buy from Nvidia are stocks. Sometimes to hold a bit, sometimes to short. So far so good. Products...nope.
75W power limit is for Low Profile cards in SFF casing that doesn't have extra power supplyu connectors. Since now GTX 1650 LP was the fastest card for SFF.
I'm sure somebody already pointed it out: the 1060 3GB had the same 192bit memory interface than the 1060 6GB. So compared to the 1060 3GB the 3050 6GB is even worse, since it's memory bus of 96bit is actually narrower than the 128bit bus of the 3050 8GB.
What's frightens me is the pricing it increased so much over the last 6 years the 5xxx series will topple everything can't imagine that the 5090 will be lower as 3000$
Also , believe it or not, this cards sell REALLY well in 3rd world countries where having a 60 or even 70 tier card means 6 to 10 months of paychecks...
I was one of those dummies who bought the 1060 3GB card. I mainly bought it instead of the 6GB card because of the card length. I had a drive cage in front of the card that would've made fitting the longer cards an issue.
Who TF cares about the 3050 anyways? These aren't gaming cards.
In this current market, Nvidia has almost zero incentive to act in good faith. Ppl will buy their products regardless of how deceptive their advertising/marketing is.
Pretty much all graphics cards don't matter because used 6600XT's exist for less than 200$
some people want higher resolutions
9:00 heck I paused this and went to EBAY, there's no shortage of 3070Ti's for 350ish right now and that card definitely outperforms a 3050.
Its crazy that they are releasing a new 30 series card after the 40 series has already been out & just launching the 40 Super series.
The worst thing is they are doing this on the down low. Most people wouldn't even know about this release if you weren't paying attention to the market.
The 3050 8GB came out in early 2022 on performance level with a 2016 card (GTX 1070)
The 3050 6GB is a 2024 card on performance level with a 2018 card (GTX 1650 Super)
Good job Nvidia.
My first PC was 486DX2 with a separate Math Coprocessor and external, single speed, CDrom.
I bought a sound-blaster long before I needed an external graphics card... Graphics cards weren't even really a thing until the internet moved into smaller towns and cities and even then the games were better optimized than they are today. Quake and EverQuest ran just fine on a Pentium.
My bet is WoW drove one of the biggest PC upgrade revolutions in modern recorded history and that's when I'd bet graphics cards really started selling main stream.
I've never bought a new Nvidia card for this very reason. They are slimeballs looking for the easy money grab, and they don't give a crap about their customer base. I don't care if they do perform better. I have owned a few used ones only because I was able to get them either at a decent price or it came with a machine I purchased.
They did the same thing with the 770s way back in the day. I fell victim to it back when I was new to PC building and more naive then I am now. This has always been their M.O.