Nvidia GeForce RTX 3050 6GB, Gaming Benchmarks and Review
ฝัง
- เผยแพร่เมื่อ 23 ก.ค. 2024
- Support us on Patreon: / hardwareunboxed
Join us on Floatplane: www.floatplane.com/channel/Ha...
Disclosure: As an Amazon Associate we earn from qualifying purchases. We may also earn a commission on some sales made through other store links
Buy relevant products from Amazon, Newegg and others below:
GeForce RTX 4090 - geni.us/puJry
GeForce RTX 4080 - geni.us/wpg4zl
GeForce RTX 4070 Ti - geni.us/AVijBg
GeForce RTX 4070 - geni.us/8dn6Bt
GeForce RTX 4060 Ti 16GB - geni.us/o5Q0O
GeForce RTX 4060 Ti 8GB - geni.us/YxYYX
GeForce RTX 4060 - geni.us/7QKyyLM
GeForce RTX 3070 - geni.us/Kfso1
GeForce RTX 3060 Ti - geni.us/yqtTGn3
GeForce RTX 3060 - geni.us/MQT2VG
GeForce RTX 3050 - geni.us/fF9YeC
Radeon RX 7900 XTX - geni.us/OKTo
Radeon RX 7900 XT - geni.us/iMi32
Radeon RX 7800 XT - geni.us/Jagv
Radeon RX 7700 XT - geni.us/vzzndOB
Radeon RX 7600 - geni.us/j2BgwXv
Radeon RX 6950 XT - geni.us/nasW
Radeon RX 6800 XT - geni.us/yxrJUJm
Radeon RX 6800 - geni.us/Ps1fpex
Radeon RX 6750 XT - geni.us/53sUN7
Radeon RX 6700 XT - geni.us/3b7PJub
Radeon RX 6650 XT - geni.us/8Awx3
Radeon RX 6600 XT - geni.us/aPMwG
Radeon RX 6600 - geni.us/cCrY
00:00 - Welcome to Hardware Unboxed
00:42 - What exactly is the RTX 3050 6GB?
01:14 - RTX 3050 6GB Specifications
06:07 - Remnant II
06:45 - The Last of Us Part I
07:04 - Starfield
07:24 - Resident Evil 4
07:58 - Ratchet and Clank Rift Apart
08:22 - Star Wars Jedi Survivor
08:48 - Forza Motorsport
09:07 - Cyberpunk 2077: Phantom Liberty
09:37 - Hogwarts Legacy
10:08 - Avatar: Frontiers of Pandora
10:34 - Assassin’s Creed Mirage
10:56 - Alan Wake 2
11:17 - Counter Strike 2
11:40 - Average [1080p, Lowest Quality]
12:41 - Cost Per Frame
14:49 - Power Usage [Cyberpunk 2077: Phantom Liberty]
15:17 - Power Usage [Assassin’s Creed Mirage]
15:49 - Temps + Clocks
16:39 - Final Thoughts
GeForce RTX 3050 6GB: Gaming Benchmarks / Review
Disclaimer: Any pricing information shown or mentioned in this video was accurate at the time of video production, and may have since changed
FOLLOW US IN THESE PLACES FOR UPDATES
Twitter - / hardwareunboxed
Facebook - / hardwareunboxed
Instagram - / hardwareunboxed
Outro music by David Vonk/DaJaVo - วิทยาศาสตร์และเทคโนโลยี
In my country it's still the same price as the rx 6600, budget nvidia cards are horrible.
It will get cheaper
@@ASS_ault When?
@@ASS_ault So will the Rx 6600.
And you'll still buy it anyway! LOL
THANKS FOR THE LEATHER!
@@jensenhuangnvidiaCEO ALL HAIL JENSEN HUANG! LET'S GET NVIDIA SHARE PRICE TO 3 TRILLION!
damn, that "look it says 6gb here" was hilarious, like showing some kind of freak
some moron whined about them using "another AMD box" for a CPU review (even though they said it in the video) was kinda ridiculous to whine about it but oh well...
I know, I already hang my head in shame when I tell someone I have a 3070... 🤣🤣🤣
Nvidia has been doing this for over a decade and they don't seem to plan on changing that unfortunately. The 1060 3GB, 1030 DDR4, and now the 3050 6GB are just the most recent examples
RTX 3060 8GB (which should be an RTX 3050 Ti tbh) is also an abomination of a GPU
It could be seen as misleading and false advertising which would make it illegal, at least in civilized countries with good consumer laws. Not sure if anyone have tried to report them to consumer authorities in their country.
The most egregious attempt was the 12GB 4080, but that riled up too many feathers so they backed down.
Bro who cares, as along as they sold out😅 its smart moves by nvidia bad move for buyer tho
IIRC the GTX 750 had 3 or 4 different specs, with even different chips used
If only Nvidia had a passing idea of the concept of integrity and named it a 3040 like they should have.
3030
or at the very least 3050 LP (low power), or something
3000
3040... 3050 Lite... something like that
How many more video cards are they going to release? I never thought I’d say this but there are far too many choices and I feel bad for the first time builders out there.
Its a waste of sand, plastic, and your money.
Maybe the 8GB model but not this.
@@arenzricodexd4409 You for real?
Time
@@arenzricodexd4409nvdia fan boy copium
😢
What a way to sell old generation 🤣
Imagine employees going through the effort to make safety data sheets, specifications, develop a production line, giving out design drawings of PCBs and making the silicon chip and send to cooler manufacturers, designing the coolers and actually producing a product with boxes foams manuals and shit, for a trash, shipping them worldwide to retailers.
And nvidia had to include this card for every driver update😂
What the actual f
😂
@@happybuggy1582 not a big deal actually. Everywhere they just search-and-replace 3060 to 3050. GPU chip is modular, nvidia can disable any number of the building blocks to downgrade as low as physically possible.
@@happybuggy1582To be fair, they already have the silicon, since the card is identical to their laptop 3050 refresh. They just needed to put it on a separate PCB.
This device is proof how nvidia is running planned obsolescence scam.
The RX480 offered the same performance and more memory at the same price 7 years ago.
7 years ago!
Nbarrassing.
a bit less performance but yeah, the same performance tier...
You have to factor in the inflation as well.
and for 125e here in spain
@@nikhileshsingh8706There are no new RX480s for sale, haven't been for over 3 years. If you're factoring in inflation then you should factor in the fact you could have used a card for 7 years and saw basically no compelling reason to "upgrade" to a 3050 6GB other than power savings. And I say this having bought over 20 1030/1050s for friends and family. Pascal was their last attempt at selling anything below $400 with reasonable specs. I hate the company as a consumer, it's why I own their stock.
@@RFLCPTR A highly tuned, BIOS-modded 480/580 8GB is really going to be very close to the 3050 6GB and even outperform it in some situations, like VRAM-starved games. I mean, a regular 3050 8GB will score 18k in Fire Strike. My overclocked (1350core/2050mem), BIOS-modded RX 570 8GB scores 14.3k. It's really not a big difference and it's pretty ridiculous that both the 3050 8GB and 6GB have less memory bandwidth than an upper midrange Radeon from 2016.
The more you buy, the more leather I have .
The more you buy, the more dividends I receive.
the more you buy, the more the wallet cries
Thanks for the sneak preview of the 5060
Wish this was compared to GTX 1060 / RX 580
Rx 580 is dead, it stopped recieving driver updates.
@@DragonOfTheMortalKombatso is 1060, your point is?
@@ionutbagmuianu884i'm pretty sure the 10 series is still getting driver updates
Same here, this card probably meant to replace GTX 1060 and 1660. At least we get a 1650 Super on the graphs, 1660 Super still beats this 3050 6GB but at least for cheaper MSRP.
I just don't understand why reviewers don't include the cards what meant to be replaced, like GTX 1060/1660 and RX 480, but include cards like RX 6400 and Intel A580. If you bought the last two card you already regret it and replaced it, but if you have a GTX 1060 or RX 480 you still not gonna replace it with 3050. Almost 8 years passed these two GPU manufacturer just refuse to deliver better price/performance/watt cards than the last generation.
There are tons of used RX 6600 on the market, originally bought for mining but they only run 6-8 months, they pretty good deal and only cost 60% of a new card price.
@@DragonOfTheMortalKombatOh really?:
Adrenalin 24.1.1 (WHQL Recommended)
File Size
629 MB
Release Date
1/23/2024
RX 580 is not even in the Legacy section.
Finally! A card that makes my old 1070Ti look good again!
Yo, still using 1070 ti
I only just now dropped my 1080 and only because I swapped from 1440p to 4k monitor
These people believe it or not, went to very good universities making these decisions
The 1650 is second on the Steam hardware survey, so maybe the 3050 6GB will be a big seller?
@@virtual-adam yeah what an amazing upgrade, these people are still rocking the 1650 because they're broke, it's not a new purchase, it's been in their rig for awhile
@@virtual-adamthe problem is the price , the standard 3050 was too expensive for its price and this 6gb version even if cheaper is also too expensive for its price .
If you have a 1650 and you are looking for an upgrade, it's infinitely better to buy a used 2060 super or a new 6600 for 10$ more than the 3050 6gb
most gamers are broke.. @@dbunik44
Hot take: that’s why NVIDIA’s pricing is the way it is.
They keep ripping customers off but yet people keep buying them
I thought Steve would have snubbed this product, but he reviewed it nonetheless, what a champ!
I love the budget stuff :)
@@Hardwareunboxed When you do =) , why not make a proper comparison with maybe only one card with external power (like that RX 6600) for comparison and ALL the other cards come with that same limitation?!? It's a niche but for some people it is relevant.
With only 75 W available your simply limited to really "interesting" option like a usd GTX 1650 or 1050 Ti, the new ARC 380 without the extra plug, the ARC 310, the RX 6400 .. maybe the RTX A2000 (which kinda is the RTX 3050 6 GB).
Some people are hungry for this "best solution in price/ performance" when the system is limited to "PCIE slot power only (75W)". Because it enables an upgrade to all those cheap office OEM PCs out there! Right up to pair it with an Ryzen 2200/ 2400 in an older office PC.
so they made a modern "GT 1030" but call it "GTX 1050TI" gotcha... should be illegal to call them the same when they are nowhere near the same... it should be called 3030... since it is basically just a new 1030...
Maybe rt 3030 or 3040
3040, it outperforms the 1650, it's more like a 1660 and it does have RT and DLSS. If it was $40 cheaper and called 3040, it would be a compelling card. Especially the low profile version, the fact it's so compact and runs on board power means it can be put in any office prebuilt and you can get reasonable performance. No bad cards, just bad names and bad prices.
It should be the "RTX" 1660 xD
The legendary Gt 1030 only uses 30w maximum…
That’s laptop gpu level
Yet you think this 70w card should be called a 3030….
The GP108 (GeForce GT 1030) was NVidia's last true low-end chip :-(
It's a good GPU
for 79.99$
no AIB will want to sell GPU with such MSRP anymore. the minimum bar is $150.
Even that is too much
Exactly. You could maybe push to $99.99
because Nvidia sells them GPU + ddr6 bundles for far more.AIB don't make much on GPU unless they go super premium which then don't sell.They are trapped by nvidia, that's why evga is no more. Evga wanted to be decent and they just couldn't, you have to play the scummy games with nvidia@@arenzricodexd4409
If I cannot buy a better card for more money, I buy nothing.
I'm sure I can use that money for something better if I'm that poor.
It is always a waste of money.
In my country (Croatia) rx 580 8gb can still be found new for 180 euros, rtx 3050 6gb is around 200 euros and rx 6600 is 230 euros.
And there's precisely 0 reason to buy anything other than the 6600 in that price category
@@pedrolantyer I agree. Had rx 580 for 5 yrs, great gpu but now is obsolete. I'm now on rx 6700xt.
New XFX RX 6600 Swift210 in Serbia is 200 euros. Bought one for sons PC. Great card.
They aren't really doing driver enhancement updates for that card anymore. I'd avoid it now.
And there's no specific reason to buy this 2 peace of garbage trashdeon. Get this RTX and be happy with it
Low profile and pcie slot powered is a good thing. I wish we had more options in this niche segment. It will fit in a 2u server chassis, and it can bring lots of prebuilt office pcs back to life.
The 4060 LP would make for the better drop in on some office PCs. Despite the additional required power plug.
@@user-ej9nl1ng9dthe 4060 already uses AD107 which would traditionally be the 50-class die
@@user-ej9nl1ng9dhow do you do that? Just set the wattage and unplug the power cable and "it just works"? Thanks in advance, I have an RTX A2000 but I may try this! The A2000 sells for more than what a 4060 would cost me, so if I can do that you best believe I will.
Edit: did an extensive search, people said that it would not work online. Haven't ever tried it, but I am not convinced it'll work until I see it. If it does though, that would really change things.
Wait you can't just unplug the power cable since graphics cards can actually detect the presence of 6 or 8 pin power cables, iirc I tried plugging a 3060 with just 1 of the 2 8 pin cables (yes it's a Colorful 3 fan model with 2x instead of the usual 1x 8 pin) and my PC just blanked. My old Zotac 1060 gave warnings when I forgot the power plug and the PC didn't boot.
@@user-ej9nl1ng9dThey have GPUs like that already. RTX A2000 uses the 3060 chip and runs at 75W, there's a new RTX A2000 Ada that came out recently, it shares the same name confusingly but it uses the Ada lovelace chip and is significantly more powerful, still running on just 75W. The A4000 also exists, and I believe that one got an Ada lovelace revision as well, and that ones perfotmance is mind boggling considering the 75W limitation.
I have an RTXA2000 the 3060 version, it runs games exceedingly well, it's just expensive and has limited supply. I only have one bc I managed to find one locally for an exceptionally good deal: $200 and it was only ever used in a office PC. I put it in a cheap prebuilt I got for $100 that has an intel 10400, and for the price of a series S I have a surprisingly capable PC that can run just about any game on the market at 60 fps and decent settings/resolution. I'd love to get my hands on the other 75W GPUs, but they're hard to come by, or absurdly expensive. It's insanely impressive to measure the wattage/performance ratio on them.
"Despite the additional required power plug." ... Which rules it out for many.
Thank you for the review of the RTX 3030. Lets see when the RTX 3010 with the name RTX 3030 gets released
RX 6600 Is the king of budget GPUs in this day and age as it nearly performs the same as a 3060 but for the price of 180-200
And If the 6500XT came out with 8gb vram for 149 or something then it would have made the 3050 6GB even worse
All trashdeon cards are obsolete, not king, please don't be ridiculous 💀
@@Thejacketof-huang The benchmarks don't lie.
@@Beginnergamer1823 AMDumb where is trashdeon rx 6500xt 8GB? 🤡🧠🚫💀👉🏻🚽
Rx 580 is king for $100 beat that
Crappy 96-bit memory bus for $170, that is some premium e-waste!
Yep, it's a xx30 class card for xx60 class prices.
Just wait 5060 to have 96 bit bus next year! With 9gb of vram!
You should see the $160 msrp AMD 6400! lol
@@haukionkannel9gb vram? In your dreams. You'll get no more than 8gb.
@@krizztykrab2297
96 bit bus with 3 gb memory modules aka 9gb vram is cheaper that 128bit memory bus with 2 gb of memory modules aka 8gb of vram... So because 96bit bus is cheaper... that is what we will get!
🤣
I bought a passively cooled Palit RTX 3050 KalmX 6GB card and I am very satisfied. Silence and energy efficiency are the most important to me.
"Having two GPUs with the same designation is confusing." - Nvidia 2022
What a well made video. You covered everything from fps to efficiency and operating temps. I'm subbed!
Any chance you could test the KalmX model?
There is only 1 RTX 3050 6GB that is worth to buy
That Palit one, the only Passive Cooled one
Good point. I think a very small ITX build would benefit.
@@idocare6538 The small ITX would need some decent airflow in the case itself in that situation.
@@idocare6538 you can google it, Palit 3050 KalmX
@@idocare6538 just as an FYI, the passively cooled ones require good airflow
Is it? Any GFX with a decent heatsink, can be modded to loose any fan noise. Just print a duct and use better fans. But coil whine, well, it remains the key source of noise in all my builds. All of them. Fan noise is solved by now. As long as you got the know how, and know how to print a simple piece of plastic. For reference, I got a single RTX3060 that does not whine. I've been through about 30 cards by now, every other card whines. I only got a single silent card.
For all the others, vrm and coil whine, is the main source of noise, and it it quite irritating in nature.
That being said, if the Palit one has no coil whine, it sure will be great to duct for silent and cool operation. No need for a chip cooker inside of my builds. It does require a fan, at that TDP. For reference the fan hardly need to spin, so you will not hear it. Chances are, that you will get a ton of coil whine, simply dwarfing any fan. Love to be proven wrong. Palit surly should send me one, as I would mod it in a day, and praise it into the sky, if it only does not whine. Which it probably do.
Unless this costs 120 bucks or less this is a straight up waste of silicon.
Is this naming issue the same as the 4080 12gb? Either way I am looking to buy a few of them for some old office PC I'm planning to flip so I found this video interesting. As o will be selling them on with a 1080/60 monitors so it fits the bill.
The Palit KalmX version is still the best passive (silent) card I can get für my silent Ryzen 9 5900X PC. Or do you know of any better silent alternative?
Price is outrages.
It evolves just backwards
Is it possible that there is a use case for this as a SFF HTPC? Or are there better alternative GPU solutions or maybe even a CPU with iGPU? What is the minimum required specs for playing 4k60fps content with HDR?
Lots of good work there, thank you. I wonder how differently the half height and the silent (KalmX) versions function? I wonder if those two might be more suitable use cases for your viewers. Perhaps something for a follow-up video? And if you do do a follow-up, how about looking at it as a second GPU for creators, giving access to things like RT Voice and Nvidia Broadcast, and helping with streaming - I've seen one streamer have to put in a second GPU because Jedi Survivor kept crashing.
Interesting to see the A580 perform so well; given that you included the RTX 4060 perhaps you might have included the A750? I guess you were time-limited.
Playing "Funny Buggers", love it when you use Aussie slang in your vid's Steve, Great work !!
The new 3040 is so cuteeee, tiny little guy, thanks ScamVidia
Whoa, I thought I had clicked on a video that was 2 years old. That's insane that they are releasing this now.
Can you make a top 10 GPU list which don't need external power? That would be nice.
How does the AMD W7500 compare to these cards?I'm asking bevause my plex server can fit low profile gpu's or a full height single slot GPU and I want to have it pretty power efficent.
The AMD Pro W7500 is 16-18% faster than the rtx 3050 6gb (source-techpowerup)
I think the Intel Arc a310 is the most efficient brand new/ cheap gpu right now, it's only 40w max/ $100/ single slot, while the rx 6400 is 53w max/ single slot/ around $150
Desktop users: NVIDIA LOWERED THE 3050 FROM 8GB TO 6GB VRAM 😡
Laptop users: NVIDIA INCREASED THE 3050 FROM 4GB to 6GB VRAM 😌
I think that the real marketing error that was done here is that for whatever reason NVIDIA did not mentioned the biggest advantage of this card (and only one ?) - it does not require additional power connector and it "can game". This makes this card a unique thing on the market. This is ideal for upgrading older PC (like office Dell Optiplex etc), or if you are making tiny ITX build, with this card you can get away with external pico laptop power adapter since GPU will not need additional power plug. Despite being slow & overpriced it still has a use case. I mean yeah, technically RTX A2000 & A4000 are faster and also do not use more power but those cards do cost as much as entire PC. Also, RTX 3050 6G - there is no reason to buy regular size card, only LP cards do matter here.
Pretty much; it's meant to replace the 1650 which came out what, 5-6 years ago?
It's still not good enough as an upgrade though; 8/128 in a 75W window should have been possible by now (but that would have stepped on the toes of the 4060 8GB and Nvidia can't have that).
I have an unraid server and currently using a 1630 gtx (no power-pin version). This is a good replacement for me. It is a bit more expensive than what I expected though.
I always wonder, are the lower-end AIB cards worth the low price? even if the cards are in the higher tier line?
Good for dedicated streaming pc for the nvenc?
Just imagine how many clueless buyers are gonna overpay for this in their prebuilt systems just because it says "Geforce RTX" on it
If they're buying a prebuilt system, they're not going to care or if they do care, they can easily return it....
Glad you did this, needed to be done
Is the fan loud? When idle?
Minor correction/contention/elaboration: most new RTX 3050 8GBs use GA107 silicon. The full GA107 chip has 2560 CUDA cores, 80 TMUs, and 32ROPs, which is the exact spec of the RTX 3050 8GB. Nvidia prioritises supply for their RTX 3050 refresh laptop GPUs, which also use the full GA107 chip, but surplus GA107s and those which don't meet laptop efficiency requirements but still have all cores working, are used for desktop RTX 3050 8GBs.
GA106 was used for the first production run of RTX 3050 8GBs because GA107 wasn't available at the time, but over the next few months, most RTX 3050 8GB cards switched to using GA107 because it's cheaper to manufacture and usually a bit more efficient.
Some new RTX 3050 8GBs still use cut-down GA106, because some GA106 chips still don't have enough working cores or don't meet the efficiency ratings necessary to be an RTX 3060 or RTX A2000, but this is relatively rare and only done when necessary, similar to how RTX 2060 "KO" GPUs used TU104 chips with a large number of defective cores instead of TU106.
Considering the rtx 4060 is twice the performance as rtx 3050 6gb hence the price should have been $150.
Look at wattage + competition (rx 6400 & arc a380) ($160 msrp or $120 msrp)
It should have been closer to $150 called the 3040 and marketed as a low profile and low consumption card
Would this be a worth while upgrade over my current GtX 1050ti?
will that be able to run anything other than MS Excel 97?
"This model was quite slow on It's way down to Australia" Yes, because It traveled from Santa Clara on a bus... on Its 96bit bus to be exact...
it's = it is
just to make sure i understood correctly: NVidia launched an RX 580 performance GPU, for RX 580 pricing, 8 years later..........what a deal.
But they had to axe off 2GB VRAM to make it happen. And have AMD provide the frame generation.
So then tell me, what did AMD do with the 6400 & 6500 & 6500xt that this pos beats?
@@thetechrealist In the middle of the crypto boom, when people were having problems getting ANYTHING to attach a monitor to (remember, AMD didn't have integrated graphics at that time) AMD took their low-end laptop dedicated GPUs and stuck them on boards and called it a day. They are products that made sense in that specific context, but today have no place in the market.
@@thetechrealist the 6500/6400 family are by no means great, but even the 6500 xt 8gb still beats out the 3050 6gb while having more ram and lower prices, the 4gb has even lower prices, and the 6400 is just not even in the same bracket its more like a video adapter. but again, not great cards, nobody liked them. and even then they perform better at lower prices (6500 8gb) and everyone hated them, so why like the 3050?
@@andersjjensen AMD is just as guilty of releasing trash, even with that context.
How does it fare in production task like streaming and rendering?
I always wonder if it would be possible to solder more VRAM on those and have it work, or if the GPU is just a bad bin with a defective memory interface.
Wish they added 1660 TI or super to the charts aswell.
The 1660 ti is very simillar to the 3050 8GB.
1660 ti would be on pair, or a bit faster than 6gb ver
it sure beats 6gb 3050 version. since it barelly 5fps or so faster than 1650s@@FATBOY_.
As far as I remember it was very close with the RX 5500 XT
after looking
nah the 1660Super was even 7-10fps faster back in 2019
I do have a couple of repurposed DELL PCs one has a 1050 ti, the other nothing, a couple of these are not a bad idea, I will get at least one for my Moms PC for sure, naming aside the specs are public and that is all that matters. Repurposed dell workstation can be purchased for 70 USD and they are fully functional is a great value.
Price of the 3050 8gb seems off. It's more expensive than 4060?
Hi Steve, thanks for the thorough review and price comparison for this budget card. For the comparison charts at the end with cost per frame, please could you in future include the VRAM capacity for ALL cards, as this helps when comparing prices looking up and down the rows. Thanks for the great work as always 🙂
"see it says 6gb right there... it really is a 6gb version of the 3050... who wouldve expected that"
indeed, sir, indeed hahaha
shame that the 3060 is missing in the charts
can you review the ASROCK A620M PRO RS motherboard?
I have heard it is really good for its tier
Will you be comparing 6GB and 8GB models with different texture settings at some point?
If AIB's make a Low Profile version and keep the price down to MSRP this would be a better upgrade solution for those Dell and HP office PC systems!
Gigabyte already has a low pro version and it's only $6 over MSRP. AIBs know exactly who this is targeting.
@@zodwraith5745 i did some checking around myself and MSI also is releasing a LP version too! Nice!
@@zodwraith5745 Gigabyte LP version is 60 euros over msi's ventus x2 oc or gigabyte's own eagle OC where I am. But it's only 40 euros cheaper than rx 6400 lp
Probably the worst GPU of 2024.
great timing, just looking into the 3050 atm! Thanks!
Why not include the Arc a750?
There is a part of me that is thinking that they only released this product so that they can compare to this 3050 on slides in the future instead of the real one.
This is interesting, I would like to see the performance of this card vs the RTX a2000. That card has 12GB VRAM and also runs exclusively off of PCie power. I feel like the a2000 would beat the 3050 6GB you are reviewing. I remember Dawid doing a video on it and the a2000 was able to go blow for blow with a 4060 that had the benefit of the power cord. Only mentioning because if someone was limited by the power draw being from the board then the a2000 should perform better than the card being reviewed here. The only issue is cost because an a2k is like $400+ USD used.
Try searching for A2000 6gb, they're less than half of that, at least here in Spain.
You're right about upgrading a "work station" type PC with non-powered cards. That's the only time I personally did it, with my first PC (AMD Sempron 3000/512mb ram). I don't remember what card I bought, but going from integrated gpu to a discrete gpu was night and day. I could play San Andreas and HL2 at a decent framerate with resolution better than 640x800 and I loved every moment of it. I'm sure some kid out there will get a lot of use out of this card.
I am curious what this little card's performance and power useage with a 12100f or 13100f is vs the 8700G. Use case wise I think they would make sense in similar use cases. Small form factor low power draw systems. Bad naming aside, the card itself does have its place.
It's wild to me seeing the GTX 1650 4GB and RX 5500XT 8GB both so close to each other, and sometimes higher than newer cards.
Am I crazy for thinking any 30 series card should be faster than any 10 series (or 16 series if you want) except for maybe the flagship (1080) vs the budget (3050) models? I DEFINITELY can't accept a 3050 6GB being on the same chart as a 1650 4GB, but when that 3050 is lower than another 50 class from 2 generations back and with 2/3rd the V-RAM you can't convince me that's okay.
same price to performance as a gt 1030 lul
Does this card have drastically lower performance in a pcie 3.0 system just like the rx 6500 xt? I have an optiplex i7 3770 system and this card sounds interesting.
For the most part it's fine. It has twice the PCIe bandwidth as that of the 6500 XT.
1:17 But why are there 2 off 3050 8GB with near identical specs, but one is 106 and the other 107, ie, do they have the same performance.
They could just named it the 3040 and save themselves the aggravation....
And priced it at 120$
@@DragonOfTheMortalKombat Absolutely.
They want to maximize margins and the name allows OEMs to scam customers by selling them a "3050" but at a lower cost to the OEM. It's extremely scummy but Nviida have been doing this for awhile.
Nah, it will sell better as 3050. Many buyers won't know the difference between the 2 variants.
TLDR buy the 6600 instead of this waste of sand
Just FYI is you are trying to put one of these into a smaller optiplex it won't fit since its a 2 slot card. The PCI slot is real close to the power supply which makes unable to fit.
Gtx 1630 v2
When do we get the 45W RTX 3030 4GB on a 64-bit bus?
This excites me 😏
Hopefully it doubles the performance of the GT 1030 2GB 64-bit.
This is an rt3030
@@a15bionic59 no this is the RTX3010
The GP108 (GeForce GT 1030) was NVidia's last true low-end chip. It's APUs from then on.
dunno about 64 bit bus but atleast 3050 4gb 40W gpu already exists in laptop
The 3050 has a 64bit laptop variant called 2050.
6500 xt 8GB? when it was released
Any plans on Helldivers 2 performance benchmarks ?
would've been an amazing card around *10 YEARS AGO*
Should i wait for this one? Was about to by an Arc a580 😅
Don't buy this bs get a rx 6600
Buy 2nd hand, save yourself
Buy Rx 6600Xt
@@inGameweTrusted Less power consumption when it's idle, i guess RX 6500xt would be good for gaming and productivity. I thought Arc A580 would be good for futureproofing. Is RX 6500xt good?
@@walkitlikeitalkit.4692 RX 6600, NOT 6500 XT. RX 6600 uses like 100-120 watts and its much better
I'm actually kind of curious how this would work as an encoder card in a HTPC or PLEX/EMBY server.
You guys are great, i watch most of you're video's I currently have a Ryzen 5 5500 and GTX TITAN Xp, Seems like it's bottlenecking it a little bit but not to bad, Could you do a video on , DO YOU NEED 8 cores in 2024? Or is 6 cores still enough? Seems like cache is everything now like the video you did, like back in the days pentium 4 vs celeron
They have covered this, 6 cores is more than likely enough, cache is more important, you also have transistor count, clock frequency and instructions per clock that matter too, cores aren't everything.
@@technologicalelite8076 Well next gen consoles are right around the corner and i would say that will retire 6 core processors
@@jamesdenson7616 Mid-Gen Refreshes are right around the corner, not next gen (Not trying to be snarky or anything, just wanted to make that correction as well). The CPU for the PS5 is 3.5Ghz, Xbox Series X is ~3.8Ghz (~3.6Ghz with SMT), both 8-Core, 16-Thread, Zen 2 Architecture, based on a 7nm process (At launch, PS5 later got switched to 6nm for better yields along with a die shrink) According to Wikipedia.
Now your CPU, the Ryzen 5 5500 (Non-X) has a base frequency of 3.6Ghz, with boosts up to 4.2Ghz (When gaming, it usually sits in between, in my experience atleast, but I do not have your CPU, results may vary), Zen 3 Architecture, but also on a 7nm process, and ofcourse 6-Core, 12-Threads.
For transistor count, Ryzen 5500 has 10.7 Billion, Xbox Series X with 15.3 Billion, PS5 at 10.6 Billion (The console's share transistor count with their GPU, so that's another thing to conside). Cache is also important like I said, Xbox series X at 4MB, PS5 at 8MB, and your Ryzen 5500 at 16MB.
Cache is less of an issue on consoles because of their entire memory being miles faster with GDDR6 memory compared to even DDR5. The whole point of larger, closer cache is to reduce access to slower system memory which once again, not an issue for consoles.
Now I can't find any performance comparisons of the Ryzen 5500 vs a console, or even much info of the CPU's in the latest console, but what we can use is a bit of logic.
All of those things I've listed can change performance drastically or minimally. It's the combination, use case, and how developers use it. We've seen some games love cache, some more cores, some clock speed, but usually with diminishing returns at a certain point.
The Xbox Series X and PS5 perform almost the exact same, yet quite a few things are different despite having similar custom Zen 2 and RDNA hardware. I can't calculate all of it for you, but considering the 5500 has more cache, higher clock speed, newer architecture, and (usually) doesn't share resources like RAM, your CPU should perform around the same despite having 2 cores less.
I'm not saying "Nobody needs more than 6 corrs" or any of that BS, just for most games, 6 cores will do and is all the game wants or needs. Is this *EVERY* case? Absolutely not. I play Cyberpunk Path Tracing with my 7700x and 4080 Super, and ever since the 2.0 update, it was optimized to be able to use 8-Cores all at once, and oh boy does it do it, ~80% usage atleast, on all of my threads (And GPU stays above 95%), it's wild.
I do think that the minimum standard or entry point should eventually become 8-Cores, especially when multi-tasking or using other software since it can be a bit handy, but that's not going to be for awhile probably. The amount of people who consider 6-Core CPUs likely won't spend enough on their system to use quality or performance settings that 6-Core CPUs can't handle.
You know what this video really makes me wish for?
Some AIB or a Chinese manufacturer just BIOS modding the RX 6600(m), power limiting to 70W and selling it as its own thing, not requiring a PCIe connector.
I feel like there's a market for it for sure.
It's a silly request in a way, given a 6600M is a 6600 XT chip, not a 6600 non XT. So a 6600 non XT at 70W could hold higher clocks than a 6600 XT.
Or AMD doing it themselves. It would sell like hot cakes
@@GewelReal Yeah I've never seen AMD make any real effort in the sub 75W space
@@deanchur They made a 75W workstation variant of the RX 7600 (Radeon Pro W7500) that's cut down so it runs off slot power but I guess it hasn't been worth make a non-professional cheap consumer version of it.
@@GewelRealThe expression is hotcakes. Hotcakes are pancakes.
Hot cakes wouldn't sell because the frosting would melt. That's why you let them cool before frosting.
People on Reddit defending this product and attacking HU like wth!!!!
I wonder how some APUs would measure up? In the low TDP niche case, you might as well go without a discrete GPU
Rtx? I wonder what game that card will run ray tracing on?
They did this before with GTX 1630... This is the very bottom of their last gen stack, there's no universe where this shouldn't be a 10/20/30 class and cost at most $100... but yay price _and_ name inflation, screw nGreedia.
Makes sense if it comes in a single slot low profile size.
Where’d you get the $310 for the 3050 8 GB? That seems high.
Why are you omitting the 3060 from the graphs?
100-120usd would be good for it
I want to see idle power consumption and encoding consumption. And the wattage of 2-4 monitors. This isnt a gamer card. This is a media card and should be treated as such.
It's too powerful to be a media card (compared to the horrible rx 6400 + it doesn't sell with a single slot version)
Why don't you take a look at the intel arc a310 which is $100 & single slot?
@@thetechrealistrx6400 cant encode. 3050 basically has a full feature set and proven driver from nvidia.
Sure you can go with intel, but I am sure not everyone can 100% calmly sleep with this driver support, at least I wouldnt want to now. In 3 generations? sure.
@@Skukkix23 Lol, Intel is too big to just give up on the graphics card market.
But, if it's worth your piece of mind I guess the 3050 is better.
Oh, & to halfway answer your question, the idle power draw should be around 10w or less.
Given that the GTX 1050/ti idle power is 4w
(source-techpowerup)
@@thetechrealistWhat's it with 3 monitors? During video playback?
During encoding?
Are multiple encoding streams a thing (unofficially anyways?)
Can I remove the fans if I just use it as a displaycard on 4 monitors?
Noone cares if this thing runs fortnite.
@@Skukkix23 No clue but maybe you should buy it to test all that out & then return it later if you’re unhappy
System Power - Watts / Frame? Power Economy?
Thank you for the review Steve. I am interested in dropping one of these low profile RTX 3050s in an old SFF case. It looks to be the fastest lp graphics card that doesn't need external power.
Is it faster than rx5600xt 6gb??
no
Lol no. It is barely faster than 1060 lol
Nvidia literally going for a cash grab for people who know little about graphic cards and computer peripherals and trying to scam them by selling an inferior product that is BOTTOM OF THE BARREL with "more" memory. Quite an introduction these folks will have to Nvidia, shame as they have some good GPUs from the current generation that can be sold cheaper.
How does this compare to the RTX A2000 6GB I wonder. If I recall, that card was pretty close to a 3050 and has a 6GB version. Is this just a "gaming version" of that card?
There's a palit one that doesn't need additional power, I think it's a good deal for entry 1080p high settings.
More for updating a business pc, or a home theater build; but it is NOT a gaming GPU. To determine its place in history, compare it to its actual predecessor: the 1650 (non-super) with the same power limit. That would show steady improvement from the 1050/1650/3050 over time. Comparing it by price to pandemic cards is sort of useless; and the Radeon card was limited by TDP of just 53 W. As your prior video demonstrated, vram matters; so constantly slamming the 6400 for not competing with a 75W card with half again the vram ? Just say they aren't equivalent and leave it. Even the 6500xt had twice the TDP of that card. Oh, and you get big bonus points for NOT including crap like the 1030 or 710 that seemed to be pandemic best sellers.
You'd only buy one of the low power cards for a very small system, or to upgrade a system with just an IGP and worthless power supply. Any discussion of value, then, probably needs to include upgrading the power supply for anything more powerful, that fits in the case ----- which is almost your entire list of cards . I paid almost a hundred US to upgrade the pitiful 180w ps in a desktop HP was selling (it was, however, really really cheap, and a storm blew out my previous system -- and the TV, the microwave, some light bulbs. . . . I needed cheap!)
Steve gets riled up because he thinks "3050" is the name when "3050 GB" or GV-N3050EAGLE OC-6GD" is the actual name of the product.
He did the same thing with the GT 1030 DDR4 and 5700 where he cried that the 5700 is a 5700G without the graphics (when the G designation is literally... graphics).
Once he realizes he doesn't like the "name", he does whatever possible to make the product look as bad as possible.
"Outrage drives clicks and clicks drive revenue."
@@tim3172he's right in that if you look at the listing on Amazon or Newegg, the sku or product number isn't what the non-savvy person will be aware of; but he also ought to be used to it by now. Somebodies' Grandma is going to get the wrong stocking stuffer. But expecting NVidia to care? Steve will give himself a stroke.