Kinda a shame that they stopped producing this tier of card for consumers. Like the ~$60 cards that were available almost 10 years ago are still the same ones they sell today.
@@aleksazunjic9672 I think that class of graphics card could still be useful for a very old system with a inadequate iGPU. Something to provide better acceleration for general desktop compositing and such.
Honestly speaking, for a 32 Watt TDP this is kinda impressive. Nothing that i would wanna daily drive at all, way too slow for today's gaming standarts, but keep in mind that 10 years ago similar cards could consume as much as 250 Watts!!! And there's a reason these are put in OEM systems specifically, it's basically just a modern day GT 1030 meaning it's more of a "display adapter" meant for extra display ports rather than a "gaming card", something i am noticing a good chunk in the comment section is not quite understanding.
@@rafradeki If I didn't get a 3080 when they became available, I would be using this comment to justify not using my GTX 970 any more. GTX 960 performance all in that tiny little card? Time to upgrade.
@@rafradeki Really Actually NO! the 960 was released in 2015 so not available 10yrs ago making 2013's GTX700 & R9 200 GPUs the "New Models" available 🤦♂ Buying new 10yrs ago to get similar level of performance it would have been R9 280X 250W or GTX770 230W/GTX780 250W.
Now you can build a 6300-PC with a FX-6300 & RX 6300. :D Never heard of that card before, if the GT 1010 would be available somewhere, that would be a good battle..
i love unknown graphics cards like these, or lesser known ones, like the 5300, 5300XT, 5500 OEM and 5600 OEM, basically I love OEM cards, one I still have never seen is an RX 640
Had an RX 640, never could get it to work, heard they were fussy with older boards, I was trying to run it in a 6th gen i5 pre-built. I ended up throwing it in as a sweetener when I was selling another card
Always nice to see a video that makes me feel good about my RX 6400. :P [sadly, my only local used options at the time were more expensive, and out of my budget range, and the only other "new" card I could afford was an RX 550... Oh, and your videos on the RX 6400 are what convinced me that it'd be worth it since it was replacing an ancient HD 6770, so thanks for that :)]
The way I see it not everyone can afford STOOPID EXPENSIVE GPUs so if it was the best in your budget & lets you have fun playing games you like it is an AWESOME! GPU 🙃👎
There is also the issue on getting a high tier card from a few gens old, because if you have a budget PC, your PSU might not be able to handle it. I just got into that issue when building a quick 3600+1080 pc for VR, the PSU was a cheap one without the PCIe cables needed. I also have a friend that "didn't want to spend much", but after switching the GPU, had to get a new PC bc the bottleneck was too much. Just try to use your weak but decent GPU to it's fullest and save up for a new build in the future, don't upgrade piece by piece for some time. 😊
Nothing wrong with your RX6400, buddy. It is a modest card but IMO a solid product that was way too expensive on release. It should have been released earlier, too. That card gets a ton of shit everywhere but if you got it for a good price and you like it, then who can judge? Screw haters and enjoy!
You should see what Intel celeron N100 does at 6 watts. It plays GTA 5 800p normal settings at 38fps avg. Peaks at 47 fps. Which is insane for an iGPU and cpu combined at 6 watts (like an arm chip). Still its fun what and pushes out 25w. Sadly they didnt gave it 4gb which is not that expensive but I think there is a manufacturer that would slap 4gb on this maybe... Like those old gtx 730s with 1gb or 2gb ddr3 or even 5. We'll see
@@ff7soldierff7Those Celerons would be amazing in a desktop socket. If they could price it at like $20, it'd be a great ultra budget system when paired with a GT 710. You could give it 8GB RAM and the GPU would have its own VRAM so it'd all do fine. It's a system on a chip design though, so it doesn't actually need a motherboard. And it probably has no free PCIe lanes. The 6W limit is artificial. Once you disable it in the BIOS, they pull up to 12W and doing so become unironically good CPUs. It exists because "cooling" on the devices you can find them in is a sheet of copper foil.
"Display Adaptors" can game, if you temper your expectations and choose the right titles. And while it might not be as performant as the latest gen of APUs, it's a lot cheaper than a complete internal overhaul of rig.
@@aleksazunjic9672 5600g is also pcie 3.0 this is why i think 4500 and 6300 would be better plus with a low end cpu like this theres a high chance the motherboard your using is gen 3 anyway
It is also is limited to a 32-bit memory bus. I'd be interested to see how bad it does in an older system with Gen3 slot. I tried to find one on eBay and found one for sale. I imagine they won't be easy to find until businesses start liquidating newer systems.
@@TheSpotify95 That's also almost the lowest you can get with 2GB vram. The limit is 3,2GB or something like that. (If it works the same as system ram allocation in the 32bit windows xp era)
It is one of THE graphics cards of all time. That's for sure. If it had Windows 7 or XP drivers, I would have loved to see it inside an old system just for the hell of it. I mean you can technically still pair this with a Q6600 or something like that on Win 10.
I like these videos where you test low-end budget friendly cards. However, I would like to see you do more retro gaming stuff with them. Like games from 2000-2010, perhaps even older. Just my thoughts.
Hi, about Baldur’s Gate and blurry graphics: counterintuitive, but if you use FSR and it’s not a pretty high resolution… disable anti aliasing. It’ll be pixelated, but things will look better. Side note, this knowledge is from playing on the Steam Deck, and importantly as things might have changed, back then there was just FSR1 in game.
Great video! Would love to see this compared to Vega 8 graphics (2200g/2400g, etc) since they both have 2gb of vram. 2200g has been my daily driver with my NOS CRT monitor for years now. Quiet and cool while the low power offsets the power draw of the CRT.
This card is probably 3 times more powerful than what card I was stuck with during the GPU shortage (an hd 7750). Thankfully my PC has had a glow up since then and now rocks a RTX 3060 ti
I haven’t watched your videos in a while, but I love your content. I love how this video started you’ve heard of these other graphics cards this one’s just like it but worse😂
It's actually really decent for what it is, it can play esports titles and uses so little power and has a little fan that's kinda cool, you should compare it to a 780M to see the results and compare both power usage of CPU+GPU and performance, anyway I have a GPU that looks just like that with a different cooler because it's from asus but it's so bad that I wasn't expecting this one to actually play modern games so nicely
looks like older cards are going to be more compelling for builders or OEM buy-and-upgraders who are stuck with PCIe 3.0, especially people who prefer indie games over triple A ones.
Can you add Batman: Arkham City or Origins for testing? As one of the best looking games with UE3? I would show how GPUs works in older but great games.
That setup would likely outperform my Dell Inspiron "computer-in-a-box" pre-built w/a Core2Quad Q 9400, I swapped out the PSU with a Thermaltake "Smart" 430watter and added an RX 470 4GB and an SSD, running windows 10. Maybe it's finally time to upgrade? . . . . hmm . . . . this just might be compelling enough to persuade me. Power consumption seems a bit high. I dunno.
You would think in such a tiny, lightweight GPU that's also only x4 that they would size the pins accordingly. It looks funny to see the x16 length but only pinned as x4. It feels like a missed opportunity for an easy way to save material. It doesn't seem like a lot, but on a major scale where if every ten boards they made, they were able to make another one out of what would have otherwise been waste, that's a decent amount of money saved in materials at the end of the run. I don't know if that's how it works with PCB's and saving the fiberglass and other materials and stuff, but you showcased that 4060 a while back that was only pinned to x8 and that was a big card, they have to had done that for a reason.
My guess would be for literal physical stability... x16 length means more contact with longer slots even if you're not using that as electrical contact. A x4 connector has physically less material to keep it in place when inserted. When you consider that these are mostly intended for business use to drive multi-monitors, the design choice makes sense - it's going in some box that will get knocked, kicked, bumped etc, especially as non-technical users stick a bunch of DisplayPort cables into them. The cost of a small amount of extra PCB (which isn't really "extra" anyway as a shorter slot just means more cut-off waste) is nothing compared to the cost incurred by companies calling IT support for display issues. So the manufacturers just price up a bit to cover their costs there and it's a non-issue. Whilst these aren't Radeon Pro cards (and therefore aren't in the high reliability guarantee product lines) I would guess a fair few out there in the wild are using the AMD Software: Pro Edition for the more stable drivers given their intended use case.
I'd like to see a comparison between this and the rx-6400. The 6400 gets a lot of hate, but there aren't many single slot cards you can fit in a pre-built SFF PC with no power connectors. If you keep your expectations in check and aren't playing the latest games at the highest settings the 6400 it isn't terrible
Can confirm! I'm not really fussed about playing games at the highest possible settings - as long as its comfortable to play I don't mind, and I've had a pretty great time with it with that in mind. Stuff like FMF being added over time really added to its overall value for me too!
The big advantages of buying a newer weak GPU is lower power usage AND better driver support. But IcebergTech said he had to use HP's old drivers... which sucks.
Am I wrong to be kinda blown away by how it performs? It will 100% work on any Optiplex out there with no need to upgrade the PSU. A bottom tier budget computer with modest gaming capabilities for as little money as possible. It would all depend on how cheap you can score a used Dell for.
Surprising amount of performance for something with just 2GB of VRAM. Definitely more of a display adapter but could play retro games or slightly older titles without issue. GTA V, Borderlands 2, etc.
If you want gaming performance, you're much better off looking for something older on the used market, unless the power consumption and size is more important than performance (or performance per dollar).
I daily drove both a gt 1030 and the rx 6400. (Gonna get a new gpu in a bit) both are bad never touch them unless you need extra displays for your excel tabs.
Honestly wouldn't mind picking up one of these for a chiplet Zen 3 home server, like a 5700X or 5900E. On some boards it'll refuse to boot without some form of graphics, and it saves having to load the heavier and propietary Nvidia drivers for a GT710 when all you need is an X server in a blue moon.
i love it ❤, i have great memories from my old "HD 7850" (16 cu) with 1 GB of vram, and this "RX 6300" (16 cu) have 2 GB of vram (and no reed of a big PSU), is a good card, to me at least 😄
This card is actually quite impressive. It's able to play games like GTAV, Cyberflunk, RD2 and BD3, and these games offer tons of hours of entertainment. Slap it on a mini-pc build and it'll be a great emulation device that can also play some triple-a games. Or build a system like 5600g or the latest APUs so there's more option for you. Also, no one cares about Starfield since it's a soulless game anyway. Redditors and social media are hating on this card but the average person don't even use/engage on those platforms so maybe there is a market for this card.
This is the tier of gpu I'd have had to settle with when I was a teenager and back then with this kind of performance I wouldn't have known any better and would have thought it was fine. Unreal 1 at 18 fps boys, yea! Those were the days, lol.
Not a bad card, probably good for video editing and photo editing. Hey, I did well with a GTX 460 with 768mb of vram for 10 years up until 3 years ago.
I always feel like the only reason these are financially viable is because of store brand prebuilts. Put "amd rx 6000 series gpu!" marketing, and sell it to clueless parents who aren't good with tech. (by store brand I mean non Dell, hp, Asus etc, more like costco/wallmart).
Honestly, it would be good for CSM machines that can do light gaming, there are a few more options for those kinds of computers, however. The RX 6300 is a little faster than a GT 1030 but it can probably play Skyrim all the same.
I need that for a Dell Optiplex SFF I just got. Gotta update the bios, then tape mod a Q6600 for it. Add 8gb of ram and have a nice... hell, I dunno yet, lol! 😂
I did put an 8400gs in it, but it stopped giving a signal the next day when I booted it up. Popped it out and its working on onboard video. I believe it's system files are damaged oh the hdd is dying too. Soooooo slow to open anything up..
I think it's better if you use act 3 as benchmark in BG3. That entire section is an FPS killer and it would make a much more 'real' performance benchmark to see how a GPU handles crowded areas.
I'm surprised to see it's 2.5 times more powerful than my old HD7770 And that it can actually play modern games. I remember back then if you didn't had enough vram the games wouldn't even open, I had a 64mb integrated memory pentium 4 and then I had to add a 512mb geforce 6200 to play things like half life 2 episode 2 and then I had to get a while new pc with the 7770 to play resident evil 5 and now I've gotten a whole new pc again to play resident evil 4 lol
1:00 Well it's certainly a long way from a HD 6450, which is what it kind of looks like but according to tech power up, the RX 6300 is on a par with an R9 280X! That's some power draw difference!
8:30 - textures at ultra in RDR2, 1080p? I doubt it really worked. IIRC ultra textures could only be enabled on 4 GB VRAM GPUs and it was close to a VRAM limit.
I'd be curious to see how well this card could handle multiple simultaneous H264 encoding & decoding tasks, as it would fit perfectly in a 1U rackmount server we have at the place where I work running our security camera, and would be a good alternative to software encoding on the CPU.
It's all to do with expectations. The weird thing is that I've been gaming for a very long time. I remember a time when anything above 2-3FPS was considered acceptable and polygons weren't filled with textures, or even were not filled at all and were just wireframes. In slower-paced games 20-30fps seems perfectly playable to me, so long as the game engine doesn't slow down (as a few do) to "keep up" with the graphics. Also most games look pretty good to me even on low/medium settings and I usually prefer to drop a detail level than to use upscaling. As I like to say "I have better things to spend my money on than slightly more detailed backgrounds". Well. I did that they until I started to muck about with LLM and generative AI. oh, lord. THEN you need the 24 gigs. :-) It would be nice if a quick image could be done using something like Stable Diffusion and via Automatic1111 at the end of these reviews - just to add something other than gaming. Just a set series of imagine with a known seed number, so it's always consistent.
Compared to the RX 550 this videocard seems good as hell, this thing on the brazilian market would be sould a lot. These types of video cards that don't require a good psu are the ones that sell the most here. This thing is better than an GTX 1050
Hell no. I've the 8700G. Try playing Alan Wake II on that 6300. The 8000g series can assign as much "vRAM" as you have RAM (minus the background OS). If you get 32GB DDR5 you can play modern AAA games at 1080p with the 8000g series (not the 8500 though)
Hey this card seems like a pretty decent choice if you just use your pc for working, but also want to do some light gaming here and there. 30fps is honestly fine tbh.
This is basically a IGP 680M put into a pcie board with some memory modules xD PD. funny to see the small square cut in the heatspreader to access the bios chip
Half of the world’s supply of RX 6300s are in the hands of TH-camrs 😂
Ah just saw your videos too!
Yay now both my tech boys have done an rx 6300 vid
ayyy
You two are way too freaking wholesome! Keep up the good work Iceberg and Random y’all both make my day better thanks dudes
This has maximum 'we have RDNA2 at home' vibes. If you can't afford a Ryzen 7640HS mini, this is the GPU for you🤣
Kinda a shame that they stopped producing this tier of card for consumers. Like the ~$60 cards that were available almost 10 years ago are still the same ones they sell today.
Yeah I remember a few of them. Would be good to see the ultra budget market return
Yeah, I'm simply looking for a display adapter that can play 4k video. That's all.
Does an Arc A380 count as a cheap display adapter? Lol. (I know, it's still about double the price.)
@@Gatorade69 iGPUs now cover that tier of consumers. In fact, CPUs without iGPU are becoming rare.
@@aleksazunjic9672 I think that class of graphics card could still be useful for a very old system with a inadequate iGPU. Something to provide better acceleration for general desktop compositing and such.
Now i'm waiting for the "Сan RX 6300 Handle Ray Tracing" video :)
Probably not my ryzen 7840u framework laptop APU beets it's performance and in quake 2 rtx is single digit framerate
😅 that's only something this guy here is mad enough to do 😂
At the very best you would be looking at still images.
Any card can handle ray tracing as long as you don’t care about frame rate😂
Raytraced powerpoint slides?
Honestly speaking, for a 32 Watt TDP this is kinda impressive. Nothing that i would wanna daily drive at all, way too slow for today's gaming standarts, but keep in mind that 10 years ago similar cards could consume as much as 250 Watts!!! And there's a reason these are put in OEM systems specifically, it's basically just a modern day GT 1030 meaning it's more of a "display adapter" meant for extra display ports rather than a "gaming card", something i am noticing a good chunk in the comment section is not quite understanding.
Actually no. This is performace of gtx 960 which was released almost 10 years ago and had a tdp of 120w
@@rafradekiStill use a 960. Great card.
@@rafradeki "as much as 250w"
@@rafradeki If I didn't get a 3080 when they became available, I would be using this comment to justify not using my GTX 970 any more. GTX 960 performance all in that tiny little card? Time to upgrade.
@@rafradeki Really Actually NO! the 960 was released in 2015 so not available 10yrs ago making 2013's GTX700 & R9 200 GPUs the "New Models" available 🤦♂
Buying new 10yrs ago to get similar level of performance it would have been R9 280X 250W or GTX770 230W/GTX780 250W.
Now you can build a 6300-PC with a FX-6300 & RX 6300. :D
Never heard of that card before, if the GT 1010 would be available somewhere, that would be a good battle..
I'd waged RX 6300 beats GT 1030 hands down, even the "fast" version.
I had a 760-PC with a gtx760 and a athlon II 760k.
Now I have a 3090 and a 5900x. at least both with a 9 in it i guess.
@@Booruvcheekthat's a little obvious, 1030 is old and has slow memory
U can build a 16 pc, with a ryzen 5 1600, 16gb ram and a 1660 super
Would be a nice balanced budget build too
It would be bad, because most motherboards for FX-6300 have only PCIe 2.0 .
i love unknown graphics cards like these, or lesser known ones, like the 5300, 5300XT, 5500 OEM and 5600 OEM, basically I love OEM cards, one I still have never seen is an RX 640
Had an RX 640, never could get it to work, heard they were fussy with older boards, I was trying to run it in a 6th gen i5 pre-built. I ended up throwing it in as a sweetener when I was selling another card
i owned a rx 560 xt which was only sold in china
Always nice to see a video that makes me feel good about my RX 6400. :P [sadly, my only local used options at the time were more expensive, and out of my budget range, and the only other "new" card I could afford was an RX 550... Oh, and your videos on the RX 6400 are what convinced me that it'd be worth it since it was replacing an ancient HD 6770, so thanks for that :)]
The way I see it not everyone can afford STOOPID EXPENSIVE GPUs so if it was the best in your budget & lets you have fun playing games you like it is an AWESOME! GPU 🙃👎
There is also the issue on getting a high tier card from a few gens old, because if you have a budget PC, your PSU might not be able to handle it. I just got into that issue when building a quick 3600+1080 pc for VR, the PSU was a cheap one without the PCIe cables needed.
I also have a friend that "didn't want to spend much", but after switching the GPU, had to get a new PC bc the bottleneck was too much.
Just try to use your weak but decent GPU to it's fullest and save up for a new build in the future, don't upgrade piece by piece for some time. 😊
Nothing wrong with your RX6400, buddy. It is a modest card but IMO a solid product that was way too expensive on release. It should have been released earlier, too. That card gets a ton of shit everywhere but if you got it for a good price and you like it, then who can judge? Screw haters and enjoy!
25 w... Really cool for its performance
You should see what Intel celeron N100 does at 6 watts. It plays GTA 5 800p normal settings at 38fps avg. Peaks at 47 fps. Which is insane for an iGPU and cpu combined at 6 watts (like an arm chip). Still its fun what and pushes out 25w. Sadly they didnt gave it 4gb which is not that expensive but I think there is a manufacturer that would slap 4gb on this maybe... Like those old gtx 730s with 1gb or 2gb ddr3 or even 5. We'll see
@@ff7soldierff7 There are some people modding the 5600 XT with 24 gigs or 32 on the steam deck so giving this guy 4 gigs should be possible.
@@ff7soldierff7Those Celerons would be amazing in a desktop socket. If they could price it at like $20, it'd be a great ultra budget system when paired with a GT 710. You could give it 8GB RAM and the GPU would have its own VRAM so it'd all do fine.
It's a system on a chip design though, so it doesn't actually need a motherboard. And it probably has no free PCIe lanes.
The 6W limit is artificial. Once you disable it in the BIOS, they pull up to 12W and doing so become unironically good CPUs. It exists because "cooling" on the devices you can find them in is a sheet of copper foil.
@@ff7soldierff7 Now pair the 6300 XT with that chip.
your videos are my meditation. thank you
Glad you like them!
I know it's nearly 10 years old, but I bought an XFX Black Edition RX 470 factory sealed for the same price as this RX 6300 at $60.
That's a STEAL!! You can also flash most of those to rx570 vbios for a good boost in performance.. awesome card
5:08 nice drivin
It's GTA...
"Display Adaptors" can game, if you temper your expectations and choose the right titles. And while it might not be as performant as the latest gen of APUs, it's a lot cheaper than a complete internal overhaul of rig.
It's great that you cover every gpu you can. Please keep doing what you're doing!👍
I gamed on an r7 240 for YEARS. It looked exactly like this. I was so happy playing Minecraft at 30 fps.
Would be like to upgrade to a R9 285... 🤔
Wild to see a card that looks like this, that 's more powerful than my R9 280 back in the day.
Well said, my 280x is sweating rn.
Love your videos my dude! Keep it up
Thanks!
you can buy this and a brand new 4500 cpu for the same price as a 5600g id love to see a comparison video of that
4500 supports only PCIe 3.0 . 5600g could have advantage in large texture games, although if you keep things under 2 GB 6300 wins.
@@aleksazunjic9672 5600g is also pcie 3.0 this is why i think 4500 and 6300 would be better plus with a low end cpu like this theres a high chance the motherboard your using is gen 3 anyway
@@aleksazunjic9672 5600g is also pcie 3.0. 5600g/5700g and 5500 are all pcie 3.0
It is also is limited to a 32-bit memory bus. I'd be interested to see how bad it does in an older system with Gen3 slot. I tried to find one on eBay and found one for sale. I imagine they won't be easy to find until businesses start liquidating newer systems.
Wow, I thought a 64-bit memory bus was bad enough, but 32-bit...?!
@@TheSpotify95 That's also almost the lowest you can get with 2GB vram. The limit is 3,2GB or something like that. (If it works the same as system ram allocation in the 32bit windows xp era)
@@TheSpotify95 Yup, just imagine that the RTX 3050 6GB has a 96Bit .-.
It is one of THE graphics cards of all time. That's for sure. If it had Windows 7 or XP drivers, I would have loved to see it inside an old system just for the hell of it. I mean you can technically still pair this with a Q6600 or something like that on Win 10.
No. Cards like this works ok only on modern pcie4 interface. All you got on old pcie2 is stutters and freezes.
Just realize this thing have the same compute unit as 680m igpu
And they said the 4060 was bad value
😁
It is, but anything that's a cut down RX 6500XT should be on your "do not purchase" list...
@@MLWJ1993 Pretty sure the 6500XT only won by a small margin compared to the older 5500XT, and lost in every case when in PCIe 3.0 mode.
@@TheSpotify95 The 6500XT is still a bad GPU, no sugarcoating that. However just imagine what a cut down bad GPU is like. 😂
this gpu was not sold alone, it is only in used market
I might buy this for my old egpu laptop setup. Wouldn’t lose out on performance much when I use it for ps2 emulation (currently have gt730)
I like these videos where you test low-end budget friendly cards. However, I would like to see you do more retro gaming stuff with them. Like games from 2000-2010, perhaps even older. Just my thoughts.
Hi, about Baldur’s Gate and blurry graphics: counterintuitive, but if you use FSR and it’s not a pretty high resolution… disable anti aliasing. It’ll be pixelated, but things will look better. Side note, this knowledge is from playing on the Steam Deck, and importantly as things might have changed, back then there was just FSR1 in game.
Including active cooling is pretty optimistic IMO
Its actual kinda impressive, that such an efficient and small card can do what it can.
Alright, you finally got one! I have one in a backup PC and it does its job just fine.
I was awake till 4:30 am working yesterday, and I kept playing your videos while doing so because I adore your voice bro 😂💙
🏳️🌈
@@ZA-7 And what's that supposed to mean?
Great video! Would love to see this compared to Vega 8 graphics (2200g/2400g, etc) since they both have 2gb of vram. 2200g has been my daily driver with my NOS CRT monitor for years now. Quiet and cool while the low power offsets the power draw of the CRT.
My old r5 3500u in my laptop had vega 8 and gta only got 30-40 and red dead got about 6-10 so this is much better
This card is probably 3 times more powerful than what card I was stuck with during the GPU shortage (an hd 7750). Thankfully my PC has had a glow up since then and now rocks a RTX 3060 ti
0:57 This giant cooler caught me off guard
Cyberpunk runs so well compared to Starfield, which looks worse in general + runs in lower resolution, unoptimized mess.
I haven’t watched your videos in a while, but I love your content. I love how this video started you’ve heard of these other graphics cards this one’s just like it but worse😂
that's really impressive giving the size and power usage of dat thing, it's so smol, maybe it can run older games even in 4k
32bit bus lol, it is not impressive
All this bang for 24watts. Nice!
I was most surprised by the FH5 test, I never would have thought it would run with only 2GB.
It's actually really decent for what it is, it can play esports titles and uses so little power and has a little fan that's kinda cool, you should compare it to a 780M to see the results and compare both power usage of CPU+GPU and performance, anyway I have a GPU that looks just like that with a different cooler because it's from asus but it's so bad that I wasn't expecting this one to actually play modern games so nicely
It would be great if you would start adding comparings with other similar gpus on those benchmarks
looks like older cards are going to be more compelling for builders or OEM buy-and-upgraders who are stuck with PCIe 3.0, especially people who prefer indie games over triple A ones.
Can you add Batman: Arkham City or Origins for testing? As one of the best looking games with UE3? I would show how GPUs works in older but great games.
That setup would likely outperform my Dell Inspiron "computer-in-a-box" pre-built w/a Core2Quad Q 9400, I swapped out the PSU with a Thermaltake "Smart" 430watter and added an RX 470 4GB and an SSD, running windows 10. Maybe it's finally time to upgrade? . . . . hmm . . . . this just might be compelling enough to persuade me.
Power consumption seems a bit high. I dunno.
You would think in such a tiny, lightweight GPU that's also only x4 that they would size the pins accordingly. It looks funny to see the x16 length but only pinned as x4. It feels like a missed opportunity for an easy way to save material. It doesn't seem like a lot, but on a major scale where if every ten boards they made, they were able to make another one out of what would have otherwise been waste, that's a decent amount of money saved in materials at the end of the run.
I don't know if that's how it works with PCB's and saving the fiberglass and other materials and stuff, but you showcased that 4060 a while back that was only pinned to x8 and that was a big card, they have to had done that for a reason.
My guess would be for literal physical stability... x16 length means more contact with longer slots even if you're not using that as electrical contact. A x4 connector has physically less material to keep it in place when inserted.
When you consider that these are mostly intended for business use to drive multi-monitors, the design choice makes sense - it's going in some box that will get knocked, kicked, bumped etc, especially as non-technical users stick a bunch of DisplayPort cables into them. The cost of a small amount of extra PCB (which isn't really "extra" anyway as a shorter slot just means more cut-off waste) is nothing compared to the cost incurred by companies calling IT support for display issues.
So the manufacturers just price up a bit to cover their costs there and it's a non-issue.
Whilst these aren't Radeon Pro cards (and therefore aren't in the high reliability guarantee product lines) I would guess a fair few out there in the wild are using the AMD Software: Pro Edition for the more stable drivers given their intended use case.
Not bad results for video adapter. Clearly improvement over HD X450 or GT X10.
Hey use it for a compact retro build slap MS DOS, XP Win 7 and decent storage, fill it with the classics and you got a nice project
I'd like to see a comparison between this and the rx-6400. The 6400 gets a lot of hate, but there aren't many single slot cards you can fit in a pre-built SFF PC with no power connectors. If you keep your expectations in check and aren't playing the latest games at the highest settings the 6400 it isn't terrible
Can confirm! I'm not really fussed about playing games at the highest possible settings - as long as its comfortable to play I don't mind, and I've had a pretty great time with it with that in mind. Stuff like FMF being added over time really added to its overall value for me too!
Sips power too… 50w max!
The big advantages of buying a newer weak GPU is lower power usage AND better driver support. But IcebergTech said he had to use HP's old drivers... which sucks.
This thing does as good as a gtx 970 on cyberpunk !! That’s kinda crazy to me, knowing how far we’ve come this thing is so low power and tiny like wow
I know we all like those new graphics bricks but its good to see a card once in a while. Reminds me of when gaming didn't cost as much
would love a video explaining these terms TAA, FSR, FXAA, AA, Presets etc.. i'm not very well versed in graphic options 🥲🙏
expected slideshows , and i think it runs better than my old gtx 960 ; seems like a good card as a backup card at least
your videos are so oddly satisfying to me
Am I wrong to be kinda blown away by how it performs? It will 100% work on any Optiplex out there with no need to upgrade the PSU. A bottom tier budget computer with modest gaming capabilities for as little money as possible. It would all depend on how cheap you can score a used Dell for.
Surprising amount of performance for something with just 2GB of VRAM. Definitely more of a display adapter but could play retro games or slightly older titles without issue. GTA V, Borderlands 2, etc.
For 25W its really efficient.
Now we have a new GT 1030 for the next 10 years
If you want gaming performance, you're much better off looking for something older on the used market, unless the power consumption and size is more important than performance (or performance per dollar).
That's actually kinda impressive, thats a nice performance for just 25w of power
I daily drove both a gt 1030 and the rx 6400. (Gonna get a new gpu in a bit) both are bad never touch them unless you need extra displays for your excel tabs.
all said its fairly impressive for 24w
Honestly wouldn't mind picking up one of these for a chiplet Zen 3 home server, like a 5700X or 5900E. On some boards it'll refuse to boot without some form of graphics, and it saves having to load the heavier and propietary Nvidia drivers for a GT710 when all you need is an X server in a blue moon.
You can just use nouveau for that purpose.
i love it ❤, i have great memories from my old "HD 7850" (16 cu) with 1 GB of vram,
and this "RX 6300" (16 cu) have 2 GB of vram (and no reed of a big PSU),
is a good card, to me at least 😄
For cyberpunk you should have tested in the Dogtown area. I can run high settings all day in the city but in dogtown it goes low 20s
This card is actually quite impressive. It's able to play games like GTAV, Cyberflunk, RD2 and BD3, and these games offer tons of hours of entertainment. Slap it on a mini-pc build and it'll be a great emulation device that can also play some triple-a games. Or build a system like 5600g or the latest APUs so there's more option for you. Also, no one cares about Starfield since it's a soulless game anyway.
Redditors and social media are hating on this card but the average person don't even use/engage on those platforms so maybe there is a market for this card.
This is the tier of gpu I'd have had to settle with when I was a teenager and back then with this kind of performance I wouldn't have known any better and would have thought it was fine. Unreal 1 at 18 fps boys, yea! Those were the days, lol.
Not a bad card, probably good for video editing and photo editing. Hey, I did well with a GTX 460 with 768mb of vram for 10 years up until 3 years ago.
Love your videos brother
The thing with most of these cheap cards is the people buying them are more likely to not care and just play the game at 720p low
This video has indirectly shown me that I should be able to run BG3 on my RX6400.
I always feel like the only reason these are financially viable is because of store brand prebuilts. Put "amd rx 6000 series gpu!" marketing, and sell it to clueless parents who aren't good with tech. (by store brand I mean non Dell, hp, Asus etc, more like costco/wallmart).
Honestly, it would be good for CSM machines that can do light gaming, there are a few more options for those kinds of computers, however. The RX 6300 is a little faster than a GT 1030 but it can probably play Skyrim all the same.
I'm surprised it can run Starfield at all. Double digits even, hell yeah haha...
I need that for a Dell Optiplex SFF I just got. Gotta update the bios, then tape mod a Q6600 for it. Add 8gb of ram and have a nice... hell, I dunno yet, lol! 😂
Ah the tape mode those were the days 😁
Nice experiment, but PCIe 2.0 will probably start to be limiting factor.
I did put an 8400gs in it, but it stopped giving a signal the next day when I booted it up. Popped it out and its working on onboard video. I believe it's system files are damaged oh the hdd is dying too.
Soooooo slow to open anything up..
A lot better than I expected.
Hey, if those Starfield frame times were framerates, that'd be pretty good!
I think it's better if you use act 3 as benchmark in BG3. That entire section is an FPS killer and it would make a much more 'real' performance benchmark to see how a GPU handles crowded areas.
I'm surprised how well this 6300 worked in all of those games, also wonder how much power consumption would drop if this card would be undervolted
They would be a good upgrade for SFF prebuilds with weak PSUs. But all the ones I've seen on eBay come with a high profile bracket only.
I'm surprised to see it's 2.5 times more powerful than my old HD7770
And that it can actually play modern games.
I remember back then if you didn't had enough vram the games wouldn't even open, I had a 64mb integrated memory pentium 4 and then I had to add a 512mb geforce 6200 to play things like half life 2 episode 2 and then I had to get a while new pc with the 7770 to play resident evil 5 and now I've gotten a whole new pc again to play resident evil 4 lol
1:00 Well it's certainly a long way from a HD 6450, which is what it kind of looks like but according to tech power up, the RX 6300 is on a par with an R9 280X! That's some power draw difference!
Correct me if I'm wrong, but I think Vulkan wasn't tested with it
Very interesting video. Had no idea these existed!
8:30 - textures at ultra in RDR2, 1080p? I doubt it really worked. IIRC ultra textures could only be enabled on 4 GB VRAM GPUs and it was close to a VRAM limit.
Wait, but what's the point? The Windows 7 driver clearly states 6400 and 6500 XT only. I doubt the 6300 will work on that OS.
I'd be curious to see how well this card could handle multiple simultaneous H264 encoding & decoding tasks, as it would fit perfectly in a 1U rackmount server we have at the place where I work running our security camera, and would be a good alternative to software encoding on the CPU.
It's all to do with expectations. The weird thing is that I've been gaming for a very long time. I remember a time when anything above 2-3FPS was considered acceptable and polygons weren't filled with textures, or even were not filled at all and were just wireframes. In slower-paced games 20-30fps seems perfectly playable to me, so long as the game engine doesn't slow down (as a few do) to "keep up" with the graphics. Also most games look pretty good to me even on low/medium settings and I usually prefer to drop a detail level than to use upscaling. As I like to say "I have better things to spend my money on than slightly more detailed backgrounds".
Well. I did that they until I started to muck about with LLM and generative AI. oh, lord. THEN you need the 24 gigs. :-) It would be nice if a quick image could be done using something like Stable Diffusion and via Automatic1111 at the end of these reviews - just to add something other than gaming. Just a set series of imagine with a known seed number, so it's always consistent.
idk man, for 60 bucks it seems pretty okay and you get the benefit of driver support^^
I just got an RX 7900 GRE. I think I’ll stick with that for now. 😂
Compared to the RX 550 this videocard seems good as hell, this thing on the brazilian market would be sould a lot. These types of video cards that don't require a good psu are the ones that sell the most here. This thing is better than an GTX 1050
Is it better than the 780M that is on the newer G series AMD processors?
Hell no. I've the 8700G. Try playing Alan Wake II on that 6300. The 8000g series can assign as much "vRAM" as you have RAM (minus the background OS). If you get 32GB DDR5 you can play modern AAA games at 1080p with the 8000g series (not the 8500 though)
Nope
Hey this card seems like a pretty decent choice if you just use your pc for working, but also want to do some light gaming here and there. 30fps is honestly fine tbh.
Better than onboard. Great desktop/mild gaming card.
This is basically a IGP 680M put into a pcie board with some memory modules xD
PD. funny to see the small square cut in the heatspreader to access the bios chip
That’s one hell of a heatsink for an i3 😂
Wonder how much performance one would gain from overclocking this bad boi, because it runs extremely cool!
Great, just what we needed. Newer, pricier, and more power hungry display adapters.
Not a bad performer honestly. The fact you can play GTA V and Fortnite comfortably on this is ridiculous.
So Steve, how many hours of experience you have in driving around los santos?
Steve: "Yes" 5:26
why not Baldus Gate 3 in Vulkan Mode ?
Gaming with 25w! that deserves more praise.
This little guy's performance is amazing. I thought it would be like 10fps max.