A friend of mine used to play GTA 5 Online at 480p windowed on the lowest of lowest settings to achieve 30+ FPS. He was running an i7 3770 with integrated graphics. Don't underestimate what budget gamers will do to play higher end games. Certainly they'll go beyond what you'd call "playable" I sold my used GTX 1660 to this friend for a pretty low price when I upgraded Edit: HOLY shit this blew up...
as long game is launching it's playable for low end gamers xD I'm currently in that segment with my 750ti sadly games are moving to dx12 and I can't play recent titles. waiting for some more price drops to finally get 3070 or equivalent so that I'm good for next 5-6 years
That is literally what this guys whole youtube is about. I think the highest thing i've seen him benchmark is a 3050. He would happily recommend integrated graphics he loves his athlon integrated graphics.
the thing i miss most after getting a better system is the amazing feeling of running a game at near 60fps with the settings you configured yourself, its very rewarding
oh yes I can confirm that, a friend of mine played (~2019) fairly recent games with an HD4770 (on an old af FX-CPU). Even RocketLeague looked atrocious. I couldnt bear watching him, but he would refuse me buying him something better. So when he had his Birthday, his friends and I bought him a brand new Rig with a 2400G which literally quadrupled his fps despite being integrated Graphics. (And it gave me the chance to overspend on the Budget without him knowing)
We need more cards like this. Who wouldn't mind a tiny modern card that is basically a 3050TI/3060 sub 75W mobile card? Good enough for all tasks, with a bit of gaming oomph. Perfect for a tiny PC that is mostly used for graphically undemanding work, but is sometimes used to game a bit. Yes, the new AMD 7000 chips have decent graphics, but it's basically either 6-7 y/o games or eSports at 1080p/30fps. I'd really like a card capable of 1080p mid-high in Cyberpunk that is only 2 slots, one fan and doesn't require additional power. Basically, a laptop 3060.
2 slots is 2 thicc. ONE slot or bust. Half height, too. Some of us have SFF Optiplexes. The RX6400 or T1000 come closest to these requirements, but are more in line with a GTX 1650 in terms of performance. But the T1000, though its feature set is awesome for its size, is too damn expensive. If AMD could just include the damn VCE with the low end of the RX 7000 line (RX 7400?) instead of cutting it out like they did with
There is the GT1030, a card 10x better than this. I once fell a victim for a Dell OEM R7 350X low profile card, when shopping for my bedroom TV video playback PC. The R7 340X was more terrible than the DDR3 version of the GT730. Even just for video playback, because the drivers were a mess.
The first gpu I had was the 2gb gddr5 version from PowerColor. It was a hand-me-down from my brother, and it was all I had until I recently upgraded to an RX 6500xt. Needless to say the upgrade was worth it for me. But that old ass card holds a special place in my heart for fulfilling my gaming needs back when I was young, even if barely.
I have Dell variants of this card (the low profile, AMD Reference ones) based in DDR3 memory, and let me tell you a little nice secret: the DDR3 ones are almost always capped at 65W, and almost always have turbo as well, idk about the GDDR5 ones. Using MSI Afterburner you can actually bypass the power limit up to 75W and you can bypass the core clock/memory clock limits, and almost all cards overclock wonderfully well. On my dual R7 250s, I think I have managed to make these card run at 1250MHz core (when turbo-ing, when hitting base clocks it hits like 1175MHz or so) and managed to make the memory run at 1300MHz no hassle, getting around a 33% performance uplift on each card. On their own, they are pretty weak, but running two in games that actually run in crossfire, they are around 45-55% the performance of a 1650, in fact, the two in crossfirex ran Crysis in high at 1080P without dropping to 60FPs in anywhere on the entire game, so for retro gaming they are pretty solid if you don't mind hacking with them to get the performance you want.
I did something similar but with an R7 240, this things OC really well. Mine did 1100MHz memory (800MHz stock) and 1050MHz clock (730MHz stock). and when running it in dual graphics mode with an R7 iGPU (also OCd to 910MHz because that was the maximum allowed) some games ran really well considering it was being run in shitty hardware lol. Some games had issues with frame timing tho due to the difference between the iGPU and the R7 240. Fun times :)
@@nachogsr5085 I also have an A10 chip, but I decided to go with dual R7 250s due to my slow memory kit (And I couldn't afford to replace it, it was cheaper to grab two graphics card lol), as it did bottleneck really hard the graphics card if I used Dual Graphics with that memory kit, although it should be fine with A10 systems with DDR3-2133 or higher. Ditching the IGPU and running on two identical cards did increase the CPU performance a bit too.
I used mine overclocked to 1250mhz for 5 years and it was a tiny beast. Never saw a card that benefited so much from a few more mhz, it was great to play kf2, overwatch and gta v at 1440x900p (with the 2gb gddr5)
I remember this card. This was a card that I picked up out of desperation after my laptop shit in me back in 2014. Took it back to best buy the next day and bought a GTX580 on Craigslist for like $80, as well as a psu for another $30 lol. This card was not good. I just wanted to play sc2 and Dragon Age: Inquisition. It did terribly with both.
Was it GDDR5? And paired with what cpu? I've seen some tests w the RX 580 paired with a Xeon E5-1620 v2 and it really gave some great performance. But, I can't say if the video creator was in fact being honest.
As is usually the case, it wasn't anything wrong with the card, just the user's faulty expectations of what it should do. People should spend more time checking what they actually need before rushing out and buying something.
My first GPU was an R7 240... in 2017... Unsurprisingly it didn't perform too well. Then I jumped to a 1050 Ti which was awesome, then a 3070 which is my third and current GPU. Both of my upgrades have been *big* jumps from my previous card. I hope my 3070 lasts me a long while though... I dread buying another GPU due to prices
I went from R9 270 to 1050ti to RX 590 to RTX 3070, and I'm seeing that the 3070 is aging quite well, the 1050ti was already struggling at 1080p one year after launch, the RTX 3070 still tanks everything like a champ at 1440p even after two years after launch
I also had an R7 240 and jumped to a 1050Ti lol GPU prices should be at their lowest right now since the crypto crash and next gen cards coming out. I ended up getting an RX 6600 XT because the ARC gpus weren't coming out in time.
My first pc I ever put a GPU in was a HD 2600 when it was still ATI.Then when i graduated college I built my first PC with Dual 6850s. Then computer after that had a R9290x. Love me some AMD even if I'm unfortunately (My wallet hurts) on team green this build around.
Something to consider with older cards: a crt monitor, set to 480p It looks great in person, plenty of refresh rates to play with, tiny visible scanlines, the only drawback in that text is slightly less readable. But it's not super pixelly like it would be on an lcd, owing to crt displays not having a fixed resolution
I was desperate back then to replace my XFX 5450 and was considering on getting this as an upgrade. Thankfully my mate sold me his 7790 instead for around a hundred bucks at the time.
I actually used this exact GPU in my first gaming PC. It was totally fine back in the day allowing me to have a ~40fps 1080p experience in GTA 5 online and a very playable minecraft experience, etc. paired with an AMD Phenom II X4 945 and 4GB RAM.
Whoa. Really? I have a Phenom II X4 955 (95W) and 2x2GB DDR2-800. Although I have the R7 250 2GB GDDR3 OEM model. Guessing you may have been running DDR3?
hey quick thought. For these older cards you test can you also include older titles? For example: Fate (first one in the series), by WildTangent, Morrowind, Oblivion, Ever Quest 1, or 2, CS-Source, Banished, Far Cry 3, Resident Evil 4?
I used and still own a XFX 2GB DDR5 and it ran games great at 1440x900. Games like Killing Floor 2, O.G. Skyrim (the special edition destroys the card lmao), Saints Row 4, Overwatch on low, GTA 5 would all run pretty good with the clock set to 1250mhz. Its a good card for 720p gaming, wish there were more of these low power low profile cards nowadays.
Hi! i'm also an EX-Intel HD user myself, used to play with an old celeron with only 2 threads (at 1,7 Ghz) and a old 2Gb stick of unknow DDR3 memory sooo yeah... i understand your pain.. x)
@@julian-vanilla8379But that isn't a problem with the Intel GPU, it's a problem with the person who bought or uses the laptop for something it was never designed to do. The original Xbox played games very well, despite it's relatively modest specs, because the games were written with it's capabilities in mind. You can't buy one now and say it was crap, just because it can't play modern games in 4K resolution.
Hey Steve :) Do you have a rough estimate on when the video with all of the benchmarks from the GPUs from the box is coming? Or will it not be coming at all? o:
Would be interesting to see if performance would be higher with modded drivers like NimeZ (Amernime), not only they add support for older cards that AMD stopped supporting long time ago but they also added various performance tweaks for DX9/DX11/OpenGL.
I don't even need to budget game anymore and run some reasonable specs, but I love these vids and your content - Thanks for doing all this hard work for us all.
Used to have a Sapphire R7 250X 1GB GDDR5 in my PC back in 2019, i bought it on that august for 19 euros + 5.50 shipping, so a total of 24.50e for that GPU, got me a little into playing some games with an A6 7400k at 4.0Ghz and then "upgraded" to an Athlon X4 840, it made me a lot of memories.... Completely remade the PC from January 2021 to June 2021 with an RX 580 2048SP 4GB and now rocking an RX 6700 10GB on the same build as the RX580, this little Xeon E3 1240 V5 is giving me a lot of worth for the 20e i paid it back in March 2021🤣
I wasn’t aware that amd has this two memory models differences cards. There was a couple of Nvidia cards I had a while back that had both a higher memory size but slower bandwidth and smaller and faster memory and for those two cards I purposely searched for the smaller but much faster memory models. I think the two cards I am referring to is Nvidia GT730 and Nvidia GT1030. Both have low profile and/or single slot models. The performance gaps between the smaller but faster memory model and the bigger but slower memory model was literally the faster smaller one being 2x faster than the bigger but slower memory model. I think it was 2gb gddr3 or ddr4 (yes ddr4) vs 1 gb gddr5.
For being a budget gamer for some years now (2010-2018) this is playable. Got a 1650 super and R5 3500 for my latest rig and I will miss the blurry but running games.
I have a 2gb gddr5 version, and even though legacy drivers are somewhat new (Last one june 2021 I believe), I still get a crash once or twice a month randomly when alt tabbing/loading a game (glad I have ssd 😂)
I was in need of a gpu for an unraid server. Just something to do basic transcoding. Thanks for reminding me this gpu even exists. Cheaper than a gt730 and a bit more oomph too.
Great video! I got an XFX R7 250, low profile, 1GB GDDR5, as a part of a bundle. Tested it, I was surprised that it is actually a pretty neat vga for the price and size. Of course if someone accepts that it is not an R9, HD7870, etc. :D
nice, I recommend you review this card a couple of weeks ago. I've been gaming on it. It overclocks pretty well. Mine is a low profile Dell OEM version with 2GB of GDDR3. They sell on ebay for around $20 USD.
M8 i had this in lp in my first dell 7010sff rig, i went to bed crying, at the time I got 45-50 fps in Siege at 720p. The latency was like dialup on depressants
This very well could be the rare 64bit 250, I remember towards the end of gDDR3 there were crazy "cheap" cards being chucked out just to use up the chip stocks whilst giving a semblance of something of value, 64bit bandwidth bus is an horrendous experience, I remember a Dell branded 4gb gDDR5 7xxx Radeon, oh the specs looked super sexeh if you ignored the 96bit bandwidth and it was far far more terrible than the 5970 I had replaced with it... Bandwidth is oft overlooked by folks myself once upon a time included, 128 is sluggish regardless of how fast the ram on the card is, 256 is much better whilst 384 is a number to look out for because it can pump raw data to the CPU so much better, a 128bit bus can not only slow down your graphics performance but also hinder the Window's "stack" thus more timeslices are spent fiddling about tapping its foot waiting for the hoary old GPU to deliver its goods. GPU makers know if they they chuck in the latest memory modules, latest reference GPU, stick a fancy fan on the thing and sell it in a snazzy box, people will buy it then find that creaky croaky old 128 interface bandwidth makes the thing perform like a 20 year old pig... Its why my next purchase is going to be a GPU'd Ryzen 7, plus a ton of ram and a fairly whizzing mobo will make card chasing for me a thing of the past.
If you ever get the curiosity for these cards back, you could check out the 250E. The card didn't increase avg fps much, but it was way better when it came to choppiness and stuttering. The wife played through the Dragon Age series and the Witcher 3 on it, so it did it's job, and was surprisingly different from the regular 250.
I'd love to see you benchmark older games that are more relevant to the cards horsepower. NOBODY is buying or using old ass low powered cards to play Cyberpunk (at least nobody sane). The thing that's awesome about the PC platform is you can play old games at higher resolutions and frame rates and this is my fave use case for PC gaming. Would love to see more classsic/older game benchmarks depending on the card!
watching the benchmarks literally confusing and damaging my eyes and my head, I've worn my glasses yet my brain is perplexed that the vision is still blurry
Used to play GTA V a lot with a XFX 2gb GDDR5 really low profile R7 250. Boosted it to 1200mhz and could play gta online quite comfortably in a 1440x900 monitor, that card gave lots of fun.
I used to have an R7 250 2Gb and I remember barely being able to play Battlefield 4 at medium settings. Thank God I have an RTX 3070ti today so that's no longer an issue.
Nimez drivers likely would have fixed FH5. I got it to run great on my old 290X with it (obviously the 290X is another universe ahead performance wise though)
@@RandomGaminginHD the 625 is on 14 nm but the older Radeon 530 is also on 28nm. They can come in a 4gb variant. Also the gddr5 version has 320 cores, not 384. So look for DDR3 models.
@@abhimaanmayadam5713 The Radeon 625, according to AMD, is GCN 4 though. That's the same gen as Polaris (RX 480, 580, etc). I'd expect performance to be way better than this all else being equal. Since the memory interface is only 64 bit, you'd be running into a memory bandwidth bottleneck pretty quickly, so I'd expect the GDDR5 variant to be faster than the DDR3 version despite the lack of core count. That might depend somewhat on the game though, as very old games might not be memory bandwidth limited (though a 64 bit DDR3 bus was already pretty mediocre by about 2008). The Radeon 530 is GCN 3 based (same as R9 380 and 380X, but NOT the 370's and 390's), which explains why that's 28nm
Apparently, you could pair one of these with A10-78xx series APU for some crossfire-ish dual graphics shenanigans. I wonder if it would actually improve anything or slow down such systems even more... On a side note, I recently added a8-7680 to my collection as the last FM2+ cpu to be released and I wonder if it could be compatible with any dual-graphics solution. Detailed information about this cpu seems to be unobtanium
They do work, but most likely the APU is going to slow down a R7 250 if you are not using the A10 7890K and with memory well over 2133MHz, pretty solid if you ask me. Memory will be your main bottleneck on these systems, so DDR3-2133 is the bare minimum, anything higher, as long as your chip supports it will make some big improvements to the graphics. For anything below an A10 7890K or slower memory kits, use an R7 240, no bottlenecks, same performance as if you were using an R7 250 with a chip bottlenecking it. Or do as I did, I ditched the IGPU and brought two 250s instead, they were cheaper than to replace 16GB of DDR3-1866 (And yes, I have the FM2+ board that has the dual x16 PCIe slots) and run well, overclock well, but are slightly loud (They are tiny Dell reference ones, with a teeny tiny fan, but are surprisingly well built for Dell standards), I am running them in a GIGABYTE Sniper A88X board with an AMD A10 7870K overclocked to 4.5GHz and with a custom boost of 4.7GHz. Didn't get rid of the system as it was my machine from when I was a kid, instead I finished it and sometimes I fire it up just because of the nostalgia, Good ol' times.
I had a a10 7800 on the am3+ socket with a r7 250 gddr3 boost card. I used dual graphics crossfire with it, and use that setup for a littler under a year. The key to 'performance' was to get the highest possible ram speed you could afford. I settled for 2133mhz gskill sniper and was able to complete far cry 3 and watch dogs on a *crispy* 720p 60 fps playthrough on $250, with a used case and power supply from my old system. I had to upgrade the cpu cooler to a 92mm arctic tower cooler, because using all of that *"power"* with a stock cooler would cause the system to severely thermal throttle. Overall, most of the games I played ran well enough, and better than the aging xbox 360 I had. GzDoom modding was what kept me on that dual graphics setup for a while, because it ran so well for the price I paid. I was using windows 7 at the time, and I upgraded when I switched to windows 10 due to performance falling off a cliff. Upgraded to a gtx 960 ti and used that until 2020. Never looked back, and now I'm on a pny rtx 3070 LHR.
That transition to the graphics card when you said it was dying got me dying, IDK why I was expecting something along the lines of screaming banshee 🤣🤣🤣
I've never seen you be this savage. Usually i see you so optimistic about older hardware like "yeah if you're willing to turn down the settings a little bit then this piece of hardware can still make a capable gaming pc even today!", but this time it was just "I'd try 1080p but thats too optimistic", "red dead disaster", etc. The red dead comment especially made me laugh so loud because im pretty sure it was you who made the video of absolutely gutting either that game or the witcher 3 of all of its graphical fidelity through ini files and such, making it run on even the lowest end of hardware. Idk but this was hilarious. Amazing content. Havent laughed this much in a while 😂
My first graphics card was a Nvidia 6600gt/ and i played Crysis and half-life 2 at low settings. I have had a rx 280 and a 560gtx, rx 580 witch even today is a good card and now i have a rtx3070
I had a VisionTek R7 240 2GB that I threw into a pre-built and OC'd. Had I known the 250 existed, I would've used that instead because apparently, I could have used that with my A10-7800 APU in crossfire.
Man i love old gpu's. I am getting an AMD A8-3870k tomorrow which also has a had 6550D onboard. One of the first apu's so will be testing that. It use to be my brothers gaming PC till I upgraded him to a 3400g cause of insane GPU prices back then. Sadly his psu died yesterday so now I have a 3rd gen intel I am busy building with no psu :/
I had the "pleasure" of using an OEM R7 240 for a while last year whilst between cards. A version with a crippled 64-bit memory bus at that. Aside from sounding like a vacuum cleaner even at idle, it was not a pleasant experience. WoW was just about playable on it though.
The R7 250X is the same as the HD 7770 - about 50% faster than the R7 250, 1GB GDDR5. The R7 250 (GDDR5 variant) will run ok for the more popular multiplayer titles. I owned the 1GB GDDR5 variant, OC, from Sapphire. Same fan shroud etc. I liked how it managed to run (somewhat) older games.
Oh men, that was the time of the entry level cards back then ... But hey, if you need only to have a display then this should do the job 😀 I'm wondering if the R9 360 will perform better (?) than the r7 250
I'm surprised that Cyberpunk 2077 actually ran at a somewhat playable framerate. I bought a HD 8570, which has the exact same GPU and memory as the R7 250, but with lower clock speeds. I used it in a secondary PC for a time and played Dark Souls Remastered with my room mate, I had to use the lowest settings at 1280x720 to get a solid 60fps in that game.
Please let me ask you to do another video with this video card!! Can you PLEASE (yes it's for real) test it vs the new RDNA2 (2CUs) integrated graphics inside Ryzen 7000 CPUs?? Yes it sounds crazy on paper, but it might not be that much... and I'd love to see it, since that's the jump I'm going to do. This black friday!! Thanks!!!
Unless I overlocked any important detail even the 2400G in another video RandomGaminginHD did was mostly faster than this GPU. It was only better in 1% and 0.1% fps dips where the APU struggled a it more. Sadly the video about the 2400G lacked any info about the memory settings apart from being DDR4-3200.
my friend i notice you never turn off shadows i find turning them off seems to get a fair increase in fps in most games and on the cards i have tested , like HD 6570, 750 ti,1030 and so on, maybe a idea but by no means telling you what to do
So glad that my first GPU was the 2GB 650ti. Apparently, Spiderman will run on it. But with some graphical errors - and that's with the 650ti Boost. I know that with it however - I just need to look for GT 1030 results to get a good estimate of performance in anything modern. And at least it's better than the R5 340X - nearly half the memory bandwidth of the 250
If this is the Oland-based 250 (and you can), re-test this card in Dual Graphics mode. This is one of the cards you could pair with the FM2 APUs (The 7850K for example) for a little boost in GPU power. As for the memory interface, yes: the GDDR5 version was faster, but overall the 250 cards (Oland or Cape Verde) were a downgrade from the HD7750 they replaced, as both that and the HD7770 were condensed into the R7 250X rebrand, which was hotter, but cheaper than both at USD99 at launch.
I can assure you that 21fps would have been considered as a LUXURY of the highest order back in the N64 days. Playing perfect dark multiplayer with bots at about 15fps took real dedication.
Haha, I actually bought one of these back in like 2019, and paired it with a 1st gen i5, sold to some guy who wanted to use it for office work for cheap
I need glasses to see. However watching cyberpunk here, I could take off my glasses and see no difference!
Same 😂
Top comment 😂
@@RandomGaminginHD RandomGaming with glasses confirmed 🤯😂
As someone who is blind, I confirm
I think i can see the game better with my glasses off, thats how blurry it is.
A friend of mine used to play GTA 5 Online at 480p windowed on the lowest of lowest settings to achieve 30+ FPS. He was running an i7 3770 with integrated graphics. Don't underestimate what budget gamers will do to play higher end games. Certainly they'll go beyond what you'd call "playable"
I sold my used GTX 1660 to this friend for a pretty low price when I upgraded
Edit: HOLY shit this blew up...
as long game is launching it's playable for low end gamers xD I'm currently in that segment with my 750ti sadly games are moving to dx12 and I can't play recent titles. waiting for some more price drops to finally get 3070 or equivalent so that I'm good for next 5-6 years
That is literally what this guys whole youtube is about. I think the highest thing i've seen him benchmark is a 3050. He would happily recommend integrated graphics he loves his athlon integrated graphics.
@@AdamantMindset maaaan I was on a 750Ti for 6 years (2014-2020) and got a whole new rig wit ha 5700 XT, and today I just bought a 6800 XT
the thing i miss most after getting a better system is the amazing feeling of running a game at near 60fps with the settings you configured yourself, its very rewarding
oh yes I can confirm that, a friend of mine played (~2019) fairly recent games with an HD4770 (on an old af FX-CPU). Even RocketLeague looked atrocious. I couldnt bear watching him, but he would refuse me buying him something better. So when he had his Birthday, his friends and I bought him a brand new Rig with a 2400G which literally quadrupled his fps despite being integrated Graphics. (And it gave me the chance to overspend on the Budget without him knowing)
We need more cards like this.
Who wouldn't mind a tiny modern card that is basically a 3050TI/3060 sub 75W mobile card? Good enough for all tasks, with a bit of gaming oomph.
Perfect for a tiny PC that is mostly used for graphically undemanding work, but is sometimes used to game a bit. Yes, the new AMD 7000 chips have decent graphics, but it's basically either 6-7 y/o games or eSports at 1080p/30fps.
I'd really like a card capable of 1080p mid-high in Cyberpunk that is only 2 slots, one fan and doesn't require additional power. Basically, a laptop 3060.
A I miss when we had loads of tiny cards on the market. Palit make some cool tiny RTX cards though
2 slots is 2 thicc. ONE slot or bust. Half height, too. Some of us have SFF Optiplexes. The RX6400 or T1000 come closest to these requirements, but are more in line with a GTX 1650 in terms of performance. But the T1000, though its feature set is awesome for its size, is too damn expensive. If AMD could just include the damn VCE with the low end of the RX 7000 line (RX 7400?) instead of cutting it out like they did with
@@RandomGaminginHD Yeston as well. Their design for RTX 3060 / RX 6500 is really nice.
There is the GT1030, a card 10x better than this.
I once fell a victim for a Dell OEM R7 350X low profile card, when shopping for my bedroom TV video playback PC. The R7 340X was more terrible than the DDR3 version of the GT730. Even just for video playback, because the drivers were a mess.
@@hyperturbotechnomike AMD's drivers are pretty good these days
The first gpu I had was the 2gb gddr5 version from PowerColor. It was a hand-me-down from my brother, and it was all I had until I recently upgraded to an RX 6500xt. Needless to say the upgrade was worth it for me. But that old ass card holds a special place in my heart for fulfilling my gaming needs back when I was young, even if barely.
For me, it was a GTX 460 from Galaxy
I used an R9 270X 4GB until last year, great card :D
My first ever gpu was a radeon 9600pro advantage
@@vasya_cat I had one of those, it had a whopping 512mb ram :D
@@vasya_cat i got a 9600 XT, flex xD
I have Dell variants of this card (the low profile, AMD Reference ones) based in DDR3 memory, and let me tell you a little nice secret: the DDR3 ones are almost always capped at 65W, and almost always have turbo as well, idk about the GDDR5 ones.
Using MSI Afterburner you can actually bypass the power limit up to 75W and you can bypass the core clock/memory clock limits, and almost all cards overclock wonderfully well.
On my dual R7 250s, I think I have managed to make these card run at 1250MHz core (when turbo-ing, when hitting base clocks it hits like 1175MHz or so) and managed to make the memory run at 1300MHz no hassle, getting around a 33% performance uplift on each card.
On their own, they are pretty weak, but running two in games that actually run in crossfire, they are around 45-55% the performance of a 1650, in fact, the two in crossfirex ran Crysis in high at 1080P without dropping to 60FPs in anywhere on the entire game, so for retro gaming they are pretty solid if you don't mind hacking with them to get the performance you want.
I did something similar but with an R7 240, this things OC really well. Mine did 1100MHz memory (800MHz stock) and 1050MHz clock (730MHz stock). and when running it in dual graphics mode with an R7 iGPU (also OCd to 910MHz because that was the maximum allowed) some games ran really well considering it was being run in shitty hardware lol. Some games had issues with frame timing tho due to the difference between the iGPU and the R7 240. Fun times :)
@@nachogsr5085 I also have an A10 chip, but I decided to go with dual R7 250s due to my slow memory kit (And I couldn't afford to replace it, it was cheaper to grab two graphics card lol), as it did bottleneck really hard the graphics card if I used Dual Graphics with that memory kit, although it should be fine with A10 systems with DDR3-2133 or higher.
Ditching the IGPU and running on two identical cards did increase the CPU performance a bit too.
I used mine overclocked to 1250mhz for 5 years and it was a tiny beast. Never saw a card that benefited so much from a few more mhz, it was great to play kf2, overwatch and gta v at 1440x900p (with the 2gb gddr5)
I remember this card. This was a card that I picked up out of desperation after my laptop shit in me back in 2014. Took it back to best buy the next day and bought a GTX580 on Craigslist for like $80, as well as a psu for another $30 lol. This card was not good. I just wanted to play sc2 and Dragon Age: Inquisition. It did terribly with both.
Was it GDDR5? And paired with what cpu? I've seen some tests w the RX 580 paired with a Xeon E5-1620 v2 and it really gave some great performance. But, I can't say if the video creator was in fact being honest.
As is usually the case, it wasn't anything wrong with the card, just the user's faulty expectations of what it should do. People should spend more time checking what they actually need before rushing out and buying something.
My first GPU was an R7 240... in 2017... Unsurprisingly it didn't perform too well. Then I jumped to a 1050 Ti which was awesome, then a 3070 which is my third and current GPU. Both of my upgrades have been *big* jumps from my previous card. I hope my 3070 lasts me a long while though... I dread buying another GPU due to prices
I went from R9 270 to 1050ti to RX 590 to RTX 3070, and I'm seeing that the 3070 is aging quite well, the 1050ti was already struggling at 1080p one year after launch, the RTX 3070 still tanks everything like a champ at 1440p even after two years after launch
@@HanCurunyr tf it’s been two years!!
@@nazmulfahad3044 Yep, Ampere was announced Sept 1st,, 2020 and the 3090, 3080 and 3070 launched Sept 17th, 2020
I also had an R7 240 and jumped to a 1050Ti lol
GPU prices should be at their lowest right now since the crypto crash and next gen cards coming out. I ended up getting an RX 6600 XT because the ARC gpus weren't coming out in time.
3070 will carry you over trough PS5 age
My first PC gpu was an msi 7770 and I loved it so much, it was a huge upgrade from my laptop's HD 4650.
I used one of those for a while it was great :)
I also had a laptop with a 5650m it was absolute shit unable to run bad company 2 without stuttering lol mobile graphics came a long way since then
My first new gpu that I bought in 2012 was Shapphire 7770 vapor-x , that was awesome gpu :)
My first pc I ever put a GPU in was a HD 2600 when it was still ATI.Then when i graduated college I built my first PC with Dual 6850s. Then computer after that had a R9290x. Love me some AMD even if I'm unfortunately (My wallet hurts) on team green this build around.
@@vespermoirai975 dual 6850 ? Damn the micro stuttering must have been hard lol
Something to consider with older cards: a crt monitor, set to 480p
It looks great in person, plenty of refresh rates to play with, tiny visible scanlines, the only drawback in that text is slightly less readable. But it's not super pixelly like it would be on an lcd, owing to crt displays not having a fixed resolution
I was desperate back then to replace my XFX 5450 and was considering on getting this as an upgrade. Thankfully my mate sold me his 7790 instead for around a hundred bucks at the time.
I actually used this exact GPU in my first gaming PC. It was totally fine back in the day allowing me to have a ~40fps 1080p experience in GTA 5 online and a very playable minecraft experience, etc. paired with an AMD Phenom II X4 945 and 4GB RAM.
Whoa. Really? I have a Phenom II X4 955 (95W) and 2x2GB DDR2-800. Although I have the R7 250 2GB GDDR3 OEM model. Guessing you may have been running DDR3?
hey quick thought. For these older cards you test can you also include older titles? For example: Fate (first one in the series), by WildTangent, Morrowind, Oblivion, Ever Quest 1, or 2, CS-Source, Banished, Far Cry 3, Resident Evil 4?
I used and still own a XFX 2GB DDR5 and it ran games great at 1440x900. Games like Killing Floor 2, O.G. Skyrim (the special edition destroys the card lmao), Saints Row 4, Overwatch on low, GTA 5 would all run pretty good with the clock set to 1250mhz. Its a good card for 720p gaming, wish there were more of these low power low profile cards nowadays.
Always feels good revisiting those old budget gpus to remember what we went through as low-spec gamers ahah
i used a non gaming laptop before my pc so a UHD Graphics. it was bad, as strong as a og xbox but maybe weaker
Hi! i'm also an EX-Intel HD user myself, used to play with an old celeron with only 2 threads (at 1,7 Ghz) and a old 2Gb stick of unknow DDR3 memory sooo yeah... i understand your pain.. x)
@@julian-vanilla8379But that isn't a problem with the Intel GPU, it's a problem with the person who bought or uses the laptop for something it was never designed to do. The original Xbox played games very well, despite it's relatively modest specs, because the games were written with it's capabilities in mind. You can't buy one now and say it was crap, just because it can't play modern games in 4K resolution.
Hey Steve :)
Do you have a rough estimate on when the video with all of the benchmarks from the GPUs from the box is coming?
Or will it not be coming at all? o:
i ate the gpus
i still using Gigabyte R7 240 2gb DDR3 in my main gaming pc.
@@coochiedingler8868 lmao
I've got the R7 250x 2GB GDDR5 for free last year, the difference is really big against the non x variant :D
The only use case I gave to one of these was AMD dual graphics sync with my Kaveri 7850 on a htpc. It was not half bad back in 2014.
Would be interesting to see if performance would be higher with modded drivers like NimeZ (Amernime), not only they add support for older cards that AMD stopped supporting long time ago but they also added various performance tweaks for DX9/DX11/OpenGL.
Interesting. I might have to check that out. I have the OEM R7 250.
Wow, wow.
You aint ALLOWED to say something bad about the r7 250 for the rest of my lifetime.
I had one and i loved it.
I'll never forget how incredibly fun is when i had my first custom pc, from laptop radeon HD7670G to GTX 950
I don't even need to budget game anymore and run some reasonable specs, but I love these vids and your content - Thanks for doing all this hard work for us all.
Used to have a Sapphire R7 250X 1GB GDDR5 in my PC back in 2019, i bought it on that august for 19 euros + 5.50 shipping, so a total of 24.50e for that GPU, got me a little into playing some games with an A6 7400k at 4.0Ghz and then "upgraded" to an Athlon X4 840, it made me a lot of memories.... Completely remade the PC from January 2021 to June 2021 with an RX 580 2048SP 4GB and now rocking an RX 6700 10GB on the same build as the RX580, this little Xeon E3 1240 V5 is giving me a lot of worth for the 20e i paid it back in March 2021🤣
I wasn’t aware that amd has this two memory models differences cards. There was a couple of Nvidia cards I had a while back that had both a higher memory size but slower bandwidth and smaller and faster memory and for those two cards I purposely searched for the smaller but much faster memory models. I think the two cards I am referring to is Nvidia GT730 and Nvidia GT1030. Both have low profile and/or single slot models. The performance gaps between the smaller but faster memory model and the bigger but slower memory model was literally the faster smaller one being 2x faster than the bigger but slower memory model. I think it was 2gb gddr3 or ddr4 (yes ddr4) vs 1 gb gddr5.
For being a budget gamer for some years now (2010-2018) this is playable. Got a 1650 super and R5 3500 for my latest rig and I will miss the blurry but running games.
Thanks for adding a cilp of the dog at 0:34 :) He's looking great
Years I play NFS most wanted on Intel P4 2.4ghz iGPU. 15-25fps. Still a pleasant memory.
I have a 2gb gddr5 version, and even though legacy drivers are somewhat new (Last one june 2021 I believe), I still get a crash once or twice a month randomly when alt tabbing/loading a game (glad I have ssd 😂)
Try Nimez drivers, maybe they'll help. Maybe not, it really depends in my experience, but if they did your issues it'll be worth it.
I was in need of a gpu for an unraid server. Just something to do basic transcoding. Thanks for reminding me this gpu even exists. Cheaper than a gt730 and a bit more oomph too.
Great video! I got an XFX R7 250, low profile, 1GB GDDR5, as a part of a bundle.
Tested it, I was surprised that it is actually a pretty neat vga for the price and size. Of course if someone accepts that it is not an R9, HD7870, etc. :D
I remember my first PC with R7 250X OC with 1 GB of GDDR5...it was decent for what it cost me (something like 20 pounds in 2018)...
No matter what I absolutely enjoy all your content…. Besides, every GPU has its place.
The floating stone attacking you was funny tho, lol.
Haha yeah I thought it was never going to load
nice, I recommend you review this card a couple of weeks ago. I've been gaming on it. It overclocks pretty well. Mine is a low profile Dell OEM version with 2GB of GDDR3. They sell on ebay for around $20 USD.
I still have one but the non boost version, keeping it around as emergency card 😂
Id argue Cyberpunk is still very immersive like this, you are immersed in the fact that V needs glasses and should not be driving
Yeah poor eyesight edition
M8 i had this in lp in my first dell 7010sff rig, i went to bed crying, at the time I got 45-50 fps in Siege at 720p. The latency was like dialup on depressants
in the Cyber Punk car it looked like the 3DFX DOS4GW version of carmageddon 😂
My eyes hurt haha
This very well could be the rare 64bit 250, I remember towards the end of gDDR3 there were crazy "cheap" cards being chucked out just to use up the chip stocks whilst giving a semblance of something of value, 64bit bandwidth bus is an horrendous experience, I remember a Dell branded 4gb gDDR5 7xxx Radeon, oh the specs looked super sexeh if you ignored the 96bit bandwidth and it was far far more terrible than the 5970 I had replaced with it... Bandwidth is oft overlooked by folks myself once upon a time included, 128 is sluggish regardless of how fast the ram on the card is, 256 is much better whilst 384 is a number to look out for because it can pump raw data to the CPU so much better, a 128bit bus can not only slow down your graphics performance but also hinder the Window's "stack" thus more timeslices are spent fiddling about tapping its foot waiting for the hoary old GPU to deliver its goods. GPU makers know if they they chuck in the latest memory modules, latest reference GPU, stick a fancy fan on the thing and sell it in a snazzy box, people will buy it then find that creaky croaky old 128 interface bandwidth makes the thing perform like a 20 year old pig... Its why my next purchase is going to be a GPU'd Ryzen 7, plus a ton of ram and a fairly whizzing mobo will make card chasing for me a thing of the past.
If you ever get the curiosity for these cards back, you could check out the 250E. The card didn't increase avg fps much, but it was way better when it came to choppiness and stuttering. The wife played through the Dragon Age series and the Witcher 3 on it, so it did it's job, and was surprisingly different from the regular 250.
I had this card! It was a huge upgrade for me, lol. You can't even fathom the performance of my previous dedicated graphics card.
Compared to a Geforce 9400GT or Radeon 6450 1GB GDDR3?
Another great RandomGamingInHd video (with voiceover)
getting the card to even run cyberpunk 2077 is crazy, way beyond crazy.
It's totally unplayable, not that crazy.
I'd love to see you benchmark older games that are more relevant to the cards horsepower. NOBODY is buying or using old ass low powered cards to play Cyberpunk (at least nobody sane). The thing that's awesome about the PC platform is you can play old games at higher resolutions and frame rates and this is my fave use case for PC gaming. Would love to see more classsic/older game benchmarks depending on the card!
Imagine when we look back at the GTX1630 in 8 odd years. That can’t even play games at 1080p on launch and this actually could lol
watching the benchmarks literally confusing and damaging my eyes and my head, I've worn my glasses yet my brain is perplexed that the vision is still blurry
I'm amazed it ran as well as it did. Was expecting it to crash as soon as it opened Cyberpunk.
The Cyberpunk footage looks like V forgot their glasses, which you wouldnt think would be an issue with the whole "cybernetic eyes" thing
OMG this was my first graphics card the XFX passive model, this card was a real trooper in my old Dell Precision with a intel core two dou.
When doing cards like this you should test a few games of the time as well to see if it was any good at the time
Used to play GTA V a lot with a XFX 2gb GDDR5 really low profile R7 250. Boosted it to 1200mhz and could play gta online quite comfortably in a 1440x900 monitor, that card gave lots of fun.
I always enjoy these videos but I especially enjoy them when the dog makes a guest appearance!
Red Dead Disaster 2 genuinely made me chuckle
😁
The cyberpunk 2077 gameplay: This is how it looks when you forgot your glasses
I have an old R7 260x from my old rig, I’m saving it to test if my current video card still works as I know it works
That view of Cyberpunk 2077... That is exactly how I see the world without my glasses.
Hey, a big suggestion for your collection of benchmark games, why not throw Frostpunk into the mix?
I used to have an R7 250 2Gb and I remember barely being able to play Battlefield 4 at medium settings. Thank God I have an RTX 3070ti today so that's no longer an issue.
The Spiderman benchmark looked like an animated Van Gogh painting.
They should just have a setting called "without glasses" to enable for low end cards
My first gpu for gaming was an R7 250 2GB DDR3! I completed AC unity on it with a framerate of 10-15 fps
That's absolute torture because that game even now looks amazing.
@@corruptedpoison1 I now have a 3080 and 5950x and replayed it recently lol, a bit smoother. I also played fallout 4 on it with around the same fps
the r5 250 was a rebadged hd 7770 wasn't it?
Nimez drivers likely would have fixed FH5. I got it to run great on my old 290X with it (obviously the 290X is another universe ahead performance wise though)
And it still lives on in laptops as the Radeon 625. Also the nimez modded drivers do help it. For tf2, I get mid 50 fps to above 60 fps.
Really? Haha I’ll have to find one
265*
@@RandomGaminginHD the 625 is on 14 nm but the older Radeon 530 is also on 28nm. They can come in a 4gb variant. Also the gddr5 version has 320 cores, not 384. So look for DDR3 models.
@@abhimaanmayadam5713 The Radeon 625, according to AMD, is GCN 4 though. That's the same gen as Polaris (RX 480, 580, etc). I'd expect performance to be way better than this all else being equal. Since the memory interface is only 64 bit, you'd be running into a memory bandwidth bottleneck pretty quickly, so I'd expect the GDDR5 variant to be faster than the DDR3 version despite the lack of core count. That might depend somewhat on the game though, as very old games might not be memory bandwidth limited (though a 64 bit DDR3 bus was already pretty mediocre by about 2008).
The Radeon 530 is GCN 3 based (same as R9 380 and 380X, but NOT the 370's and 390's), which explains why that's 28nm
@@SterkeYerke5555 But the similarities and performance are there, despite it being made on a newer process.
Apparently, you could pair one of these with A10-78xx series APU for some crossfire-ish dual graphics shenanigans. I wonder if it would actually improve anything or slow down such systems even more... On a side note, I recently added a8-7680 to my collection as the last FM2+ cpu to be released and I wonder if it could be compatible with any dual-graphics solution. Detailed information about this cpu seems to be unobtanium
That is what one of my HTPC has is a 7890K and R7 250 in crossfire in a slim case. Works fine for that and old games.
They do work, but most likely the APU is going to slow down a R7 250 if you are not using the A10 7890K and with memory well over 2133MHz, pretty solid if you ask me. Memory will be your main bottleneck on these systems, so DDR3-2133 is the bare minimum, anything higher, as long as your chip supports it will make some big improvements to the graphics.
For anything below an A10 7890K or slower memory kits, use an R7 240, no bottlenecks, same performance as if you were using an R7 250 with a chip bottlenecking it.
Or do as I did, I ditched the IGPU and brought two 250s instead, they were cheaper than to replace 16GB of DDR3-1866 (And yes, I have the FM2+ board that has the dual x16 PCIe slots) and run well, overclock well, but are slightly loud (They are tiny Dell reference ones, with a teeny tiny fan, but are surprisingly well built for Dell standards), I am running them in a GIGABYTE Sniper A88X board with an AMD A10 7870K overclocked to 4.5GHz and with a custom boost of 4.7GHz.
Didn't get rid of the system as it was my machine from when I was a kid, instead I finished it and sometimes I fire it up just because of the nostalgia, Good ol' times.
I had a a10 7800 on the am3+ socket with a r7 250 gddr3 boost card. I used dual graphics crossfire with it, and use that setup for a littler under a year. The key to 'performance' was to get the highest possible ram speed you could afford. I settled for 2133mhz gskill sniper and was able to complete far cry 3 and watch dogs on a *crispy* 720p 60 fps playthrough on $250, with a used case and power supply from my old system. I had to upgrade the cpu cooler to a 92mm arctic tower cooler, because using all of that *"power"* with a stock cooler would cause the system to severely thermal throttle. Overall, most of the games I played ran well enough, and better than the aging xbox 360 I had. GzDoom modding was what kept me on that dual graphics setup for a while, because it ran so well for the price I paid. I was using windows 7 at the time, and I upgraded when I switched to windows 10 due to performance falling off a cliff.
Upgraded to a gtx 960 ti and used that until 2020. Never looked back, and now I'm on a pny rtx 3070 LHR.
This is a rebrand of HD 7750, right? I used to have one with 1GB GDDR5 version. It was the mainstream graphics card of it's era and was quite popular.
That transition to the graphics card when you said it was dying got me dying, IDK why I was expecting something along the lines of screaming banshee 🤣🤣🤣
I've never seen you be this savage. Usually i see you so optimistic about older hardware like "yeah if you're willing to turn down the settings a little bit then this piece of hardware can still make a capable gaming pc even today!", but this time it was just "I'd try 1080p but thats too optimistic", "red dead disaster", etc. The red dead comment especially made me laugh so loud because im pretty sure it was you who made the video of absolutely gutting either that game or the witcher 3 of all of its graphical fidelity through ini files and such, making it run on even the lowest end of hardware. Idk but this was hilarious. Amazing content. Havent laughed this much in a while 😂
Haha thanks. I’m done with this card
That Cyberpunk footage through the R7 250 would look good... in 1999.
What breed is your dog? Remind me of the Sykes from Midsomer Murders.
My first graphics card was a Nvidia 6600gt/ and i played Crysis and half-life 2 at low settings. I have had a rx 280 and a 560gtx, rx 580 witch even today is a good card and now i have a rtx3070
I had a VisionTek R7 240 2GB that I threw into a pre-built and OC'd. Had I known the 250 existed, I would've used that instead because apparently, I could have used that with my A10-7800 APU in crossfire.
“1080p? You must be having a laugh.” Hahaha that really made me laugh. Great video as per usual.
Will u include a little oc in the furtur ? would be interesting with does old gpu's.
I basically had one of these in my first laptop in form of overclocking its R7 M265 to slightly beyond this card at 1080/1180 core/mem
I finished Oultlast and Bad Company with R7 240 and 768p monitor back in the days lol
Sick rock collection bro.
I’d love to see what my first graphics card could do on modern tittles, the first pc I ever built had a power color r9 290 turbo duo
Man i love old gpu's. I am getting an AMD A8-3870k tomorrow which also has a had 6550D onboard. One of the first apu's so will be testing that. It use to be my brothers gaming PC till I upgraded him to a 3400g cause of insane GPU prices back then. Sadly his psu died yesterday so now I have a 3rd gen intel I am busy building with no psu :/
Doesn't the name reflect that they used to give these away with a 3 pack of Boost bars?
I had the "pleasure" of using an OEM R7 240 for a while last year whilst between cards. A version with a crippled 64-bit memory bus at that. Aside from sounding like a vacuum cleaner even at idle, it was not a pleasant experience. WoW was just about playable on it though.
The R7 250X is the same as the HD 7770 - about 50% faster than the R7 250, 1GB GDDR5. The R7 250 (GDDR5 variant) will run ok for the more popular multiplayer titles. I owned the 1GB GDDR5 variant, OC, from Sapphire. Same fan shroud etc. I liked how it managed to run (somewhat) older games.
that fsr cyberpunk graphic look similar to what i see without glasses.
Oh men, that was the time of the entry level cards back then ... But hey, if you need only to have a display then this should do the job 😀
I'm wondering if the R9 360 will perform better (?) than the r7 250
Unequivocally, the R9 360 will wipe the floor with the R7 250. It’s a GCN 2.0 card to start (250 is GCN 1.0), and has ldouble the shader count.
My first GPU was a Radeon 6770 1gb with a phenom 2 x4 965 in 2011 played skyrim at 1080p! good times.
I'm surprised that Cyberpunk 2077 actually ran at a somewhat playable framerate.
I bought a HD 8570, which has the exact same GPU and memory as the R7 250, but with lower clock speeds. I used it in a secondary PC for a time and played Dark Souls Remastered with my room mate, I had to use the lowest settings at 1280x720 to get a solid 60fps in that game.
Ca we expect a video on that minisforum with intel gpu in the backgound there? ETA prime whacked HoloISO on it and it looked quite good!
Yeah did a video review already. I use it as my capturing and editing system now. It’s fantastic
@@RandomGaminginHD thanks, must have missed that one!
nice weather over there :)
I think you could do Hybrid Crossfire with the Kaveri APUs like the A8-7600, or maybe that was just the R7 240
I used to have one. Top Tip. If you're lucky, they overclock like beasts!
Please let me ask you to do another video with this video card!!
Can you PLEASE (yes it's for real) test it vs the new RDNA2 (2CUs) integrated graphics inside Ryzen 7000 CPUs??
Yes it sounds crazy on paper, but it might not be that much... and I'd love to see it, since that's the jump I'm going to do. This black friday!!
Thanks!!!
Unless I overlocked any important detail even the 2400G in another video RandomGaminginHD did was mostly faster than this GPU. It was only better in 1% and 0.1% fps dips where the APU struggled a it more. Sadly the video about the 2400G lacked any info about the memory settings apart from being DDR4-3200.
my friend i notice you never turn off shadows i find turning them off seems to get a fair increase in fps in most games and on the cards i have tested , like HD 6570, 750 ti,1030 and so on, maybe a idea but by no means telling you what to do
Still have one from XFX. They are still good for a nice Retro-PC with Windows XP because of available drivers.
So glad that my first GPU was the 2GB 650ti. Apparently, Spiderman will run on it. But with some graphical errors - and that's with the 650ti Boost.
I know that with it however - I just need to look for GT 1030 results to get a good estimate of performance in anything modern. And at least it's better than the R5 340X - nearly half the memory bandwidth of the 250
GT1030 got trash driver support from Nvidia because "not a gaming card". Would target RX550 as the benchmark of entry level cards in current titles.
If this is the Oland-based 250 (and you can), re-test this card in Dual Graphics mode. This is one of the cards you could pair with the FM2 APUs (The 7850K for example) for a little boost in GPU power.
As for the memory interface, yes: the GDDR5 version was faster, but overall the 250 cards (Oland or Cape Verde) were a downgrade from the HD7750 they replaced, as both that and the HD7770 were condensed into the R7 250X rebrand, which was hotter, but cheaper than both at USD99 at launch.
It fulfills a perfectly fine job as a proper paperweight.
Sewing machine oil or some 5w motor oil would clear up the fan noise FYI.
I can assure you that 21fps would have been considered as a LUXURY of the highest order back in the N64 days. Playing perfect dark multiplayer with bots at about 15fps took real dedication.
DDR3 card .. I didnt know Radion pulled stunts like that.
2013 launch..
Haha, I actually bought one of these back in like 2019, and paired it with a 1st gen i5, sold to some guy who wanted to use it for office work for cheap
My first real "GPU" was an S3 ViRGE 3D, played NFS 3 @320x240 with 10-15 fps, good times!
My first "Gpu" was an HD 5450 and it was slow, BUT once i managed to OC it to 900mhz it managed to run CS GO at 30fps until like 2016.