I’m currently trying to build a ryzen apu system, and this video really clarified a few doubts I had about the impact of memory speeds on performance. I have to say, I’m still a bit disappointed that you didn’t show how memory speed affects cpu bound tasks, but you still earned a sub today 😊
Great timing! I'm benchmarking a 4350G right now with some cheap DDR4-4000 and am about to start playing with the timings, it's good to have some data to refer to here 🙂
You're very articulate and easy to listen too. A good presentation with good info which is unlike some others I see who talk so fast and throw so much info at you it's hard to try and even keep up with what they are saying before they move on to something else. Thank you!
Useful video, it saves me money to see, that an upgrade of memory from my current 3000 MHz to 3600 MHz will not bring very much. My Ryzen 3 2200G in Apr 2019 was a cost-effective solution with its $97 and its max 2933MHz DDR4. The 5600G is NOT a cost-effective solution for non-gaming users. I consider upgrading to a 4600G or 5600G in 1 to 2 years, when the prices (2nd hand) drop say $100 to significantly below $200. I think I have a good chance after the introduction of AM5 and DDR5 in 2022/23 :)
considering the fact that increasing memory bandwith resulting in almost linear performance uplift, i think why they still use vega until now is because ddr4 memory cant keep up with higher performance igpu. things might change when ddr5 hit the market, and become common, then maybe we'll starting to see higher performance navi igpu.
I think the argument is that they could use a newer architecture like RDNA one or RDNA2 with no RT tech to use less CUs and get the same performance as Vega with 7 or 8 CUs. But, faster memory would be required no feed more CUs... Or on board memory!
I was under the impression that the older Apu's where more affected by the timings, than the speed. It's nice to see the direction that AMD takes, with those new ones. They became more GPU than apu, as gddr ram has much worse timings and latency, but stupid fast clocks. This pretty much describe the gddr5 ram also.. I just wish they use more CU, as there's no replacement for displacement!
@@carisi2k11 Polaris was 28nm, and barely broke 1200Mhz. The 64-core Vega 10 chip was 14nm, and barely exceeded 1400Mhz. It also had a 4096-bit bus, which ought to have made up for the speed, but didn't. The Vega 20 was 7nm, but with 60 cores, you should have gotten more out of it. But, the on-chip Vega ditched the HBM2 controller years ago; I am running Vega 8 at 2100 Mhz, for hours, without a problem. I looked at the clocks on RDNA 1; the Vega cores since Zen 2 have been faster than the discreet cards for RDNA 1. I think they completely re-engineered Vega in the move to TSMC's 7nm process.
For the timing section, I think you mean a "meaningful" change instead of a "measurable" change as you numbers showed that you measured a small change.
I wonder how these CPUs and integrated graphics work when it comes to video rendering tasks. Would be nice if you could cover that aspect too when talking about integrated graphics and graphic cards, especially for casual gamers who do the occasional video rendering as well for work. Love how Eber signs off with saying you should spend responsibly ;) nice touch.
Agreed. This is the second video on TH-cam now that has analyzed memory speeds and timing on the new AMD APUs that has only tested gaming performance. Please show rendering and general productivity benchmark differences as well!
I understand your question. From watching Gamers Nexus, Hardware Unboxed and AHCO, I'm left with the opinion that integrated graphics can do video rendering...it's just going to take a long time to perform the task. A 1660 would be faster than igp's. Just my .02c.
I have read your mind. I just built a PC yesterday, and got this CPU and paired it with 16GB DDR4 3200. It plays everything I play just fine. Nothing major, but I only emulate on the PC and it works flawlessly
Yes it definitely does since running dual channel ram will effectively double the memory bandwith. A dual channel 8x2 2400mhz kit will easily beat a single channel 16x1 3200mhz stick in these scenarios
you won't notice the difference, you won't notice performance difference between dual rank dimm or single rank. the difference is shown in numbers, in charts of tests, in micro numbers and ultra speeds that you won't notice. even your same New pc with 2x16 RAM single rank 3200MHz will give you different values in Forza Horizon running the test four times, Cinebench will give you different results running it two times in a day for five days. (even so being the same new pc without any usage more than the tests). a lot of youtubers show the benchmark results of blender or tomb rider... *how many times the test was ran to post the result?* of course, this channel is reliable and trusted, real ! not as other fake anonymous without any proof that shows the have made the tests (benchmark, pc benchmark, testing games....)
@@marcc5768 I read a post on line, where several top brands of RAM were tested. The ones tested were all 3200 speed, which, for my PC budget, is what I will go for!
thanks alot for this, very clear and informative. I decided to order a 5600g and 2x8gb crucial ballistix 3600 cl16. Not trying to go all out, but wanted to be middle of the road kinda thing, while watching the budget. Either way, it's going to be leaps and bounds above my current 1090T with 8bg gskill 1600.
Get the $125 G.skill flare X, Bump the voltage to 1.45V, and the frequency to 3600 You'll be able to keep the 14-14-14-34 timings, and it's much cheaper than most other good kits.
Patriot steel viper has a 4400cl19 16gb kit for $115 that will under clock and produce very tight timings... Samsung b die ICs are the best memory modules available currently.
I really needed this video. I am on the fence with getting the 5700G or waiting for the possible 6800G/6900G successors. Edit: In the end decided on waiting for Q1 2022 for the Rembrandt APUs. They are going to be much better in comparison to the 5700G. The Steam Deck was a good indication of this including the leaked roadmap.
Yes the 6XXXG series would likely see a huge improvement in the graphics segment cause of new DRNA 2 architecture and DDR5 and it might have the Navi graphics which are much better than Vega. I am also waiting for that series to get started with my pc building!
@Viztiz what a joke, ryzen 6000 will pair with ddr5 up to 12000 mhz speeds, it will proly reach mid end gpus like gtx1660, thats why amd ended making low and mid end gpus.
Amazing video thanks so much. Concise, to the point and with great testing and information. 10/10 video man. This is the first video I saw on your channel too. Subbed!
Thanks, was sweating this after snagging a 5600G for a build for a friend (already have 2x16 Ballistix 3600 matched for the 3600X that was previously on the AsRock B450M-HDV R4.0 board). Sounds like I'm already set up with one of the best options, and if the board allows for overclocking the CPU and iGPU, this thing should SCREAM.
I did today (January 2023) a test with a game. The latency is by far the most important for the Ryzen 5700G as swift clock speeds. With the RAM speed set to 2733 Mhz and timings of 20 19 19 I get with the same game between 65 fps to 112 fps. All other settings on auto. It never passed that fps. With the same RAM (lowest is rated for 2400 MT/S and the other is rated for 3000 MT/S) set at auto it runs at 2400 Mhz and it's timings are according to the UEFI (Asus Prime B450M-A II) at 17 17 17 (all settings on auto) I get most of the time 145 fps and up to 165 fps peaking sometimes to 220 fps and 245 fps. Looks like the CAS latency is by far more important in my 5700G system (RX 6600) as the RAM clock speed.
Excellent video, still useful in Summer ‘23. I’m about to swap out 2x8GB Corsair Vengeance DDR4 3600mhz CL18 for 2x16GB G Skill Ripjaws CL16, and I was wondering what to do next - tighter timings or higher frequencies? This video makes me think I should leave timings alone and go for higher RAM/fclk clocks (and try to hold on to CL16, but not too hard).
Here in 2023, the GPU storm is not as intense, but it's still not an attractive time (for me) to buy. I bought a new R5 5600G for $118 US, 32 gb DDR4-3200 for $80, and a good 1 TB NVME for $90. I did buy a robust upgrade-worthy B550 MB. We'll see if upgrades ever make sense. Now I gotta watch the memory running video...
Thanks man, your video just saved me the trouble of trying the Rams from my mining rigs myself, now I know to just take my Teamgroup T-Force Xtreem 4000 ram and forgot the rest on this particular machine although I short of thought so as most dedicated GPU news gddr5, gddr6 running at crazy speeds, like my gigabyte rtx 3090 master that I used for mining, that puppys ram was running at 104-110°c because gigabyte use shit thermal pads, replacing them with some Thermalright EXTREME ODYSSEY II did the trick and lowered it to less then 50°c but also the rams were running at 9600mhz in oc mode. From that I kind of understood the importance of speed and the igp using system memory as video memory it was obvious that ram speeds were important.
I'm glad you did a chunk of time for just the iGPU. I just tested a 3100 + GTX 1650 (non-super) and the 5600G from your benches performs better/on par with the aforementioned combo, but here's the rub: I paid 128 + 214 for that cpu + GPU combo due to shit prices of 2021. So, for just the 5600G I get more cores and threads and better/same performance in games, for about $82 less. Yes please.
And it enables small mini pc cases like the ASRock X300W. Not everybody wants a huge largely empty case or can afford/justify a graphics card to go in it. Little space and little money is the way to go these days.
Update after a month. 5700G is a monster for work. I never go over 2% CPU usage. Sometimes I think I should've gone way lower end. Mind you, I don't game at work. I'm yet to find a way to slow down the clock below 3 GHz. For those quiet moments. I guess I'm saying it's a great CPU for power worker and could be good for gaming as well. Just amazing.
Don't bother, just fit a better aftermarket cooler on it and turn the fan way down. I mean you might want to throw some harder work at it at some stage. My 5600G hardly gets above 5% for normal tasks, but I can throw modelling software at it and it will suddenly become a lot more active.
Secondary timings have just as much of an effect on the performance as the primary. And tweaking primaries while leaving secondaries and tertiaries on auto might change their values for the worse in terms of performance.
intel pushes their non k line......OH well you cant run 3200mhz ram w.o a z series chipset ehehehe.......amd pushes these apu....."oh youll want a more expenseive kit of ram than a plug n play c16 3600 with zen3"........zen3 with c16 3600 xmp or 3200 c14 xmp out of the box instantly performs at a great level w.o any PBO/OC etc. APU......use ram calculator and clock tuner ryzen XD. But youll still need a new/ish chipset to support the newer/rdna2 apus....like a b550. OH how theyve played us all....then yeah apply 2 slots max of ram on ITX....for 32gb/duel rank/b-die. to hit 4000+Mhz at tight timings.......on a 32gb duel rank kit........just to maimize a "cheaper" cpu......which is exactly why im waiting for am4 to be truly EOL before i adapt my rdna2 apu build and concurrent duel rank b-die ram. As a 5800x/5900x makes off perfectly fine with 3600 c16, vs the now new b-die ripjaws v kit of 3600mhz c14....thus you push 4000 c16 etc with ram calculator designations....im not going to buy a 100$ more kit of ram over my already 200$ range 32gb ripjaws.....just so i can hit 3733 c14 on duel rank. Its no different than people on zen+ wanting to jump to zen3 on a b450 or x470.......yet only have 2400-3000mhz ram then get the new zen3 chip....with a 3080 or 6800....and wonder why their frames arent the same as benchmarks on YT or the web.
@@sushimshah2896 aCTuAlLy....... No, honestly: The ~15% difference we saw there was due tothe sticks only having 4 chips on each module (instead of 8) this adds extra latency in some scenarios (or so, Actually Hardcore Overclocking has a good video about it) You can think of it like single rank and dual rank. And in that case we basically had half rank.
@@anhiirr remove the "." key from your keyboard, thank you If you're recommending the Ryzen Dram Calculator you have no clue about proper RAM overclocking.
Since a dedicated GPU is so expensive I'm going to try to build a new system using the 5600g then add a dedicated GPU when prices drop. I'm also going to overclock the thing though.
Great comparison! Thank you very much for the information. I am going to build a pc with the 5600g because I think that with the 3600 memory it is the sweetspot between price-value for some decent 1080p gaming.
@@dalewilson908 you should consider i5 12400/12400f which intel just released. 12400f about 200$ and is on par with r5 5600x (if you're going to buy dedicated gpu straight away) then 12400 is better choice
Hi! What’s the difference of 3600 to 5600? And the x to g? I am a bit stuck moving forward cause I am so lost whether to get 3600 or 5600 and what other parts should go with it
@@cocomu9955 5600x is much more powerful than 3600, and the difference between 5600g and 5600x is the "G" stands for apu which means it has built in graphics card in the processor, so you can play less demanding games with only the cpu if you have the 5600g. but 5600x is about 10% faster in performance. Hope this helped
Just from my own experience, upgraded a Core i3 10100 and Asrock B460M Pro mobo combo to a i5 10600 and ASUS B560 Plus mobo combo. (yes I've seen Hardware Unboxed video on ASUS B560 mobo's, this one has a heatsink on the vrm's. Not sure how good it is, hope to be taking a closer look at it in the not too distant future.). With the memory tuning available on the B560 platform my son is able to run Destiny 2 at 60 fps on lowest settings. He's locked it at 40 and gets better frame consistency this way. He could not get the game to load on the older setup. Both setups were/are running DDR4-3600 memory with CL18 timings. He may not have had windows 10 set to performance mode with the i3 setup, I don't know. Now with about a $600 dollar computer setup he can run the games he wanted to run, not at the speed and resolution, but they will run. Fortunately I've convinced him that the cost of DGPU's are prohibitive at this time. Yes I can afford one but refuse to give into economic piracy. Some folks may not have this latitude/flexibility and that is truly unfortunate. It's unfortunate because these folks may have no choice but to work form home and require a desktop computer with substantial resources. Hopefully people in this situation have an employer that will defray the cost of such a system to help their employee. This is the kind of situation that show's capitalism at it's worst. The flip side, imo, while computer gaming and social media are enjoyable, we need to get back to better interpersonal relations. I'm not saying it's easy, but it's not impossible either. The human body responds very well to moderate physical activity, I know this from personal experience.
One thing I've discovered monkeying around with these is that power consumption only goes up very slightly if you overclock RAM. It did increase significantly if you overclock CPU or GPU, though. Mind you, it's very power efficient so this isn't a major concern, but it will make fans louder.
Some of the best ram I have found for this CPU in 2023 is Teamgroups T-force Vulcan ddr4 ram. you can get 16gb of 3600mhz cl18 for $37 dollars... or 4000mhz for $44. OR you can get the 32gb versions for $61 and $72. I think the 32gb is a little overkill for this though since unless you are playing mostly CPU intensive games like minecraft the 32gb will have little overall perfomance boost. And for 7 dollars more you should definitely be getting the 4000mhz instead of the 3600mhz if you get the 16gb
It depends on the mobo, how much is reserved for the iGPU. On mine you have no way to adjust it. But there's one pool of RAM, gfx programs can allocate more. Some programs lie to you, treating the 2GB reservation as if it were dGPU VRAM
You did not mention it, but to use 4000Mhz RAM at a 1:1 ratio with these CPUs you need to overclock the infinity fabric to 2000Mhz. Otherwise it goes into 2:1 mode. The fact that you did not mention it really makes me doubt that claim. Especially since you have to be lucky to be able to overclock it that high. I'm reasonably sure you'd know that. Otherwise I like the content of the video.
This isn't so much of an issue on the APU's, as they are a single CCU design and can handle higher fabric clocks than the non-APU's. This was also the case with the 4000-series APU's, though I'm sure a lot of people wouldn't know that since you couldn't buy them retail. So it doesn't surprise me that it holds 1:1 at 4000 MT/s. The part of his testing that really bothers me is that he only bothered with testing primary timings, and while those are important, it's the secondary and tertiary timings that will really impact performance.
Curious if the secondary and tertiary timings were manually set during these tests, left at Auto to run wild or if they were monitored and if so what they were as a few of them can really have a large impact in speed.
ARE YOU SURE you bought the good GT1030? The GDDR5 model? And not the HALF AS FAST DDR3 variant? Your results are strange. I wil get my 5600G on monday which I bought for 120€ then I will compare it to Quadro 2000 GT 1030 GDDR5 As well as 2200G and 3400G
@@Pram6969 Cant test because the 5600G has to be RMA'd it does not boot or post even on bios updated mainboard :( So will probably take 2 weeks before I have another 5600g at hand. Sorry
That's one of the most aesthetically pleasing thumbnails I've seen. Wow!
N your profile too
It is awesome to see reviews of the APUs. GPUs are way too expensive these days
my rx 460 just kaput this morning, thank god i just changed to 5600g
@@AbuGuroza Kaputt is german.
@@fynkozari9271 oh thanks
@@fynkozari9271 Kaput means something else in German, and they got the word from French. Kaput in this use case is purely English.
@@auturgicflosculator2183 I can speak german.
Vega is just the last version of GCN which started back with the 7970. Really has stood the test of time. My uv & oc'ed Radeon VII really has held up.
Looking health Eber! Much love :)
I’m currently trying to build a ryzen apu system, and this video really clarified a few doubts I had about the impact of memory speeds on performance. I have to say, I’m still a bit disappointed that you didn’t show how memory speed affects cpu bound tasks, but you still earned a sub today 😊
Great timing! I'm benchmarking a 4350G right now with some cheap DDR4-4000 and am about to start playing with the timings, it's good to have some data to refer to here 🙂
I’d be interested to see a 4000G series with different memory timings since 4000G has a much better memory controller than 5000
"Cheap" DDR4 4000 LOL
@@selohcin yeah, a random unheard-of brand ("V-Color") was selling 16gb kits of DDR4-4000 for £75 on Amazon a couple of weeks ago
@@selohcin you can easily get 16GB 4000mhz for 90 bucks or less
You're very articulate and easy to listen too. A good presentation with good info which is unlike some others I see who talk so fast and throw so much info at you it's hard to try and even keep up with what they are saying before they move on to something else. Thank you!
Useful video, it saves me money to see, that an upgrade of memory from my current 3000 MHz to 3600 MHz will not bring very much. My Ryzen 3 2200G in Apr 2019 was a cost-effective solution with its $97 and its max 2933MHz DDR4. The 5600G is NOT a cost-effective solution for non-gaming users. I consider upgrading to a 4600G or 5600G in 1 to 2 years, when the prices (2nd hand) drop say $100 to significantly below $200. I think I have a good chance after the introduction of AM5 and DDR5 in 2022/23 :)
considering the fact that increasing memory bandwith resulting in almost linear performance uplift, i think why they still use vega until now is because ddr4 memory cant keep up with higher performance igpu. things might change when ddr5 hit the market, and become common, then maybe we'll starting to see higher performance navi igpu.
I think the argument is that they could use a newer architecture like RDNA one or RDNA2 with no RT tech to use less CUs and get the same performance as Vega with 7 or 8 CUs. But, faster memory would be required no feed more CUs... Or on board memory!
It would seem the Steam Deck is proof of that, as it's RDNA2 iGPU is paired with LPDDR5.
Could the 3D cache (or how it's called) replace some memory bandwidth like the Infinity Cache?
Well said. infact the newly introduced ryzen 6000 for laptops use rdna2 for the graphics part and suppory/require ddr5 memory.
this man just predicted the future
the saund and the lightin of your videos is great, is pleasent to watch
I was under the impression that the older Apu's where more affected by the timings, than the speed.
It's nice to see the direction that AMD takes, with those new ones. They became more GPU than apu, as gddr ram has much worse timings and latency, but stupid fast clocks. This pretty much describe the gddr5 ram also..
I just wish they use more CU, as there's no replacement for displacement!
and RDNA2 instead of vega graphics.
@@carisi2k11 Polaris was 28nm, and barely broke 1200Mhz. The 64-core Vega 10 chip was 14nm, and barely exceeded 1400Mhz. It also had a 4096-bit bus, which ought to have made up for the speed, but didn't. The Vega 20 was 7nm, but with 60 cores, you should have gotten more out of it. But, the on-chip Vega ditched the HBM2 controller years ago; I am running Vega 8 at 2100 Mhz, for hours, without a problem. I looked at the clocks on RDNA 1; the Vega cores since Zen 2 have been faster than the discreet cards for RDNA 1. I think they completely re-engineered Vega in the move to TSMC's 7nm process.
For the timing section, I think you mean a "meaningful" change instead of a "measurable" change as you numbers showed that you measured a small change.
I wonder how these CPUs and integrated graphics work when it comes to video rendering tasks. Would be nice if you could cover that aspect too when talking about integrated graphics and graphic cards, especially for casual gamers who do the occasional video rendering as well for work. Love how Eber signs off with saying you should spend responsibly ;) nice touch.
Agreed. This is the second video on TH-cam now that has analyzed memory speeds and timing on the new AMD APUs that has only tested gaming performance. Please show rendering and general productivity benchmark differences as well!
I understand your question. From watching Gamers Nexus, Hardware Unboxed and AHCO, I'm left with the opinion that integrated graphics can do video rendering...it's just going to take a long time to perform the task. A 1660 would be faster than igp's. Just my .02c.
@@GB-zi6qr Thanks. I thought it might already be on par with a 1550 - 1660. 🤔
@@rubikazariah don't bet on it. I'm thinking dram isn't the same as vram. Please correct me if I'm wrong.
@@rubikazariah On par with probably GT 1030 or GTX 750 but 1050s+ are better cards than Vega lineup.
Great video and a lot of time and effort put into it! Thanks 💪
Definitely subbing after this video
Excellent and informative! Thanks again for another great video
I have read your mind. I just built a PC yesterday, and got this CPU and paired it with 16GB DDR4 3200. It plays everything I play just fine. Nothing major, but I only emulate on the PC and it works flawlessly
Thanks for the work Eber 👍
November 2022. Got 5700G at work. I'm in IT. Absolutely love it. Btw most of my IT team runs AMD at home. And yes, I found this very helpful Thx
That thumbnail is awesome
The aesthetics of this video are so pleasing!
This is exactly what I needed. Thanks!
Eber, ty much for the detailed overview. I was wondering if dual-rank can be a factor? Ty.
yes, but if your mem clock is high enough, dual rank doesn't really matter.
Yes it definitely does since running dual channel ram will effectively double the memory bandwith.
A dual channel 8x2 2400mhz kit will easily beat a single channel 16x1 3200mhz stick in these scenarios
dual channel /= dual rank
@@AnantGupta02 Sorry my bad
you won't notice the difference, you won't notice performance difference between dual rank dimm or single rank.
the difference is shown in numbers, in charts of tests, in micro numbers and ultra speeds that you won't notice.
even your same New pc with 2x16 RAM single rank 3200MHz will give you different values in Forza Horizon running the test four times,
Cinebench will give you different results running it two times in a day for five days.
(even so being the same new pc without any usage more than the tests).
a lot of youtubers show the benchmark results of blender or tomb rider...
*how many times the test was ran to post the result?*
of course, this channel is reliable and trusted, real ! not as other fake anonymous without any proof that shows the have made the tests (benchmark, pc benchmark, testing games....)
Thank you for testing this specific scenario of the integrated praphics!
Awesome explanation, awesome video, big thanks for it!
i like this Eber dude, so pleasing~
Thank you so much for the video!
Thanks for the great info! I would like to see the 5600G (or the 5650G) paired with 3200 speed RAM, but with a variety of applications. 🤓
Hardware Unboxed did something l like that recently. Ram scaling for the 5600g and 5700g with 1/2 a dozen different RAM kits.
@@marcc5768 cool thanks. Plan on getting the 5600G as the 5700G is out of my PC build budget! 🤓
@@marcc5768 I read a post on line, where several top brands of RAM were tested. The ones tested were all 3200 speed, which, for my PC budget, is what I will go for!
@@j.lietka9406 try to get CL 14 or CL 16 for best results. CL 14 might be a bit more but worth it for performance. Don't do beyond CL 16
@@marcc5768 ok thank you!
I bought a kit of Samsung B-die 3200 CL14 and I clocked to 4000 CL17 and it's stable. Samsung B-die is the best RAM you can buy still in 2021.
Hi, what motherboard do you have? Thanks!
Before the video I figured 3200 average timings.
After the video it was still 3200 average timings. Amazing!
thanks alot for this, very clear and informative. I decided to order a 5600g and 2x8gb crucial ballistix 3600 cl16. Not trying to go all out, but wanted to be middle of the road kinda thing, while watching the budget. Either way, it's going to be leaps and bounds above my current 1090T with 8bg gskill 1600.
Of course it help, thank you!
5000g chips and motherboards going on sale rn, just picked up a 5600g and motherboard combo for $310ish for my little brother for Christmas :)
Thank you. Very instructive video.
thanks for all these info! Great!
Late to the party, but I have a 2x 32 kit OC’d to 4000 MT/s. I’ve been curious as to how this memory would perform on the 5700G. Thank you, fellas!
Thank you for sharing. Might I suggest that you link the "Ryzen Memory Timing" video that you reference several times in the video description?
Informative! Thank you.
This is really appreciated since I'm considering going to a even smaller ITX case without a dedicated GPU.
Get the $125 G.skill flare X, Bump the voltage to 1.45V, and the frequency to 3600 You'll be able to keep the 14-14-14-34 timings, and it's much cheaper than most other good kits.
Patriot steel viper has a 4400cl19 16gb kit for $115 that will under clock and produce very tight timings... Samsung b die ICs are the best memory modules available currently.
@@ComputersAre8ad how can I know which brands modules have this samsung ICs?
Great info. Thank you.
I really needed this video. I am on the fence with getting the 5700G or waiting for the possible 6800G/6900G successors.
Edit: In the end decided on waiting for Q1 2022 for the Rembrandt APUs. They are going to be much better in comparison to the 5700G. The Steam Deck was a good indication of this including the leaked roadmap.
Are they really coming in q1?
Rembrandt is canceled, wait until 3-4 q 2022 for raphael
Yes the 6XXXG series would likely see a huge improvement in the graphics segment cause of new DRNA 2 architecture and DDR5 and it might have the Navi graphics which are much better than Vega. I am also waiting for that series to get started with my pc building!
@Viztiz what a joke, ryzen 6000 will pair with ddr5 up to 12000 mhz speeds, it will proly reach mid end gpus like gtx1660, thats why amd ended making low and mid end gpus.
@@emersonmatheus8611 don't choke on your own hope lol
Amazing video thanks so much. Concise, to the point and with great testing and information. 10/10 video man. This is the first video I saw on your channel too. Subbed!
Thanks, was sweating this after snagging a 5600G for a build for a friend (already have 2x16 Ballistix 3600 matched for the 3600X that was previously on the AsRock B450M-HDV R4.0 board). Sounds like I'm already set up with one of the best options, and if the board allows for overclocking the CPU and iGPU, this thing should SCREAM.
very informative & well presented, thank you!
it very helpful to decide for my new build
really good info thank you
Awesome explanation, sir. Thank you.
very good comparison between mem speed and timing, exactly what I was looking for :-)
Thanks for this well done video.
this was very helpful.
this video helped, thank you.
I love this dude's voice, and calm demeaner.
Thx for ur work
Important video. Thanks.
nice video, nice work!
It would be cool to see how 2x16 dual channel dual rank 3600 CL14 memory frame rates would chart since the latency in nanoseconds is so low.
Timings are less important than the speed. It doesn't matter if your timings are lower when you can't move enough data to make it matter.
This is the most perfect 5600G 5700G video that I've always wanted. Thank you.
I did today (January 2023) a test with a game. The latency is by far the most important for the Ryzen 5700G as swift clock speeds. With the RAM speed set to 2733 Mhz and timings of 20 19 19 I get with the same game between 65 fps to 112 fps. All other settings on auto. It never passed that fps. With the same RAM (lowest is rated for 2400 MT/S and the other is rated for 3000 MT/S) set at auto it runs at 2400 Mhz and it's timings are according to the UEFI (Asus Prime B450M-A II) at 17 17 17 (all settings on auto) I get most of the time 145 fps and up to 165 fps peaking sometimes to 220 fps and 245 fps. Looks like the CAS latency is by far more important in my 5700G system (RX 6600) as the RAM clock speed.
Excellent video, still useful in Summer ‘23. I’m about to swap out 2x8GB Corsair Vengeance DDR4 3600mhz CL18 for 2x16GB G Skill Ripjaws CL16, and I was wondering what to do next - tighter timings or higher frequencies? This video makes me think I should leave timings alone and go for higher RAM/fclk clocks (and try to hold on to CL16, but not too hard).
Sensacional quality and sharp video... how camera and lens used ? Regards
👌👌👌👌🔥🔥🔥
This is great Eber!!!
Whoahh, I had no idea the 5600G did not support PCI gen 4, I was going to get a B550 board, but now I know there's no point in paying up for that!
nice and useful!
Here in 2023, the GPU storm is not as intense, but it's still not an attractive time (for me) to buy. I bought a new R5 5600G for $118 US, 32 gb DDR4-3200 for $80, and a good 1 TB NVME for $90. I did buy a robust upgrade-worthy B550 MB. We'll see if upgrades ever make sense. Now I gotta watch the memory running video...
Thanks man, your video just saved me the trouble of trying the Rams from my mining rigs myself, now I know to just take my Teamgroup T-Force Xtreem 4000 ram and forgot the rest on this particular machine although I short of thought so as most dedicated GPU news gddr5, gddr6 running at crazy speeds, like my gigabyte rtx 3090 master that I used for mining, that puppys ram was running at 104-110°c because gigabyte use shit thermal pads, replacing them with some Thermalright EXTREME ODYSSEY II did the trick and lowered it to less then 50°c but also the rams were running at 9600mhz in oc mode. From that I kind of understood the importance of speed and the igp using system memory as video memory it was obvious that ram speeds were important.
This year's black friday was sick!! the 5700g reached as low as 260$ !!
I'm glad you did a chunk of time for just the iGPU. I just tested a 3100 + GTX 1650 (non-super) and the 5600G from your benches performs better/on par with the aforementioned combo, but here's the rub: I paid 128 + 214 for that cpu + GPU combo due to shit prices of 2021. So, for just the 5600G I get more cores and threads and better/same performance in games, for about $82 less. Yes please.
Less power consumption (not sure) + less stuff in case(better temps most probably)
And it enables small mini pc cases like the ASRock X300W. Not everybody wants a huge largely empty case or can afford/justify a graphics card to go in it. Little space and little money is the way to go these days.
Super video 👍👍
Update after a month. 5700G is a monster for work. I never go over 2% CPU usage. Sometimes I think I should've gone way lower end. Mind you, I don't game at work. I'm yet to find a way to slow down the clock below 3 GHz. For those quiet moments. I guess I'm saying it's a great CPU for power worker and could be good for gaming as well. Just amazing.
just lower the TDP to 35W
@@Puretea4711 Not sure how actually.
Don't bother, just fit a better aftermarket cooler on it and turn the fan way down. I mean you might want to throw some harder work at it at some stage. My 5600G hardly gets above 5% for normal tasks, but I can throw modelling software at it and it will suddenly become a lot more active.
Thanks now i know now 😍😍😍
Secondary timings have just as much of an effect on the performance as the primary. And tweaking primaries while leaving secondaries and tertiaries on auto might change their values for the worse in terms of performance.
True, just as we saw lately in the laptop space
intel pushes their non k line......OH well you cant run 3200mhz ram w.o a z series chipset ehehehe.......amd pushes these apu....."oh youll want a more expenseive kit of ram than a plug n play c16 3600 with zen3"........zen3 with c16 3600 xmp or 3200 c14 xmp out of the box instantly performs at a great level w.o any PBO/OC etc. APU......use ram calculator and clock tuner ryzen XD. But youll still need a new/ish chipset to support the newer/rdna2 apus....like a b550. OH how theyve played us all....then yeah apply 2 slots max of ram on ITX....for 32gb/duel rank/b-die. to hit 4000+Mhz at tight timings.......on a 32gb duel rank kit........just to maimize a "cheaper" cpu......which is exactly why im waiting for am4 to be truly EOL before i adapt my rdna2 apu build and concurrent duel rank b-die ram. As a 5800x/5900x makes off perfectly fine with 3600 c16, vs the now new b-die ripjaws v kit of 3600mhz c14....thus you push 4000 c16 etc with ram calculator designations....im not going to buy a 100$ more kit of ram over my already 200$ range 32gb ripjaws.....just so i can hit 3733 c14 on duel rank. Its no different than people on zen+ wanting to jump to zen3 on a b450 or x470.......yet only have 2400-3000mhz ram then get the new zen3 chip....with a 3080 or 6800....and wonder why their frames arent the same as benchmarks on YT or the web.
@@sushimshah2896 aCTuAlLy.......
No, honestly: The ~15% difference we saw there was due tothe sticks only having 4 chips on each module (instead of 8) this adds extra latency in some scenarios (or so, Actually Hardcore Overclocking has a good video about it) You can think of it like single rank and dual rank. And in that case we basically had half rank.
@@anhiirr
remove the "." key from your keyboard, thank you
If you're recommending the Ryzen Dram Calculator you have no clue about proper RAM overclocking.
@@anhiirr these are still vega apu not rdna 2
Since a dedicated GPU is so expensive I'm going to try to build a new system using the 5600g then add a dedicated GPU when prices drop. I'm also going to overclock the thing though.
ini yg aku cari selama ini 😮
Great comparison! Thank you very much for the information. I am going to build a pc with the 5600g because I think that with the 3600 memory it is the sweetspot between price-value for some decent 1080p gaming.
Im going to go for it too. :) have u built a pc yet?
Thanks for your comment, just what I've been looking for. I'm on board with the 5600G and he 3600 memory.
@@dalewilson908 you should consider i5 12400/12400f which intel just released. 12400f about 200$ and is on par with r5 5600x (if you're going to buy dedicated gpu straight away) then 12400 is better choice
Hi! What’s the difference of 3600 to 5600? And the x to g? I am a bit stuck moving forward cause I am so lost whether to get 3600 or 5600 and what other parts should go with it
@@cocomu9955 5600x is much more powerful than 3600, and the difference between 5600g and 5600x is the "G" stands for apu which means it has built in graphics card in the processor, so you can play less demanding games with only the cpu if you have the 5600g. but 5600x is about 10% faster in performance. Hope this helped
Interesting… on another note, have you ever consider getting into audiobook narration?
His voice would be nice on scripted podcasts like "stuff you should know"
thanks guys
I literally just purchased a new PC yesterday with 5600G and corsair 3200 RAM so will be putting it to the test soon.
How is it?
Still lovin' the 3400g
Just from my own experience, upgraded a Core i3 10100 and Asrock B460M Pro mobo combo to a i5 10600 and ASUS B560 Plus mobo combo. (yes I've seen Hardware Unboxed video on ASUS B560 mobo's, this one has a heatsink on the vrm's. Not sure how good it is, hope to be taking a closer look at it in the not too distant future.). With the memory tuning available on the B560 platform my son is able to run Destiny 2 at 60 fps on lowest settings. He's locked it at 40 and gets better frame consistency this way. He could not get the game to load on the older setup. Both setups were/are running DDR4-3600 memory with CL18 timings. He may not have had windows 10 set to performance mode with the i3 setup, I don't know. Now with about a $600 dollar computer setup he can run the games he wanted to run, not at the speed and resolution, but they will run. Fortunately I've convinced him that the cost of DGPU's are prohibitive at this time. Yes I can afford one but refuse to give into economic piracy.
Some folks may not have this latitude/flexibility and that is truly unfortunate. It's unfortunate because these folks may have no choice but to work form home and require a desktop computer with substantial resources. Hopefully people in this situation have an employer that will defray the cost of such a system to help their employee. This is the kind of situation that show's capitalism at it's worst. The flip side, imo, while computer gaming and social media are enjoyable, we need to get back to better interpersonal relations. I'm not saying it's easy, but it's not impossible either.
The human body responds very well to moderate physical activity, I know this from personal experience.
Well said.
Make sure to update the description to include the affiliate links to buy the new processors :)
Short answer: 3200mhz CL16
You're welcome.
"X" series also is able to use NVMe gen 4, the "G" series uses Gen 3 NVMe M.2. The "5800X" runs really hot as well
Thank you for a simple and well presented vid which actually explains what we, the average Joe, needs to understand.
recently i upgraded my ram and they gave it my 2200g another chance to shine
One thing I've discovered monkeying around with these is that power consumption only goes up very slightly if you overclock RAM. It did increase significantly if you overclock CPU or GPU, though. Mind you, it's very power efficient so this isn't a major concern, but it will make fans louder.
Did you use a DDR4 or GDDR5 GT 1030?
How about editing sir? Or OBS studio
oh my god this video has lots of knowlege
Some of the best ram I have found for this CPU in 2023 is Teamgroups T-force Vulcan ddr4 ram. you can get 16gb of 3600mhz cl18 for $37 dollars... or 4000mhz for $44. OR you can get the 32gb versions for $61 and $72. I think the 32gb is a little overkill for this though since unless you are playing mostly CPU intensive games like minecraft the 32gb will have little overall perfomance boost. And for 7 dollars more you should definitely be getting the 4000mhz instead of the 3600mhz if you get the 16gb
what is the size of the memory allocation for the iGPU: 2GB, 4GB, 8GB or Auto?
I want to say it's suggested to keep it at "Auto". Performance was usually the same or better with auto than preallocating any amount.
According to doom eternals read out when I played it, there is about 8gb of vram but in R6siege it says only 2. So I assume it's auto.
It depends on the mobo, how much is reserved for the iGPU. On mine you have no way to adjust it.
But there's one pool of RAM, gfx programs can allocate more. Some programs lie to you, treating the 2GB reservation as if it were dGPU VRAM
11-16GB video shered memory in games...
@@christiankulmann3325 But is the game using that much? It depends on the game obviously.
You did not mention it, but to use 4000Mhz RAM at a 1:1 ratio with these CPUs you need to overclock the infinity fabric to 2000Mhz. Otherwise it goes into 2:1 mode. The fact that you did not mention it really makes me doubt that claim. Especially since you have to be lucky to be able to overclock it that high. I'm reasonably sure you'd know that. Otherwise I like the content of the video.
This isn't so much of an issue on the APU's, as they are a single CCU design and can handle higher fabric clocks than the non-APU's. This was also the case with the 4000-series APU's, though I'm sure a lot of people wouldn't know that since you couldn't buy them retail. So it doesn't surprise me that it holds 1:1 at 4000 MT/s. The part of his testing that really bothers me is that he only bothered with testing primary timings, and while those are important, it's the secondary and tertiary timings that will really impact performance.
please do a review about the 4650g as well. thanks :)
Thank you
APU FTW
One thing to remember: GDDR timings are really loose. For GPU performance, timings matter way less than raw transfer speed!
Curious if the secondary and tertiary timings were manually set during these tests, left at Auto to run wild or if they were monitored and if so what they were as a few of them can really have a large impact in speed.
ARE YOU SURE you bought the good GT1030? The GDDR5 model? And not the HALF AS FAST DDR3 variant?
Your results are strange. I wil get my 5600G on monday which I bought for 120€ then I will compare it to
Quadro 2000
GT 1030 GDDR5
As well as
2200G and 3400G
What's the result
@@Pram6969 Cant test because the 5600G has to be RMA'd it does not boot or post even on bios updated mainboard :(
So will probably take 2 weeks before I have another 5600g at hand. Sorry
This is THE definitive RAM video for the 5700G. Way better than Hardware Unboxxed.
Due to the slowness of the ram speed, the speed of navi architecture will be limited. So let's wait for the DDR5