Byte Size Techs: Dual vs Quad Rank RAM - 2 DIMMs vs 4 DIMMs - Which Is Better? - th-cam.com/video/GThdz0kPBms/w-d-xo.html Is It OK to Mix RAM Kits? - th-cam.com/video/qxxrnsUXcko/w-d-xo.html Does 32GB of Ram REALLY Make a Difference in Your PC ??? - th-cam.com/video/JupBvOfoHjA/w-d-xo.html Is 16 GB of RAM Enough for Gaming in 2020? - th-cam.com/video/WimRwGv0XNA/w-d-xo.html
I have not run into a game yet that uses more than 16GB of Ram, but if your doing things like video editing then getting 32GB might make a difference. The price increase between 2x8GB vs 2x16GB isn't too much so might as well get 32GB even if your not going to use it. Maybe a future game will use it?
Any Techs out THERE(?) ..... Will my newer revision.... (5.23 or 4.23 REV 'if it matters) .... 3000mhz - Quad Channel Corsair Dominator Platniums... (all from same package) ...work with my X99 platform and its XEON 2679 V4 that I am about to work with on my ASUS-E-WS-USB/3.1 MoBo? They are 1.35, I know they say 1.2 is better so will it just under-clock to the XEONS highest rating as in 2400 or 2666mhz or around give or take?..... I hope, also, if so can I lower the wattage to like the standard x99 broadwell 1.2v for ram or even lower then the 1.35 rating to 1.25 or anything lower if its going to run lower anyway or might it not run at all? That would be terrible as it was over $200 used after tax... barely used but used.... eBay. :)
@@coolcat23 The test is garbage. It puts comparable speeds (3200 vs 3600) against each other in heavily GPU-bound titles. It should compare 3600 to regular 2666 and put them against each other in CPU-heavy AAA titles, like Cyberpunk or RDR2.
@@DFDark2 Those games are hard on CPU, but not so much on GPU. What is ideally needed for establishing RAM utilization is a game that puts GPU into full work while loading up a lot of graphic data. Huge 3D open-world games are perfect for that. Unless you want to measure something like turn time in Civ6 but that's a different, very game-specific test, while most gamers are mostly interested in FPS gains. Either way, RAM frequency can give significant performance gains in many games. This test just happened to mostly benchmark games that aren't as responsive to RAM speed as some others.
The test is inherently flawed. 1% being greatly better because of better RAM, is not a strange result, and is far more prevalent, on weaker CPUs, when you have the slightest amount of bottleneck. Im running a 2600 with a 1080, and as im running games with optimised and minimum settings, the difference between 3200 b-die stock and 3200 OCed. It's not only big, but the minimums sometimes are 2x. My fps in heavy titles, is incredible consistent. RAM speed does not limit performance. CPU might limit performance, thus you increase it with faster RAM, and by faster RAM i dont mean RAM speed only, but tight timings aswell.
What are you all talking about! That’s the new CPU Thermal Paste he got there... it’s just peanut butter coloured! That’s why tech’s benchmarks are so crispy and tasty!
I'll take full system stability over a 5%-10% performance increase that I'll never notice when actually using my computer. Run 32GB (4x8GB) at 3200 CL16 and have zero complaints.
You're not talking about 5% either, in some tests on Ryzen you get slightly higher fps on the 3200, on others it's the 3600 but it's never more than 1.5%, it's supposedly a little more on Intel but I doubt that it's more than 2%. The next AM5 socket motherboards may have a more dramatic difference but at present the sweet spot is definitely in between the 2, (3466 apparently) but there's no real difference so the cheaper, more stable 3200 is the better option.
@@paulm2467 I'm really glad you said this. I'm building my first PC and went with 3200 because it seemed like it would be the most stable instead of trying to push for 3600. Makes me feel justified XD
Actually tuning the subtimings gets u the biggest improvement in terms of perf or fps in gaming , way more then just setting the freq. higher , but thats a little time consuming but worth at the end.
@@sorinpopa1442 I remember seeing a video showing the fps difference & the going from the lowest ddr4 ram to the highest was only 5-10fps increase 🤷🏻♂️
@@mikeramos91 depens on the game also some u only get 2,3 fps dif at max in some games. in some best case scenario up to 15 or more especyally if u play at lower res 1080p or so . Not huge ya but just for tuning ur ram subtimings free perf. gain basically.
@@sorinpopa1442 I increase the MHz one step at a time until the computer wouldn't boot. Clearing the CMOS via the jumper didn't fix it. Removing and putting back the battery didn't. I had to remove a stick of RAM and I assume it reworked its way through a process to recognize it again. Later I put the other stick back, set XMP and everything works the same as before I tried that stunt. It is a 3200 MHz kit, stepped it up to 3333 seeming stable. Now I'm squeamish about trying the timings. Default is c16.
@@jasonlisonbee Very painful process. Glad i have a gigabyte motherboard which goes to default settings whenever ram oc is invalid. It took me time to oc 3000MHz ram to 3600MHz on a first gen Ryzen. Fun times.
I love this channel since it's so straightforward and blunt. More channels need to talk about the fact that you'll only see some differences on a chart, not buying what you dont need, etc. Great stuff!
The testing was "blunt" indeed. Completely unreliable. Getting worse(!!!) performance from faster RAM is not explained by stating that often RAM is not the bottleneck in a system. Rely on these results at your own peril.
Used to be so informative until the talk show with his wife who doesn't contribute anything. I unsubbed. Let's see if it's back to being informative and not gibberish for an hour.
When I first built my system with DDR4 3200 I forgot to make the changes in the bios. When I realized that and did it the most noticeable difference was in system operations in general not in game speed.
DDR4 CL14 running 14 14 14 14 34. Cut the time to engage the connection and run it on quad channel. People want faster and faster RAM speeds. AMD offered us QUAD Channel on the TR three years ago. EPYC and Threadripper Pro run 8 channel RAM.
@@MJ-uk6lu At the Threadripper launch no utube channel talked about quad channel doubling ram transfer speeds. Again at TR PRO launch no utube channel talked about 8 channel ram being 4X the speed of dual channel. I had to go to websites for Asus and AMD then search info on the tech terms they used. For that matter no one talked about GEN4 AMD cpu lanes Vs Intel's GEN3 lanes. Faster dual channel may help, but it kills the low latency. WRX80 is the new 'WINNER'.
One setup where I noticed a overall difference was going from 2666 to 3200mhz on a i5 10400 on a z490 board. The Z board was only $20 more than a decent b460. Everything just feels snappier and certain games like Overwatch netted me another 8-10 FPS avg. that’s a big difference
I'm just getting back into building computers since the switch from AGP to PCIe (just been getting stock laptops through work...bleh...) and really did not have the time to parse out 12+ years of technological progress. Your videos have been an amazing way to catch up on the times and figure out what is worth spending money on these days. Thank you!
11:52 About your final thoughtson server hardware being okay with "slower RAM," don't forget they are usually quad channel (or more), which vastly offsets the slower clocked RAM. The consumer desktop space, being only dual channel, has to rely on faster clocks / lower latencies. Server builds prioritise slower, more stable, lower power configurations.
Here are the differences between your regular PC gamers and the Rams Cult/guru's/Junkies. PC Gamer's- buys almost the cheapest rams he can get because it doesn't improve his fps enough to warrant the more expensive rams. Ram Guru's- Spend more money on his Rams than his CPU. Hoping to get that one golden bin Samsung B-die that can do CL 15-15-15 at 4400 MHz. He would skip sex if he can shave off just one more nanosecond off his ram latency. He would spend countless hours, weeks, and months living inside his pc bios tweaking every setting there is and leaving no stone unturned if he can just find that one setting that would shave off another 0.5 nanoseconds off his ram latency. He remembers all his Rams RLH/IOL numbers by heart and his most used words are "Please please post" begging and praying to his motherboard after putting in a new set of RLH/IOL numbers. Rams are not just about fps and the one that you buy from the bargain bin rack even at real high speed but the timing is way too loose to tell the differences. The super rams like Samsung b-die is not just about speed but tight and very low latency something like this is a CL 16-16-16 at 4400mhz and super tuned to 35 nanosecond latency or lower. I promised you, it will blow your mind!
Having your RAM in dual rank nets a much bigger difference in games on my 5600X as covered by so many videos. I got 20+ more average FPS in Shadow of the Tomb Raider bench going from a single rank kit to a dual one (tested with an RTX 3070, 1080p highest preset + TAA)
Just run 4 sticks of single rank gives the same results as 2 sticks of dual rank. I tested 4 sticks of single rank 8GB DDR4 3200 16 vs 2 sticks of dual rank 32gb ddr4 3200 16 and all memory and game tests were the same. ZERO difference.
This was an excellent comparison that will provide cost benefits. I especially liked hearing that it would be more effective to buy more ram to improve performance. When I do my next build I will use this information. Thanks
I've been a professional tester for CPUs, Motherboards, Memory and GPUs for 20 years, working for a leading PC magazine, beginning in the early 90th. I always been told by vendors, that the new generation of RAM will be the big game changer. It never was. There been slight improvements between the generations (DRAM to SDRAM to DDR to DDR2 and so on). But never in a way that we've could recommend Our readers to upgrade because of memory speed. The differences in-between the same generation of RAM (i.g. DDR4) are even more insignificant. All this memory hype reminds me on HiFi enthusiasts who spend a fortune for some voodoo cables. It only improves your system if you firm believe in it.
there is some truth to the quality of cables for HiFi - pure cooper over aluminum clad copper or just cheap aluminum wire , higher gauge over lower gauge, ( 14 gage vs 12 gage )
Great stuff! I have 64GB of 3600MHz and I've been running it at 2133 for ages as a 3D artist. Thought I'd try running it at 3200 and 3600 and saw no difference in my workflow except sometimes I would BSoD during meetings or while working 👌 Back to 2133 for me
Id love a timing tuning video for older systems like DDR 3. Often on locked down or oem boards its the only thing you can tune as even BLCK is locked. (looking at you Dell and HP) And maybe ram speed/Timings effect on APU to cater to the budget and international crowd. :)
nice. Gonna upgrade from a 2600X to 5600X sometime next year and was thinking about a RAM upgrade from my 16GB 3200/CL16 RAM to 16GB 3600 RAM to maybe push the new Ryzen. Well, when I see these "dramatic" differences I'm better off saving my money and just keep on using the existing ram, which works like a charm anyway. :)
I had heard that AMD 5000 chips relied heavily on RAM speed to achieve optimal performance, but I guess that was only applicable to the older generations when upgrading from speeds below 3000 MHz. Thank you for this video - it cleared things up for me. I suppose I mostly wasted my money on these 3600 MHz CL16 RAM sticks I bought, but I guess it doesn't hurt to have them, though. Now, I'll probably just run them at 3200 MHz instead of 3600 for the sake of stability when I finish my build.
about the stability bit,its true. Cannot run my "expensive" xlrb 3600 mhz at 3600,windows keep bugging out and I had to change it for 3200 mhz. Dont know if it will stabilize these overpriced ram
@@luckydepressedguy8981 the sweet spot for amd even though it’s recommended to be 3600mhz is actually 3200mhz I didn’t even bother wasting the extra 100 bucks on the 3600 mhz version of my ram sticks
Yea, I've done this before, but people need to see it again. I may do a follow up using Zen 2 on a B550 just because. Or not... maybe Intel? It doesn't really matter however.
CPU's caching algorithm, branch prediction speculation and stuff is what makes the CPU show this kinda result. A CPU core is unaware of the RAM's existence. All it needs is that specific data block must always be available onto it's local cache memory (L1 data and instructions cache), it takes around ~1 nanosecond to get the core to fetch the data from here. If a core can't find data in its L1 cache then it waits for the L2 cache to send the data over to L1. Since it takes time for the data to arrive fron L2 to L1 and then accessing it (total time is around ~5 nanoseconds). But, if that data block is not available in L2 as well, then that core will have to wait until that data block comes from L3 to L2 to L1 and then access it (takes around 10-15 nanoseconds). And, if that block of data still isn't founded, even on the L3 cache then.... oh boy this is going to be the longest time for CPU to wait for that data block since that data block will now be fetched from thousands of miles away; from the RAM, and it takes about (40 - 80 nanoseconds). This is when all that "RAM latency" thing starts to matter. Fun fact: for most CPUs the "cache hit rate" (it means the rate of finding the required data block for a core on its L1 cache) is said to be near ~95% for modern CPUs. That means, 95% time of the total running time of your application, a CPU core will always find all the data that it needs on its local cache (L1), and, only for 5% time that core will have to wait for the data to arrive from other type of memories (L2, L3, L4 and RAM). Now consider a theoretical CPU that always finds data on it's cache, it means 100% cache hit rate. That means RAM latency doesn't matter anymore, at all. But reality isn't like that. I think, the reason you don't see AMD Ryzen CPUs suffer from real world performance loss despite these have the highest memory latency is not because their architecture is efficient in caching algorithms, it is because these CPUs have way way more cache memory size than any Intel CPU. That probably keeps the cache hit rate higher most of the times. Intel on the other hand don't have the advantage of using a large cache (for now), hence Intel has to use superior caching algorithms, branch prediction stuff. This is why despite Intel is low on cache memory they are still competitive. But, in the end when the frequency of data blocks keep increasing (future even heavier data sizes; AAA games) , only having a superior caching mechanisms won't be enough and you will need more cache memory as well, eventually, and this is where AMD has more chances to be in the future competition than Intel. At least current theory based on my speculation says so.
Beautiful test bench. One of my projects is to turn an old case into an open air test bench. I'll send a link to my Twitter feed when I can show something.
this guys got no idea, Higher ram speed kits can be lowered down and timings tightened I bought 4 sticks of 8gig DDR 4 4000 then lowered the timings down to CL 14 14 14 14 28 at 3600 or CL16 16 16 16 32 3800 depending on the games I wish to play or work on, Running on a 5950x and got around 15% - 20% boost in most games
This shows there's no need to overclock my RAM for gaming. Higher voltage, less difference in performance. Instead I reduced clocks on my stock 3200MHz without change of voltage ^^ Thanks for this video, it helped me pick good way ♥
Really glad you made this video! This is what we need to shut up those that think they know and really don't. I am one of those surprised because it's always someone posting you need FASTER RAM. Yet, personally I never saw why. So, really glad that at least some testing is done to show that more RAM speed, for the most part, is mostly hype and mental comforting.
Why do people pay more for faster RAM? Because it makes a (small) difference in a proper system that does not have some issues. Server farms are an entirely different matter; inadequate analogy.
Good advice. Just ordered a kit of 2x8 DDR4 3600 CL16 to replace the 2400 sticks in my rig. Cost finally came down enough that I thought I'd try it. 3600 wasn't in the budget when I built it. Would have liked to see some comparisons between 2400 MHz ram.
RAM speed, primary, secondary, and tertiary timings don't amount to a hill of beans for most users in most applications. Almost all the time, something else is affecting performance to the point where RAM speed is inconsequential. Using mildly slower RAM will go unnoticed in real-world use but will have a noticeable effect on your real-world wallet.
When I first stumbled upon your channel I thought to myself "that guy over there talks like he wants to sell me a fridge or something". After watching some of you videos I came to the conclusion that it's one of the best channels for an average Joe like me, who doesn't ahve many finances but still likes to enjoy things and wants to get best he can get for the buck. I'm now a big fan of yours. So please good Sir, keep selling me all the fridges you have :D!
Yeah, got myself a 64gb 3200MHz ram kit for productivity. I'll probably check if docp on/off makes any difference for the kind of workloads I have, but in any case its pretty much academic, my pc is already anywhere from 2x-5x faster (depending on the task) than that of my colleagues, so I doubt adding maybe 5% is even going to make a difference even if it does materialise. Productivity is not that different from gaming in that respect, if its not a whole tier faster it doesn't matter. The question isn't is it running 2 or 3 seconds, the question usually is is it running 30-150 seconds or long enough for me to get up and get a coffee in the meantime.
Wish I saw this 2 weeks ago. Had 32GB(2x16) 3600CL18 in my PC. Decided I'd need at least 64 for my video work after seeing the 32 fill up pretty quick on small projects. Figured I may as well go all out and bought 64GB(2x32) 3600 CL16 instead. Zero difference between CL 18 and 16. Aside from the price.
Sir youre the only one i ccan actually waste 16 minutes of my life on a video.BUT IT IS WORTH IT In the end unlike others.i can actually listen to the way you talk i literally cant focus with others
I am not shocked, I was expecting maybe a bit more difference, but even the most convincing videos on RAM speed showed fairly minimal differences after jumping through quite a number of hoops and using the most convincing benchmarks and tests. That being said, I just obtained 32GB of 3600 CL16 for my next build (was running 16GB of 3200 CL16), mainly cuz it was only about 10 or 15usd more expensive than the 3200 RAM, maybe it'll be utilized better in the future? dunno, but I got it anyway
I did a little research myself before I built my 2nd to last cpu and found the same thing. Even in some newer RAM you sacrifice performance. Glad I new this, a lot of people don't know this. I chose 32 GB Trident Z 3200 and they work great for everything I do without a skip!!! Great video!!!
U my friend...come here...let me give u a big hug...after goin through so many vids looking for a actual side by side comparison 3200 vs 3600...idc if cl16 vs cl18 is not that much of a difference...n im sure is some other that are doing the same but i couldn't find 1 but you...thank u...dont show me 3200 vs 3600 but the cl is different...keep everything on the same playing field except the MHz...thank you sir...n wassup with the peanut butter
Always depends on the game. In my case the higher ram speed brings me more minimum fps in World of Warcraft and in raid this is very important. In terms of price, there isn't that big a difference. If you then make the effort and adjust the timings and sub-timings, you can get good performance out as long as you are within the CPU limit.
@Tech Deals I've noticed higher 0.1% going from 3200xmp to 3600 with tuned timings on zen2. Zen3 is less dependent on ram speed because of the architectural differences
Having watched a shit ton of RAM videos now, THIS IS BY FAR the most helpful!!! Since you are using the Trident Z Neo to begin with, the golden standard for Ryzen cpus (I am boldly stating this) it really shows what little difference it can make and that for 99% of the people it wont make a difference. There is one question however, and this is a big one, that is tough to track. The Trident Z Neo are said to have different chips. Some the "lesser" Hynix, others and usually the pricier faster ones the B die. Plus Some are Dual Rank others single, from what I read, its hard to find the truth. Now in this test of course you filled all slots, emulating the Dual Rank. However, generalizing the test, it would be great to see if there is a difference in between dual rank and single with only two slots (since the SFF market is growing and also some choose two sticks over 4 to leave headroom for upgrades later on...) and if one can find the difference in the Hynix and b-die within the Trident z Neo family. To clarify, the Trident Z Neo are some of the very few Sticks out there specifically designed for AMD platforms, or so they claim... thats why it would be very interesting to see an even more in depth test on these sticks on AMD platforms.
I recently upgraded my old rig a few weeks ago to a Ryzen 7 5800x, and picked up G.skill ripjaws dual ch. 16GB 3600mhz CL16 which was only $17 more than the 3200mhz thinking the 3600 would perform better, and an MSI B550 MPG gaming carbon wifi board, WD M.2 SN850 7000mb/s read. And for now an XFX RX 580 8GB
You are testing how graphics fps work in relation to higher RAM speed. What might be a better investigation is rendering speed between the different speeds. I just bought a Windows 10 pro key and it worked fine!
Thank you for this, I thought I was about to be told I bought the wrong RAM. I'm only interested in running stock, no overclocking. I kept seeing people say that Ryzen likes faster ram, but figured since the motherboard and CPU both say 3200, that's what I should buy. I think I've also read that faster RAM would be throttled, but I wasn't sure. Glad to see it wouldn't have made much of a difference either way.
Got a new MB, new CPU, and new RAM for a pair of systems. Got some very solid 3200 Corsair dirt cheap on one system and 3600 when the price difference was pretty small. Honestly don't regret either purchase.
Essentially, u might gain 2-5 FPS on a good day going from 3200 to 3600 xD. But with Amd, if you want your speeds to run 1:1 with your infinitu cache, 3600mhz will do that, while any faster will give no further gains, any slower might actually hurt you ever so slightly. With Intel, You can just get the slower ram regardless lol
the server comparison does not make sense when servers are all about efficiency and gamers just often care about raw speed, which does not matter in servers where your heat matters much more
@@lost_places_global9008 I gave away my pc because I rarely used it for gaming. It had an i7 2600 (non K) so it was good enough for me but it started to show it's age. I'll be building a new system when the market normalizes. It doesn't need to be beefy or anything, it just needs to be there if I feel the urge to play a game.
@@peugeot908hdifab Yeah I bought 3600mhz with cl16 Since I have a i5-9600k and a rtx 2060gaming pro on a motherboard that supports up to 4400mhz for RAM.
okay, do the same but then in 4k , CPU is bottlenecking the GPU on 1080p ,so in stead do the same in this video on 4k and see if you get more performance knowing that your CPU is not bottlenecking the 3080 RTX
It's not a bit of a stretch, it's nonsense. Also, having unexplained drops in performance due to faster RAM essentially invalidates all results (for the purposes of extrapolating them to any other system).
I have 16 GB of 3200mz C16 RAM, and am debating if I should upgrade to 32GB(2x16) 3600mz C16. I know my motherboard is dual channel so I would only want 2 sticks. I just upgraded from a 3600X to a 5700X, and have a X570 motherboard. I am a multi-tasker, sometimes training in Runescape(3), and playing COD at the same time, while also having OperaGX and discord open, while voice chatting in there. I'm one to make sure I'm getting good use of my money before upgrading, and it seems like a good idea to me, but would love your thoughts! The same kit you have in the video is $96 right now on amazon, is pretty tempting! Great video as well, as always, and happy new year!
So what i don't understand is, why do many Tech channels or people online say that for Ryzen you should get 3600 instead of 3200? I always see comparisons where the difference is like 2% or so but price difference is about 10-20% more.
@@madmax2069 Hi max.Go back to the launch of the TR 1900X. An 8 core Ryzen Vs an 8 core TR with quad channel memory was a perfect opportunity to show 4 Vs 2 channels. Utube never did this. Only the fact that a 16 core 1950x could be installed. X370 boards had one X16 socket and TR had X16 X8 X16 X8 was ignored too. In two years AMD dropped the 8/12/16 core TR cpus. 2020, oh hell a quad M.2 PCIe card won't work on an X570 board. Oh hell one M.2 is on cpu lanes and one M.2 is on shared chipset lanes. Oh hell, a new M.2 gen4 drives run at 3500/3500 on Intel and 7000/6600 on AMD. And when did you see 4x8GB Quad 3200C14 Vs dual 3200C14 on Utube? It's just crazy that servers run 8 channel memory.
From my understanding ram speeds help in scenarios where the lack of gpu memory forces the system to move assets from de system RAM to the GPU VRAM but that is a very specific case and you could just save the money and use it to get a better GPU or at least a version with more memory
And the peanut butter wasn't mentioned once, I was curious if it is nutty or smooth. This video and most that you upload are right up my alley mainly because Im about to upgrade and I find myself on this channel more often than not. Cheers man.
Thanks for this video! I am about to purchase a 5600X and was wondering about the RAM speeds and how they would fare, now I know I can just get what is cheaper.
Even today, many or most of us still have our same systems looking for updates like GPUs or CPUs, good to know that I do not need faster RAM over my current 3200
Byte Size Techs:
Dual vs Quad Rank RAM - 2 DIMMs vs 4 DIMMs - Which Is Better? - th-cam.com/video/GThdz0kPBms/w-d-xo.html
Is It OK to Mix RAM Kits? - th-cam.com/video/qxxrnsUXcko/w-d-xo.html
Does 32GB of Ram REALLY Make a Difference in Your PC ??? - th-cam.com/video/JupBvOfoHjA/w-d-xo.html
Is 16 GB of RAM Enough for Gaming in 2020? - th-cam.com/video/WimRwGv0XNA/w-d-xo.html
I have not run into a game yet that uses more than 16GB of Ram, but if your doing things like video editing then getting 32GB might make a difference. The price increase between 2x8GB vs 2x16GB isn't too much so might as well get 32GB even if your not going to use it. Maybe a future game will use it?
AMD/DRC Gamer R9 ram and Samsung B-die on a quad channel Threadripper run mixed kits without a hitch.
I wonder if that changes at the higher end of the cpus. Like 5900x and 5950x
@@VadimZverev xmp 2 would probably be better xmp runs all kinds of voltages you dont want mb vendors love to crank the voltage
Any Techs out THERE(?) ..... Will my newer revision.... (5.23 or 4.23 REV 'if it matters) .... 3000mhz - Quad Channel Corsair Dominator Platniums... (all from same package) ...work with my X99 platform and its XEON 2679 V4 that I am about to work with on my ASUS-E-WS-USB/3.1 MoBo? They are 1.35, I know they say 1.2 is better so will it just under-clock to the XEONS highest rating as in 2400 or 2666mhz or around give or take?..... I hope, also, if so can I lower the wattage to like the standard x99 broadwell 1.2v for ram or even lower then the 1.35 rating to 1.25 or anything lower if its going to run lower anyway or might it not run at all? That would be terrible as it was over $200 used after tax... barely used but used.... eBay. :)
"RAM speed is only going to improve performance when RAM speed is the limit to performance." Well said. Thanks for this video. 👍
Yes, but where do the worse(!!!) performances come from? Something is obviously not right with these tests.
@@coolcat23 The test is garbage. It puts comparable speeds (3200 vs 3600) against each other in heavily GPU-bound titles. It should compare 3600 to regular 2666 and put them against each other in CPU-heavy AAA titles, like Cyberpunk or RDR2.
@@steelin666 Not sure about those titles. More like late game Stellaris, Civilization etc.
@@DFDark2 Those games are hard on CPU, but not so much on GPU. What is ideally needed for establishing RAM utilization is a game that puts GPU into full work while loading up a lot of graphic data. Huge 3D open-world games are perfect for that.
Unless you want to measure something like turn time in Civ6 but that's a different, very game-specific test, while most gamers are mostly interested in FPS gains.
Either way, RAM frequency can give significant performance gains in many games. This test just happened to mostly benchmark games that aren't as responsive to RAM speed as some others.
@@steelin666 oh god. another "ddr speed matter" enthusiast despite overflowing proof on internet. smh.
Test like this is all I need from a tech channel.
The test is inherently flawed.
1% being greatly better because of better RAM, is not a strange result, and is far more prevalent, on weaker CPUs, when you have the slightest amount of bottleneck. Im running a 2600 with a 1080, and as im running games with optimised and minimum settings, the difference between 3200 b-die stock and 3200 OCed. It's not only big, but the minimums sometimes are 2x.
My fps in heavy titles, is incredible consistent.
RAM speed does not limit performance. CPU might limit performance, thus you increase it with faster RAM, and by faster RAM i dont mean RAM speed only, but tight timings aswell.
Me the whole time: What's that peanut butter doing there?😂
The strangest thing that someone has bought with his referral link
gives extra 30fps at 4k with ray tracing at ultra
@@no.no.4680 time for a challenge
What are you all talking about! That’s the new CPU Thermal Paste he got there... it’s just peanut butter coloured! That’s why tech’s benchmarks are so crispy and tasty!
He's going to overclock it in the next video 🤣
I'll take full system stability over a 5%-10% performance increase that I'll never notice when actually using my computer. Run 32GB (4x8GB) at 3200 CL16 and have zero complaints.
That's the full honest truth right there!
That's what I did. I bought 16GB of 3600 18CL and there was no real world difference. In fact in benchmarks the 3600 ran slower.
You're not talking about 5% either, in some tests on Ryzen you get slightly higher fps on the 3200, on others it's the 3600 but it's never more than 1.5%, it's supposedly a little more on Intel but I doubt that it's more than 2%. The next AM5 socket motherboards may have a more dramatic difference but at present the sweet spot is definitely in between the 2, (3466 apparently) but there's no real difference so the cheaper, more stable 3200 is the better option.
Does that explain my games crashing and errors on memtest when xmp is enabled at 3600mhz? Would be nice if the ram worked as advertised lol
@@paulm2467 I'm really glad you said this. I'm building my first PC and went with 3200 because it seemed like it would be the most stable instead of trying to push for 3600. Makes me feel justified XD
to be clear: ram speed does make a difference, just not big enough to be worth it
Actually tuning the subtimings gets u the biggest improvement in terms of perf or fps in gaming , way more then just setting the freq. higher , but thats a little time consuming but worth at the end.
@@sorinpopa1442 I remember seeing a video showing the fps difference & the going from the lowest ddr4 ram to the highest was only 5-10fps increase 🤷🏻♂️
@@mikeramos91 depens on the game also some u only get 2,3 fps dif at max in some games. in some best case scenario up to 15 or more especyally if u play at lower res 1080p or so . Not huge ya but just for tuning ur ram subtimings free perf. gain basically.
@@sorinpopa1442 I increase the MHz one step at a time until the computer wouldn't boot. Clearing the CMOS via the jumper didn't fix it. Removing and putting back the battery didn't. I had to remove a stick of RAM and I assume it reworked its way through a process to recognize it again. Later I put the other stick back, set XMP and everything works the same as before I tried that stunt. It is a 3200 MHz kit, stepped it up to 3333 seeming stable. Now I'm squeamish about trying the timings. Default is c16.
@@jasonlisonbee Very painful process. Glad i have a gigabyte motherboard which goes to default settings whenever ram oc is invalid. It took me time to oc 3000MHz ram to 3600MHz on a first gen Ryzen. Fun times.
I love this channel since it's so straightforward and blunt.
More channels need to talk about the fact that you'll only see some differences on a chart, not buying what you dont need, etc. Great stuff!
I appreciate that he takes a more pro-consumer approach where he talks about more common use cases instead of just talking over benchmarks.
Absolutely. It's such a grounded channel all in all.
The testing was "blunt" indeed. Completely unreliable. Getting worse(!!!) performance from faster RAM is not explained by stating that often RAM is not the bottleneck in a system. Rely on these results at your own peril.
Used to be so informative until the talk show with his wife who doesn't contribute anything. I unsubbed. Let's see if it's back to being informative and not gibberish for an hour.
You unsubbed so... why are you still here lol, you said it yourself you don't want to watch this and yet you're still here. Just to complain?.
short answer, save your money and buy the slower RAM, but spend more money to buy more RAM.
you can always download ram
@@tsuna111 dude do you think everyone can just download ram wifi is expensive
@@kyleforgeard4148 I downloaded RAM from a friend to my USB Flash drive stick, and then transferred it to my PC, it works.
Okay, buy lot's of fast RAM. Check!
or just buy ram that has a cas latency lower or equal to 16
Got 4400 CL19 trimmed to 4200CL16 since 2017. Keeps my 8700K competitive in games against 2020 chips.
please share aida64 memory bandwidth and latency score if possible. I do want to see how that looks like :D
I add the link to my channel description next to ram specs cause youtube won't let me share in the comments.
@@club4ghz okay thanks!
@@club4ghz saw it. No doubt that system will be relevant for a long time lol
I've got an 8700k, but my ram is only DD4 3000 lol
It just occurred to me that Tech Deals is probably going to write off that peanut butter as a business expense and I respect him for that.
When I first built my system with DDR4 3200 I forgot to make the changes in the bios. When I realized that and did it the most noticeable difference was in system operations in general not in game speed.
CAS latency/CL is probably a bigger factor than frequency on fast modern CPUs nowadays, but even then don't expect much for most things.
DDR4 CL14 running 14 14 14 14 34. Cut the time to engage the connection and run it on quad channel. People want faster and faster RAM speeds. AMD offered us QUAD Channel on the TR three years ago. EPYC and Threadripper Pro run 8 channel RAM.
@@maxhughes5687 I'd happily take quad channel over two faster modules anyday.
2nds and 3rds make almost as much of a difference, if not more, than just just primary four timings.
@@maxhughes5687 People don't know what they want
@@MJ-uk6lu At the Threadripper launch no utube channel talked about quad channel doubling ram transfer speeds. Again at TR PRO launch no utube channel talked about 8 channel ram being 4X the speed of dual channel. I had to go to websites for Asus and AMD then search info on the tech terms they used. For that matter no one talked about GEN4 AMD cpu lanes Vs Intel's GEN3 lanes. Faster dual channel may help, but it kills the low latency. WRX80 is the new 'WINNER'.
One setup where I noticed a overall difference was going from 2666 to 3200mhz on a i5 10400 on a z490 board. The Z board was only $20 more than a decent b460. Everything just feels snappier and certain games like Overwatch netted me another 8-10 FPS avg. that’s a big difference
I'm just getting back into building computers since the switch from AGP to PCIe (just been getting stock laptops through work...bleh...) and really did not have the time to parse out 12+ years of technological progress. Your videos have been an amazing way to catch up on the times and figure out what is worth spending money on these days. Thank you!
11:52 About your final thoughtson server hardware being okay with "slower RAM," don't forget they are usually quad channel (or more), which vastly offsets the slower clocked RAM. The consumer desktop space, being only dual channel, has to rely on faster clocks / lower latencies. Server builds prioritise slower, more stable, lower power configurations.
Here are the differences between your regular PC gamers and the Rams Cult/guru's/Junkies.
PC Gamer's- buys almost the cheapest rams he can get because it doesn't improve his fps enough to warrant the more expensive rams.
Ram Guru's- Spend more money on his Rams than his CPU. Hoping to get that one golden bin Samsung B-die that can do CL 15-15-15 at 4400 MHz.
He would skip sex if he can shave off just one more nanosecond off his ram latency.
He would spend countless hours, weeks, and months living inside his pc bios tweaking every setting there is and leaving no stone unturned if he can just find that one setting that would shave off another 0.5 nanoseconds off his ram latency.
He remembers all his Rams RLH/IOL numbers by heart and his most used words are "Please please post" begging and praying to his motherboard after putting in a new set of RLH/IOL numbers.
Rams are not just about fps and the one that you buy from the bargain bin rack even at real high speed but the timing is way too loose to tell the differences. The super rams like
Samsung b-die is not just about speed but tight and very low latency something like this is a CL 16-16-16 at 4400mhz and super tuned to 35 nanosecond latency or lower. I promised you, it will blow your mind!
@@forthosewhodare7325 you sound like the latter
Having your RAM in dual rank nets a much bigger difference in games on my 5600X as covered by so many videos. I got 20+ more average FPS in Shadow of the Tomb Raider bench going from a single rank kit to a dual one (tested with an RTX 3070, 1080p highest preset + TAA)
can also use 4 normal rams for same effect
Just run 4 sticks of single rank gives the same results as 2 sticks of dual rank. I tested 4 sticks of single rank 8GB DDR4 3200 16 vs 2 sticks of dual rank 32gb ddr4 3200 16 and all memory and game tests were the same. ZERO difference.
This was an excellent comparison that will provide cost benefits. I especially liked hearing that it would be more effective to buy more ram to improve performance. When I do my next build I will use this information. Thanks
I've been a professional tester for CPUs, Motherboards, Memory and GPUs for 20 years, working for a leading PC magazine, beginning in the early 90th. I always been told by vendors, that the new generation of RAM will be the big game changer. It never was. There been slight improvements between the generations (DRAM to SDRAM to DDR to DDR2 and so on). But never in a way that we've could recommend Our readers to upgrade because of memory speed. The differences in-between the same generation of RAM (i.g. DDR4) are even more insignificant. All this memory hype reminds me on HiFi enthusiasts who spend a fortune for some voodoo cables. It only improves your system if you firm believe in it.
there is some truth to the quality of cables for HiFi - pure cooper over aluminum clad copper or just cheap aluminum wire , higher gauge over lower gauge, ( 14 gage vs 12 gage )
The peanut butter is the secret advertisement for all Tech Deal's fans. Just how some food advertisement work in movie.
Finally a straightforward no BS explanation, thanks man.
Glad it helped!
Great stuff! I have 64GB of 3600MHz and I've been running it at 2133 for ages as a 3D artist. Thought I'd try running it at 3200 and 3600 and saw no difference in my workflow except sometimes I would BSoD during meetings or while working 👌 Back to 2133 for me
Id love a timing tuning video for older systems like DDR 3. Often on locked down or oem boards its the only thing you can tune as even BLCK is locked. (looking at you Dell and HP)
And maybe ram speed/Timings effect on APU to cater to the budget and international crowd. :)
Dell systems with ECC are going to be locked down for stability not kids tweaking. They have to be built like tanks not never go down.
This dude's comedic timing is EFFORTLESS! 🤣 I'm subscribing right now!
nice. Gonna upgrade from a 2600X to 5600X sometime next year and was thinking about a RAM upgrade from my 16GB 3200/CL16 RAM to 16GB 3600 RAM to maybe push the new Ryzen. Well, when I see these "dramatic" differences I'm better off saving my money and just keep on using the existing ram, which works like a charm anyway. :)
I had heard that AMD 5000 chips relied heavily on RAM speed to achieve optimal performance, but I guess that was only applicable to the older generations when upgrading from speeds below 3000 MHz. Thank you for this video - it cleared things up for me. I suppose I mostly wasted my money on these 3600 MHz CL16 RAM sticks I bought, but I guess it doesn't hurt to have them, though. Now, I'll probably just run them at 3200 MHz instead of 3600 for the sake of stability when I finish my build.
Yeah that’s good news
about the stability bit,its true. Cannot run my "expensive" xlrb 3600 mhz at 3600,windows keep bugging out and I had to change it for 3200 mhz. Dont know if it will stabilize these overpriced ram
@@luckydepressedguy8981 you could stabilize them at 3600mhz?
@@luckydepressedguy8981 the sweet spot for amd even though it’s recommended to be 3600mhz is actually 3200mhz I didn’t even bother wasting the extra 100 bucks on the 3600 mhz version of my ram sticks
Nah, run them at 4133.
I've got my memory sticks running @ 3733 and low CL 16s. THAT actually makes a bit of a difference.
What about tRCD, tRP and tRAS?
@@sabishiihito : I've got those cranked down quite low too, TYVM. 🙄
Your findings are consistent: I recall another vid of yours covering the same thing in the past, with the same results.😊
Yea, I've done this before, but people need to see it again. I may do a follow up using Zen 2 on a B550 just because. Or not... maybe Intel? It doesn't really matter however.
@@TechDeals agree that it does not. Thank you for doing one of these again.
I very much appreciated the splits you made so I could go to final thoughts. That deserves a thumbs up.
Is the ram compatible with the peanut butter? 😂
My gold star count from Tech is over 9000!
I see what you did there.
Look at that, AMD improved Ryzen that much that the 'speed' of RAM does not matter much any more. Wonderful!
CPU's caching algorithm, branch prediction speculation and stuff is what makes the CPU show this kinda result. A CPU core is unaware of the RAM's existence. All it needs is that specific data block must always be available onto it's local cache memory (L1 data and instructions cache), it takes around ~1 nanosecond to get the core to fetch the data from here. If a core can't find data in its L1 cache then it waits for the L2 cache to send the data over to L1. Since it takes time for the data to arrive fron L2 to L1 and then accessing it (total time is around ~5 nanoseconds). But, if that data block is not available in L2 as well, then that core will have to wait until that data block comes from L3 to L2 to L1 and then access it (takes around 10-15 nanoseconds). And, if that block of data still isn't founded, even on the L3 cache then.... oh boy this is going to be the longest time for CPU to wait for that data block since that data block will now be fetched from thousands of miles away; from the RAM, and it takes about (40 - 80 nanoseconds). This is when all that "RAM latency" thing starts to matter.
Fun fact: for most CPUs the "cache hit rate" (it means the rate of finding the required data block for a core on its L1 cache) is said to be near ~95% for modern CPUs. That means, 95% time of the total running time of your application, a CPU core will always find all the data that it needs on its local cache (L1), and, only for 5% time that core will have to wait for the data to arrive from other type of memories (L2, L3, L4 and RAM).
Now consider a theoretical CPU that always finds data on it's cache, it means 100% cache hit rate. That means RAM latency doesn't matter anymore, at all. But reality isn't like that.
I think, the reason you don't see AMD Ryzen CPUs suffer from real world performance loss despite these have the highest memory latency is not because their architecture is efficient in caching algorithms, it is because these CPUs have way way more cache memory size than any Intel CPU. That probably keeps the cache hit rate higher most of the times. Intel on the other hand don't have the advantage of using a large cache (for now), hence Intel has to use superior caching algorithms, branch prediction stuff. This is why despite Intel is low on cache memory they are still competitive. But, in the end when the frequency of data blocks keep increasing (future even heavier data sizes; AAA games) , only having a superior caching mechanisms won't be enough and you will need more cache memory as well, eventually, and this is where AMD has more chances to be in the future competition than Intel. At least current theory based on my speculation says so.
@@deletevil Sure makes that multi 8 core dies on a Threadripper look good.
Beautiful test bench.
One of my projects is to turn an old case into an open air test bench.
I'll send a link to my Twitter feed when I can show something.
this guys got no idea, Higher ram speed kits can be lowered down and timings tightened I bought 4 sticks of 8gig DDR 4 4000 then lowered the timings down to CL 14 14 14 14 28 at 3600 or CL16 16 16 16 32 3800 depending on the games I wish to play or work on, Running on a 5950x and got around 15% - 20% boost in most games
Thank you sir
This shows there's no need to overclock my RAM for gaming. Higher voltage, less difference in performance. Instead I reduced clocks on my stock 3200MHz without change of voltage ^^ Thanks for this video, it helped me pick good way ♥
Really glad you made this video! This is what we need to shut up those that think they know and really don't. I am one of those surprised because it's always someone posting you need FASTER RAM. Yet, personally I never saw why. So, really glad that at least some testing is done to show that more RAM speed, for the most part, is mostly hype and mental comforting.
You're so great for providing this video so fast after the launch!!! thanks!!!!
I’m so glad I watched this video before spending a crap ton of money on high speed ram for my Ryzen 7 5800X3D.
Very eye opening and bodes the question: why would you ever pay an extra 30 for for 400MHz. Good stuff
People like bigger numbers :)
Why do people pay more for faster RAM? Because it makes a (small) difference in a proper system that does not have some issues. Server farms are an entirely different matter; inadequate analogy.
Good video. You just added some clarity to the big question "is the ram speed does matter for the new 3th gen ryzen " and I saved $300-$400.
Extremely helpful and very surprising results! Thank you!
Once again you have answered the question on my mind. Thank you, and thanks to Jiff for sponsoring this TH-cam video.
Bought a 3600mhz 8x4 kit yesterday upgrading from a 2400mhz 8x2 kit. Funny how you upload the answers to whatever’s on my mind any given day, lol
Good advice. Just ordered a kit of 2x8 DDR4 3600 CL16 to replace the 2400 sticks in my rig. Cost finally came down enough that I thought I'd try it. 3600 wasn't in the budget when I built it. Would have liked to see some comparisons between 2400 MHz ram.
What was your cpu?
RAM speed, primary, secondary, and tertiary timings don't amount to a hill of beans for most users in most applications. Almost all the time, something else is affecting performance to the point where RAM speed is inconsequential. Using mildly slower RAM will go unnoticed in real-world use but will have a noticeable effect on your real-world wallet.
That's true, but if you tell everyone the answer, they won't watch the video! :)
@@TechDeals They won't read my comment, so no worries.
When I first stumbled upon your channel I thought to myself "that guy over there talks like he wants to sell me a fridge or something". After watching some of you videos I came to the conclusion that it's one of the best channels for an average Joe like me, who doesn't ahve many finances but still likes to enjoy things and wants to get best he can get for the buck.
I'm now a big fan of yours. So please good Sir, keep selling me all the fridges you have :D!
Yeah, got myself a 64gb 3200MHz ram kit for productivity. I'll probably check if docp on/off makes any difference for the kind of workloads I have, but in any case its pretty much academic, my pc is already anywhere from 2x-5x faster (depending on the task) than that of my colleagues, so I doubt adding maybe 5% is even going to make a difference even if it does materialise. Productivity is not that different from gaming in that respect, if its not a whole tier faster it doesn't matter. The question isn't is it running 2 or 3 seconds, the question usually is is it running 30-150 seconds or long enough for me to get up and get a coffee in the meantime.
faster memories only have an effect on the use of APUs....And the difference is quite noticeable!
great vid Tech! i'm gonna keep my DDR4-3200mhz ram for a long while!
NO one says dual or quad RAM. No one says at what latency.
@@maxhughes5687 agreed, testing only higher ram frequency is not a good test on Ryzen. The differences appear when you go dual rank and lower latency.
Thanks!
Wish I saw this 2 weeks ago. Had 32GB(2x16) 3600CL18 in my PC. Decided I'd need at least 64 for my video work after seeing the 32 fill up pretty quick on small projects. Figured I may as well go all out and bought 64GB(2x32) 3600 CL16 instead. Zero difference between CL 18 and 16. Aside from the price.
Spreads peanut butter all over ram during first 15 seconds of the video, proceeds to watch the rest of the video 🫠
Do you think peanut butter can be a good replacement for thermal paste
the real question is, is that peanut butter a better thermal paste than mx-4?
Sir youre the only one i ccan actually waste 16 minutes of my life on a video.BUT IT IS WORTH IT In the end unlike others.i can actually listen to the way you talk i literally cant focus with others
I am not skipping ads for you, good sir.
I appreciate that
Thanks for reassuring me, I bought a 3200 kit with the intent of getting a 3600 kit. Woosh.
You regret it?
@@lost_places_global9008 only for a day :P
@@wiltak_ I bought 3600 cl16
Only 6€ more and definitely better than 3200mhz cl16
@@lost_places_global9008 oh yeah for sure, no point in me regretting it now though. I doubt I'd notice much difference.
@@wiltak_ yeah; some say I shouldn’t buy a 3600mhz cl16 for my intel i5-9600K
I don’t get their points tho.
I am not shocked, I was expecting maybe a bit more difference, but even the most convincing videos on RAM speed showed fairly minimal differences after jumping through quite a number of hoops and using the most convincing benchmarks and tests.
That being said, I just obtained 32GB of 3600 CL16 for my next build (was running 16GB of 3200 CL16), mainly cuz it was only about 10 or 15usd more expensive than the 3200 RAM, maybe it'll be utilized better in the future? dunno, but I got it anyway
The case for droping a 5600x on x370 and buy a 7700xt or a 4060 keeps getting stronger. TH-cam algorithm doing it's magic.
I did a little research myself before I built my 2nd to last cpu and found the same thing. Even in some newer RAM you sacrifice performance. Glad I new this, a lot of people don't know this. I chose 32 GB Trident Z 3200 and they work great for everything I do without a skip!!! Great video!!!
I bet the real reason you have a jar of peanut butter in front of you is that it matches the colour scheme of the noctua fan 😁
So ddr5 is probably not even going to recommended for gaming on launch like pci gen 4 then huh
not until its optimized
You are the 1st person to compare fast ram to productive use that I've seen.
Funny I just bought some 3600 memory for my B550 board. Guess I'm about to find out if my deal was a deal!
did it work?
U my friend...come here...let me give u a big hug...after goin through so many vids looking for a actual side by side comparison 3200 vs 3600...idc if cl16 vs cl18 is not that much of a difference...n im sure is some other that are doing the same but i couldn't find 1 but you...thank u...dont show me 3200 vs 3600 but the cl is different...keep everything on the same playing field except the MHz...thank you sir...n wassup with the peanut butter
thanks for this! perfect timing i been looking for ram for my first build 👍🏽
kingston fury beast 2x16gb 3600MHZ on ryzen 5600x has been a nightmare, freezing all the time with or without OC
Always depends on the game. In my case the higher ram speed brings me more minimum fps in World of Warcraft and in raid this is very important.
In terms of price, there isn't that big a difference.
If you then make the effort and adjust the timings and sub-timings, you can get good performance out as long as you are within the CPU limit.
@Tech Deals I've noticed higher 0.1% going from 3200xmp to 3600 with tuned timings on zen2. Zen3 is less dependent on ram speed because of the architectural differences
Great video as always.
Did you change the memory fabric multiplier for each ram ? 1600mhz vs1800mhz?
Having watched a shit ton of RAM videos now, THIS IS BY FAR the most helpful!!! Since you are using the Trident Z Neo to begin with, the golden standard for Ryzen cpus (I am boldly stating this) it really shows what little difference it can make and that for 99% of the people it wont make a difference.
There is one question however, and this is a big one, that is tough to track. The Trident Z Neo are said to have different chips. Some the "lesser" Hynix, others and usually the pricier faster ones the B die. Plus Some are Dual Rank others single, from what I read, its hard to find the truth. Now in this test of course you filled all slots, emulating the Dual Rank. However, generalizing the test, it would be great to see if there is a difference in between dual rank and single with only two slots (since the SFF market is growing and also some choose two sticks over 4 to leave headroom for upgrades later on...) and if one can find the difference in the Hynix and b-die within the Trident z Neo family.
To clarify, the Trident Z Neo are some of the very few Sticks out there specifically designed for AMD platforms, or so they claim... thats why it would be very interesting to see an even more in depth test on these sticks on AMD platforms.
Hell, I had no issue running 3600MHz ram on an a520 board with a 3200G.
I recently upgraded my old rig a few weeks ago to a Ryzen 7 5800x, and picked up G.skill ripjaws dual ch. 16GB 3600mhz CL16 which was only $17 more than the 3200mhz thinking the 3600 would perform better, and an MSI B550 MPG gaming carbon wifi board, WD M.2 SN850 7000mb/s read. And for now an XFX RX 580 8GB
Very good video. 3200 is the sweet spot. I'd like to see a video for 2666 Vs 3200 on intel.
You are testing how graphics fps work in relation to higher RAM speed. What might be a better investigation is rendering speed between the different speeds. I just bought a Windows 10 pro key and it worked fine!
There is a blender test in there. :)
Love the 21 likes on the video before it even premieres lol
wonder what is the Peanut Butter is for🤔🤣🤣🤣🤣🤣🤣🤣🤣🤣
Thank you for this, I thought I was about to be told I bought the wrong RAM. I'm only interested in running stock, no overclocking. I kept seeing people say that Ryzen likes faster ram, but figured since the motherboard and CPU both say 3200, that's what I should buy. I think I've also read that faster RAM would be throttled, but I wasn't sure. Glad to see it wouldn't have made much of a difference either way.
Got a new MB, new CPU, and new RAM for a pair of systems. Got some very solid 3200 Corsair dirt cheap on one system and 3600 when the price difference was pretty small. Honestly don't regret either purchase.
Essentially, u might gain 2-5 FPS on a good day going from 3200 to 3600 xD. But with Amd, if you want your speeds to run 1:1 with your infinitu cache, 3600mhz will do that, while any faster will give no further gains, any slower might actually hurt you ever so slightly. With Intel, You can just get the slower ram regardless lol
What about at 1440p and 4k there doesn't seem to be much of a difference
the server comparison does not make sense when servers are all about efficiency and gamers just often care about raw speed, which does not matter in servers where your heat matters much more
Well, that's an underwhelming difference. 3200 it is! Thanks!
Is your build too old?
@@lost_places_global9008 I gave away my pc because I rarely used it for gaming. It had an i7 2600 (non K) so it was good enough for me but it started to show it's age. I'll be building a new system when the market normalizes. It doesn't need to be beefy or anything, it just needs to be there if I feel the urge to play a game.
@@peugeot908hdifab Yeah I bought 3600mhz with cl16
Since I have a i5-9600k and a rtx 2060gaming pro on a motherboard that supports up to 4400mhz for RAM.
@@peugeot908hdifab The market will take about two years probably until it turns normal.. so prepare yourself.
okay, do the same but then in 4k , CPU is bottlenecking the GPU on 1080p ,so in stead do the same in this video on 4k and see if you get more performance knowing that your CPU is not bottlenecking the 3080 RTX
What about timings cl 14 vs cl16 or maybe 3600 cl 16 vs 3200 cl14?
2 vs 4 sticks, single rank vs dual rank sticks
@@miniweeddeerz1820 Yep subtimimgs are important. Both of my systems Intel and ryzen run b die 3200 14 14 14 memory kits
Comparing the server space to a gaming PC, with regard to memory is a bit of a stretch in our opinion. Thanks for your other input though.
It's not a bit of a stretch, it's nonsense. Also, having unexplained drops in performance due to faster RAM essentially invalidates all results (for the purposes of extrapolating them to any other system).
Building my first pc and I can’t even explain how much you have helped me. Thank you for everything!
Great to see benchmarks for a rig using the same GPU & CPU that I have - Very helpful on RAM speeds, thank you!
I have 16 GB of 3200mz C16 RAM, and am debating if I should upgrade to 32GB(2x16) 3600mz C16. I know my motherboard is dual channel so I would only want 2 sticks. I just upgraded from a 3600X to a 5700X, and have a X570 motherboard. I am a multi-tasker, sometimes training in Runescape(3), and playing COD at the same time, while also having OperaGX and discord open, while voice chatting in there. I'm one to make sure I'm getting good use of my money before upgrading, and it seems like a good idea to me, but would love your thoughts! The same kit you have in the video is $96 right now on amazon, is pretty tempting! Great video as well, as always, and happy new year!
Anyone else want a peanut butter sandwich or is it just me?
Thx for the final clear view Tech Deals! 64GB 3200 kit it will be instead of an 3600 kit for my new build around a 5900X..
32GB 3200 MHz CL16 HyperX on my 5950X also. Got a nice deal for it and it seems fine.
Peanut butter benchmarks when?
So what i don't understand is, why do many Tech channels or people online say that for Ryzen you should get 3600 instead of 3200? I always see comparisons where the difference is like 2% or so but price difference is about 10-20% more.
Going from single to dual is like night and day. Increasing RAM speeds gets you diminishing returns.
And utube never covered the advantage of going from dual channel to quad on a Threadripper cpu.
@@maxhughes5687 are you sure about that, seems like you didn't realize search too well.
@@madmax2069 Hi max.Go back to the launch of the TR 1900X. An 8 core Ryzen Vs an 8 core TR with quad channel memory was a perfect opportunity to show 4 Vs 2 channels. Utube never did this. Only the fact that a 16 core 1950x could be installed. X370 boards had one X16 socket and TR had X16 X8 X16 X8 was ignored too. In two years AMD dropped the 8/12/16 core TR cpus. 2020, oh hell a quad M.2 PCIe card won't work on an X570 board. Oh hell one M.2 is on cpu lanes and one M.2 is on shared chipset lanes. Oh hell, a new M.2 gen4 drives run at 3500/3500 on Intel and 7000/6600 on AMD. And when did you see 4x8GB Quad 3200C14 Vs dual 3200C14 on Utube? It's just crazy that servers run 8 channel memory.
From my understanding ram speeds help in scenarios where the lack of gpu memory forces the system to move assets from de system RAM to the GPU VRAM but that is a very specific case and you could just save the money and use it to get a better GPU or at least a version with more memory
Might be a dumb question, but would the results be about the same at higher resolution? 1440 and 4K?
The higher resolution, the more GPU-bound the task will be. So it basically makes even less difference when increasing the res.
And the peanut butter wasn't mentioned once, I was curious if it is nutty or smooth.
This video and most that you upload are right up my alley mainly because Im about to upgrade and I find myself on this channel more often than not. Cheers man.
Smooth! :)
Thanks for the info Tech appreciate ya!
Thanks for this video! I am about to purchase a 5600X and was wondering about the RAM speeds and how they would fare, now I know I can just get what is cheaper.
ram speed does make a small difference...why use ryzen 5 5600x for these tests?
Even today, many or most of us still have our same systems looking for updates like GPUs or CPUs, good to know that I do not need faster RAM over my current 3200