I'm a 5800X3D user and with how good this CPU is I'll probably stay with DDR4 for the next few generations, but it was nice to see how much I'm "losing" by doing so
Another reason for AM4 owners to finally leave the old platform. If someone have an old PC with a Ryzen 5 2600 or even 3600 and considering upgrading to 5800X3D, please don't do it. Unless you invested in a fast 32GB RAM kit recently. If you have only 16GB or an old 2666 CL19 kit or even slower, you will benefit more from switching to AM5. Performance-wise and cost-wise as well. For unlocking full potential of the 5800XD3 you'll need at least 32GB-3200 CL16. Not to mention you would need at least a 4080 class GPU. New bigger and faster DDR4 + 5800X3D combo will cost you $373. But even faster 32GB DDR5-6000 CL30 + Ryzen 5 7600 + reasonably good B650 motherboard will cost you only $390. So only $20 more for roughly the same or even slightly more performance. But still with upgrade path for quite a few more years. I think it's worth a few bucks extra. No brainer at this point. I know I said it last time but I'll keep saying it as long as there's a single person in the comment section saying they just upgraded to a 5800XD3 and it was the best decision they ever made.
@@elon8818the additional cache is literally meant to mitigate the bandwidth limitation of ram. You do not get the uplift on x3d CPUs that you do on standard CPUs with regard to faster ram. This makes the 5800x3d a brilliant update to a platform that already has decent memory speeds like 3200 or higher. Most people do not need(nor would they even notice) the best performance.
This video isn't about you or your PC build at all. It's addressing Intel CPUs that are compatible with both types of RAM. There is no option for AM4 so it's completely irrelevant. We don't even know for sure that DDR5 would significantly improve 5800x3D performance because it's impossible to test.
@@elon8818 I get your point, and generally agree that alternatives should be considered. But I'm not sure I agree fully with your points and reasoning. I haven't seen anywhere that 5800x3d is sensitive to RAM latency or bandwidth - certainly not on HUB th-cam.com/video/9sKBu8HIExI/w-d-xo.html Needing a 4080 - I disagree wildly. It varies a lot from game to game how much they benefit from 5800x3d, and the same can be said for any CPU upgrade - why would you not need to upgrade the GPU to take full advantage of Ryzen 7000? There is also the motherboard cost, you can't leave that out of the pricing of upgrade to Ryzen 7000. I do think you are right on 16gb to 32gb jump. If you need new ram anyway, it's a good time to consider the cost of jumping up to DDR5. The other one is motherboard - If you are on AM4, maybe on a mATX motherboard, consider if you will need more NVME slots in the near future, or want faster PCIE for your SSDs - and consider the drawbacks of staying on an older Motherboard.
I've literally looked up DDR4 4000 vs DDR5 7200 the other night and now, just a few hours later, you come up with this video. Don't know whether i should find that funny or scary lol. However, you guys have forgotten to disclose something very important for this test : was the DDR4 Kit running at Gear 1 or Gear 2?
I JUST WENT W A NEW ARCTICTECHURE NOT JUST FASTER AND CHEAPER IM ON DDR5 5200 @ CL 36 AND DAMN IM SO HAPPY W IT if u have the extra 20 bucks like the vid said get off the old ... 4 dimms...@ 128gb doing productivity too not just gaming
@@Hardwareunboxed thanks. i have slight different performance results in some games, but i am lucky that my 14700k is able to run 4400g1 16-16-16 what is most likely a bit better than HU 4000, and generally will give an extra 3 to 7 fps depending on the game but overall as i am using 4k , once one goes above 1440 the difference between ddr4 and ddr5 is very small. That said, i agree that ddr4 high speed g1 is only a real option for those that already have good ddr4.
@@jungervin8765 I think the differences here are down to how the memory controllers in this CPU perform. The tested DDR4 kit should outperform the DDR5 kit, if the memory controllers performed equally.
@@philipp594 Kind of depends if latency or bandwidth is the bottleneck. You can have a CL40 DDR5 kit that still delivers better performance. It's completely game dependent. But i agree that a latency benchmark would be nice to have.
Thank you for this, while I already upgraded my DDR4 LGA 1700, this video would have made that decision much easier had someone created it previously. Revisiting these types of situation is a must if the intent is to inform consumers of their options... Thanks guys!
Would be nice to have included a broader range of DDR4 and DDR5 speeds to compare performance/value within the respective types. It would also be nice to see with the updated BIOS for the 7800X3D motherboards, how those go with the different DDR5 speeds and how AMD compares to Intel now with memory speeds/performance.
@@ReivecS Including AMD would be too many variables, yes, but using speeds higher than 3600 and 6000 Mb/s isn't representative of the options people in this market would actually consider.
@@Vegemeister1 I feel like you are asking for a lot of extra work for a result we all know what the answers will be. 3600/6000 are the sweet spots. Going higher has massive diminishing returns. Better to spend that time on another video with actual useful information.
@@ReivecS We know DDR5 would be faster, of course, but what I would prefer is a test of the sweet-spot speeds *only*, which is the same amount of work. Then we'd know exactly how much faster DDR5 is, and have something to point people to when they ask if DDR5 is worth the extra cost. The DDR4 in this video costs almost twice as much as sweet spot DDR5, which makes it a realistic option for practically nobody.
@@Vegemeister1 Did you even watch the video ? he reminded you of the benchmark they did 2 years ago at 9:57 where they did test out 3600 CL14 DDR4 vs 6000 CL36 DDR5 to represent what the average joe would be buying. Guess what ? it's 4% faster on average so there's your something to point people towards, I'll legit even give you a hand if we're assuming someone's buying the shitty bottom of the chain CL18, the difference between CL14 and CL18 is another 4% in the worst case scenario possible, so you're looking at an 8% difference in the worst case scenario, and 4% most of the times. Accept the obvious fact, stop trying to gaslight people and move on with your life, and if you're gonna unironically try to throw that "but 0.1% lows" take if your game is unironically stuttering for those 0.1% figures to make a difference you're either playing a shitty port or you got other issues to worry about in your build.
The advantage of DDR4 can be found in latency as a maxed out DDR4 configuration ~4133-4266 on 13th and 14th gen will get about 10-15% better memory latency compared to a maxed out DDR5 config at ~7400-7800, gear 1 really makes the difference here. The DDR4 config suffers heavily in bandwidth despite b-die timings allowing close to theoretical max for a given frequency, it just can't make up for the ~80% deficit in effective clocks, I would be interested in seeing a tuned bench for both configs as I think DDR5 may pull ahead even further, but DDR4 would still be very performant.
"Hardware Out of the Box" is all you'll ever get from this channel. i2Hard is an extremely good Russian channel that does a lot of overclock testing. Although I still haven't seen a proper 4133+ vs 7800+ comparison on any channel or forum.
@@jjlw2378 You're right. This channel focuses almost exclusively on "out of the box" performance, sometimers to a fault. Beyond XMP they basically do no overclocking tests, even a small basic overclock that can be achieved in more than 50% of samples is worth looking at in some parts as it can shift the rankings.
@@connectingupthedotsyup in actual latency on the accurate tests like Intel imc gui or pyprime (NOT AIDA) I had a ~60% latency Improvement going from ddr5 hyinix m die xmp to just about as dialed in of a overclock as possible while stable (frequency OC was only 5600->6666). I was pretty limited in frequency by my Z690 board but as I said, enormous oc regardless
@@jjlw2378 good luck on running 7800+ with any stability ay all. Aks Buildzoid (Actually Hardcore Overclocking) what is wrong with such an high memory speed with Raptor Lake (or even Refresh).
@@connectingupthedots "This channel focuses almost exclusively on "out of the box" performance, sometimers to a fault" The channel is literally called 'Hardware Unboxed'. Please think on that for a moment. Just a moment, and it'll start to make sense.
i wonder what the difference would be with a heavily tuned kit of ddr4 3600 vs a ddr5 6000 cl30 kit. and then a heavily tuned 3600 vs heavily tuned 6000
This! I want to know how much CL timings matter as well as speeds across these DDR4 and DDR5 comparisons.. like is getting CL16 vs 18 really that big of a difference for DDR4, if so is there a diminishing returns point and is it tied to the speed?? Also, what about these same spec comparisons on the AMD side? Does core count benefit from certain specs even just looking at AM5 + DDR5 ranges
If you're talking about advertised kit specs then the difference can be huge, C16 can be B-die and the C18 is whatever. And the C18 kit is going to be something like 18-22-22 while the B-die is flat 16-16-16.@@Artificial.Unintelligence
Do you have any plans to revisit PCI-E bandwidth differences with some of these newer memory-heavy games that have come out (Hogwarts, Survivor, Last of Us, etc)? It doesn't seem unreasonable for 5800x3D to be put on a PCI-E 3.0 motherboard with a 40 series card.
I wonder what's the difference when it comes to an iGPU. probably really tough to quantify with an iGPU that's actually worth testing, as there's no decent iGPU attached to a memory controller capable of DDR4 and DDR5.
Intel iGPU - it is too weak to detect difference in DDR4 to DDR5. Even Raptor Lake iGPU is 1/2 as powerfull as 5700G's iGPU + it also can use CPU L3 cache to aid own memory bandwidth needs. I suppose it (top end RPL iGPU) can barely utilise dual channel DDR4-3200 memory (JEDEC timings) AMD iGPU - only avaiable with one memory standart so no fair comparison (same iGPU, different memory) could be made.
It's a moot point. iGPU is not suitable for gaming. Of course it will be better with faster RAM but it will never been good enough for the latest games.
@@volodumurkalunyak4651 Congrats, you win the weird award of "replying to a comment by plagiarising that comment" 😆 E2A: Award rescinded because you edited in some value.
The take away is still: DDR4 is still fine for now. Move up to 32 gig if you did not already. If building a new rig, then making sure it is ddr5 is going to be one of the basic checkboxes to met in order to make sure it has any future proofing/upgrade potential though.
What's the integrated memory controller setting? Gear1 for DDR4 and Gear2 for DDR5? I've noticed that a slower RAM on Gear1 performs better in some games than much higher frequency RAM on Gear2 due to lower latency. edit: Nevermind, HUB commented that they've used Gear 1 for DDR4, so best case scenario for DDR4. Cheers.
@@ActuallyHardcoreOverclocking That's rather obvious, the fastest I've seen Intel's IMC go is 2000MHz, so it was running 4000MHz RAM in Gear1, but everything above 1800MHz (so 3600MHz RAM) is a silicon lottery. DDR5 starts at 4800MHz, so Gear2 is a must. My point was, if that 4000MHz DDR4 is running in Gear2 then it's at a disadvantage, because even much cheaper 3600MHz CL16 RAM running in Gear1 should outperform it in most games.
I'd love if you could include a benchmark for late game performance of heavily CPU-bound games like Civilization VI or Paradox grand strategy games. I don't really care personally how many FPS I can get on max graphics in some AAA game. I want to know how long will I have to wait after I press end turn on turn 300 in CivVI.
Excellent analysis, as always! Now wondering if there would be any perceivable difference at all when testing with more mid-range parts, like the i5 13600 and RTX 3070/ RX 6800 🤔
Good info, and I think it's also important to note that in order to get 7200 speed ram to run, you likely need a 13700k or better and mobo that will run it. Most peoples systems are running 6000 cl30 or slower and I suspect those are going to be much more similar to fast ddr4 or even slower.
1 month ago was looking into a new system. at first i was going for a ryzen 7 5800x3d config, but then i chose the new platform with the 7700. the ram was 100$ (for 32 gb 6000mhz cl30), so not that much more expensive than a ddr4 kit.
The 5800X3D sells for a premium. It's an amazing upgrade but very expensive for a new machine. A new RYZEN 5700X machine makes sense because it's considerably cheaper. Otherwise its RYZEN 7700X all the way. Or you can save quite a lot by going RYZEN 7700, you can use the boxed cooler and a cheaper B620 motherboard and still get almost the same performance.
sometimes 6000MHz not working well (silicone lottery of memory controller) for Ryzen. if it happens - Try to manually set for 5200MHz (and keep memory timings) This should solve most problems If You lucky - 6000Mhz will be work for out of the box ;)
@@edisirio506 It is a silicon lottery ;) my old 5600X do not wanna coperate with "cheap" modules 3600MHz (CL18) so i must change speed to 3200 CL16, but i replaced later with more premium 3600 (CL16) sticks - they works fine at advertized speed. AM5 is more sensitive due to higher speed of DDR5. Some combinations doesn't work.
Long story short: DDR5 is a maginal/fair improvement at 1080p and 1440p but in most cases ties at 4k. However, DDR5 has more stability issues and you have to be very careful with which board, cpu and CL timings/brand you choose as in many cases you may not be able to get the full speeds, memory or the ability to use all fo your dimm slots. DDR5 definitely still feels like it's in the 'bleeding edge' unless you are paying a high premium for top end motherboards rather than the more reasonable $100-$200 range.
it's not how it works, if your gpu is maxed out at 1080p then faster ram doesn't help, it's not about resolution but about how much your gpu is getting utilized cpu limited situations will benefit from faster ram no matter the resolutions
lmfao so every b650 user is a beta tester? stop coping, DDR5 is perfectly stable even for budget builds. It just takes a bit longer for booting sometimes
It's really only Intel where you have to worry about these other things. AM5 is literally so noob proof nowadays that you just buy 6000MHz CL30, enable Expo and call it a day. Buy 2 sticks though, 4 sticks is another story
Great video and benchmarks as usual. What was the latency used for each kit? Latency will heavily impact gaming results so it would have been good to lower the ddr4 speed and decrease the latency. I just completed extensive benchmarks on ddr5-6000 vs 8000 … there is a performance increase but it’s definitely not significant. Video will be posted tomorrow.
No Microsoft Flight Simulator benchmark ? I expect that to enjoy a lot of memory bandwidth. Maybe similar with Factorio. Edit: saw in the slides with the previous DDR5-6000 vs DDR4-3600 that it's not the case with MSFS, so I guess I stand corrected. But, man, did DOOM Eternal got a nice bump. I wonder how difficult is to get DOOM and DOOM Eternal to run at 1000 FPS nowadays. I remember there was a contest 5 years or so ago, and an overclocked to 7? GHz (cannot properly remember) 9700K managed to get to 1000 FPS (but not average) at 720p while looking at a wall in darkness or something. A well tuned 14900K with killer memory and 4090 I expect to be able to get to 1000 average FPS maybe. Anyway, I'm getting waay offtopic.
Thanks for this Steve. I had a 32GB kit 4000Mhz cl16 D4 of Ram I was using when I was going to build my 12th gen Intel system. D5 ram was thru the roof expensive at that time. So I went the D4 route and I can see it still holds strong.
there can't be a fair comparison on AMD without external factors affecting the tests. it's only possible on Intel right now. That's because they can't test the same CPU on both DDR5 and DDR4 when testing for AMD. because no single AMD CPU supports both DDR4 and DDR5. So, alluding to that I don't think they will be doing this test for Ryzen.
You should consider adding rendered results and benchmarks for the number one played game on Steam, Counterstrike 2. It is arguably among the most competitive first person shooter games, and capturing that audience will help your channel even more.
@@ChaseHub no. I can hit 1000fps on practice mode without players or BOTs in some sections of Mirage, but it will be anywhere from 250-700 during games. O Ancient, it's tough to stay above 400fps.
What is this? Where's the test setup info, ram timings?? Can't believe how popular this channel is with this kind incompetence when it comes to CPU and RAM tests
I was thinking about upgrading my cpu from a 12400. Into something a little beefier... But i think ima hold off cause im still on DDR4. Seems pointless to just upgrade the CPU, KNOWING im limiting its performance. It will be a bit of a wait but im sure microcenter will have a nice mobo+cpu+ram bundle sometime post-LGA1700
Yeah at this point just don't bother. Wait for new gen and change everything together. 12400 is such a good little chip, that I can't imagine you have a bad experience with it.
@@dat_21I dont !! its a great little chip. Im probably shorting myself like10 to 20 fps on average but that much fps is not worth the upgrade. i might even wait to see i AMD does even better things on the CPU side.
Im still on a 12600K DDR4 system (with the RAM sticks i took out of my old 7700K system) with a 4070ti gpu and i dont feel particularly hampered anywhere, aside from the occasional stutter here and there (not sure if its because of the memory or not). When i'm gonna upgrade to DDR5 is when i replace half my setup and i don't feel like i need to do that for at least another 3-4 years more.
Please, title your videos accurately: Premium "DDR4 vs. Premium DDR5". I built a system last week. The difference between a reasonably priced DDR4 and DDR5 32 GB kits from the same company (Kingston Fury) was 60€ compared to 110€. On the motherboard front I was looking at 90€ for an AM4 board, while the AM5 variant from the same company in the same tier cost 120€. The combined package came to 150€ on AM4 compared to 230€ on AM5. I had zero issues finding plenty of AM4 compatible products, be it boards, CPUs, or RAM. (I understand that this might be different for an intel platform, but the language in the video was general and not specifying that this was specific to just the intel platform.) As such at least for the European market I find this piece highly misleading.
I wish you compared 14900K@ddr4 4000mhz vs 5800x3d on the same kit @3600 with low timings. Interested in seeing how they compare without Intel's ddr5 advantage.
@@tilapiadave3234 And here you are again, spreading what? Oh, yeah, your opinion. I'm betting everyone that has a 5800x3D would totally disagree with you. 🤦♂
if you are willing to put the time to tight the timings and figure out the vccsa and vddq voltage to run the ddr4 at 4400c16at gear 1 (what I am going to say is quite a bit of work) the matchup becomes competitive . I have a 5800x3d and a lucky 14700k that is able to do 4400 at 1.30 vccsa (what is relatively low), where i oc'ed both the cpu and the ram .I also tried to tweak the 5800x3d but there is much less headroom, essentially going over 4000 mt on the 5800x3d means faster loading screens and little more, while on the intel every ram speed increase corresponds to an fps increase if you can manage to keep latency low, for example in csgo, 4000c1 provided 940 fps, 4133 c15 had around 965 fps then 4266 c16 995 fps, 4300 c15 1010 fps and finally 4400 c16 1030 fps. So the jump on the 14700k from 4000c15 to 4400c16 was around 90 fps. But there is kind of a hard wall after 4400, only a few mb are able to run 2x16gbs of ddr4 above 4400@g1 . That said, at 4400-16-16-16g1 the intel trades blows with the x3d and tends to mostly overtake yet it takes considerable effort (read investment of time to figure voltages and then mem tests) and favorable "silicon lottery" in regards the imc, mb and even ram kit.
Testing RAM in GPU bound scenarios with high graphic settings is not feasible to tell a proper difference between the kits. Could it be possible for you to test games at lower graphic settings to put more stress on the CPU, since that's when RAM speed starts to matter more in the FPS averages and lows to better illustrate the performance difference?
but its the logical way to test it if you buying a new 14900K a $600 cpu, are you playing the latest AAA games at low settings? Just a question do you think higher settings put less stress on the cpu than low settings? It puts more stress on everything, thats why a 4090 is being used because thats a realistic approach
@@imo098765 Since RAM is being tested I think it's only reasonable to try to eliminate as many other possible factors as possible to gauge an accurate performance difference in games between the kits.
Any chance of more UE5 games tested in the future, curious to see how games are going to scale on that engine, seems to be lighter on the Vram at least.
I wish y'all had included Factorio and/or Stellaris since those are intensely memory intensive, and usually see huge perf improvement on L3 CPU caches - so better memory latency might improve things significantly in late game where even L3 can't keep up and it has to go back and forth on RAM all the time to do the calculations
If you can budget for it dude and you are going am5 go for the 7600x and amd said 6000mhz was the sweet spot for ram which is what i went for on my build January 23
If this test was 3200mhz vs 7200mhz the differences would be much larger and more relevant to the average gamer since it was the most affordable for a while and recommended by youtubers.
Firstly , if the majority of ppl own 3200 ddr4 , then the majority of ppl own ddr5 is somewhere like 6000 , and secondly if u do that comparison u may as well do it with a 3070 or a 6800xt ... and a 13600k ..
Nah, 7200 is starting to be cheap and there's a lot of options. Even the 48 GB kits can be found at 7200/7600 for a decent price. 6-12 months ago they were rare...@@maegnificant
Great work as always, Steve. I had to get my PC replaced with a new motherboard and CPU since my 7-year old one but when I asked if I should upgrade to a motherboard that uses DDR5, the staff at the local Centrecom told me not to worry and stick with DDR4 (as well as getting a 12th gen Intel CPU instead of a 13th gen). I am still to this day baffled by that as I can't afford to do so at this rate, especially since I upgraded my RAM from 16GB to 36GB 3600Mhz not too long ago. Although this video did quelled my fears a bit as while DDR5 is the better option but it seems DDR4 still has some life left.
I'm on 12700k (5.0p/3.9e) with a 4070 and 32GB of DDR4 3466 with manually tuned timings. I bought this CPU when it was new for the purpose of getting another round of usage from my old DDR4 kit. I play at 1440p and still I'm entirely CPU limited since I mainly play competitive shooters, simulation games and MMOs. My ancient e-die kit can do 3600, but loose timings, going down to 3466 and manually tightening the timings saw me a ~15% increase in SotTR which should put me fairly close to what this CPU can do with DDR5. Would be interesting to see what the difference looks like between a manually tuned good DDR4 kit vs a DDR5 one.
Imagine a 7950x3d, rtx 4090, ddr5-7200, and a $350 AIO build just to max out the advantages of playing in 1080p. Most likely on a god-awful TN monitor. No productivity work since it is 1080p.
Im hoping to see you guys discuss AM5 more. I’m planning to build a PC around the 7800X3D in the hopes that it will be “future proofed” like how AM4 is/was. I still don’t know what mobo and ram to get. I need some guidance 😅
Hello, i don't think You need future proof because in the future You can Buy better ram and better mobos for les money. Nice mobos with a lot of features could be the b650-a rog strix or x670 steel legend. Ram, 6000 mts cL30 are the sweet spot for ryzen cpus.
Some advise looking for good AM5 motherboards is to check the Hardware Unboxed channel on youtube. For RAM is guess 6000 CL30 is a good spot to be at. Ofcourse you can go faster if you have the money.
If you care about future proofing, cost matters for you I guess. Sweet spot is 6000 CL30. Cost is only $20 difference vs the cheapest ones. Going faster didn't make any reasonable performance gain. Don't trust the guy about the rog strix or steel legend mobo. Expensive as hell for little to no benefit gain. Almost every AM5 motherboard can unlock a full potential of the CPU. VRM won't throttle. It's not Intel. You shouldn't spend more than $150 if you want the most cost/performance. $200 max if you really need extra features. If you can't say what those features are, you simply don't need them. What's the point for future proofing then save some money...
@@JackJohnson-br4qr he is saying he wants "future proof" his PC and he Will Buy a 7800x3d, 100$ more in the mono won't mame any diference if he Will Buy anyway a 800$+ gpu in the worst of scenarios. I won't sit here to explain here why future proof doesnt exist But a 150$ mobo for the Best cpu for gaming is kinda cheap.
I'm running a Samsung B-die kit at 4200 CL16-16-16 and my 13700K is flying. You need a tuned B-die kit in order to match fast DDR5. Standard DDR4 kits like that Corsair Dominator 4000 CL19-23-23 kit are way too slow.
He did use B-die 4000MHz kit. He had RipJaws V 4000MHz CL16 at 1.4V kit. I'm using the same kit, but with tuned subtimings and tertiary timings as well. Running it at 4133MHz CL15 (15-15-15-32) 1.55V on my i5 13600K.
Spot on review as always Steve. I have a 4080/Z490/10900K/32 GB DDR4 @4000 set up and I couldn't be happier. I built my rig specifically to get 1440p @ 144hz and I can get get this easily: Buttery smooth gameplay with really good RTX graphics. As I say, couldn't be happier. Great vids by the way. Love and respect all the way from Pomland! 👍
@@Tpecep Why should he do that, if he is getting what he wants? Personally, moving to 13/14th gen Intel is a waste of money as it is a dead-end platform. Sure, they will be fast enough for some time, but why spend the money on a new platform with little to no upgrade potential? His 10900K is a fine CPU and he can comfortably wait until Intel 15th gen or later if he so desires. I don't see the 10900K being a bottleneck at 1440p/144fps for awhile in most games that require that high a number.
@@Tpecephe's looking to game at 1440p/144hz. He said it directly in his comment. Very few titles are CPU bound these days, and 95+% of them his 10900 will be able to get 144hz on. So really, you're saying he should spend $400+ (and hours of time) on a 1-2% practical uplift. Context and use cases are important
@@badkoolaid22 the 13600k is only 280 on amazon and it is 25% faster than the 10900k. Games like Spiderman Remastered are indeed bottlenecked by the 10900k and even the 11900k
@ude992 That is a decent upgrade, as if I remember 14700K also has more cores than the 12700K, granted they are E-cores, but the performance upgrade from the P-Cores is pretty significant. If you were going to sit on the 14700K for at least 3 -4 years, I would say even though it is a dead-end platform, you would enjoy a sizable increase in performance, especially if you upgrade your GPU. You will probably be GPU bound in some games with the 3080 running on a 14700K.
If you are happy with your system don’t go rushing out to upgrade your system to DDR5. If you are building a new system you might as well go with DDR5. While most of the tested titles didn’t see much improvement with DDR5, there was a definite advantage with DDR5 and no point going going backwards if you are building a new system.
@@Hardwareunboxed Thats good to hear, I've ordered gigabyte z790 gaming x ax, 13600k & GSkill 7200. I really hope they work, because I'm from India, my friend is buying it from US, so no option of replacing the parts.
Presumably, RGB is mostly purchased by accident or when rum glitterballing is worth enduring in exchange for securing a lower price compared to an identical component without the Saturday Night Fever effect. The only other reason to use it is if we've forgotten where we put the Ipcress File. It would be much simpler if manufacturers made all of their stuff incapable of any pointless light show.
I think tracking 0.1 or 0.2% lows is important for ram testing. It's also not really enough to say that "well, it doesn't matter for newer games anyway because you'll be GPU limited". Good CPU and RAM will still improve the consistency of frame times even if the gpu is your main limiting factor.
Anyone who likes cost effective gaming has an X3D be it 58 or 78 which would be a good DDR4 VS DDR5 comparison even though the 78 would naturally be faster by about 15% @@nonenothing4412
Another reason for AM4 owners to finally leave the old platform. If someone have an old PC with a Ryzen 5 2600 or even 3600 and considering upgrading to 5800X3D, please don't do it. Unless you invested in a fast 32GB RAM kit recently. If you have only 16GB or an old 2666 CL19 kit or even slower, you will benefit more from switching to AM5. Performance-wise and cost-wise as well. For unlocking full potential of the 5800XD3 you'll need at least 32GB-3200 CL16. Not to mention you would need at least a 4080 class GPU. New bigger and faster DDR4 + 5800X3D combo will cost you $373. But even faster 32GB DDR5-6000 CL30 + Ryzen 5 7600 + reasonably good B650 motherboard will cost you only $390. So only $20 more for roughly the same or even slightly more performance. But still with upgrade path for quite a few more years. I think it's worth a few bucks extra. No brainer at this point. I know I said it last time but I'll keep saying it as long as there's a single person in the comment section saying they just upgraded to a 5800XD3 and it was the best decision they ever made.
I know the channel focuses more on out of the box setups but on a 4090 with 13700K and ddr4 cl16 4000 tuned + tuned windows + tuned cpu, just from free guides. I get 91fps average in hogwarts(running around full speed through the entire hogsmeade) with 59 on the lows. Some games do not of course benefit that much but more youtubers should dive more into tuning because thats a free 19.7%(x3d) & 30%(14900K) more at 1080p than both of these cpus get at stock and I paid less than both these cpus cost(well the 14900K was not out back when the 13700K launched of course but you know what I mean)
But what kind of timings are used for both DDR4 and DDR5 kits besides the useless CL? Subtimings such as secondary and tertiary timings can make a lot of real world difference. Properly tuned DDR4 can be faster than 7200-7800Mhz XMP, while tuned DDR5 can edge ahead as well. If XMP was used for both then there is a lot of performance left on the table for both setups.
Whats your test setup? Issue is that you can really only kinda viably test it with Intel platform CPU because they speak both DDR4 and DDR5. With Ryzen they only speak either DDR4 or DDR5 and the chipsets and boards are just too different to make testing viable. Cheers. Good luck! Happy Winter time. :)
3:17 I have 6900 XT SE with 5800X yet get 200 fps on 1440p 144hz playing Assassins creed with a mere 16-20ms input lag I wonder what the input lag is since so much more data can be processed
Where I’m at the price between ddr4 and ddr5 boards and the memory itself is pretty vast. My wife got me a 13700k and ddr4 mobo for Xmas last year and just the difference in price between the ddr4 and ddr5 version of the mobo was ~$400. I would’ve opted for the ddr5, but honestly after some tweaking in the bios the cpu performs pretty well with ddr4. Upgraded from a 9900kf and am running a 4080. Definitely feel a decent amount of performance increase from the 9th gen to the 13th gen even with ddr4.
I’m currently playing farcry 6 with a 4090 and I don’t know how you have got those high frame rates , I’m guessing that’s without dxr and no hd texture pack ?
I'm a 5800X3D user and with how good this CPU is I'll probably stay with DDR4 for the next few generations, but it was nice to see how much I'm "losing" by doing so
Another reason for AM4 owners to finally leave the old platform. If someone have an old PC with a Ryzen 5 2600 or even 3600 and considering upgrading to 5800X3D, please don't do it. Unless you invested in a fast 32GB RAM kit recently. If you have only 16GB or an old 2666 CL19 kit or even slower, you will benefit more from switching to AM5. Performance-wise and cost-wise as well. For unlocking full potential of the 5800XD3 you'll need at least 32GB-3200 CL16. Not to mention you would need at least a 4080 class GPU. New bigger and faster DDR4 + 5800X3D combo will cost you $373. But even faster 32GB DDR5-6000 CL30 + Ryzen 5 7600 + reasonably good B650 motherboard will cost you only $390. So only $20 more for roughly the same or even slightly more performance. But still with upgrade path for quite a few more years. I think it's worth a few bucks extra. No brainer at this point. I know I said it last time but I'll keep saying it as long as there's a single person in the comment section saying they just upgraded to a 5800XD3 and it was the best decision they ever made.
@@elon8818the additional cache is literally meant to mitigate the bandwidth limitation of ram. You do not get the uplift on x3d CPUs that you do on standard CPUs with regard to faster ram. This makes the 5800x3d a brilliant update to a platform that already has decent memory speeds like 3200 or higher. Most people do not need(nor would they even notice) the best performance.
This video isn't about you or your PC build at all. It's addressing Intel CPUs that are compatible with both types of RAM. There is no option for AM4 so it's completely irrelevant. We don't even know for sure that DDR5 would significantly improve 5800x3D performance because it's impossible to test.
@h0stile420 You right I forgot to add the mobo. Now it's correct. 😉
@@elon8818 I get your point, and generally agree that alternatives should be considered. But I'm not sure I agree fully with your points and reasoning.
I haven't seen anywhere that 5800x3d is sensitive to RAM latency or bandwidth - certainly not on HUB th-cam.com/video/9sKBu8HIExI/w-d-xo.html
Needing a 4080 - I disagree wildly. It varies a lot from game to game how much they benefit from 5800x3d, and the same can be said for any CPU upgrade - why would you not need to upgrade the GPU to take full advantage of Ryzen 7000?
There is also the motherboard cost, you can't leave that out of the pricing of upgrade to Ryzen 7000.
I do think you are right on 16gb to 32gb jump. If you need new ram anyway, it's a good time to consider the cost of jumping up to DDR5.
The other one is motherboard - If you are on AM4, maybe on a mATX motherboard, consider if you will need more NVME slots in the near future, or want faster PCIE for your SSDs - and consider the drawbacks of staying on an older Motherboard.
nice graphs! Was hoping ddr4 3200 would have been included, as most people didnt buy 4000mhz, only the ones building high end did.
You can just find a DDR4 specific video to see 4000 vs 3600 vs 3200
You simply won't get very far with 3200, let's just say buying 3200 == you don't care about RAM enough in the first place. Just don't bother.
Can't similar be said for DDR5, or are most people running 7200mHz DDR5?
I got ddr4-3200Mhz when avg people got 2666Mhz back in 2017, now I have used my ddr4 about 6 years, highest ddr4 was 3400Mhz back then. @@OMGJL
@@OMGJL like you and your guitar haha. why try to poverty shame some man when you live in the third world?
As a sim racer i just wanna say THANK YOU for including ACC in these videos. Keep it on these benchmarks ❤
@bujfvjg7222 Can’t have everything 😂
I've literally looked up DDR4 4000 vs DDR5 7200 the other night and now, just a few hours later, you come up with this video. Don't know whether i should find that funny or scary lol.
However, you guys have forgotten to disclose something very important for this test : was the DDR4 Kit running at Gear 1 or Gear 2?
Gear 1
@@Hardwareunboxed Cheers big ears mate.
I JUST WENT W A NEW ARCTICTECHURE NOT JUST FASTER AND CHEAPER IM ON DDR5 5200 @ CL 36 AND DAMN IM SO HAPPY W IT if u have the extra 20 bucks like the vid said get off the old ... 4 dimms...@ 128gb doing productivity too not just gaming
@@Hardwareunboxed thanks. i have slight different performance results in some games, but i am lucky that my 14700k is able to run 4400g1 16-16-16 what is most likely a bit better than HU 4000, and generally will give an extra 3 to 7 fps depending on the game but overall as i am using 4k , once one goes above 1440 the difference between ddr4 and ddr5 is very small. That said, i agree that ddr4 high speed g1 is only a real option for those that already have good ddr4.
What does Gear 1 and Gear 2 mean? What's the difference? And is it only relevant for DDR4?
It would be nice if you calculated and overlayed the access times for both, to make it easier to compare the kits specs. And maybe link the kits ..
Pointless. Nothing we're gonna notice while gaming directly anyway.
Yes. Why the DDR4 was faster in some titles? I guess because of access times, but hard to guess if they don't share basic infos.
@@jungervin8765 I think the differences here are down to how the memory controllers in this CPU perform. The tested DDR4 kit should outperform the DDR5 kit, if the memory controllers performed equally.
@@jungervin8765 You can guess all you want. This video is generally, do you need DDR5 and their final thoughts covers it.
@@philipp594 Kind of depends if latency or bandwidth is the bottleneck. You can have a CL40 DDR5 kit that still delivers better performance. It's completely game dependent. But i agree that a latency benchmark would be nice to have.
Thank you for this, while I already upgraded my DDR4 LGA 1700, this video would have made that decision much easier had someone created it previously. Revisiting these types of situation is a must if the intent is to inform consumers of their options...
Thanks guys!
The stack of RAM at the end of the video is simply epic!
I imagine Steve and Tim play Jenga with it in their off time.
That’s like $5,000 in memory right there 😅
Would be nice to have included a broader range of DDR4 and DDR5 speeds to compare performance/value within the respective types. It would also be nice to see with the updated BIOS for the 7800X3D motherboards, how those go with the different DDR5 speeds and how AMD compares to Intel now with memory speeds/performance.
You are adding in too many variables to get any informative result. He used a 14th gen intel because it can still run both memory types.
@@ReivecS Including AMD would be too many variables, yes, but using speeds higher than 3600 and 6000 Mb/s isn't representative of the options people in this market would actually consider.
@@Vegemeister1 I feel like you are asking for a lot of extra work for a result we all know what the answers will be. 3600/6000 are the sweet spots. Going higher has massive diminishing returns. Better to spend that time on another video with actual useful information.
@@ReivecS We know DDR5 would be faster, of course, but what I would prefer is a test of the sweet-spot speeds *only*, which is the same amount of work. Then we'd know exactly how much faster DDR5 is, and have something to point people to when they ask if DDR5 is worth the extra cost.
The DDR4 in this video costs almost twice as much as sweet spot DDR5, which makes it a realistic option for practically nobody.
@@Vegemeister1 Did you even watch the video ? he reminded you of the benchmark they did 2 years ago at 9:57 where they did test out 3600 CL14 DDR4 vs 6000 CL36 DDR5 to represent what the average joe would be buying.
Guess what ? it's 4% faster on average so there's your something to point people towards, I'll legit even give you a hand if we're assuming someone's buying the shitty bottom of the chain CL18, the difference between CL14 and CL18 is another 4% in the worst case scenario possible, so you're looking at an 8% difference in the worst case scenario, and 4% most of the times.
Accept the obvious fact, stop trying to gaslight people and move on with your life, and if you're gonna unironically try to throw that "but 0.1% lows" take if your game is unironically stuttering for those 0.1% figures to make a difference you're either playing a shitty port or you got other issues to worry about in your build.
I would like to see this exact same test done on games that are released about 2 years from now to see how those games perform on ddr4
Would be interesting to include a standard DDR5 kit like 6000 CL30 to see does cheaper DDR5 still compete
CL30 has more fps than 7200 mhz. So always look at faster CL timings
The advantage of DDR4 can be found in latency as a maxed out DDR4 configuration ~4133-4266 on 13th and 14th gen will get about 10-15% better memory latency compared to a maxed out DDR5 config at ~7400-7800, gear 1 really makes the difference here. The DDR4 config suffers heavily in bandwidth despite b-die timings allowing close to theoretical max for a given frequency, it just can't make up for the ~80% deficit in effective clocks, I would be interested in seeing a tuned bench for both configs as I think DDR5 may pull ahead even further, but DDR4 would still be very performant.
"Hardware Out of the Box" is all you'll ever get from this channel. i2Hard is an extremely good Russian channel that does a lot of overclock testing. Although I still haven't seen a proper 4133+ vs 7800+ comparison on any channel or forum.
@@jjlw2378 You're right. This channel focuses almost exclusively on "out of the box" performance, sometimers to a fault. Beyond XMP they basically do no overclocking tests, even a small basic overclock that can be achieved in more than 50% of samples is worth looking at in some parts as it can shift the rankings.
@@connectingupthedotsyup in actual latency on the accurate tests like Intel imc gui or pyprime (NOT AIDA) I had a ~60% latency Improvement going from ddr5 hyinix m die xmp to just about as dialed in of a overclock as possible while stable (frequency OC was only 5600->6666). I was pretty limited in frequency by my Z690 board but as I said, enormous oc regardless
@@jjlw2378 good luck on running 7800+ with any stability ay all. Aks Buildzoid (Actually Hardcore Overclocking) what is wrong with such an high memory speed with Raptor Lake (or even Refresh).
@@connectingupthedots "This channel focuses almost exclusively on "out of the box" performance, sometimers to a fault"
The channel is literally called 'Hardware Unboxed'. Please think on that for a moment. Just a moment, and it'll start to make sense.
i wonder what the difference would be with a heavily tuned kit of ddr4 3600 vs a ddr5 6000 cl30 kit.
and then a heavily tuned 3600 vs heavily tuned 6000
This!
I want to know how much CL timings matter as well as speeds across these DDR4 and DDR5 comparisons..
like is getting CL16 vs 18 really that big of a difference for DDR4, if so is there a diminishing returns point and is it tied to the speed??
Also, what about these same spec comparisons on the AMD side? Does core count benefit from certain specs even just looking at AM5 + DDR5 ranges
Most ddr5 using XMP > maxed out 4200 b die with tuned timings
If you're talking about advertised kit specs then the difference can be huge, C16 can be B-die and the C18 is whatever. And the C18 kit is going to be something like 18-22-22 while the B-die is flat 16-16-16.@@Artificial.Unintelligence
That is too dependent on silicon lottery,and HUB doesn't like that.
@@naamadossantossilva4736 how is it? Any b die can do 3600 with tight timings, same with Hynix A/M die @ 6000
Do you have any plans to revisit PCI-E bandwidth differences with some of these newer memory-heavy games that have come out (Hogwarts, Survivor, Last of Us, etc)? It doesn't seem unreasonable for 5800x3D to be put on a PCI-E 3.0 motherboard with a 40 series card.
I'd like to see that as well
me too
this is what I have
I wonder what's the difference when it comes to an iGPU. probably really tough to quantify with an iGPU that's actually worth testing, as there's no decent iGPU attached to a memory controller capable of DDR4 and DDR5.
i leave my igpu to give 160mb the poor thing is defaulted to 64mb
LOL
iGPU only good for rencoding
Intel iGPU - it is too weak to detect difference in DDR4 to DDR5.
Even Raptor Lake iGPU is 1/2 as powerfull as 5700G's iGPU + it also can use CPU L3 cache to aid own memory bandwidth needs. I suppose it (top end RPL iGPU) can barely utilise dual channel DDR4-3200 memory (JEDEC timings)
AMD iGPU - only avaiable with one memory standart so no fair comparison (same iGPU, different memory) could be made.
It's a moot point. iGPU is not suitable for gaming. Of course it will be better with faster RAM but it will never been good enough for the latest games.
@@volodumurkalunyak4651 Congrats, you win the weird award of "replying to a comment by plagiarising that comment" 😆
E2A: Award rescinded because you edited in some value.
The take away is still: DDR4 is still fine for now. Move up to 32 gig if you did not already. If building a new rig, then making sure it is ddr5 is going to be one of the basic checkboxes to met in order to make sure it has any future proofing/upgrade potential though.
I've been waiting this test for months! ;-)
What's the integrated memory controller setting? Gear1 for DDR4 and Gear2 for DDR5? I've noticed that a slower RAM on Gear1 performs better in some games than much higher frequency RAM on Gear2 due to lower latency.
edit: Nevermind, HUB commented that they've used Gear 1 for DDR4, so best case scenario for DDR4.
Cheers.
intel's DDR5 memory controller doesn't support gear1.
@@ActuallyHardcoreOverclocking how about AMD Zen4? Do they support?
@@ActuallyHardcoreOverclocking That's rather obvious, the fastest I've seen Intel's IMC go is 2000MHz, so it was running 4000MHz RAM in Gear1, but everything above 1800MHz (so 3600MHz RAM) is a silicon lottery. DDR5 starts at 4800MHz, so Gear2 is a must.
My point was, if that 4000MHz DDR4 is running in Gear2 then it's at a disadvantage, because even much cheaper 3600MHz CL16 RAM running in Gear1 should outperform it in most games.
Was thinking the same thing, this should have been said in the beginning of the video
I was looking for an updated video like this all day yesterday... Perfect timing mister Steve
Would be interesting to see if quad channel makes any difference. Of course you would need an HEDT system for that.
We all have hedt system, bro 😂
Would be nice if you let us know what RAM modules you tested in this video. This doesn't really tell me anything
10:30 How long it took you to build that tower? lol
AHahhAH :DDDDD
I'd love if you could include a benchmark for late game performance of heavily CPU-bound games like Civilization VI or Paradox grand strategy games. I don't really care personally how many FPS I can get on max graphics in some AAA game. I want to know how long will I have to wait after I press end turn on turn 300 in CivVI.
Great info thanks for putting in the time to test it all
Excellent analysis, as always! Now wondering if there would be any perceivable difference at all when testing with more mid-range parts, like the i5 13600 and RTX 3070/ RX 6800 🤔
It just depends on the framerate. 120 fps and under no difference, over 120 fps is a difference but you have to be very astute to notice it.
congrats for 1million, well deserved.
Is it just me or did he fail to mention what GPU he was using in the test setup?
Good info, and I think it's also important to note that in order to get 7200 speed ram to run, you likely need a 13700k or better and mobo that will run it. Most peoples systems are running 6000 cl30 or slower and I suspect those are going to be much more similar to fast ddr4 or even slower.
1 month ago was looking into a new system. at first i was going for a ryzen 7 5800x3d config, but then i chose the new platform with the 7700. the ram was 100$ (for 32 gb 6000mhz cl30), so not that much more expensive than a ddr4 kit.
The 5800X3D sells for a premium. It's an amazing upgrade but very expensive for a new machine. A new RYZEN 5700X machine makes sense because it's considerably cheaper. Otherwise its RYZEN 7700X all the way. Or you can save quite a lot by going RYZEN 7700, you can use the boxed cooler and a cheaper B620 motherboard and still get almost the same performance.
you lucky 100$ for that kit, now it's ~140$ in Europe and price burn really fast since last week :/
sometimes 6000MHz not working well (silicone lottery of memory controller) for Ryzen.
if it happens - Try to manually set for 5200MHz (and keep memory timings)
This should solve most problems
If You lucky - 6000Mhz will be work for out of the box ;)
@@stanley3647 i just set them at 6000 in the bios and they work at 6000....unless task manager is a liar 🤣
@@edisirio506 It is a silicon lottery ;) my old 5600X do not wanna coperate with "cheap" modules 3600MHz (CL18) so i must change speed to 3200 CL16, but i replaced later with more premium 3600 (CL16) sticks - they works fine at advertized speed.
AM5 is more sensitive due to higher speed of DDR5. Some combinations doesn't work.
i always know when i watch a hardware unboxed video im gonna get some legit info . i respect it man keep doing a good job
Long story short:
DDR5 is a maginal/fair improvement at 1080p and 1440p but in most cases ties at 4k.
However, DDR5 has more stability issues and you have to be very careful with which board, cpu and CL timings/brand you choose as in many cases you may not be able to get the full speeds, memory or the ability to use all fo your dimm slots.
DDR5 definitely still feels like it's in the 'bleeding edge' unless you are paying a high premium for top end motherboards rather than the more reasonable $100-$200 range.
4k is irrelevent since the gpu is the bottleneck.
it's not how it works, if your gpu is maxed out at 1080p then faster ram doesn't help, it's not about resolution but about how much your gpu is getting utilized
cpu limited situations will benefit from faster ram no matter the resolutions
it's all about what hardware that limit your performance, in CPU limited scenario, having a faster ram does matter most of the time
lmfao so every b650 user is a beta tester? stop coping, DDR5 is perfectly stable even for budget builds. It just takes a bit longer for booting sometimes
It's really only Intel where you have to worry about these other things. AM5 is literally so noob proof nowadays that you just buy 6000MHz CL30, enable Expo and call it a day. Buy 2 sticks though, 4 sticks is another story
Great video and benchmarks as usual. What was the latency used for each kit? Latency will heavily impact gaming results so it would have been good to lower the ddr4 speed and decrease the latency. I just completed extensive benchmarks on ddr5-6000 vs 8000 … there is a performance increase but it’s definitely not significant. Video will be posted tomorrow.
DDR4 4000 CL16 vs DDR5 7200 CL34
@@tonan8888thanks, I must have missed that. Would be interesting to see those results at 3600 CL14.
Would be nice to have a comparison for productivity work
Do you think you could do a follow up video comparing 3200 cl16 vs 6000 cl30?
Yep
No Microsoft Flight Simulator benchmark ? I expect that to enjoy a lot of memory bandwidth. Maybe similar with Factorio.
Edit: saw in the slides with the previous DDR5-6000 vs DDR4-3600 that it's not the case with MSFS, so I guess I stand corrected.
But, man, did DOOM Eternal got a nice bump. I wonder how difficult is to get DOOM and DOOM Eternal to run at 1000 FPS nowadays. I remember there was a contest 5 years or so ago, and an overclocked to 7? GHz (cannot properly remember) 9700K managed to get to 1000 FPS (but not average) at 720p while looking at a wall in darkness or something. A well tuned 14900K with killer memory and 4090 I expect to be able to get to 1000 average FPS maybe. Anyway, I'm getting waay offtopic.
Thanks for including 4K
Is it possible to stimulate Zen 3 performance on Zen 4 to see what difference the DDR 5 memory makes or did you already test this?
Cool to see the numbers, thumbs up!
Thanks for this Steve. I had a 32GB kit 4000Mhz cl16 D4 of Ram I was using when I was going to build my 12th gen Intel system. D5 ram was thru the roof expensive at that time. So I went the D4 route and I can see it still holds strong.
You can pick up a kit of 16gb ddr4 3200 for $35. Until ddr5 gets that low, I'm sticking with am4.
waiting for a ryzen comparison. bought 4800 kit being the cheapest and need to know when to upgrade it
there can't be a fair comparison on AMD without external factors affecting the tests. it's only possible on Intel right now. That's because they can't test the same CPU on both DDR5 and DDR4 when testing for AMD. because no single AMD CPU supports both DDR4 and DDR5. So, alluding to that I don't think they will be doing this test for Ryzen.
How timings and frequency affects on not just games, but also on apps like excel, rendering, etc
@@pinktuna3693I think they mean DDR5-4800 vs 7200
Ryzen non x3d are memory sensitive (more than intel) for sure thats what we know
You should consider adding rendered results and benchmarks for the number one played game on Steam, Counterstrike 2. It is arguably among the most competitive first person shooter games, and capturing that audience will help your channel even more.
arent ppl getting like 1000fps on cs2 already
@@ChaseHub no. I can hit 1000fps on practice mode without players or BOTs in some sections of Mirage, but it will be anywhere from 250-700 during games. O Ancient, it's tough to stay above 400fps.
What is this? Where's the test setup info, ram timings?? Can't believe how popular this channel is with this kind incompetence when it comes to CPU and RAM tests
Would be interesting to see power consumption at a locked framerate between these two setups.
Appreciate this review, thank you!
It’s nice to know at 4K, my DDR4 3600, Ryzen 7 5800x3D and RX 7900 XT combo will serve me well for a while
It's surprising that after all this time DDR5 really hasn't shown itself to be very essential. But DDR5-7200 does offer a nice boost.
i feel like the x3d method is the obvious direction for gaming cpus. peak performance with low power consumption and no expensive memory requirement
I was thinking about upgrading my cpu from a 12400. Into something a little beefier... But i think ima hold off cause im still on DDR4. Seems pointless to just upgrade the CPU, KNOWING im limiting its performance. It will be a bit of a wait but im sure microcenter will have a nice mobo+cpu+ram bundle sometime post-LGA1700
Yeah at this point just don't bother. Wait for new gen and change everything together. 12400 is such a good little chip, that I can't imagine you have a bad experience with it.
@@dat_21I dont !! its a great little chip. Im probably shorting myself like10 to 20 fps on average but that much fps is not worth the upgrade. i might even wait to see i AMD does even better things on the CPU side.
I wonder how my DDR4 32GB 4400MHz CL17 kit would compare against decent a decent DDR5 kit?
You should tune that kit's subtimings, you'd get lot more performance. It should be roughly trading blows with 6800-7200MHz DDR5.
Im still on a 12600K DDR4 system (with the RAM sticks i took out of my old 7700K system) with a 4070ti gpu and i dont feel particularly hampered anywhere, aside from the occasional stutter here and there (not sure if its because of the memory or not). When i'm gonna upgrade to DDR5 is when i replace half my setup and i don't feel like i need to do that for at least another 3-4 years more.
Please, title your videos accurately: Premium "DDR4 vs. Premium DDR5". I built a system last week. The difference between a reasonably priced DDR4 and DDR5 32 GB kits from the same company (Kingston Fury) was 60€ compared to 110€. On the motherboard front I was looking at 90€ for an AM4 board, while the AM5 variant from the same company in the same tier cost 120€. The combined package came to 150€ on AM4 compared to 230€ on AM5. I had zero issues finding plenty of AM4 compatible products, be it boards, CPUs, or RAM. (I understand that this might be different for an intel platform, but the language in the video was general and not specifying that this was specific to just the intel platform.) As such at least for the European market I find this piece highly misleading.
Same thing for the LATAM market. Here DDR5 is almost double the price of a DDR4 in the same "tier".
I wish you compared 14900K@ddr4 4000mhz vs 5800x3d on the same kit @3600 with low timings. Interested in seeing how they compare without Intel's ddr5 advantage.
@@tilapiadave3234 And here you are again, spreading what? Oh, yeah, your opinion. I'm betting everyone that has a 5800x3D would totally disagree with you. 🤦♂
if you are willing to put the time to tight the timings and figure out the vccsa and vddq voltage to run the ddr4 at 4400c16at gear 1 (what I am going to say is quite a bit of work) the matchup becomes competitive . I have a 5800x3d and a lucky 14700k that is able to do 4400 at 1.30 vccsa (what is relatively low), where i oc'ed both the cpu and the ram .I also tried to tweak the 5800x3d but there is much less headroom, essentially going over 4000 mt on the 5800x3d means faster loading screens and little more, while on the intel every ram speed increase corresponds to an fps increase if you can manage to keep latency low, for example in csgo, 4000c1 provided 940 fps, 4133 c15 had around 965 fps then 4266 c16 995 fps, 4300 c15 1010 fps and finally 4400 c16 1030 fps. So the jump on the 14700k from 4000c15 to 4400c16 was around 90 fps. But there is kind of a hard wall after 4400, only a few mb are able to run 2x16gbs of ddr4 above 4400@g1 .
That said, at 4400-16-16-16g1 the intel trades blows with the x3d and tends to mostly overtake yet it takes considerable effort (read investment of time to figure voltages and then mem tests) and favorable "silicon lottery" in regards the imc, mb and even ram kit.
Another great use case for keeping your DDR4 on the previous gen Ryzens and get a better GPU with the difference.
Testing RAM in GPU bound scenarios with high graphic settings is not feasible to tell a proper difference between the kits. Could it be possible for you to test games at lower graphic settings to put more stress on the CPU, since that's when RAM speed starts to matter more in the FPS averages and lows to better illustrate the performance difference?
but its the logical way to test it if you buying a new 14900K a $600 cpu, are you playing the latest AAA games at low settings?
Just a question do you think higher settings put less stress on the cpu than low settings? It puts more stress on everything, thats why a 4090 is being used because thats a realistic approach
Testing below 1080p is unrealistic, almost no one does it these days unless they're really trying to prove a point
@@imo098765 Since RAM is being tested I think it's only reasonable to try to eliminate as many other possible factors as possible to gauge an accurate performance difference in games between the kits.
niche test scenario but I really would like to see DDR4 @ 4000c15 super tune (manual secondary/territory timing) vs DDR5 7200 cl32 super tune
Any chance of more UE5 games tested in the future, curious to see how games are going to scale on that engine, seems to be lighter on the Vram at least.
Where is the price comparison graph for ddr4 vs ddr5?
You benchmark stuffs fast Steve, and you can bet I'll come to watch your content fast as well!
what about games starting, games loading a map, games closing?
I wish y'all had included Factorio and/or Stellaris since those are intensely memory intensive, and usually see huge perf improvement on L3 CPU caches - so better memory latency might improve things significantly in late game where even L3 can't keep up and it has to go back and forth on RAM all the time to do the calculations
It Will be more likely CPU benchmarks not RAM. Also do you need that much FPS on the games like that?
@@FaridRudiansyah Actually-CPU-limited games usually don't have their performance measured in FPS.
for 1080p what is better :
ryzen 5 5600x with 3060 12gb or
ryzen 5 7600 with 3060 12gb
both with 16gb ram 3200mhz and 5200mhz????
If you can budget for it dude and you are going am5 go for the 7600x and amd said 6000mhz was the sweet spot for ram which is what i went for on my build January 23
If this test was 3200mhz vs 7200mhz the differences would be much larger and more relevant to the average gamer since it was the most affordable for a while and recommended by youtubers.
Id argue that 3600Mhz vs 7200Mhz is ideal. Literally perfectly in line with DDR4 vs 5 scaling for latency and speed.
No one uses 7200mhz ddr5 either
nah, compare 3200mhz vs 6000mhz
Firstly , if the majority of ppl own 3200 ddr4 , then the majority of ppl own ddr5 is somewhere like 6000 , and secondly if u do that comparison u may as well do it with a 3070 or a 6800xt ... and a 13600k ..
Nah, 7200 is starting to be cheap and there's a lot of options. Even the 48 GB kits can be found at 7200/7600 for a decent price. 6-12 months ago they were rare...@@maegnificant
Thanks for the benchmarks!
Great work as always, Steve.
I had to get my PC replaced with a new motherboard and CPU since my 7-year old one but when I asked if I should upgrade to a motherboard that uses DDR5, the staff at the local Centrecom told me not to worry and stick with DDR4 (as well as getting a 12th gen Intel CPU instead of a 13th gen). I am still to this day baffled by that as I can't afford to do so at this rate, especially since I upgraded my RAM from 16GB to 36GB 3600Mhz not too long ago.
Although this video did quelled my fears a bit as while DDR5 is the better option but it seems DDR4 still has some life left.
What components did you use for tests? As you can do a like-for-like given you need at least a different motherboard.
Another GOAT video from HU let's go.
AAAAAAHHHHHHH
NPC comment right there.
It doesn't even make sense
@@j22563AAAAAAAAHHHHHHH
I thought it was more LAMA than GOAT. Well, ha yeah, I guess it could even be more HORSE than LAMA or GOAT combined. Baa, baa, baa...
do i have to explain i was screaming like a goat
I'm on 12700k (5.0p/3.9e) with a 4070 and 32GB of DDR4 3466 with manually tuned timings. I bought this CPU when it was new for the purpose of getting another round of usage from my old DDR4 kit. I play at 1440p and still I'm entirely CPU limited since I mainly play competitive shooters, simulation games and MMOs. My ancient e-die kit can do 3600, but loose timings, going down to 3466 and manually tightening the timings saw me a ~15% increase in SotTR which should put me fairly close to what this CPU can do with DDR5. Would be interesting to see what the difference looks like between a manually tuned good DDR4 kit vs a DDR5 one.
Imagine a 7950x3d, rtx 4090, ddr5-7200, and a $350 AIO build just to max out the advantages of playing in 1080p. Most likely on a god-awful TN monitor. No productivity work since it is 1080p.
How many frames does the digital RGB add vs. standard analog RGB?
I've gotta know.
Im hoping to see you guys discuss AM5 more. I’m planning to build a PC around the 7800X3D in the hopes that it will be “future proofed” like how AM4 is/was. I still don’t know what mobo and ram to get. I need some guidance 😅
Hello, i don't think You need future proof because in the future You can Buy better ram and better mobos for les money.
Nice mobos with a lot of features could be the b650-a rog strix or x670 steel legend.
Ram, 6000 mts cL30 are the sweet spot for ryzen cpus.
Some advise looking for good AM5 motherboards is to check the Hardware Unboxed channel on youtube. For RAM is guess 6000 CL30 is a good spot to be at. Ofcourse you can go faster if you have the money.
If you care about future proofing, cost matters for you I guess. Sweet spot is 6000 CL30. Cost is only $20 difference vs the cheapest ones. Going faster didn't make any reasonable performance gain. Don't trust the guy about the rog strix or steel legend mobo. Expensive as hell for little to no benefit gain. Almost every AM5 motherboard can unlock a full potential of the CPU. VRM won't throttle. It's not Intel. You shouldn't spend more than $150 if you want the most cost/performance. $200 max if you really need extra features. If you can't say what those features are, you simply don't need them. What's the point for future proofing then save some money...
@@JackJohnson-br4qr he is saying he wants "future proof" his PC and he Will Buy a 7800x3d, 100$ more in the mono won't mame any diference if he Will Buy anyway a 800$+ gpu in the worst of scenarios.
I won't sit here to explain here why future proof doesnt exist But a 150$ mobo for the Best cpu for gaming is kinda cheap.
@@nelsonmejiaslozada9362 Why is a $150 mobo bad for 7800XD3? Tell me.
is there any reason why ddr4 is mre expensive? what market is paying more for outdated ram?
I'm running a Samsung B-die kit at 4200 CL16-16-16 and my 13700K is flying. You need a tuned B-die kit in order to match fast DDR5. Standard DDR4 kits like that Corsair Dominator 4000 CL19-23-23 kit are way too slow.
How much does it cost?And how hard to you have to win the silicon lottery to get such results?
He did use B-die 4000MHz kit. He had RipJaws V 4000MHz CL16 at 1.4V kit. I'm using the same kit, but with tuned subtimings and tertiary timings as well. Running it at 4133MHz CL15 (15-15-15-32) 1.55V on my i5 13600K.
How much productivity workload will differ?
Spot on review as always Steve. I have a 4080/Z490/10900K/32 GB DDR4 @4000 set up and I couldn't be happier. I built my rig specifically to get 1440p @ 144hz and I can get get this easily: Buttery smooth gameplay with really good RTX graphics. As I say, couldn't be happier. Great vids by the way. Love and respect all the way from Pomland! 👍
@@Tpecep Why should he do that, if he is getting what he wants? Personally, moving to 13/14th gen Intel is a waste of money as it is a dead-end platform. Sure, they will be fast enough for some time, but why spend the money on a new platform with little to no upgrade potential?
His 10900K is a fine CPU and he can comfortably wait until Intel 15th gen or later if he so desires. I don't see the 10900K being a bottleneck at 1440p/144fps for awhile in most games that require that high a number.
@@Tpecephe's looking to game at 1440p/144hz. He said it directly in his comment. Very few titles are CPU bound these days, and 95+% of them his 10900 will be able to get 144hz on. So really, you're saying he should spend $400+ (and hours of time) on a 1-2% practical uplift. Context and use cases are important
@@badkoolaid22 the 13600k is only 280 on amazon and it is 25% faster than the 10900k. Games like Spiderman Remastered are indeed bottlenecked by the 10900k and even the 11900k
@@davidandrew6855if i got 12700k, is it worth to upgrade it to 14700k? Still using 3080. Playing on 1440p and ddr4
@ude992 That is a decent upgrade, as if I remember 14700K also has more cores than the 12700K, granted they are E-cores, but the performance upgrade from the P-Cores is pretty significant.
If you were going to sit on the 14700K for at least 3 -4 years, I would say even though it is a dead-end platform, you would enjoy a sizable increase in performance, especially if you upgrade your GPU. You will probably be GPU bound in some games with the 3080 running on a 14700K.
I wish you'd include popular titles like LoL, PoE, D4, etc in your benchmarks as well.
I know it would be alot of testing but it would also be nice to see 11 vs 10 with the ram ddr4 vs ddr5
If you are happy with your system don’t go rushing out to upgrade your system to DDR5. If you are building a new system you might as well go with DDR5. While most of the tested titles didn’t see much improvement with DDR5, there was a definite advantage with DDR5 and no point going going backwards if you are building a new system.
would be nice if you included also "budget" ddr5-6000", but im sure its not worth to overpay for 7200
Which motherboard used for DDR5 7200 kit ? I mean, not many boards are working with 7200 know
All boards we've tested work with 7200.
@@Hardwareunboxed Thats good to hear, I've ordered gigabyte z790 gaming x ax, 13600k & GSkill 7200. I really hope they work, because I'm from India, my friend is buying it from US, so no option of replacing the parts.
i guess ram producers not very happy with such videos
Presumably, RGB is mostly purchased by accident or when rum glitterballing is worth enduring in exchange for securing a lower price compared to an identical component without the Saturday Night Fever effect. The only other reason to use it is if we've forgotten where we put the Ipcress File.
It would be much simpler if manufacturers made all of their stuff incapable of any pointless light show.
So 4k none improvment
1080p
7 ~15 improvement?
And 1440p 2~ 7
In some games ?
Lol not worth it
The GPU is the bottleneck.
@@mrnicktoyou I take it you say this because of the fact that there was a difference in 1080p, but none at 4K?
I think tracking 0.1 or 0.2% lows is important for ram testing.
It's also not really enough to say that "well, it doesn't matter for newer games anyway because you'll be GPU limited". Good CPU and RAM will still improve the consistency of frame times even if the gpu is your main limiting factor.
Wow you've taken the approach that declares people still game on Intel CPU's, amazing I didn't know that was a thing.
Lmao
Anyone who likes cost effective gaming has an X3D be it 58 or 78 which would be a good DDR4 VS DDR5 comparison even though the 78 would naturally be faster by about 15% @@nonenothing4412
Using different AMD CPUs to look at memory performance would be an epic fail.
I agree AMD CPU's have superior performance which isn't so affected by RAM vagaries🤣 @@Hardwareunboxed
Another reason for AM4 owners to finally leave the old platform. If someone have an old PC with a Ryzen 5 2600 or even 3600 and considering upgrading to 5800X3D, please don't do it. Unless you invested in a fast 32GB RAM kit recently. If you have only 16GB or an old 2666 CL19 kit or even slower, you will benefit more from switching to AM5. Performance-wise and cost-wise as well. For unlocking full potential of the 5800XD3 you'll need at least 32GB-3200 CL16. Not to mention you would need at least a 4080 class GPU. New bigger and faster DDR4 + 5800X3D combo will cost you $373. But even faster 32GB DDR5-6000 CL30 + Ryzen 5 7600 + reasonably good B650 motherboard will cost you only $390. So only $20 more for roughly the same or even slightly more performance. But still with upgrade path for quite a few more years. I think it's worth a few bucks extra. No brainer at this point. I know I said it last time but I'll keep saying it as long as there's a single person in the comment section saying they just upgraded to a 5800XD3 and it was the best decision they ever made.
Hey. You forgot price of AM5 motherboard. I’m starting to think you’re a troll.
@@eggnoc Sorry my mistake, I meant to add it. I corrected it
What was the ddr5 memory used in this test? Link?
first
Grats. I'm jelly
What timings are the memory's running?
I know the channel focuses more on out of the box setups but on a 4090 with 13700K and ddr4 cl16 4000 tuned + tuned windows + tuned cpu, just from free guides. I get 91fps average in hogwarts(running around full speed through the entire hogsmeade) with 59 on the lows. Some games do not of course benefit that much but more youtubers should dive more into tuning because thats a free 19.7%(x3d) & 30%(14900K) more at 1080p than both of these cpus get at stock and I paid less than both these cpus cost(well the 14900K was not out back when the 13700K launched of course but you know what I mean)
Nice test. I really do miss at least a few basic productivity benchmarks.
Would love to see some productivity benchmarks
But what kind of timings are used for both DDR4 and DDR5 kits besides the useless CL?
Subtimings such as secondary and tertiary timings can make a lot of real world difference. Properly tuned DDR4 can be faster than 7200-7800Mhz XMP, while tuned DDR5 can edge ahead as well. If XMP was used for both then there is a lot of performance left on the table for both setups.
Whats your test setup? Issue is that you can really only kinda viably test it with Intel platform CPU because they speak both DDR4 and DDR5. With Ryzen they only speak either DDR4 or DDR5 and the chipsets and boards are just too different to make testing viable. Cheers. Good luck! Happy Winter time. :)
3:17 I have 6900 XT SE with 5800X yet get 200 fps on 1440p 144hz playing Assassins creed with a mere 16-20ms input lag I wonder what the input lag is since so much more data can be processed
Where I’m at the price between ddr4 and ddr5 boards and the memory itself is pretty vast. My wife got me a 13700k and ddr4 mobo for Xmas last year and just the difference in price between the ddr4 and ddr5 version of the mobo was ~$400. I would’ve opted for the ddr5, but honestly after some tweaking in the bios the cpu performs pretty well with ddr4. Upgraded from a 9900kf and am running a 4080. Definitely feel a decent amount of performance increase from the 9th gen to the 13th gen even with ddr4.
grats on the 1mil about time
I’m currently playing farcry 6 with a 4090 and I don’t know how you have got those high frame rates , I’m guessing that’s without dxr and no hd texture pack ?
where is factorio in this benchmarks? :(
I've got a 5800X3D and a 7800 XT, I'm happy with it don't think I'll be upgrading for a long while
Why not do the tests with the ram at the same speed?
What GPU Did he use for the benchmark?