Overclocking is no longer a trend. I see so many more threads now days on how to undervolt the best for the minimal performance loss. I am currently using my 7950x with the eco mode 65w. I lose 4 fps in my favourite game.
I mean to be fair in games even the 13900K doesn't use that much power. My 12600K generally uses 40-60 watts in games even though its power limit is 150.
@@MLWJ1993 the 12600k always uses more power than the amd 5600x and 5800x in the comparisons I could find; like 10% more at idle and up to almost double under load.
@@henrym5908 Interesting, the charts from Igors lab suggest the 12600K to equal the 5600X's power consumption (on average), but the 12700K & 12900K use less power than the 5600X in their tested games (at all resolutions actually) games. Idle I see power consumption of ~5W on 12600K vs 9W on 5600X on Tweakers. Although if we compare the 2 it doesn't make a huge difference until we look at all-core workloads where alderlake pushes for much more power.
Don't forget that AMD has an adventage over intel due to their litography proccess which is smaller so less power consumption but still you can make it power hungry.
Fortunately other popular gaming devices, like handhelds, are pushing in the other direction. I know they don't have a choice but still - it's still mindblowing to me what level of performance I'm getting with my Steam Deck for only 15W TOTAL CPU + GPU (tops at around 25W when we account for display consumation and other internals).
And some laptops that dedicated for long battery life also do that (although in some vendors and laptop models, the U-series APUs are still allowed to boost up to 30W if the cooling system is good enough)
@@LutraLovegood I have a thin and light laptop with an r7 6800u and it is insane the level of performance it gets for the power draw. the cpu performs between a desktop 3700x and 5700x and has an extremely competent igpu capable of comfortably playing new aaa games at 1080p medium to low, all in a 25w package. cpus really don't need to be drawing 300w
I wish tech press would focus on power efficiency more, gamers nexus are already starting to test this sort of thing which is great, but it's not really done as a standard test which sucks. edit: realised hardware unboxed also seem to be starting to test power efficiency too :D
Power envelope for desktop CPUs was a kind of prisoner's dilemma. As soon as either brand moved up, the other brand had to move up or lose. I'm impressed it stayed almost the same from 2010 to 2020, honestly. The "good" news is that they can only really do this once, and then they have to improve IPC/efficiency/process/etc to once again realize gains each generation. And they've now done it. This means the next chipset generations will have to have architecture or process improvements to raise performance.
Good point about the prisoners dilemma. But I think there is no good news about it. After they both "defected", the CPU power consumption for desktop will probably stay high forever. It will perhaps never go back to the more sensible 100W. It would probably only work if they all signed a mutual agreement.
@@cube2fox you can always limit how much power the CPU uses, its just now there is more headroom for people who don’t care. Personally both my CPU and GPU are Undervolted and its performs better than stock and quieter.
"I'm impressed it stayed almost the same from 2010 to 2020, honestly." It's because AMD couldn't compete with Intel until after they got Zen going, so it was basically a decade+ of Intel being able to do whatever they wanted with no risk. Once AMD had good CPUs again, Intel had to get off their ass and start pushing their CPUs to compete again. There's nothing to be impressed by, except maybe Intel's anti-competitive tactics that held AMD back I suppose.
Yeah it has gotten a little out of hand. Also for GPUs; the RTX 4090 really should have been a 350 watt card but Nvidia wanted that last 2% of performance. Before 12th gen, Intel’s official power draw was reasonable. But they allowed Z-series motherboard vendors to set unlimited power limits by default, which got progressively worse as they added more cores.
@@suoyidl2654 ray tracing isn't useless tech , especially on high end cards where you can max it out and still have good performance , I would say ray tracing was useless when it was first inroduced in the 20 series because only the 2080ti could run it reasonably and it didn't make a noticeable difference in visual quality in games like battlefield 5 and shadow of the tomb raider and it still halved the frame rate ( plus dlss 1 was terrible ) But now , ray tracing is a step up in visual fidelity in many games such as cyberpunk , control , metro exodus EE , dying light 2 , spider man ...etc
@@lynackhilou4865 When in a blind test users are unable to tell ray traced from rasterization unless they're trained to do so and concentrate, I'd call it useless.
While you have alluded to this I think it's important to remember that power consumption on its own doesn't tell you if a given CPU is more efficient than another CPU. For example if you were to compare a CPU with a 60W power limit and a CPU with a 100W power limit but the 100W CPU is able to complete its tasks twice as fast then it is actually more efficient. Efficiency is measured in the amount of Wh required to perform a given amount of work, power consumption is only a part of this metric. It's also worth considering that a faster CPU can allow you to get more work done in the same amount of time and if you're using your PC to make money then it's possible that those time savings will end up paying for the inefficiency of the CPU. Regarding the low amount of power pulled by a single core: it has less to do with diminishing returns in terms of performance and more to do with the fact that there's a limited amount of power you can push into a single core before you can't cool it due to thermal density. This was less of an issue in the past because the cores weren't as dense as they are today due to how much the transistors have shrunk. Finally you have to consider how big of an impact the difference in CPU efficiency will really have on the overall amount of power all of your devices consume (this also heavily depends on your usage as you have to have a workload that will actually stress the CPU before you start seeing those very high power consumption numbers, gaming in particular is known to not stress CPUs all that much). 100W sounds like a lot especially when it comes to CPUs but its impact on the number of Wh you end up consuming could be so small that it's within the margin of error. Of course I'm not saying that it's not worth it to consider reducing power limits and ideally also undervolting CPUs/GPUs especially since drawing less power also makes cooling a lot easier and could save you a significant amount of money in cooler costs.
Just yesterday I actually put my 3700X in Eco-Mode, and noticed it was still clocking itself to 4.3 GHz on single cores in games, 100 MHz lower than normal. The only time it really noticeably throttled itself was at full core loads, where the processor capped all its cores to 3.8 GHz. But as a result, my thermals never reached higher than 58 degrees, as compared to PBO mode where my CPU would eventually reach 86 degrees under synthetic load, or even standard mode where it would reach 71 degrees. I admittedly didn't check the power consumption stats, but I imagine they were also significantly lowered. And then you uploaded this video, letting the world know that it's okay to power limit your CPU if your power draw and thermals are significantly lowered, with a near-single-digit impact on performance. You really laid out why it's more beneficial than keeping it stock. I'd add that if you purposefully play eSports games at 480 Hz and high resolutions, then maybe it can be argued you'll need that extra performance for some games. But that's a small percentage within the enthusiast PC gaming community who mainly use 120-165 Hz, which is already a small portion of the PC community as a whole. The vast majority of people will pretty much only benefit from power limiting/underclocking their processors.
I bought 5900x as my cpu, and underclocked it to use about 40W. It's still about 10x faster than my old processor, and I have now cpu that doesn't need cooling. With my (older, I'll upgrade when gpu prices go down to level of sanity) GTX 980 frame-limited during gaming, I'm genuinely unsure if my gaming computer even uses its fans. The quality of life upgrade was massive, and turning the computer on without any sound beside monitors turning on still makes me smile.
I'm glad to see a video like this. I just upgraded my computer of 6 years, and honestly for playing the games I do like FF14, or Age of Empires 2 -- I wouldn't mind more energy efficient parts. I'm at a weird point, where I want to upgrade more, because building computers is fun! But I don't need performance, I could easily see myself going down the micro-builds and more efficient route. It'd be really neat to build a computer that has the same performance as my 6 year old ring, but is physically half the size or puts out way less heat.
AMD or Intel could just say "Imagine this scenario: a hot summer, in your room, all windows closed, you're boiling" or something more PR. Also say "Spend less on bills, and more on games" or something, idk. That would sell more than "hurr durr spend 300€ on motherboards"
Probably the most insidious part of this race-to-the-bottom is all the hidden ecosystem costs. Spend more on a cooler, spend more on case/ fans, spend more on power supply... the current dearth of affordable motherboards is partly due to VRM costs. All taken into account you may be spending $100 or $200 more on a build because team purple decided to jump off the performance per watt cliff for 4% higher performance.
As someone who has a 5L case and is very limited on cooling, when i looked at Intels slides, what peaked my interest was that 65W and same performance as previous gen at 240W, because in my case, power efficiency is king. Things aren't as ridiculous as they seem with the high power draws on display because while they technically allow you to suck 400W or whatever, you don't need that. You still get much better performance than the previous generation on the same small cooler. Performance per watt is still increasing.
@@2kliksphilip Both at 65 W is definitely good, but also showing what power level the older gen draws for the new one. Both make sense really. Especially in a situation where the power increase changes for each increase in watt. For example, if 65 W gave X multicore performance %, 66 W gave X+Y performance, but if 67 W didn't give X+2Y, rather it gave X+1.5Y.
@@2kliksphilip very true, and this should be normalized. Maybe a standard should be making a graph that tests the performance every 5 watts, so 10, 15, 20, or at least for the high power CPUs and then plotting that on a graph. If we just test at 65W, many low power ones won't even show up so they can maybe test per watt.
ive been looking at sff cases and been pondering going with intel next time i build, im fucking shocked how high wattage has gotten. like, i went from a 4790k which idles at like 3 watts, to a 5800x which idles at 45 watts. this cant continue.
@@quantum5661 i agree. Efficiency should be more important, although 45 idle is very high. I bet you could drop that with underholdning and one of the auto frequency mores like PBO
@@quantum5661 and if you want efficiency, don't go with Intel. They used th be the efficiency kings but it changed like 2 years ago. Intel has gotten desperate trying to keep up with AMD's performance so they pump in lots of watts to get good performance. Also, AMD's socket is more likely to last much longer so you have an upgrade path later on with AMD's motherboard
This is very important to me as someone living in a hot climate in a thermally constrained room! Before I lived here, I had more consistent A/C so I didn't need to worry as much about thermals. Now I almost only ever use laptops and game consoles because, despite enjoying the customizability of pc towers and pc gaming, heat output is almost always higher than the other options, which just doesn't make sense in my situation. The recent trends in desktop computing haven't been reassuring either, with the ever higher TDPs. Glad this is getting more attention so someday I can return to computer towers!
the heat accumulates in the room. It is cheaper to cool the room using an exhaust fan (50watt), rather than an air-condition (1horsepower, 1000watt). A person emits 100watt of body heat, but a PC can emit 5x a person, so it is as tho the room is crowded with people.
Im feeling it here in Brasil.. 30c everyday, without AC i dont even open games, just use pc normally because i cant stand such heat, and cant use AC most of the time because energy bill are in all time high right now
When tech reviewers compare cpus price to performance they should include the cost of average power draw for average usage over average lifetime of cpu. Just to help people understand what they will really pay to buy and run that cpu. And of course it would put a big inventive to hardware manufacturers to prioritise power consumption.
Nah I dont see the power trend die out. People were always willing to spend extra money for little performance gains. Its not gonna stop just cause their powerbill has an additional 0.
A good tip is to undervolted your CPU. You'll maybe see a 5-10 percent performance loss, but a massive drop inn power consumption. The CPU's are very efficient, it's just that the manufacturers crank them to the max (more or less)
@@2kliksphilip Of course! This is especially useful for me as I'm looking to upgrade after a long time out of the market. I would not have easily come across the idea of intentionally throttling performance to increase efficiency. I'll need to take this into account while doing cost-effectiveness assessments, as well (unless you're going to do that anyway 😅, in which case I can wait). It seems like there's already enough data out there from benchmarks to see which power level is most GHz/watt and then we can just divide price by that value.
I completely agree with this. One of my favorites PC build was the LTT energy saving PC, wish someone would do one like that at different price points every year like Jean-Baptiste Show does a 500€ PC build every year
I was running my Ryzen 3600 at 4500Mhz all core, up from the base clock of 3.6Ghz. So there was overclocking headroom in the low-mid range. The high end was saturated. I think this distinction should be pointed out. Now we are seeing even the lower end ones being close to max.
You can just leave the default settings, but instead tweak the "Processor Power Management" in Window 10/11. Set "Processor performance increase threshold" to 95%, "Processor performance decrease threshold" to 94%, and "Processor performance core parking min cores" to 100%. You will get the lowest power usage without limiting performance in peak load. (notice that I disabled core parking, which is not a mistake, rather it is better to have all cores on standby and throttled down, than let them get parked)
This seems very interesting. I don't know enough about this so boosting this comment up so people more knowledgeable than me can comment on this. Thank you
The most i REAALLY cared about "Power consumption" is "Will i have to buy a bigger power Supply?" , so yeah... i think we kinda are some of the problem
This whole thing is so overblown. 99.9 percent of people will not be running their cpu at 100% all day If you set a 13900k desktop to windows power saver over 7 hours it uses an average of 15 watts with three browser windows(twitch/TH-cam, news, discord) Set it to max power and play cod/ow - 70 watts average over 4 hours. Everyone needs to calm tf down.
@@2kliksphilip But people judge the 7950x and 13900k for that exactly (because they don't understand and you would need to explicitly explain the power draw of typical use). When you actually use it - it uses a fraction of the power they would expect. When the 13900k and 7950x produce the best frames, you most certainly will "get the 13900k", well gamers will This pearl clutching over power draw is just clickbait - And not just from you, the entire community is just gorging itself on this. I don't hate the game, just what the game does to general community knowledge. (for either companies processor)
@@2kliksphilip Yes, but when you are using them for PRODUCTIVITY, the cost of power is meaningless compared to the time it saves you to do work faster. I dont get paid to play CS at 800FPS, but I do get paid if I can compile code faster. Like when Disney makes a new movie with tons of CG, do you think they choose a render farm that is the most power efficient, or one that is the fastest? Its obviously the later, because time=money, power=far less money.
What about an experimental episode, where you port old BSP maps (Quake3/GldSrc) to Unreal LUMEN or even to Blender Cycles (offline). 🙏 I've already tried this, and the results were just fantastic 🤩
The tinkering hasn't necessarily gone away per say. It used to be where we'd overclock our cpu's now we undervolt them. Same goes for GPU's. Personally however I do feel that this is a good thing if you look at the bigger picture. Most people don't overclock or really tweak their cpu's and gpu's so they benefit by the companies that design these chips doing this from the factory with safety parameters in place to make sure nothing is damaged.
CPUs are pre-overclocked these days. It's cool though that consumers don't have to overclock their CPUs anymore if they just want to eek out the most juice. If you're good with your power plans, you can underclock your CPU in seconds.
Nothing weird about this. Most processors have some headroom for pushing the voltage-frequency curve. So you have the option to just decrease the voltage, just increase the frequency, or do a bit of both.
One of the reasons i like the new m1/m2 apple devices. they have really good performance but are also very efficient. Maybe its "only" 70% the performance of the top of the line intel CPU but at 10% of the electricity
The issue here would be to find a common testing ground, i imagine. Different generations can have wildly different efficiency points. Maybe a processor could be 20% faster by only consuming 10% more power which would make it more efficient overall but how to test that efficiently? One could just run dozens of time consuming tests for ever possible power level - each with dozens of benchmarks. While I agree this would be great and the manufacturers definitely already are doing this, to consumers and reviewers the non-existance of automated testing systems does make this impractical (at the moment), I fear.
Idea: bring back the turbo button! Back in the day, the turbo button disabled the CPU's onboard cache... Let's bring it back in the form of a button that unlocks the CPU's higher power limits on demand. If I'm just surfing the net, I may want my CPU limited to 65 watts. But if I fire up a game, let me punch my Turbo button, thus unlocking my 170 watt limit and letting the CPU clock up.
Moving into my first house, I am practically paranoid about energy consumption. Simply because I live in a very HOT and HUMID climate, like 8 months out of the year. And things that use power make heat, which my A/C uses more electricity to remove. Rather than building my own router out of an old PC, Im getting an off-the-shelf router that's ARM based and uses barely any energy. A space heater on full power (in the US) uses 1800 watts, so even just 18 appliances using 100 watts is like having a space heater running in your house while it's burning hot outside and your A/C struggles to keep up. But a computer using over 1000 watts on top of everything else? Nope, I will pass.
The 13900k is insanely energy efficient while extremely performant. Just not at stock. The 4090 is insanely energy efficient while extremely performant. Just not at stock. Both of these shipped extremely overclocked, well into diminishing returns. They both should have their power draw limited. It's time for reviewers to push energy efficiency, and it's time for these manufacturers to make it easy to switch between "performance" and "efficient" modes for the typical user, and to show the impacts of it.
To be super pedantic, power-limiting is not under-clocking. Under-clocking actually changes the boost behaviour, while power-limiting just truncates the top end of the curve.
My new PC is a seasonal computer. It raises the temp in my room by about 5F. Thats lovely in the winter, but during the summer I could really only leave it on for an hour or two at a time before the room was uncomfortably hot. Between my 240hz 2k monitor, my RTX 3070 ti and my i7-12700K, my PC has begun to function more like a space heater than a gaming rig. When I build a new system in 4-5 generations, power/performance is going to be top of my mind. Most of the games I play are retro graphics anyways...
This excessive overclocking reminds me of how a lot of dumb corporate managers are trying to increase productiveness by a few percent by overworking their workers, making them a lot more inefficient at their job, more tired and, as a result, a lot more miserable and easier to 'break'
I am so clueless about any PC stuff and every year it just seems to get more complicated lol. Dont know if I can ever catch up, I do recognize that buying pre-built is paying too much for less, but dang if I would ever get my head around all of this stuff.
Love the video, dead on point on everything! However, I don't think power efficiency was simply not considered. Rather, it was a "silent killer," a feature everybody understood and just... didn't mention. People just picked a mid tier 500 watter and had certainty that it will last them through their PC's lifetime, as long as the GPU remains a mid-tier one. Now this is not the case - you will see the interest in methods and ratings for efficiency itself explode, not just because of energy cost itself, but the fact that most systems DON'T run a 1000W 80+ Platinum triple-verified top brand PSU. They run a 500 watter that came with the damn box. And they won't buy a new supply just for the CPU. I'm extremely glad all the new chips are showing excellence in actual efficiency, so far. With dynamic clocks based entirely on the cooling setup, almost every "benchmark" of this generation is rendered null and void simply because no sane consumer will HAVE such a system as in benches. Whoever isn't doing efficiency benchmarks had better get on it quick.
When playing a game like BF4, my Ryzen 5 3550H uses 7W, highest I’ve seen it hit was 15W, the problem is that the performance gain from a higher TDP nowadays is so negligible, yet vendors put their TDPs so high just for those clocks
You somehow always manage to amaze me, great new video 👍 I'm also of the opinion that the consumption got out of hand. This is why I decided that in future upgrades (currently having a 5800x and rtx3090) I am no longer buying the fastest available CPUs/GPUs but also take a close look at efficiency. It looks though that I skip the new gen and instead wait for the next one. this way it should be possible to upgrade to a faster setup and even lower the power consumption. Btw. i reduced the power consumption of my 3090 in summer because else it would get to warm in my working room. but in autumn/winter reset it to its default power target to heat my room while i play instead of using the gas-heating 😉
The Ecomode makes this a bit better I think, especially since it boosts to a configurable maximum. But anything higher than 80°C is a no-go in my book because the efficiency drops hard at some point. Boosting for a short time is fine, but boost should not be the norm.
It's not even the temperature that's the problem, obviously the process node from TSMC does not scale favourable compared to Intel when given more power which is evident by just dropping down to 105W eco mode & only losing a few % performance & at least 20°C in temperatures.
As someone who runs an overclocked i5-9600k (default: 4.3ghz all core boost 4.6 single core), I can say they aren’t really pushed very close to their limits. When I tried an overclock with an undervolt, I was able to get a stable all core boost frequency of 4.7Ghz. With a relatively heavy but mostly unharmful overvolt, I found 1 core that failed to get past 4.9Ghz but the average frequency is 5.1Ghz. This is an overclock of roughly 18% and while mine is good, it’s not extraordinary. If intel was really pushing the 9000 series, a 20% overclock with an air cooler and $150 motherboard would be unheard of. . I think for a few generations, casual overclocking is going to be a matter of undervolting. How much performance can I keep when I reduce the voltage by 10-20-50-or even 150 millivolts? Can I reduce the power draw by 10% and keep 95% of the performance? 96% 97% where is the tipping point?
I predict that we'll see 550+watt 4-slot AIB 7900 XTX cards that perform close to the 4090. I bet AMD just doesn't want to deal with this power-PR nonsense ATM. for gods sake they clocked the card to only 2.3ghz when everyone was expecting them to hit at least 3ghz.
@@KonradVarga just speculation. A 3rd 8-pin connector would give it a theoretical maximum of an extra 150 watt. So may be I went overboard. 500w is more like it
With AMD cpu's you can adjust the frequency to voltage curve by an offset. What that allows is better clocks at same temperatures. But if you power limit PBO, you will have stock performance at massively reduced thermals.
I just wanna mention the way you've set up your gpu power adapter it is more likely to fail according to tests done by some tech youtubers. They almost never fail when bent up or down but bending them left/right seems to be making them melt. Generally this may only happen to people who get bad batches but it could happen to you. I would recommend rerouting the adapter to bend down instead of right!
If we don't stop, we will eventually hit the ac circuit limit of 1500W. And unless it becomes common for computers to be given their own dedicated circuit, it also has to handle monitors and lights. It may also have to share with multiple office or bedroom appliances.
From overclocking trend to undervolting, this is the world we live in. Ryzen for some reason have so much power pregiven. I tried undervolting my 5600x and get less overall temperature and more performence just by doing undervolting. Went from 88W to 48W while maintaining base clock speed using power scheme to 99%. Got more performance because of how low temperatures are, so my CPU can finally breath (from almost 88C to 68C).
@@marceloalejandro8228 Settings from undervolting I did was in video from Optimum Tech called "Ryzen 5000 Undervolting with PBO2" and for the windows I did lower minimum and maximum processor state to 99%
Intel’s 8th to 10th gen weren’t repurposed server CPUs. They were new consumer dies that were just lengthened to add the extra cores. Intel’s server CPUs used a different core layout, different memory controllers, no integrated graphics, etc.
He's talking about LGA2011 CPUs, even showing a 6950x during that segment. HEDT 2011 CPUs absolutely were repurposed server chips, same as the w3175x was a repurposed Xeon 8280.
Welcome to 2kliksphilip where we act like AMD is all high and mighty creating Zen which was super scalable while ignoring the practically impossible scalability of Skylake Intel managed with them consistently staying ahead of AMD in performance for its entire run. Then once performance became even Intel was consistently managing better price to performance than AMD was. Because they've managed to stay that way for 3 generations now Philip needs to make a video saying price/performance isn't important anymore since AMD's staggered behind badly in that regard and it's efficiency which is key since that's the only metric they've managed to stay ahead at (although even then undervolting Intel shows exceedingly competitive efficiency regardless). I don't hate AMD, AM4 will probably go down as one of the most legendary sockets of all time but the AMD bias/shilling on this channel is unreal. I'm completely baffled how he ever managed to get review samples from Nvidia with how shockingly blatant the AMD bias is. If Intel makes good CPUs they're only mentioned in passing only to then return to the main subject at hand which is how holy and beloved our saviors AMD are.
processor coolers are so weird. They were making computers in the 80s or something and were bottlenecked by the temperature of the chip. Then one guy said we should make a tiny little fan that sucks air away from the chip and put it inside the case. First computers didn't have moving parts, and then they did. Like a little mechanical machine inside your electronics. also its literally a free way to increase your processor speed in a sense
I undervolted my RTX3080 so that it only draws 250W flat out now with an imperceptible difference in performance - great for small form factor cases But I discovered that intel locked down the 12400F even more than just locking the multiplier.. because I can't undervolt it at all. -30mv offset does absolutely nothing, and -50mv cuts the performance by like 40%, same as setting the power limit to 45W with stock voltage.
Buying a new computer now comes with additional charge of being able to pay for fast internet for giant patches of games and being able to afford electricity bill of constantly running a small water heater.
the power issues with new ryzen 7xxx is because they made the heatspreader too damn thick. they did that to keep the mounting height the same as AM4 so people could keep their cooelrs, but they really should have made it thin as possible
Systems are much much less efficient now as well. I remember back in 2010. I had a laptop I would use at school all day. 8 hours going from class to class. Taking notes and browsing the internet. It would sip about 15% CPU usage on that cheap processor with those loads for 8 hours and I would get home usually with at least 30% battery life left. Now, I would love to find a laptop that would last half that time and load without completely draining its battery. It really feels like technology is regressing.
tbh, current softwares are also super bloated i think newer cpus are much more power efficient, they just have a lot more things to do in the background constantly my laptop which is an acer swift x lasts about 5 hours on battery on windows when idle, while on linux that goes up to 24 hours on idle the entire computer uses 2.5-3watts on a 54wh battery when idle, so i guess you just gotta look for more efficient laptops
I have to say my new 13900k is using a bit less wattage than my old 10900k in daily usage/gaming If I stream/record and play GTA with 80-99% GPU usage, my CPU only runs at around 100-120W where my old 10900k with the same settings/gpu sucked 150-200W 13900k only uses 1-2 Threads for this workload. Same results in many different daily scenarios, as the CPU hardly ever gets fully utilized. When doing productivity based workloads though.. oh well.. lets just say I am glad I have a 1000W Platinum PSU
I undervolted my first good processor, the i7 6700k, and I currently undervolt my Ryzen 5600x. I'm gonna keep doing it, not only to save money but because I mostly play low-graphics or old games, aside from currently Doom Eternal and some day Elden Ring. I don't think I'll ever need such performance.
One thing i despise on many current techtubers that they test cpus with unrealistic stress tests and then concentrate only on that to evaluate cpu's efficiency, temp range and power usage, + that sweet sweet clickbait tittle/thumbnail. In fact ~95% of people wont stress top tier cpus such as 12900k/13900k/7950x at 100% on daily basis. They need to have better testing methodology for todays cpus which are way smarter at managing its power/clockspeed/voltage limits. prime95, even blender tests do NOT reflect real usage statistics, sure they can be supplementary but it needs to have at least 2 different workloads to provide at least reasonably accurate idea how much power those cpus will draw.
reviewers should account for power usage by including it in the price when comparing price to performance between cpus. I find it weird that that hasn't happened yet.
I am already amazed as even though technology is evolving every year, power consumption is increasing instead of decreasing. How the hell it is improvement then? 450W default TDP of 4090 and it easily exceeds 500W. This is stupid. Companies need to understand that we don't game at 8K. Even though they are trying to maximize the potential for gaming at 4K, they should not do this at the cost of higher power consumption. Every generation, we are seeing massive jumps in power consumption. It's like the architecture and technology implemented are not helping but the GPUs and CPUs are instead forced to work faster by feeding them more power.
my biggest problem is how much power my cpu uses when idle i have a 3600x and it uses over 30 watts when idle which is insane, my laptop only uses like 1 watt when idle, when i undervolt it it uses less but then it is always at a locked frequency, i wish there was an option to change the frequency and voltage when doing more heavier tasks, i dont want my cpu to use 30 watts when on twitter
Overclocking made sense in core2quad times: buy new cooler, change 1 number in bios, gain stable 50% performance boost, but mainly because cpus weren't doing it themselves. Nowadays it's pointless and in fact sometimes things are boosting into crashing themselves (mostly graphics cards), so it's not necessarily better. Games or items, I don't want to spend hours installing mods to fix the game or its UI, or figure out new numbers and abbreviations in bios, it should just work out of the box tuned for best value and longevity for money.
0:00 "Something strange is happening in the world known as earth. At a time when we're increasingly being told to save the earth, our corporations want to destroy more of it than ever before! Then again, corporations have always been a bit... hungry."
I like to do undervolting to keep same performance at lower power draw or even small overclock and undervolt to gain performance without more power. Original ryzen 1000 series boost was actually bad. I think something went wrong and they released it with boost that does not work properly. It was fixed in zen 1 laptop chips.
First thing I did with my 12700 was to lower the max draw to 105W (from 125W sustained and Unlimited peak for short durations). For my workloads I've lost about 200 MHz or so (from 4.5GHz short duration to ~3.9 GHz long to now ~3.7GHz). It's barely noticeable. What was noticeable however is how much more efficient the 12700 is when idling compared to my previous Zen 2 system. Those chiplets suck up power like no tomorrow just to keep idling.
Should try to bring clock speeds down for maximum power savings. 4ghz consumes 4x more electricity than 2ghz for the same work, and 16x more than 1ghz. If you can bring clock speeds down even by a little, it has quite little impact on performance and it's exponentially less power hungry.
Overclocking is no longer a trend. I see so many more threads now days on how to undervolt the best for the minimal performance loss. I am currently using my 7950x with the eco mode 65w. I lose 4 fps in my favourite game.
I mean to be fair in games even the 13900K doesn't use that much power. My 12600K generally uses 40-60 watts in games even though its power limit is 150.
@@lycanthoss Intel usually uses less power during gaming & idling than AMD does. It's just all-core workloads that kind of blow the powerbudget.
@@MLWJ1993 the 12600k always uses more power than the amd 5600x and 5800x in the comparisons I could find; like 10% more at idle and up to almost double under load.
@@henrym5908 Interesting, the charts from Igors lab suggest the 12600K to equal the 5600X's power consumption (on average), but the 12700K & 12900K use less power than the 5600X in their tested games (at all resolutions actually) games.
Idle I see power consumption of ~5W on 12600K vs 9W on 5600X on Tweakers.
Although if we compare the 2 it doesn't make a huge difference until we look at all-core workloads where alderlake pushes for much more power.
Don't forget that AMD has an adventage over intel due to their litography proccess which is smaller so less power consumption but still you can make it power hungry.
Fortunately other popular gaming devices, like handhelds, are pushing in the other direction. I know they don't have a choice but still - it's still mindblowing to me what level of performance I'm getting with my Steam Deck for only 15W TOTAL CPU + GPU (tops at around 25W when we account for display consumation and other internals).
25W? That's less than my my CPU + GPU on idle.
@@LutraLovegood I know right 😄 My desktop also uses more than that idle.
And some laptops that dedicated for long battery life also do that (although in some vendors and laptop models, the U-series APUs are still allowed to boost up to 30W if the cooling system is good enough)
@@LutraLovegood I have a thin and light laptop with an r7 6800u and it is insane the level of performance it gets for the power draw. the cpu performs between a desktop 3700x and 5700x and has an extremely competent igpu capable of comfortably playing new aaa games at 1080p medium to low, all in a 25w package. cpus really don't need to be drawing 300w
You've seen a british man try to chow down a steam deck, now you see him trying to bite through a cpu.
I wish tech press would focus on power efficiency more, gamers nexus are already starting to test this sort of thing which is great, but it's not really done as a standard test which sucks.
edit: realised hardware unboxed also seem to be starting to test power efficiency too :D
yeah arm chips are the future and gamers not going to let them win
@@koko-fv2zr I'm hoping RISC-V overtakes x86 in the next decade or two
Apple is already doing this.
Tbf power consumption is more of a overheating problem than anything else. More power it consumes the hotter it gets.
@@koko-fv2zr ARM chips have nothing inherently more power-efficient about them. They just happened to have found that niche at the moment.
Power envelope for desktop CPUs was a kind of prisoner's dilemma. As soon as either brand moved up, the other brand had to move up or lose. I'm impressed it stayed almost the same from 2010 to 2020, honestly. The "good" news is that they can only really do this once, and then they have to improve IPC/efficiency/process/etc to once again realize gains each generation. And they've now done it. This means the next chipset generations will have to have architecture or process improvements to raise performance.
Good point about the prisoners dilemma. But I think there is no good news about it. After they both "defected", the CPU power consumption for desktop will probably stay high forever. It will perhaps never go back to the more sensible 100W.
It would probably only work if they all signed a mutual agreement.
@@cube2fox you can always limit how much power the CPU uses, its just now there is more headroom for people who don’t care. Personally both my CPU and GPU are Undervolted and its performs better than stock and quieter.
@@Azzy_Mazzy Yeah that's good. Although few will know how or choose to do this given that it isn't the default.
@@cube2fox i know R7000 CPUs you can just enable eco mode and it would run way cooler, idk if intel has something like that
"I'm impressed it stayed almost the same from 2010 to 2020, honestly."
It's because AMD couldn't compete with Intel until after they got Zen going, so it was basically a decade+ of Intel being able to do whatever they wanted with no risk. Once AMD had good CPUs again, Intel had to get off their ass and start pushing their CPUs to compete again. There's nothing to be impressed by, except maybe Intel's anti-competitive tactics that held AMD back I suppose.
Yeah it has gotten a little out of hand.
Also for GPUs; the RTX 4090 really should have been a 350 watt card but Nvidia wanted that last 2% of performance.
Before 12th gen, Intel’s official power draw was reasonable. But they allowed Z-series motherboard vendors to set unlimited power limits by default, which got progressively worse as they added more cores.
A little? I'm annoied by this development since the release of > so far < usless tech like Raytracing.
@@suoyidl2654 ray tracing isn't useless tech , especially on high end cards where you can max it out and still have good performance , I would say ray tracing was useless when it was first inroduced in the 20 series because only the 2080ti could run it reasonably and it didn't make a noticeable difference in visual quality in games like battlefield 5 and shadow of the tomb raider and it still halved the frame rate ( plus dlss 1 was terrible )
But now , ray tracing is a step up in visual fidelity in many games such as cyberpunk , control , metro exodus EE , dying light 2 , spider man ...etc
Well 4090 doesnt draw that much in usual gaming scenarios in my experience.
@@simptrix007 Do you cap the fps? Because most people don't.
@@lynackhilou4865 When in a blind test users are unable to tell ray traced from rasterization unless they're trained to do so and concentrate, I'd call it useless.
While you have alluded to this I think it's important to remember that power consumption on its own doesn't tell you if a given CPU is more efficient than another CPU.
For example if you were to compare a CPU with a 60W power limit and a CPU with a 100W power limit but the 100W CPU is able to complete its tasks twice as fast then it is actually more efficient. Efficiency is measured in the amount of Wh required to perform a given amount of work, power consumption is only a part of this metric.
It's also worth considering that a faster CPU can allow you to get more work done in the same amount of time and if you're using your PC to make money then it's possible that those time savings will end up paying for the inefficiency of the CPU.
Regarding the low amount of power pulled by a single core: it has less to do with diminishing returns in terms of performance and more to do with the fact that there's a limited amount of power you can push into a single core before you can't cool it due to thermal density. This was less of an issue in the past because the cores weren't as dense as they are today due to how much the transistors have shrunk.
Finally you have to consider how big of an impact the difference in CPU efficiency will really have on the overall amount of power all of your devices consume (this also heavily depends on your usage as you have to have a workload that will actually stress the CPU before you start seeing those very high power consumption numbers, gaming in particular is known to not stress CPUs all that much). 100W sounds like a lot especially when it comes to CPUs but its impact on the number of Wh you end up consuming could be so small that it's within the margin of error.
Of course I'm not saying that it's not worth it to consider reducing power limits and ideally also undervolting CPUs/GPUs especially since drawing less power also makes cooling a lot easier and could save you a significant amount of money in cooler costs.
Just yesterday I actually put my 3700X in Eco-Mode, and noticed it was still clocking itself to 4.3 GHz on single cores in games, 100 MHz lower than normal. The only time it really noticeably throttled itself was at full core loads, where the processor capped all its cores to 3.8 GHz. But as a result, my thermals never reached higher than 58 degrees, as compared to PBO mode where my CPU would eventually reach 86 degrees under synthetic load, or even standard mode where it would reach 71 degrees.
I admittedly didn't check the power consumption stats, but I imagine they were also significantly lowered. And then you uploaded this video, letting the world know that it's okay to power limit your CPU if your power draw and thermals are significantly lowered, with a near-single-digit impact on performance. You really laid out why it's more beneficial than keeping it stock. I'd add that if you purposefully play eSports games at 480 Hz and high resolutions, then maybe it can be argued you'll need that extra performance for some games. But that's a small percentage within the enthusiast PC gaming community who mainly use 120-165 Hz, which is already a small portion of the PC community as a whole. The vast majority of people will pretty much only benefit from power limiting/underclocking their processors.
I'm just glad that power consumption is getting more and more attention.
Only like a solid half decade late but yea ig
I bought 5900x as my cpu, and underclocked it to use about 40W. It's still about 10x faster than my old processor, and I have now cpu that doesn't need cooling. With my (older, I'll upgrade when gpu prices go down to level of sanity) GTX 980 frame-limited during gaming, I'm genuinely unsure if my gaming computer even uses its fans. The quality of life upgrade was massive, and turning the computer on without any sound beside monitors turning on still makes me smile.
I'm glad to see a video like this. I just upgraded my computer of 6 years, and honestly for playing the games I do like FF14, or Age of Empires 2 -- I wouldn't mind more energy efficient parts.
I'm at a weird point, where I want to upgrade more, because building computers is fun! But I don't need performance, I could easily see myself going down the micro-builds and more efficient route. It'd be really neat to build a computer that has the same performance as my 6 year old ring, but is physically half the size or puts out way less heat.
Congrats on being featured in the newest ltt video 👍 folliwing you for years by now and I was very suprised with the ltt staff mentioning your videos
AMD or Intel could just say "Imagine this scenario: a hot summer, in your room, all windows closed, you're boiling" or something more PR. Also say "Spend less on bills, and more on games" or something, idk. That would sell more than "hurr durr spend 300€ on motherboards"
That would sell a lot of amd gpus. Do not die from a stroke gaming, get a Radeon instead.
And less heat = lower fan speed = less noise. It's a win-win-win.
No it wouldn’t
Probably the most insidious part of this race-to-the-bottom is all the hidden ecosystem costs. Spend more on a cooler, spend more on case/ fans, spend more on power supply... the current dearth of affordable motherboards is partly due to VRM costs. All taken into account you may be spending $100 or $200 more on a build because team purple decided to jump off the performance per watt cliff for 4% higher performance.
As someone who has a 5L case and is very limited on cooling, when i looked at Intels slides, what peaked my interest was that 65W and same performance as previous gen at 240W, because in my case, power efficiency is king. Things aren't as ridiculous as they seem with the high power draws on display because while they technically allow you to suck 400W or whatever, you don't need that. You still get much better performance than the previous generation on the same small cooler. Performance per watt is still increasing.
@@2kliksphilip Both at 65 W is definitely good, but also showing what power level the older gen draws for the new one. Both make sense really. Especially in a situation where the power increase changes for each increase in watt. For example, if 65 W gave X multicore performance %, 66 W gave X+Y performance, but if 67 W didn't give X+2Y, rather it gave X+1.5Y.
@@2kliksphilip very true, and this should be normalized. Maybe a standard should be making a graph that tests the performance every 5 watts, so 10, 15, 20, or at least for the high power CPUs and then plotting that on a graph. If we just test at 65W, many low power ones won't even show up so they can maybe test per watt.
ive been looking at sff cases and been pondering going with intel next time i build, im fucking shocked how high wattage has gotten.
like, i went from a 4790k which idles at like 3 watts, to a 5800x which idles at 45 watts. this cant continue.
@@quantum5661 i agree. Efficiency should be more important, although 45 idle is very high. I bet you could drop that with underholdning and one of the auto frequency mores like PBO
@@quantum5661 and if you want efficiency, don't go with Intel. They used th be the efficiency kings but it changed like 2 years ago. Intel has gotten desperate trying to keep up with AMD's performance so they pump in lots of watts to get good performance. Also, AMD's socket is more likely to last much longer so you have an upgrade path later on with AMD's motherboard
This is very important to me as someone living in a hot climate in a thermally constrained room! Before I lived here, I had more consistent A/C so I didn't need to worry as much about thermals. Now I almost only ever use laptops and game consoles because, despite enjoying the customizability of pc towers and pc gaming, heat output is almost always higher than the other options, which just doesn't make sense in my situation. The recent trends in desktop computing haven't been reassuring either, with the ever higher TDPs. Glad this is getting more attention so someday I can return to computer towers!
in summer It will be impossible to run a computer without air conditioning in the room, especially here in South America.
the heat accumulates in the room. It is cheaper to cool the room using an exhaust fan (50watt), rather than an air-condition (1horsepower, 1000watt). A person emits 100watt of body heat, but a PC can emit 5x a person, so it is as tho the room is crowded with people.
Im feeling it here in Brasil.. 30c everyday, without AC i dont even open games, just use pc normally because i cant stand such heat, and cant use AC most of the time because energy bill are in all time high right now
I love the little small print in the thumbnail
When tech reviewers compare cpus price to performance they should include the cost of average power draw for average usage over average lifetime of cpu. Just to help people understand what they will really pay to buy and run that cpu. And of course it would put a big inventive to hardware manufacturers to prioritise power consumption.
Nah
I dont see the power trend die out.
People were always willing to spend extra money for little performance gains.
Its not gonna stop just cause their powerbill has an additional 0.
I feel Philip is good at predicting trends in the hardware market be it about the companies or users.
@@2kliksphilip The rough side of being able to analyze data.
Don't let people without that ability get in your way or ruin the fun you have with it.
A good tip is to undervolted your CPU. You'll maybe see a 5-10 percent performance loss, but a massive drop inn power consumption.
The CPU's are very efficient, it's just that the manufacturers crank them to the max (more or less)
Excellent perspective and points.
@@2kliksphilip Of course! This is especially useful for me as I'm looking to upgrade after a long time out of the market. I would not have easily come across the idea of intentionally throttling performance to increase efficiency. I'll need to take this into account while doing cost-effectiveness assessments, as well (unless you're going to do that anyway 😅, in which case I can wait). It seems like there's already enough data out there from benchmarks to see which power level is most GHz/watt and then we can just divide price by that value.
I noticed the upscaling on the graphs, thank you mr Klik
Glad to see you talking about this.
I completely agree with this. One of my favorites PC build was the LTT energy saving PC, wish someone would do one like that at different price points every year like Jean-Baptiste Show does a 500€ PC build every year
i havent owned a good computer in my entire life but i still find these videos extremely interesting!
I was running my Ryzen 3600 at 4500Mhz all core, up from the base clock of 3.6Ghz. So there was overclocking headroom in the low-mid range. The high end was saturated. I think this distinction should be pointed out. Now we are seeing even the lower end ones being close to max.
Hi daddy Philip
I agree, but is there a part in the video?
I've been undervolting gpus since 2016 and cpus since 2017 and getting 20-30% lower power and temperatures for a loss of 0-4% performance.
You can just leave the default settings, but instead tweak the "Processor Power Management" in Window 10/11. Set "Processor performance increase threshold" to 95%, "Processor performance decrease threshold" to 94%, and "Processor performance core parking min cores" to 100%. You will get the lowest power usage without limiting performance in peak load. (notice that I disabled core parking, which is not a mistake, rather it is better to have all cores on standby and throttled down, than let them get parked)
This seems very interesting. I don't know enough about this so boosting this comment up so people more knowledgeable than me can comment on this. Thank you
clever
@@earthling_parth there is a forum titled "Windows power plan settings explorer utility" on Guru3d, a lot of discussion about these settings there.
this basically disables turbo boos, which makes it run at Base Frequency, you are basically running the 13900k at 3.0ghz
@@puccionicolas7763 this 3 setting were not turbo settings.
And people say Bulldozer was too hot when it was 125w
The most i REAALLY cared about "Power consumption" is "Will i have to buy a bigger power Supply?" , so yeah... i think we kinda are some of the problem
This whole thing is so overblown.
99.9 percent of people will not be running their cpu at 100% all day
If you set a 13900k desktop to windows power saver over 7 hours it uses an average of 15 watts with three browser windows(twitch/TH-cam, news, discord)
Set it to max power and play cod/ow - 70 watts average over 4 hours.
Everyone needs to calm tf down.
@@2kliksphilip But people judge the 7950x and 13900k for that exactly (because they don't understand and you would need to explicitly explain the power draw of typical use). When you actually use it - it uses a fraction of the power they would expect.
When the 13900k and 7950x produce the best frames, you most certainly will "get the 13900k", well gamers will
This pearl clutching over power draw is just clickbait - And not just from you, the entire community is just gorging itself on this.
I don't hate the game, just what the game does to general community knowledge.
(for either companies processor)
@@2kliksphilip Yes, but when you are using them for PRODUCTIVITY, the cost of power is meaningless compared to the time it saves you to do work faster. I dont get paid to play CS at 800FPS, but I do get paid if I can compile code faster. Like when Disney makes a new movie with tons of CG, do you think they choose a render farm that is the most power efficient, or one that is the fastest? Its obviously the later, because time=money, power=far less money.
What about an experimental episode, where you port old BSP maps (Quake3/GldSrc) to Unreal LUMEN or even to Blender Cycles (offline). 🙏
I've already tried this, and the results were just fantastic 🤩
It will be a follow up for your great comparison between CS.GO and the new HL Alex engine
The tinkering hasn't necessarily gone away per say. It used to be where we'd overclock our cpu's now we undervolt them. Same goes for GPU's. Personally however I do feel that this is a good thing if you look at the bigger picture. Most people don't overclock or really tweak their cpu's and gpu's so they benefit by the companies that design these chips doing this from the factory with safety parameters in place to make sure nothing is damaged.
Undercoating can Aldo describe stabilu
@@timeTegus are you okay mate? seriously i'm genuinely concerned, i hope you're not having a stroke
CPUs are pre-overclocked these days. It's cool though that consumers don't have to overclock their CPUs anymore if they just want to eek out the most juice.
If you're good with your power plans, you can underclock your CPU in seconds.
@@JessicaFEREM that klcan make the system unstable
@@timeTegus Overclocking also can make the system unstable. People used to ride the line of stability and then dial it back a little for performance.
I undervolted my R5 3600 and still got a fair bit of performance lift from overclocking. Processors are weird sometimes.
Nothing weird about this. Most processors have some headroom for pushing the voltage-frequency curve. So you have the option to just decrease the voltage, just increase the frequency, or do a bit of both.
That is expected behavior. Your chip wasnt power limited so by undervolting you caused it to run cooler, which in turn makes it boost higher.
Undervolting can yield impressive results on that front with no or very little loss in performance
Im perfectly happy with more performance even at a greater power consumption since we still have tbe option of power limitting any cpu
100% agreed. I even underclock and undervolt my phone's SOC and memory for better battery life (ROG phone 5).
One of the reasons i like the new m1/m2 apple devices. they have really good performance but are also very efficient. Maybe its "only" 70% the performance of the top of the line intel CPU but at 10% of the electricity
This is why people need to start tuning their CPUs and set a power limit
Perhaps benchmarks should be done at standard power levels, showing how efficient each chip is when given the same amount of power to work with.
The issue here would be to find a common testing ground, i imagine. Different generations can have wildly different efficiency points. Maybe a processor could be 20% faster by only consuming 10% more power which would make it more efficient overall but how to test that efficiently? One could just run dozens of time consuming tests for ever possible power level - each with dozens of benchmarks.
While I agree this would be great and the manufacturers definitely already are doing this, to consumers and reviewers the non-existance of automated testing systems does make this impractical (at the moment), I fear.
Idea: bring back the turbo button! Back in the day, the turbo button disabled the CPU's onboard cache... Let's bring it back in the form of a button that unlocks the CPU's higher power limits on demand. If I'm just surfing the net, I may want my CPU limited to 65 watts. But if I fire up a game, let me punch my Turbo button, thus unlocking my 170 watt limit and letting the CPU clock up.
just change the windows power plan, you can even add a little shortcut on your taskbar and run it using WINDOWS KEY + 1,2,3,etc
Moving into my first house, I am practically paranoid about energy consumption. Simply because I live in a very HOT and HUMID climate, like 8 months out of the year. And things that use power make heat, which my A/C uses more electricity to remove. Rather than building my own router out of an old PC, Im getting an off-the-shelf router that's ARM based and uses barely any energy. A space heater on full power (in the US) uses 1800 watts, so even just 18 appliances using 100 watts is like having a space heater running in your house while it's burning hot outside and your A/C struggles to keep up. But a computer using over 1000 watts on top of everything else? Nope, I will pass.
The 13900k is insanely energy efficient while extremely performant. Just not at stock.
The 4090 is insanely energy efficient while extremely performant. Just not at stock.
Both of these shipped extremely overclocked, well into diminishing returns. They both should have their power draw limited. It's time for reviewers to push energy efficiency, and it's time for these manufacturers to make it easy to switch between "performance" and "efficient" modes for the typical user, and to show the impacts of it.
To be super pedantic, power-limiting is not under-clocking.
Under-clocking actually changes the boost behaviour, while power-limiting just truncates the top end of the curve.
My nhd15s cools my 13900k just fine, I also put a contact frame on it to shave off more heat.
My new PC is a seasonal computer. It raises the temp in my room by about 5F. Thats lovely in the winter, but during the summer I could really only leave it on for an hour or two at a time before the room was uncomfortably hot. Between my 240hz 2k monitor, my RTX 3070 ti and my i7-12700K, my PC has begun to function more like a space heater than a gaming rig. When I build a new system in 4-5 generations, power/performance is going to be top of my mind. Most of the games I play are retro graphics anyways...
This excessive overclocking reminds me of how a lot of dumb corporate managers are trying to increase productiveness by a few percent by overworking their workers, making them a lot more inefficient at their job, more tired and, as a result, a lot more miserable and easier to 'break'
@@youtubeisgarbage900 What does this have to do with communism, zombie.
@@youtubeisgarbage900 Is everyone who thinks overworking people to death is bad - is a communist in your book?
@@youtubeisgarbage900 I can add one more: Japan
Yup. This is the reason why I have not upgraded and hope for the next generation to be more power sensible. Both for GPUs and CPUs.
GPUs 10 years ago: d r i n k
CPUs 10 years ago: chilling
CPUs and GPUs today: *d r i n k*
I am so clueless about any PC stuff and every year it just seems to get more complicated lol. Dont know if I can ever catch up, I do recognize that buying pre-built is paying too much for less, but dang if I would ever get my head around all of this stuff.
Just be patient and don’t overthink things. set a budget and a goal then start looking at reviews of products that meets your budget
I prefer to undervolt rather than underclock; reduces my power consumption and heat output while simultaneously providing more performance!
Love the video, dead on point on everything! However, I don't think power efficiency was simply not considered.
Rather, it was a "silent killer," a feature everybody understood and just... didn't mention. People just picked a mid tier 500 watter and had certainty that it will last them through their PC's lifetime, as long as the GPU remains a mid-tier one.
Now this is not the case - you will see the interest in methods and ratings for efficiency itself explode, not just because of energy cost itself, but the fact that most systems DON'T run a 1000W 80+ Platinum triple-verified top brand PSU. They run a 500 watter that came with the damn box. And they won't buy a new supply just for the CPU.
I'm extremely glad all the new chips are showing excellence in actual efficiency, so far. With dynamic clocks based entirely on the cooling setup, almost every "benchmark" of this generation is rendered null and void simply because no sane consumer will HAVE such a system as in benches. Whoever isn't doing efficiency benchmarks had better get on it quick.
Very good overview
When playing a game like BF4, my Ryzen 5 3550H uses 7W, highest I’ve seen it hit was 15W, the problem is that the performance gain from a higher TDP nowadays is so negligible, yet vendors put their TDPs so high just for those clocks
You somehow always manage to amaze me, great new video 👍
I'm also of the opinion that the consumption got out of hand. This is why I decided that in future upgrades (currently having a 5800x and rtx3090) I am no longer buying the fastest available CPUs/GPUs but also take a close look at efficiency. It looks though that I skip the new gen and instead wait for the next one. this way it should be possible to upgrade to a faster setup and even lower the power consumption.
Btw. i reduced the power consumption of my 3090 in summer because else it would get to warm in my working room. but in autumn/winter reset it to its default power target to heat my room while i play instead of using the gas-heating 😉
Steve's thumbanail face at the end is just *chef kiss*
The Ecomode makes this a bit better I think, especially since it boosts to a configurable maximum.
But anything higher than 80°C is a no-go in my book because the efficiency drops hard at some point.
Boosting for a short time is fine, but boost should not be the norm.
It's not even the temperature that's the problem, obviously the process node from TSMC does not scale favourable compared to Intel when given more power which is evident by just dropping down to 105W eco mode & only losing a few % performance & at least 20°C in temperatures.
i like the music in your videos
I just bought a 750 watt PSU. I think I might've gone a little too low on that one.
As someone who runs an overclocked i5-9600k (default: 4.3ghz all core boost 4.6 single core), I can say they aren’t really pushed very close to their limits. When I tried an overclock with an undervolt, I was able to get a stable all core boost frequency of 4.7Ghz. With a relatively heavy but mostly unharmful overvolt, I found 1 core that failed to get past 4.9Ghz but the average frequency is 5.1Ghz. This is an overclock of roughly 18% and while mine is good, it’s not extraordinary. If intel was really pushing the 9000 series, a 20% overclock with an air cooler and $150 motherboard would be unheard of.
.
I think for a few generations, casual overclocking is going to be a matter of undervolting. How much performance can I keep when I reduce the voltage by 10-20-50-or even 150 millivolts? Can I reduce the power draw by 10% and keep 95% of the performance? 96% 97% where is the tipping point?
I predict that we'll see 550+watt 4-slot AIB 7900 XTX cards that perform close to the 4090.
I bet AMD just doesn't want to deal with this power-PR nonsense ATM. for gods sake they clocked the card to only 2.3ghz when everyone was expecting them to hit at least 3ghz.
if not better
its slightly bigger than the RX 6900 XT
and where did you get the 550+ TDP out of?
@@KonradVarga just speculation. A 3rd 8-pin connector would give it a theoretical maximum of an extra 150 watt. So may be I went overboard. 500w is more like it
I hope oneday We will get A arm based GPU
With AMD cpu's you can adjust the frequency to voltage curve by an offset. What that allows is better clocks at same temperatures. But if you power limit PBO, you will have stock performance at massively reduced thermals.
I just wanna mention the way you've set up your gpu power adapter it is more likely to fail according to tests done by some tech youtubers. They almost never fail when bent up or down but bending them left/right seems to be making them melt. Generally this may only happen to people who get bad batches but it could happen to you. I would recommend rerouting the adapter to bend down instead of right!
If we don't stop, we will eventually hit the ac circuit limit of 1500W. And unless it becomes common for computers to be given their own dedicated circuit, it also has to handle monitors and lights. It may also have to share with multiple office or bedroom appliances.
From overclocking trend to undervolting, this is the world we live in. Ryzen for some reason have so much power pregiven. I tried undervolting my 5600x and get less overall temperature and more performence just by doing undervolting. Went from 88W to 48W while maintaining base clock speed using power scheme to 99%. Got more performance because of how low temperatures are, so my CPU can finally breath (from almost 88C to 68C).
What settings are you using?
@@marceloalejandro8228 Settings from undervolting I did was in video from Optimum Tech called "Ryzen 5000 Undervolting with PBO2" and for the windows I did lower minimum and maximum processor state to 99%
Intel’s 8th to 10th gen weren’t repurposed server CPUs. They were new consumer dies that were just lengthened to add the extra cores. Intel’s server CPUs used a different core layout, different memory controllers, no integrated graphics, etc.
He's talking about LGA2011 CPUs, even showing a 6950x during that segment.
HEDT 2011 CPUs absolutely were repurposed server chips, same as the w3175x was a repurposed Xeon 8280.
@@popcorny007 ...which is before the 8th generation, like Rachit said.
ring-bus moment
Welcome to 2kliksphilip where we act like AMD is all high and mighty creating Zen which was super scalable while ignoring the practically impossible scalability of Skylake Intel managed with them consistently staying ahead of AMD in performance for its entire run. Then once performance became even Intel was consistently managing better price to performance than AMD was. Because they've managed to stay that way for 3 generations now Philip needs to make a video saying price/performance isn't important anymore since AMD's staggered behind badly in that regard and it's efficiency which is key since that's the only metric they've managed to stay ahead at (although even then undervolting Intel shows exceedingly competitive efficiency regardless).
I don't hate AMD, AM4 will probably go down as one of the most legendary sockets of all time but the AMD bias/shilling on this channel is unreal. I'm completely baffled how he ever managed to get review samples from Nvidia with how shockingly blatant the AMD bias is. If Intel makes good CPUs they're only mentioned in passing only to then return to the main subject at hand which is how holy and beloved our saviors AMD are.
1:52 I think you accidentally put two 5900X in the graph here.
I both andervolted and power limited, to 125W, my 12900K when I got it at the start of the year. You got me thinking about a 13900KS now.
One unmentioned aspect is unwanted heating up the room in Summer.
I heard on the internet someone was saying that "Europe's problem is world's problem but world's problem isn't Europe's problem".
processor coolers are so weird. They were making computers in the 80s or something and were bottlenecked by the temperature of the chip. Then one guy said we should make a tiny little fan that sucks air away from the chip and put it inside the case. First computers didn't have moving parts, and then they did. Like a little mechanical machine inside your electronics. also its literally a free way to increase your processor speed in a sense
I undervolted my RTX3080 so that it only draws 250W flat out now with an imperceptible difference in performance - great for small form factor cases
But I discovered that intel locked down the 12400F even more than just locking the multiplier.. because I can't undervolt it at all.
-30mv offset does absolutely nothing, and -50mv cuts the performance by like 40%, same as setting the power limit to 45W with stock voltage.
I'm happy with my 65W the *900 branded stuff has always been the "extreme enthusiast" tier with zero regard for efficiency
my guy, u make good videos
Buying a new computer now comes with additional charge of being able to pay for fast internet for giant patches of games and being able to afford electricity bill of constantly running a small water heater.
the power issues with new ryzen 7xxx is because they made the heatspreader too damn thick. they did that to keep the mounting height the same as AM4 so people could keep their cooelrs, but they really should have made it thin as possible
gem
Thanks Philip, useful video. Also, why the hell is your energy kWh price 49.25p? I thought the price cap put it to an average of 34p?
The CPU and GPU markets need to get out of their 60s muscle car era and enter their 2020s hybrid hatchback era.
Systems are much much less efficient now as well. I remember back in 2010. I had a laptop I would use at school all day. 8 hours going from class to class. Taking notes and browsing the internet. It would sip about 15% CPU usage on that cheap processor with those loads for 8 hours and I would get home usually with at least 30% battery life left. Now, I would love to find a laptop that would last half that time and load without completely draining its battery. It really feels like technology is regressing.
tbh, current softwares are also super bloated
i think newer cpus are much more power efficient, they just have a lot more things to do in the background constantly
my laptop which is an acer swift x lasts about 5 hours on battery on windows when idle, while on linux that goes up to 24 hours on idle
the entire computer uses 2.5-3watts on a 54wh battery when idle, so i guess you just gotta look for more efficient laptops
I have to say my new 13900k is using a bit less wattage than my old 10900k in daily usage/gaming
If I stream/record and play GTA with 80-99% GPU usage, my CPU only runs at around 100-120W where my old 10900k with the same settings/gpu sucked 150-200W
13900k only uses 1-2 Threads for this workload. Same results in many different daily scenarios, as the CPU hardly ever gets fully utilized. When doing productivity based workloads though.. oh well.. lets just say I am glad I have a 1000W Platinum PSU
i see what you did there with the benchmark graphs
I undervolted my first good processor, the i7 6700k, and I currently undervolt my Ryzen 5600x. I'm gonna keep doing it, not only to save money but because I mostly play low-graphics or old games, aside from currently Doom Eternal and some day Elden Ring. I don't think I'll ever need such performance.
One thing i despise on many current techtubers that they test cpus with unrealistic stress tests and then concentrate only on that to evaluate cpu's efficiency, temp range and power usage, + that sweet sweet clickbait tittle/thumbnail. In fact ~95% of people wont stress top tier cpus such as 12900k/13900k/7950x at 100% on daily basis. They need to have better testing methodology for todays cpus which are way smarter at managing its power/clockspeed/voltage limits. prime95, even blender tests do NOT reflect real usage statistics, sure they can be supplementary but it needs to have at least 2 different workloads to provide at least reasonably accurate idea how much power those cpus will draw.
Performance pet watt is the future in cpu/gpu
I remember being very excited about buying a NH-D15 2 years ago but now I'm slightly cursing myself for not going with an AIO.
reviewers should account for power usage by including it in the price when comparing price to performance between cpus. I find it weird that that hasn't happened yet.
I am already amazed as even though technology is evolving every year, power consumption is increasing instead of decreasing. How the hell it is improvement then? 450W default TDP of 4090 and it easily exceeds 500W. This is stupid. Companies need to understand that we don't game at 8K. Even though they are trying to maximize the potential for gaming at 4K, they should not do this at the cost of higher power consumption.
Every generation, we are seeing massive jumps in power consumption. It's like the architecture and technology implemented are not helping but the GPUs and CPUs are instead forced to work faster by feeding them more power.
my biggest problem is how much power my cpu uses when idle i have a 3600x and it uses over 30 watts when idle which is insane, my laptop only uses like 1 watt when idle, when i undervolt it it uses less but then it is always at a locked frequency, i wish there was an option to change the frequency and voltage when doing more heavier tasks, i dont want my cpu to use 30 watts when on twitter
Overclocking made sense in core2quad times: buy new cooler, change 1 number in bios, gain stable 50% performance boost, but mainly because cpus weren't doing it themselves. Nowadays it's pointless and in fact sometimes things are boosting into crashing themselves (mostly graphics cards), so it's not necessarily better.
Games or items, I don't want to spend hours installing mods to fix the game or its UI, or figure out new numbers and abbreviations in bios, it should just work out of the box tuned for best value and longevity for money.
it makes me sad when i think about the future of power consumption
I agree, power efficiency should be given equal importance as of price/performance ratio.
0:00
"Something strange is happening in the world known as earth.
At a time when we're increasingly being told to save the earth, our corporations want to destroy more of it than ever before!
Then again, corporations have always been a bit... hungry."
i wouldn't say i underclock but i do undervolt my gpu and cpu in order to lower my temperatures while not sacrificing any performance
I like to do undervolting to keep same performance at lower power draw or even small overclock and undervolt to gain performance without more power.
Original ryzen 1000 series boost was actually bad. I think something went wrong and they released it with boost that does not work properly. It was fixed in zen 1 laptop chips.
First thing I did with my 12700 was to lower the max draw to 105W (from 125W sustained and Unlimited peak for short durations). For my workloads I've lost about 200 MHz or so (from 4.5GHz short duration to ~3.9 GHz long to now ~3.7GHz). It's barely noticeable.
What was noticeable however is how much more efficient the 12700 is when idling compared to my previous Zen 2 system. Those chiplets suck up power like no tomorrow just to keep idling.
Should try to bring clock speeds down for maximum power savings. 4ghz consumes 4x more electricity than 2ghz for the same work, and 16x more than 1ghz. If you can bring clock speeds down even by a little, it has quite little impact on performance and it's exponentially less power hungry.