ok, I know that I am quite late and I might sound dumb but isn’t 0.765kWh is just for one hour? should you like… multiply it by 3 first so it’s 3 hours of his pc time per day?
If the people paying the bills (usually parents) had any idea why their electricity bills were so high - they think its the aircon or the heating or the kettle and they are putting low wattage lights around the house etc - little to they know that the gaming PC they bought for the brat locked in the bedroom gaming all day is using more power than they could imagine, I pay my electric and I refuse to upgrade my system because quite frankly the power usage is insane
@@DarkDragon-jo4ke what are the specs of your pc? I am wondering if I should stay with the old AM4 (5900x + RTX 3060) or should I buy the latest models, but I am afraid of the consumption =)
Not gonna lie, I used to keep my gaming PC idling 24/7 when I wasn't using it. INCLUDING WHEN I WENT ON VACATION. I don't do that nowadays, but BOY was I stupid!
Fun fact: There are places where people leave their cars running all day long. Yakutsk, a city in Yakutia, Siberia, Russia have 7 months of -71°C winter. Drivers leave their cars running, even at night because if they don't the engines and the oils will freeze and they won't turn back on. The people that can afford heated garages (not as common as you think to heat a garage during that cold winters) can at least turn their engines off when they get home. Very informative video, thanks for sharing!
@@TarenGarond that's heavily exaggerated, although the far northern parts of Russia get pretty cold from November to mid May, but that's northern Syberia. ain't really no people up that way, it's just animals and other wildlife. but most people live within Moscow, it gets snowy and Russia does get cold, but only the far northern parts way up in Syberia drop below average Winter temperature, but that's the Pole, it's the Arctic. there's very little sunlight from December until early March, but i don't know any person that far up. i lived in the Ukraine for a little while when i was younger, but i remember it snowed alot. i had family in Russia though, but we all moved when i turned 12 at the time.
That temperature is wrong. The record is -64°C. From a quick google, they had about 10-15 days in this year with temps around -50-55°C. Yes that is freaky cold, but it's not minus 71 and nowhere near months of it.
Honestly.. I love efficiency and there is another hidden cost in using a PC. Wear and tear.. using a PC invariably over long periods of time will eventually see a failure somewhere. Might be as simple as a fan dying or a drive giving up etc. My high end gaming PC with multi-monitor setup I realized was total overkill for just relaxing, browsing the web, chatting on discord, watching youtube videos etc. A great alternative, is.. using a Raspberry PI 4b. A $50-$70 single board computer. It can easily do all of those things for around 0.2W per hour.. basically for almost no cost. When I don't need all the capabilities of a big gaming rig, I can turn it off.. and switch to the PI. Silent and super efficient.
It is worse for your computer turning it on and off, I leave my PC on forever and I have components that are there for a decade. I do this for more than a decade and the components that I experienced failure were the really bad ones (due to a fabrication problem or something), fact is that it is more likely you gonna despise your good components before its stops working. I know it sounds stupid leaving PC on forever but I got my means for doing this and here we have clean energy.
@@gutexp May well be worse for your computer but generally in life I hate wasting stuff including energy, water and gas. I also use Sleep, Hibernate and Turn Off all the time and my PCs are fine. I even turn off my home built TrueNas server if I don't use for an extended period of time LOL. But I understand your point.
@@Ludak021 This old chestnut have any scientifically obtained proof? Genuine curiousity here, I want to avoid the energy pollution costs of manufacturing new parts, not just the energy used on my end. This isn't the 90s where all PSUs were cheap junk which sent sudden 5V and 12V jolts to shitty capacitors in underdesigned filter circuits. Most chips run on 3.3V or even 1.5V input now. SSDs use a shockingly high about of electricity compared to spinning rust (as I found out when deciding on my laptop's m.2 vs SATA), but they also don't have the initial load spike on power up, and no mechanical wear from spin-up/down. I suppose there's also a factor of modern CPUs, drivers, and OS ability to send components into lower-power states during idle that wasn't a thing back in the 90s and 00s, either, so that further complicates the question on the "leaving it on" side.
my pc doesn't like to sleep, i come back to it being non responsive, crashed or unuseable, so sleep and hybernate are off the table, but when i know im going to be away from it i turn it off, but i know many people that just leave it on all day and night and spend maybe a couple of hours a day actually using it while the rest of the time it's sitting there running, warmed up, and ready to go, im sure there is a bot net that just loves their computer.
This is something to think about when thinking of reuse an old pc as a server. I live in Sweden and we had some hefty electric payments the last six month so there is absolutely money to save on a year, and counting all different users will sum up in lots of wasted energy and money. The router is one thing most not turn off, but there is money and safety won by set the wireless to not operate in the router when you know it wont be in use.
I use a modest gaming pc when gaming, and when working, browsing the web and doing other basic computing tasks I use an Intel NUC! I also leave the NUC powered on for roughly 12 hours per day as it has Sonarr and Radarr installed on it, as I want to ensure it downloads all the latest releases. The NUC uses between 5-10 watts when idle and between 10-20 watts under load! For comparison, my gaming pc uses roughly 150 watts under load when gaming and about 40 watts when idle.
Thanks for this information, this is not easy to find. I wish I had this kind of data when I lived with my father and he complained that the computer was making the power bill skyrocket (while he usually fell asleep with his CRT TV on all night until I noticed at 4-5am, and now THAT'S a power hungry device).
Here in Mexico City, if you go past 125 KWh per month, you get reclassified to a different rate, which costs 5 times more per KWh. For 125 KWh, you can expect to pay about 5 USD per month, but if you get reclassified, it goes up to 150 USD per month. So much for owning a Gamer PC here… Fuck Latin America.
@@Krzeszny1995PL The PSU isn''t the only part affected, the entire system is. Jayztwocents has a video showing CPU voltage spiking to 1.5+ volts on startup and other things also experience similar spikes. From the hardware's perspective, sleep mode isn't much different, the only difference is the RAM stays powered and uses a few watts since it would lose its data otherwise. These power cycles aren't much of a problem in the short term, but when you keep your stuff for many years like I do (I'm still using a Core 2 Quad from 16 years ago in one of my systems), it's worth considering.
Thanks for the informative video and love from Turkey, I generally idle my computer between lunch times because I'm lazy but now I will know better! As a foot note, leaving your computer on idle while on the main menu of a game you paused will passively strain your system just the same as when you're gaming and it will be much worse to leave it idling like that. In most main menus the game is still being rendered on the background as evident by the heat and the fan noise that comes from your computer. The heat will make your thermal paste go dry with time(causing thermal throttles on CPU, worse performance on GPU) and the electricity wasted will double. So, don't do what I used to do; which is leaving the game open and doing something else, just save, quit and turn off your computer. Your computer will thank you :)
On the game menu, the CPU usage should be very low since there is no need to run the game logic and physics. Also, to reduce GPU usage, turn on vsync. If you pause the game, the game might suspend the game logic and physics as well. Some games render the background and render the menu on top as a semi-transparent thing which means work for the GPU.
@@louistournas120 Yes. Though I don't use VSync because it limits FPS to 60, I use a high refresh monitor. In that case I limit my FPS to my monitor refresh rate using RivaTuner. Uncapped frame rates will cause more work for GPU and will result in coil whine noise, will reduce longevity of GPU thermal paste condition as well as more energy waste. Good points on game logic not working on background but most games will also have memory leak issues so after you come again in an hour or so, game will be more laggy.
Fun fact. In places where it is extreme cold. People leave their cars running so their car doesn’t freeze up. You might see this taking place in northern territories.
I was looking into something like this today for a project. In the USA, where the majority of our power production is from natural gas (2019 numbers), each kW of power generated emits around 0.91 pounds (413 grams) of CO2.
subbed. im seeing a lot of high end pcs being built with no consideration for power usage, i would much rather see a build that is powerful and energy efficient
Try "Wolfgangs channel" he recently had video about power efficient server. But you likely want a compromise if you want a PC and you are not ok with a tablet. I have PC on for a long time (home office, games, ...) and I'd say it costs like 20$ per month (price doubled in last year where i live due to war in Ukraine and expensive natural gas - actually price tripled, but half of the price used to be distribution fee ... hell with included taxes, Czech Republic reached Solomon island prices)
I enjoyed this Nick, this is good global comparison and insightful. You might be considering doing this home project on a larger scare, have you considered presenting this to a large organization after counting and estimating their energy bills savings if idle status is avoided by programmed remote mass shutdowns? Very interesting.
Yea I have the same power meter too. I bought mine from Aliexpress. Even LTT uses the same power meter. Apparently, it's a pretty good power meter. Just do not leave it plugged into a high-power source (around 1,000W+) for long periods of time as u will risk burning it and causing a fire.
I ordered that same measuring device from Aliexpress for 15€. I guess they made a mistake since I got two of them in the package. A really cool thing to measure power use with. My decent 6 core 16gb 3050 700€ gaming laptop uses about 15-20 watts idling connected to my 55 inch LG OLED and the main screen turned off. TH-cam uses 25-30w. Max usage makes it go to 110 watts or so, WoW classic with fps limited to 45, 1440p and 3/10 graphical fidelity uses around 35 watts. My TV with power saving maxed uses around 37-45 watts, HDR on without power saving makes it go to 110 and more watts. My desktop just idles at +100 watts. That's why I love laptops now, gonna save a big penny.
Power bills gone up in the UK and i rent. Im working from home and we're trying to keep the bills down. I do have another housemate but im not sure how much of the energy bill i should feel bad about - theres probably a biased because im home most of the day. I will have to get one of these!
Loving this content man. Personally I run a 5800x with a 6800xt and i turn my PC off every time I'm done using it for this reason. i want to know what's the reason of leaving a computer idling ? For quick access really ? Get your self ssds the boot process is so much quicker use it and turn it off !
@@armschoolc32 wait a second are you seriously complaining about 15- 20 seconds and trying to start an argument just so you can tickle your balls over saving some energy yearly ? ... pathetic
I have data recorded every 5 seconds for a year showing my office, bedroom and living room PC usages. Having that data over the year saved me 25% on my electric bill. Gamers in particular need to pay more attention to the wattage and start to see it as a "bad" thing. My PC idles at 95-115W. Very similar spec to yours. However. Running a game in full 4k with default settings and no VSync will immediately raise that to 550-600W. However, if I go and enable VSync that drops under 400W. Lower the res to 1440P and all the fans stop and it comes down to 320W. When I run "3DMark" Benchmarks like Time Spy. I can see each phase of the test on the mains power chart. I got so upset with this, I spend £3000 buying solar panels, batteries and an inverter to solar power the whole thing. Now I can game "guilt" free AND save on electric bill.
If the power supply will only pull as much power as needed from the wall, why does having a higher capacity power supply generate a larger electricity bill? If you don't use the whole thing, are you charged as if you are?
Have a larger power supply generally shouldn't increase power consumption. The power supply rating is simply a limit on how many watts it can pull at one time. I have a 850W power supply. If I had a 650W power supply with the same efficiency rating, it MAY pull slightly less power. Somewhere between 0%-2% less www.wepc.com/tips/power-supply-ratings-exactly-what-do-they-mean/
Very good. I learned not to leave my PC on waay back when the motherboard started melting. Granted it was a cheap one, and the PSU might not have been much good either. If I hadn’t been in the house would have gone up in flames, which would definitely have wasted a lot of energy.
Love the video! I actually just used one of your other videos to under-volt my graphics card, I'm only rocking a 1660 Super, but I cut the wattage down nearly 40%, and my temps down by 30% while only losing like 6% of my FPS. I still hate how much energy desktops use compared to a laptop for just browsing the web/simple document work. My desktop still uses about 48-70 watts vs my laptop using about 10-20 watts for those simple activities. I wonder why desktops can't go into some super high energy-efficiency mode when you're doing stuff that doesn't stress the system at all?
If you have a gaming laptop it will use an extremely complicated method of switching video processing from your GPU to the more efficient on board video. This is a massive PITA to deal with but it is more efficient. Desktops don't do this, they're always using your GPU, which just uses more power as a baseline, even if you're undervolting. One thing you CAN do is tell Windows not to use your Hardware Acceleration for certain applications on your PC. I think this is under the Gaming settings on Win11. This means the GPU will stay idle instead of consuming more power. The other option is of course to use a SOC that combines graphics and CPU on one chip. AMD calls these "APUs." It is much slower for gaming but still usable, and more power (and of course cost) efficient. In this case you use the video outputs on your motherboard instead of a separate GPU.
Nvidia needs to get their stuff in order. in Germany and many other EU countries power costs >35 USD cents per KwH. Real money. My 1660 Super alone consumes 15-20 watt just idling on the desktop- so i pay over $50 for the privilige of having a DGPU. My Ryzen 5700G entire system with 32 Gigs, 3 SSDs and APU costs only 25-30 watts Idling, the card almost DOUBLES that. NVIDIA needs stop wasting power, there is NO REASON these cards suck up this much wattage just sitting around. I'm sure if you pull out that 3090 you will halve your idle draw as well.
That's pretty terrible price damn. For me it costs currently a bit over 13 €cents/kwh. My pc with 3600x and 5700xt pulls 95w at idle. Could be lower but i find it acceptable imo.
Modern video game consoles also have some solid power settings. On the Xbox Series S/X you don't even have to lose the "Quick Resume" functionality if you turn off idle power draw functions. The PS4 will lose it's ability to pick back up on a running game if you disable "Rest mode" though. Not really a big deal and you can always toggle it off and on as needed if you don't want to keep going through the motions of launching/loading specific games (The Last of Us 2 for example takes quite a while to load from launch so you can leave Rest Mode on until you beat it). Not sure about the PS5 as I don't own one.
Few things to consider are that the PC in question had a fairly high-power GPU (350 watt 3080 Ti) and an AIO pump, which is going to use much more power than an air cooler especially when idling. Most gamers according to Steam's hardware survey rarely get a card anywhere above an Nvidia XX70 series, and those cap out at around 250 watts under max load and tend to average closer to 200-220 watts. Leaving your PC to idle is definitely a foolish thing imo, running your hardware for longer periods will reduce the lifespan quicker (especially for fans) and the tiny tiny setbacks of extra power supply strain (minimal) and slightly-slower bootup times (if you're on an SSD it really shouldn't be an issue) are incredibly miniscule compared to the cost of running your PC at idle for so long. I definitely wish Windows had more options for power management built-in, as stuff like Windows Update and Windows Antivirus running at random can really ramp up power consumption for your storage and CPU.
Coming to this video helped me a lot. I'm upgrading my GPU and learning about wattage and costs. Now I found the perfect spot where my montly bill might increase with 10 euro's but not more thankfully.
I have an idea for a "hybrid powered PC" which would run off of a small uninterruptible power supply (UPS) when idle. During higher load scenarios, the UPS would be charged from the grid while running the PC at the same time. I'm also trying to figure out how to get the UPS to support the grid power to the PC during unusual demand spikes (Turbo Boost, etc). With your car analogy, idling this style of PC would be more comparable to leaving a Prius in [READY] mode (running), where it only starts the engine when required to charge the battery.
The answer is simple. You take a secondary power source like solar panels or a windmill, and a connection to the grid. You then create a relay with sensors and other needed utilities and connect both secondary and main power into inputs on the relay and then connect the output of that that relay to the input of a ups. The ups powers the computer, and the relay keeps the ups charged. During minimal use, the relay would keep the secondary source of power flowing as long as it keeps up with demand, and when demand outstrips capacity, the relay switch's to the main grid. All the while, the computer is happily chugging along with stable power from the ups.
That is not at all comparable and just worse and more expensive. Hybrid cars save so much fuel because the engine is running way less, that's s massive difference. What does it do that the PC is running off a battery??? It would kinda work out if you charged the battery at night and then used it during day. But the savings are still tiny. It's easy, everytime you leave your desk just put the PC into sleep mode.
@@gerble36That's still ridiculous. Either put solar panels on your roof if you have a house. That alleviates some of the cost but it needs to amortize first. Using a solar panel or windmill to power a computer is ridiculously inefficient. This small scale power generation just doesn't make sense from the numbers perspective.
The computer still uses the same amount of energy either way. There is no energy savings in this situation. And charging a battery instead of a direct line actually reduces efficiency requiring even more energy.
This also made me re-think my work from home situation. Here I am thinking i'm being efficient by preventing e-waste because I use a 2009 iMac as a secondary screen (target display mode)...only to find out the average power draw is 100-200watts.. Looking up new Energy Star Certified monitor info I realized most modern high efficiency work monitors use like 8-15watts when they're on. I don't even really use the "computer" part of the iMac so I'm actually being wasteful here energy wise (it also generates a ton of heat). No brainer time to upgrade to a more power efficient monitor and sell the iMac to somebody who needs an actual computer to recoup some of the cost.
i used to be the idiot to leave my pc 'idling' overnight, went to work the next morning only to come back home to my pc at some point and do a gaming session of roughly 2-6 hours. it took me a year to realize how much this wasted power had cost me on the yearly bill. that isn't even funny anymore. so to all the other idiots still idling around the globe: Turn OFF your pc whenever you leave it for more then roughly an hour. seriously. on top of all that, it isn't just idling.. you are shaving off running hours from the lifespan of your components.. this may not have a direct impact but in the long run it will shave off years. this is because electronic devices simply die over time from usage.
Another comment separate from my other one pointing out the weird mistake. You are correct that turning off computer at night when idling saves money, of course. However, there is evidence to suggest leaving the computer on is *better* for the "health" of the computer: 1) Most computer failures happen on boot ups. Less boot ups, less chances things will go wrong. Booting up a PC takes a toll on it, a BIGGER toll than leaving it idle. 2) If you are using sleep or hibernate you are either using RAM or HDD/SSD respectively. Which causes more usage on those items every night, again leading to more degradation of your computer parts compared to leaving it idle. 3) Cold temps. Having the computer warming up also takes a toll on it, this kind of goes along with point 1 but w.e I'm not saying this is a universal truth (for example, a point #3 exception would be if you are in a small room with poor ventilation, you should turn it off. because the heat will build on itself and that extra heat will cause dmg to the computer. But if you are in a big room, or a well ventilated room, that doesn't matter) BUT there is a lot of evidence to suggest (assuming you have a very well setup-computer in all aspects) leaving it on when not in use is *better* for it. As for the environmental effect, it's negligible. Individuals using electricity is NOT the cause for global issues. It's fossil fuels and LARGE corporations massive power hungry machines. They are still necessary, but *they* need to be looking at ways to ease environmental impacts, not us individuals. Lastly: you might be thinking that the cost of that nightly use is not worth. Well. If the computer completely breaks 3 years earlier because of all that boot-ups, then the thousands of dollars to replace may out match it lol. But generally speaking, yes, if you can't afford it, probably better idea to just use sleep (if you use HDD) or hibernate (if you use SSD). That distinction between storage drives is due to SSD having a much longer life span than a HDD, since using sleep will use RAM, so HDD is safe. Hibernate will use primary drive, which should be an SSD (assuming you have one). *TL;DR* Contrary to this video, if you can afford it, and if your PC setup is proper, leaving your computer idle (obviously turn off monitors and stuff, just the tower we are talking here) has actually been shown to be BETTER for the life-span of the computer and it's parts individually as well. So the research, given those previous assumption, says you should leave it on.
Greetings from Europe. 7 USD cents per kWh is something we can only dream of. Where I live prices are 12 times higher, and still going up. Yes, that's right. 12 times higher. I'm not kidding you. We are just under 0.9 USD per kWh (autumn 2022). Makes you think twice before leaving all your appliances on standby. And there's the environmental impact as well.
I am in BC as well. But I am using an AMD 3200G APU. No discrete GPU, so I just sip power. So my yearly cost for computer use is probably what I would pay for one beer in a nightclub.
Do not trust sleeping devices. A windows 10 machine will usually wake up to check for updates. If said updates are found, it may not go back to sleep again until you tell it to Update. A sleeping mini-pc draws about 3W. An awake one draws 20-30W. TVs will also switch themselves on to download the adverts and spam they through at you or to download the tv guide and retune the aerial. Not all standby devices pull
Basically, if I run my PC 24/7 at moderately high work loads, I'm using .06 cents a work hour of my income. For me, that is acceptable. A good comparison is my rent is $4.5 per each of my work hours. My car loan is $1.8 per work hour. So in comparison if for example I build a new $3000 computer every year and include 24/7 power consumption, its barely $2 per work hour. I rebuild my PC once every 3-5 years, and I turn it off most of the time so it's a lot less to get entertained per year/work hour for me. The car analogy though is a bit different, if you idle for 30 seconds, you already wasted more then double gas then to start it, it's like you lived in Hawaii for electricity. Idling is actually one way I see electric cars as much better. So many people IDle for a long time.
Power (and thermal) cycling is what will usually kill electronics. I pretty much never power my PC off (since c. 2012, now on the 3rd build), never had a _single_ hardware failure in that time, not even an HDD. The one from 2012 (P8Z77 / 2500K, later 3770K), according to the SMART data on a couple of WD Greens running in RAID 0, was running for 6.5 years of 7.5 years before I upgraded. It drew c. 45W at the wall, the current one under 40W.
it also depends on the game you are playing and the graphics settings, if you are playing yakuza, lol, rocket league... these games are not demanding. Let's try this, assume you are playing red dead redemption 2, Alane Wake 2, baldur's gate3or and bad optimized games, these games can cause your pc to shut down if you have a PSU below 450w, simply put gpu demand too much power
If i was to use PC - it is using like , 1 KWH every 3 hours or so. A KWH is now 37 cents. I use for 3-9 hours. So i could use about call it a buck 10 a day. My Gaming PC costs me 300 dollars a year to use with this calculation.
Great video.. here in Denmark we pay example 0,44 usd for each 1 kwh and even more in the witner time when the big solar power parks dont provide as much power as normal. This ofcourse depend on many things often its less than 0,44 usd. Actually the electricty itself cost example 0,075 usd, BUT on top of that come all kind of tax such as delivery tax, sales tax, net tax and other stuff and then the price end up like 0,44 usd for 1 kwh of electricty. But it depend allot if its peak time of the day from 17-22 where its the most expensive ( like 0,44usd ) but from 12-16 the price can be like 0,22 usd since there is an overproduction of power so in such a case you only pay all the taxes but get the power for free. So the prices goes up down allot. The main issue for many people are that you are not at home when the power is "cheap" so you spend most of your time gaming, cooking, tv from 17-22 in ordinary familys. The same goes with your electric car that you cant charge in the middle of the day when its away from home. ( some electric cars can even be used as a house battery where you can charge the car example in the night and then if its home deliver to the house when the power is expensive so you dont have to buy during peak houres. ). Pesonaly when the war in ukraine started prices skyrocked and my bill was like 2-3 times higher than normal for x month and then it dropped to a more normal level again. But due to all the unrest in the world that can effect prices i bought a hybrid solar power system so now i make my own electricty and in the summer time produce like 99% of all power i use my self. From 7 in the morning to 20 in the evening it can power my gaming computer with solar. ( and house ) And in winter time i can charge the battery when its cheap at night ect. I think the electricty prices/tax is crazy BUT in example 10 years from now i think prices will be lower when there are more wind and solar parks than today. At idle my gamer pc run at like 150watt ( that is with pc, 4k monitor and usb devices, and like 550w when gaming ) tip if you turn off HDR in the screen you save like 50watt, and you can also reduce game fps and monitor hz so the computer dont have to use quite as much power. Honestly most games do NOT require you to run them at any more than 60fps to have a fluid and great gaming experience. There are also options in windows 11 under power/graphics where you can select what games should run with "reduced power" and turn off HDR and such things, but honestly i dont really know if it work that great. You can also in nvidia and AMD software reduce FPS in games, just remember to reduce your monitors HZ if you do so for best result. ps. my 10 year old laptop uses only 45watt when surfing the net.. the gaming pc uses 150watt. think about that. pps, i consider to get a small one or two 500watt windmill in the garden for the winter time to charge the battery. power is very expensive in my country.
This is something I've been thinking about lately concerning my old PC I use as a plex server. I'm trying to think of some way to automate the sleep of my machine when I don't need it to be awake. I need to do more research on this, but it seems like Plex could create a feature to wake the machine when you ping it.
I am thinking on buying a laptop or an handeld pc for lower power usage, but on the same time I could limit refresh rate, UV GPU, limit FPS and choose better efficiency components to reduce power draw on the desktop. That's tough :c
A huge part of that is your 48" screen. Turn down the brightness to 50%, turn down the refresh rate to 144hz. Then with your cpu, you need to look into lowering voltage, etc.
Actually fantastic tech channel. You have really interesting premises and you follow through in a natural way that makes for a really entertaining video. Keep it up.
It would be interesting to compare cost now we have a global energy crisis. Here in London for example costs can be 5X higher! ......do most of your general computing on a NUC :) I also advise running everything from a power strip with individually switched sockets. Turn off chargers, power bricks etc when not in use.
The list at 7:00 makes me so mad as a German. Seems like we're a small island because our great politicians are brain dead since 2000. How can you have this high Energie costs and still try to be a leading industrial country. Same with fuel and pretty much anything else.
Thanks for this. Quality content right there. Did you see Germany as #10 on the list of most expensive countries. Just small island nations are more expensive 🤔
2 ปีที่แล้ว
I know this. We are shutting down nuclear power plants, just so we can buy nuclear power from France and Finland. It‘s stupid.
I'm not trying to discredit your research, I really appreciate the effort you put into it. But would a 5000 series CPU draw the same yields? Let's be honest, these are the ones that people are looking at.
Good point. Time for me to upgrade? Depending on the CPU, could be more, could be less. My R7 3700X has the same TDP (65W) as the R5 5600X. (Of course real world power draw will be somewhat different) R7 5800X is rated at 105W My power consumption is skewed in one direction by the GPU and arguably the other way by the CPU. I still think it's quite efficient compared to the 5800X
@@TechIlliterate Let's take into account that's the TDP, which usually doesn't really indicate power cnmsumption of the CPU. According to AMD, a R5 3600 and R7 3700 CPUs have a 65W TDP, but in reality they use about 80W under heavy load (using the built-in boost). If PBO is enabled they can easily use 100W. To be fair, this may not be AMD's fault or a false statement from them, it might just be related to the "issue" about AM4 motherboards forcing more voltage than it's set and reported to the CPU. So a configuration that states will provide 1.2v may well be feeding 1.3v to the CPU. Many sites report in their tests that the 5600X uses less power than the 2600, 3600, 3700 and 3700x with reports about 62-67W under heavy load (varies a bit depending on the load).
The reason I watched this video is because I got a hold of my friend to see if he could upgrade my computer for me. He didn't have any spare parts lying around but what he DOES have is a lenovo thinkcentre. its supposedly quite big, and it's got 128gbs of ram dual CPUS at 4.2 ghz for a total of 32cores. 1500 watt power supply. he also told me that usually it runs at around 600 watts, and even then, I still thought it was too much. My current PC runs at around 375. Look, I'm not a power hungry guy. I had an electrical fire once, so I tend to unplug and put away things that I'm not using anyways, and as someone who lives in the toronto area, I thought this was very insightful for the power bill. Thank you for this video. P.S, does anyone know any other ways I can conserve power to help me make a decision? I don't need all of this power as I said before, but as someone who is interested in doing things regarding operating systems or even game development, I thought maybe it would be useful to have so I don't need to go out and get it later. It would be really nice to force the computer to run at around a quarter of its power and then bring it back up when I decide I want to develop. Just doing something very similar to the power saving mode on an Android phone as an example. I also plan to use Linux which is very power efficient last I checked.
Interesting, how much does that equate to as watts used? If I was to buy an inverter what size would I need to run a rig like yours in gaming mode for 3 hrs? Any advice appreciated.
there is a town in sibiria where the people have to leave there cars on all year because otherwise it would freeze in 20 min and you can't start it anymore -.- XD
Heheh, OTR truckers do it all the time, leaving their trucks idling over night as they sleep in their cabs. ALL cruise ships are running 24/7 because they can not shut off the systems onboard without damaging the vessel, and cause mold to form all over the interior.
I let my PC 24/7 and while it does that it uses under 50W/h tested. How is yours using over 100? That's definitely not idle! 5600x / 1080Ti / SSD / HDD / EK X360 Extreme Kit (both CPU+GPU) / 7 Noctua Fans
Gone are the good old days of cheap electricity where I had my PC running flight sims on 3 screens with a ton of added hardware. Never really measured power consumption in those days but it sure was a lot higher than your 3 hour test because of the 3 screens and close to 100% CPU/GPU usage pretty much all the time. As of next month I'm gonna pay 89 euro cents per kWh (currently already paying half of that). Really glad I barely use my PC these days and have moved my gaming over to a Nintendo Switch & Switch Lite which only uses a few watts, especially in handheld mode. Desperate times call for desperate measures 🙂
You can game on power efficient PC hardwares albeit they will be low end. RX 6400 and GTX 1650 draw 75w from the wall, combine it with a 35w CPU like Intel T or S series or older i3 and newer Pentium (54w) will not draw a lot of power. Games are cheaper on PC for me personally, while Nintendo charges Dollar, Euro, Pounds, and Yen which are expensive.
@@main_tak_becus6689Mid tier hardware is pretty decent on efficiency as well. Total system power on my rig is probably around 250 - 300 watts after going with a 5900X. My 6700 pushing 1440p by itself averages around 100 - 110 watts. +130 watts at 4k. 5900X is around 50 - 70 watts according to Adrenaline.
Any Update? We need new calculations with new price for KWh. In Poland we have about $0,40/KWH and we are not a Island :-) And I can say more, companys witout private bills have to paid araound 0.80USD/1KWH
idling computer? My pc makes me money when I am not using it.. high power usage but higher income to offset the parts and power needed to make a few bucks. A 3090 can make about 8-10 bucks a day running it for 21 hours. Paid for the entire cost of the computer in a few (6 ish) months. I do use mostly solar and battery banks to supply the home 9 months out of the year.. Winter however, well, there's always one toasty warm room! lol
My computer has 12 hard drives, nvidia 1660, AMD thredripper 2950X and 1000w power supply. Using my solar scam monitoring software, it shows that I am using more power for my computer than my air conditioner. I'm upgrading to a 4090 and a 1300 watt power supply.
Thanks for watching. 🙏 Don't forget to LIKE, It helps a lot!
The last message was golden.
how much does your PC cost
*Edit: in usd
So 130w per hour idle with monitors on or off?
ok, I know that I am quite late and I might sound dumb but isn’t 0.765kWh is just for one hour? should you like… multiply it by 3 first so it’s 3 hours of his pc time per day?
which power supply should i buy that consume less electricity ?
If the people paying the bills (usually parents) had any idea why their electricity bills were so high - they think its the aircon or the heating or the kettle and they are putting low wattage lights around the house etc - little to they know that the gaming PC they bought for the brat locked in the bedroom gaming all day is using more power than they could imagine, I pay my electric and I refuse to upgrade my system because quite frankly the power usage is insane
Before I upgrade my pc I was paying $180 electricity bill now that I upgrade $3400 pc I pay $280-$300 electricity bill
@@DarkDragon-jo4ke what are the specs of your pc? I am wondering if I should stay with the old AM4 (5900x + RTX 3060) or should I buy the latest models, but I am afraid of the consumption =)
@@Lightblurs facts
An even higher indirect cost is turning the brat into an even more bratier brat! (is bratier a word? 🤔)
@@Lightblurs unless you are lagging there is no reason for upgrading your pc dude. My motherboard is 12 years old and I play whatever I want fine
Not gonna lie, I used to keep my gaming PC idling 24/7 when I wasn't using it. INCLUDING WHEN I WENT ON VACATION. I don't do that nowadays, but BOY was I stupid!
what was your bills?
Vacation too 💀
@@phcore2550That's the most important part, the key 🗝️ everyone's looking for but at the same time the one thing he will never tell us…😂
Why?
This is exactly the video I was looking for in 2022 when electricity in the UK is now 52p/kWh
the price for ukraine
Fun fact: There are places where people leave their cars running all day long. Yakutsk, a city in Yakutia, Siberia, Russia have 7 months of -71°C winter. Drivers leave their cars running, even at night because if they don't the engines and the oils will freeze and they won't turn back on. The people that can afford heated garages (not as common as you think to heat a garage during that cold winters) can at least turn their engines off when they get home.
Very informative video, thanks for sharing!
That is a fun fact. Thanks!
7 months of -71C that is a massively extreme exaggeration!
@@TarenGarond that's heavily exaggerated, although the far northern parts of Russia get pretty cold from November to mid May, but that's northern Syberia. ain't really no people up that way, it's just animals and other wildlife. but most people live within Moscow, it gets snowy and Russia does get cold, but only the far northern parts way up in Syberia drop below average Winter temperature, but that's the Pole, it's the Arctic. there's very little sunlight from December until early March, but i don't know any person that far up. i lived in the Ukraine for a little while when i was younger, but i remember it snowed alot. i had family in Russia though, but we all moved when i turned 12 at the time.
You know there are engine block heaters for that right?
That temperature is wrong. The record is -64°C. From a quick google, they had about 10-15 days in this year with temps around -50-55°C. Yes that is freaky cold, but it's not minus 71 and nowhere near months of it.
Honestly.. I love efficiency and there is another hidden cost in using a PC. Wear and tear.. using a PC invariably over long periods of time will eventually see a failure somewhere. Might be as simple as a fan dying or a drive giving up etc. My high end gaming PC with multi-monitor setup I realized was total overkill for just relaxing, browsing the web, chatting on discord, watching youtube videos etc. A great alternative, is.. using a Raspberry PI 4b. A $50-$70 single board computer. It can easily do all of those things for around 0.2W per hour.. basically for almost no cost. When I don't need all the capabilities of a big gaming rig, I can turn it off.. and switch to the PI. Silent and super efficient.
Same reason why I got a Thinkcentre Tiny for $30.
turning you PC off and on all the time will induce more chance for failures.
It is worse for your computer turning it on and off, I leave my PC on forever and I have components that are there for a decade. I do this for more than a decade and the components that I experienced failure were the really bad ones (due to a fabrication problem or something), fact is that it is more likely you gonna despise your good components before its stops working.
I know it sounds stupid leaving PC on forever but I got my means for doing this and here we have clean energy.
@@gutexp May well be worse for your computer but generally in life I hate wasting stuff including energy, water and gas. I also use Sleep, Hibernate and Turn Off all the time and my PCs are fine. I even turn off my home built TrueNas server if I don't use for an extended period of time LOL. But I understand your point.
@@Ludak021 This old chestnut have any scientifically obtained proof? Genuine curiousity here, I want to avoid the energy pollution costs of manufacturing new parts, not just the energy used on my end.
This isn't the 90s where all PSUs were cheap junk which sent sudden 5V and 12V jolts to shitty capacitors in underdesigned filter circuits. Most chips run on 3.3V or even 1.5V input now.
SSDs use a shockingly high about of electricity compared to spinning rust (as I found out when deciding on my laptop's m.2 vs SATA), but they also don't have the initial load spike on power up, and no mechanical wear from spin-up/down.
I suppose there's also a factor of modern CPUs, drivers, and OS ability to send components into lower-power states during idle that wasn't a thing back in the 90s and 00s, either, so that further complicates the question on the "leaving it on" side.
This channel is underrated. Good work dude 👍
my pc doesn't like to sleep, i come back to it being non responsive, crashed or unuseable, so sleep and hybernate are off the table, but when i know im going to be away from it i turn it off, but i know many people that just leave it on all day and night and spend maybe a couple of hours a day actually using it while the rest of the time it's sitting there running, warmed up, and ready to go, im sure there is a bot net that just loves their computer.
PC master race be like^
This is something to think about when thinking of reuse an old pc as a server. I live in Sweden and we had some hefty electric payments the last six month so there is absolutely money to save on a year, and counting all different users will sum up in lots of wasted energy and money. The router is one thing most not turn off, but there is money and safety won by set the wireless to not operate in the router when you know it wont be in use.
That was an excellent video, buddy. I don't play games on PC, but it helps being aware of how much your electricity bill will cost.
I use a modest gaming pc when gaming, and when working, browsing the web and doing other basic computing tasks I use an Intel NUC! I also leave the NUC powered on for roughly 12 hours per day as it has Sonarr and Radarr installed on it, as I want to ensure it downloads all the latest releases.
The NUC uses between 5-10 watts when idle and between 10-20 watts under load!
For comparison, my gaming pc uses roughly 150 watts under load when gaming and about 40 watts when idle.
What specs do you have? Just out of curiosity.
Thanks for this information, this is not easy to find. I wish I had this kind of data when I lived with my father and he complained that the computer was making the power bill skyrocket (while he usually fell asleep with his CRT TV on all night until I noticed at 4-5am, and now THAT'S a power hungry device).
Some CRT TVs have a timer. I have 2 that have it. You can choose 30 min, 1 h 2 h, 3 h and it will autoshutdown.
@@louistournas120 IKR. But my father never used it.
I remember I used both the timer and the alarm function son my TV as a kid.
your dad is a good fella
Here in Mexico City, if you go past 125 KWh per month, you get reclassified to a different rate, which costs 5 times more per KWh.
For 125 KWh, you can expect to pay about 5 USD per month, but if you get reclassified, it goes up to 150 USD per month.
So much for owning a Gamer PC here… Fuck Latin America.
Thats 4kWh per day. If you have a job I dont think you are at risk of being reclassified with any kind of PC.
I am past 150 less than 250 and i pay about 20 dollars every 2 months
With NVME drives that boot up in under 15 seconds, there are very few reason to leave a computer on when not in use. Great video.
Except the power on/off phase is the most stressful part for electronics. Orders of magnitude higher in stress. Has been since the beginning.
Honestly? I think it's much cheaper to replace a PSU than to leave a PC idling. And what about sleep mode?
@@Krzeszny1995PL The PSU isn''t the only part affected, the entire system is. Jayztwocents has a video showing CPU voltage spiking to 1.5+ volts on startup and other things also experience similar spikes. From the hardware's perspective, sleep mode isn't much different, the only difference is the RAM stays powered and uses a few watts since it would lose its data otherwise. These power cycles aren't much of a problem in the short term, but when you keep your stuff for many years like I do (I'm still using a Core 2 Quad from 16 years ago in one of my systems), it's worth considering.
Thanks for the informative video and love from Turkey, I generally idle my computer between lunch times because I'm lazy but now I will know better!
As a foot note, leaving your computer on idle while on the main menu of a game you paused will passively strain your system just the same as when you're gaming and it will be much worse to leave it idling like that. In most main menus the game is still being rendered on the background as evident by the heat and the fan noise that comes from your computer. The heat will make your thermal paste go dry with time(causing thermal throttles on CPU, worse performance on GPU) and the electricity wasted will double. So, don't do what I used to do; which is leaving the game open and doing something else, just save, quit and turn off your computer. Your computer will thank you :)
On the game menu, the CPU usage should be very low since there is no need to run the game logic and physics.
Also, to reduce GPU usage, turn on vsync.
If you pause the game, the game might suspend the game logic and physics as well. Some games render the background and render the menu on top as a semi-transparent thing which means work for the GPU.
@@louistournas120 Yes. Though I don't use VSync because it limits FPS to 60, I use a high refresh monitor. In that case I limit my FPS to my monitor refresh rate using RivaTuner. Uncapped frame rates will cause more work for GPU and will result in coil whine noise, will reduce longevity of GPU thermal paste condition as well as more energy waste. Good points on game logic not working on background but most games will also have memory leak issues so after you come again in an hour or so, game will be more laggy.
Fun fact. In places where it is extreme cold. People leave their cars running so their car doesn’t freeze up. You might see this taking place in northern territories.
Yes I used to live in Northern Canada. Usually a block heater was enough...usually.
Meanwhile here in the Netherlands, our electrical price is about €0,70 per Kwh... 😢
Son, was are you talking about? Germany has highest el. price and it is 41 cents per kWh? WAT ARE YOU TALKING ABOUT SON???
@@JohnBobb 41 cents? I wish I had that at the moment.
I was looking into something like this today for a project. In the USA, where the majority of our power production is from natural gas (2019 numbers), each kW of power generated emits around 0.91 pounds (413 grams) of CO2.
😨 Wow! I'd like to know the CO2 numbers here. Fortunately more than 90% of ours is Hydro electric.
not bad anyway, becasue in Poland 75% of energy is from coal so xd
In Yakutsk they leave the cars running all winter, non stop, ever, until spring!
subbed. im seeing a lot of high end pcs being built with no consideration for power usage, i would much rather see a build that is powerful and energy efficient
Try "Wolfgangs channel" he recently had video about power efficient server. But you likely want a compromise if you want a PC and you are not ok with a tablet. I have PC on for a long time (home office, games, ...) and I'd say it costs like 20$ per month (price doubled in last year where i live due to war in Ukraine and expensive natural gas - actually price tripled, but half of the price used to be distribution fee ... hell with included taxes, Czech Republic reached Solomon island prices)
I enjoyed this Nick, this is good global comparison and insightful. You might be considering doing this home project on a larger scare, have you considered presenting this to a large organization after counting and estimating their energy bills savings if idle status is avoided by programmed remote mass shutdowns?
Very interesting.
Four hours in the computer? Rookie numbers
Maybe he has responsibilities or other things to do than to just sit there and play games all day long
@@jambee5603 Have you never heard of jokes?
Yea I have the same power meter too. I bought mine from Aliexpress. Even LTT uses the same power meter. Apparently, it's a pretty good power meter. Just do not leave it plugged into a high-power source (around 1,000W+) for long periods of time as u will risk burning it and causing a fire.
Thanks for the info. Don't need that in my life
What power meter is this, do you have the exact model or link? I know its an old post but who knows xD maybe any of you see this
I ordered that same measuring device from Aliexpress for 15€. I guess they made a mistake since I got two of them in the package. A really cool thing to measure power use with. My decent 6 core 16gb 3050 700€ gaming laptop uses about 15-20 watts idling connected to my 55 inch LG OLED and the main screen turned off. TH-cam uses 25-30w. Max usage makes it go to 110 watts or so, WoW classic with fps limited to 45, 1440p and 3/10 graphical fidelity uses around 35 watts. My TV with power saving maxed uses around 37-45 watts, HDR on without power saving makes it go to 110 and more watts. My desktop just idles at +100 watts.
That's why I love laptops now, gonna save a big penny.
Power bills gone up in the UK and i rent. Im working from home and we're trying to keep the bills down.
I do have another housemate but im not sure how much of the energy bill i should feel bad about - theres probably a biased because im home most of the day.
I will have to get one of these!
Loving this content man. Personally I run a 5800x with a 6800xt and i turn my PC off every time I'm done using it for this reason. i want to know what's the reason of leaving a computer idling ? For quick access really ? Get your self ssds the boot process is so much quicker use it and turn it off !
I always turn my pc of for the night I never got why people leave it on it don't take long to turn on
It's 15-20 seconds vs instant.
How fast can you have your system booted and ready to use (in stand by state, all stuff loaded and idle) ?
@@armschoolc32 wait a second are you seriously complaining about 15- 20 seconds and trying to start an argument just so you can tickle your balls over saving some energy yearly ? ... pathetic
@@armschoolc32 well ?
I have data recorded every 5 seconds for a year showing my office, bedroom and living room PC usages. Having that data over the year saved me 25% on my electric bill.
Gamers in particular need to pay more attention to the wattage and start to see it as a "bad" thing.
My PC idles at 95-115W. Very similar spec to yours.
However. Running a game in full 4k with default settings and no VSync will immediately raise that to 550-600W. However, if I go and enable VSync that drops under 400W. Lower the res to 1440P and all the fans stop and it comes down to 320W.
When I run "3DMark" Benchmarks like Time Spy. I can see each phase of the test on the mains power chart.
I got so upset with this, I spend £3000 buying solar panels, batteries and an inverter to solar power the whole thing. Now I can game "guilt" free AND save on electric bill.
Thx for the video. Good info... I now leave my PC running idle 24/7 🤦
If the power supply will only pull as much power as needed from the wall, why does having a higher capacity power supply generate a larger electricity bill? If you don't use the whole thing, are you charged as if you are?
Have a larger power supply generally shouldn't increase power consumption. The power supply rating is simply a limit on how many watts it can pull at one time.
I have a 850W power supply. If I had a 650W power supply with the same efficiency rating, it MAY pull slightly less power. Somewhere between 0%-2% less
www.wepc.com/tips/power-supply-ratings-exactly-what-do-they-mean/
@@TechIlliterate Thank you, I appreciate the info.
Powerefficiency actually should be a standard Bios-Feature.
Thanks to a little project of mine, my computer now talks directly to the Smart Meter on my home. The power company now thinks it owes ME money.
Very good. I learned not to leave my PC on waay back when the motherboard started melting. Granted it was a cheap one, and the PSU might not have been much good either. If I hadn’t been in the house would have gone up in flames, which would definitely have wasted a lot of energy.
Love the video!
I actually just used one of your other videos to under-volt my graphics card, I'm only rocking a 1660 Super, but I cut the wattage down nearly 40%, and my temps down by 30% while only losing like 6% of my FPS.
I still hate how much energy desktops use compared to a laptop for just browsing the web/simple document work. My desktop still uses about 48-70 watts vs my laptop using about 10-20 watts for those simple activities. I wonder why desktops can't go into some super high energy-efficiency mode when you're doing stuff that doesn't stress the system at all?
You make a good point. Something I have wondered as well. I am going to look into it. :)
hey im also rocking a 1660s, i cant wait to try it my self.
My pc uses 130w at idle haha
If you have a gaming laptop it will use an extremely complicated method of switching video processing from your GPU to the more efficient on board video. This is a massive PITA to deal with but it is more efficient. Desktops don't do this, they're always using your GPU, which just uses more power as a baseline, even if you're undervolting. One thing you CAN do is tell Windows not to use your Hardware Acceleration for certain applications on your PC. I think this is under the Gaming settings on Win11. This means the GPU will stay idle instead of consuming more power.
The other option is of course to use a SOC that combines graphics and CPU on one chip. AMD calls these "APUs." It is much slower for gaming but still usable, and more power (and of course cost) efficient. In this case you use the video outputs on your motherboard instead of a separate GPU.
Nvidia needs to get their stuff in order. in Germany and many other EU countries power costs >35 USD cents per KwH. Real money. My 1660 Super alone consumes 15-20 watt just idling on the desktop- so i pay over $50 for the privilige of having a DGPU. My Ryzen 5700G entire system with 32 Gigs, 3 SSDs and APU costs only 25-30 watts Idling, the card almost DOUBLES that. NVIDIA needs stop wasting power, there is NO REASON these cards suck up this much wattage just sitting around. I'm sure if you pull out that 3090 you will halve your idle draw as well.
That's pretty terrible price damn. For me it costs currently a bit over 13 €cents/kwh. My pc with 3600x and 5700xt pulls 95w at idle. Could be lower but i find it acceptable imo.
@@imrileth6618 A large part of Europe has these prices, now... and it can get worse.
If it makes you feel any better my 7800xt uses an insane 60W to just idle.
Modern video game consoles also have some solid power settings. On the Xbox Series S/X you don't even have to lose the "Quick Resume" functionality if you turn off idle power draw functions. The PS4 will lose it's ability to pick back up on a running game if you disable "Rest mode" though. Not really a big deal and you can always toggle it off and on as needed if you don't want to keep going through the motions of launching/loading specific games (The Last of Us 2 for example takes quite a while to load from launch so you can leave Rest Mode on until you beat it). Not sure about the PS5 as I don't own one.
How in the earth you dont have 300k subscribers!? Thanks for your videos
Few things to consider are that the PC in question had a fairly high-power GPU (350 watt 3080 Ti) and an AIO pump, which is going to use much more power than an air cooler especially when idling. Most gamers according to Steam's hardware survey rarely get a card anywhere above an Nvidia XX70 series, and those cap out at around 250 watts under max load and tend to average closer to 200-220 watts.
Leaving your PC to idle is definitely a foolish thing imo, running your hardware for longer periods will reduce the lifespan quicker (especially for fans) and the tiny tiny setbacks of extra power supply strain (minimal) and slightly-slower bootup times (if you're on an SSD it really shouldn't be an issue) are incredibly miniscule compared to the cost of running your PC at idle for so long.
I definitely wish Windows had more options for power management built-in, as stuff like Windows Update and Windows Antivirus running at random can really ramp up power consumption for your storage and CPU.
I agree changing my sleep settings now good advice thanks
Coming to this video helped me a lot. I'm upgrading my GPU and learning about wattage and costs. Now I found the perfect spot where my montly bill might increase with 10 euro's but not more thankfully.
I have an idea for a "hybrid powered PC" which would run off of a small uninterruptible power supply (UPS) when idle. During higher load scenarios, the UPS would be charged from the grid while running the PC at the same time. I'm also trying to figure out how to get the UPS to support the grid power to the PC during unusual demand spikes (Turbo Boost, etc). With your car analogy, idling this style of PC would be more comparable to leaving a Prius in [READY] mode (running), where it only starts the engine when required to charge the battery.
The answer is simple. You take a secondary power source like solar panels or a windmill, and a connection to the grid. You then create a relay with sensors and other needed utilities and connect both secondary and main power into inputs on the relay and then connect the output of that that relay to the input of a ups. The ups powers the computer, and the relay keeps the ups charged. During minimal use, the relay would keep the secondary source of power flowing as long as it keeps up with demand, and when demand outstrips capacity, the relay switch's to the main grid. All the while, the computer is happily chugging along with stable power from the ups.
That is not at all comparable and just worse and more expensive. Hybrid cars save so much fuel because the engine is running way less, that's s massive difference. What does it do that the PC is running off a battery??? It would kinda work out if you charged the battery at night and then used it during day. But the savings are still tiny. It's easy, everytime you leave your desk just put the PC into sleep mode.
@@gerble36That's still ridiculous. Either put solar panels on your roof if you have a house. That alleviates some of the cost but it needs to amortize first. Using a solar panel or windmill to power a computer is ridiculously inefficient. This small scale power generation just doesn't make sense from the numbers perspective.
The computer still uses the same amount of energy either way. There is no energy savings in this situation. And charging a battery instead of a direct line actually reduces efficiency requiring even more energy.
This also made me re-think my work from home situation. Here I am thinking i'm being efficient by preventing e-waste because I use a 2009 iMac as a secondary screen (target display mode)...only to find out the average power draw is 100-200watts.. Looking up new Energy Star Certified monitor info I realized most modern high efficiency work monitors use like 8-15watts when they're on. I don't even really use the "computer" part of the iMac so I'm actually being wasteful here energy wise (it also generates a ton of heat). No brainer time to upgrade to a more power efficient monitor and sell the iMac to somebody who needs an actual computer to recoup some of the cost.
i used to be the idiot to leave my pc 'idling' overnight, went to work the next morning only to come back home to my pc at some point and do a gaming session of roughly 2-6 hours.
it took me a year to realize how much this wasted power had cost me on the yearly bill.
that isn't even funny anymore.
so to all the other idiots still idling around the globe: Turn OFF your pc whenever you leave it for more then roughly an hour. seriously.
on top of all that, it isn't just idling.. you are shaving off running hours from the lifespan of your components.. this may not have a direct impact but in the long run it will shave off years.
this is because electronic devices simply die over time from usage.
your videos are very high quality. you deserve more attention.
Another comment separate from my other one pointing out the weird mistake.
You are correct that turning off computer at night when idling saves money, of course. However, there is evidence to suggest leaving the computer on is *better* for the "health" of the computer:
1) Most computer failures happen on boot ups. Less boot ups, less chances things will go wrong. Booting up a PC takes a toll on it, a BIGGER toll than leaving it idle.
2) If you are using sleep or hibernate you are either using RAM or HDD/SSD respectively. Which causes more usage on those items every night, again leading to more degradation of your computer parts compared to leaving it idle.
3) Cold temps. Having the computer warming up also takes a toll on it, this kind of goes along with point 1 but w.e
I'm not saying this is a universal truth (for example, a point #3 exception would be if you are in a small room with poor ventilation, you should turn it off. because the heat will build on itself and that extra heat will cause dmg to the computer. But if you are in a big room, or a well ventilated room, that doesn't matter)
BUT there is a lot of evidence to suggest (assuming you have a very well setup-computer in all aspects) leaving it on when not in use is *better* for it.
As for the environmental effect, it's negligible. Individuals using electricity is NOT the cause for global issues. It's fossil fuels and LARGE corporations massive power hungry machines. They are still necessary, but *they* need to be looking at ways to ease environmental impacts, not us individuals.
Lastly: you might be thinking that the cost of that nightly use is not worth. Well. If the computer completely breaks 3 years earlier because of all that boot-ups, then the thousands of dollars to replace may out match it lol. But generally speaking, yes, if you can't afford it, probably better idea to just use sleep (if you use HDD) or hibernate (if you use SSD). That distinction between storage drives is due to SSD having a much longer life span than a HDD, since using sleep will use RAM, so HDD is safe. Hibernate will use primary drive, which should be an SSD (assuming you have one).
*TL;DR* Contrary to this video, if you can afford it, and if your PC setup is proper, leaving your computer idle (obviously turn off monitors and stuff, just the tower we are talking here) has actually been shown to be BETTER for the life-span of the computer and it's parts individually as well. So the research, given those previous assumption, says you should leave it on.
Those are some good points. Thanks for sharing. I want to look more into the impacts of PC startup and heat cycles on hardware.
Thank you, you magnificent bastard!
I don't even have to look anywhere else, all the info is here!
Awesome video man!
in rural mountains in california now our cost per kwh is 53c. pg&e really upped their rates these last 2 years
Greetings from Europe. 7 USD cents per kWh is something we can only dream of. Where I live prices are 12 times higher, and still going up.
Yes, that's right. 12 times higher. I'm not kidding you. We are just under 0.9 USD per kWh (autumn 2022).
Makes you think twice before leaving all your appliances on standby.
And there's the environmental impact as well.
I am in BC as well. But I am using an AMD 3200G APU. No discrete GPU, so I just sip power. So my yearly cost for computer use is probably what I would pay for one beer in a nightclub.
fyi you can also get smart plugs that also give you energy readings :) Gives you more control over when they're running
Do not trust sleeping devices. A windows 10 machine will usually wake up to check for updates. If said updates are found, it may not go back to sleep again until you tell it to Update. A sleeping mini-pc draws about 3W. An awake one draws 20-30W.
TVs will also switch themselves on to download the adverts and spam they through at you or to download the tv guide and retune the aerial.
Not all standby devices pull
Basically, if I run my PC 24/7 at moderately high work loads, I'm using .06 cents a work hour of my income. For me, that is acceptable.
A good comparison is my rent is $4.5 per each of my work hours. My car loan is $1.8 per work hour.
So in comparison if for example I build a new $3000 computer every year and include 24/7 power consumption, its barely $2 per work hour.
I rebuild my PC once every 3-5 years, and I turn it off most of the time so it's a lot less to get entertained per year/work hour for me.
The car analogy though is a bit different, if you idle for 30 seconds, you already wasted more then double gas then to start it, it's like you lived in Hawaii for electricity. Idling is actually one way I see electric cars as much better. So many people IDle for a long time.
what about "turn off monitor after X mount of time", Mine turns off after 15 minutes, so it basically becomes a non-factor with some ~10W @ standby.
Power (and thermal) cycling is what will usually kill electronics. I pretty much never power my PC off (since c. 2012, now on the 3rd build), never had a _single_ hardware failure in that time, not even an HDD. The one from 2012 (P8Z77 / 2500K, later 3770K), according to the SMART data on a couple of WD Greens running in RAID 0, was running for 6.5 years of 7.5 years before I upgraded. It drew c. 45W at the wall, the current one under 40W.
it also depends on the game you are playing and the graphics settings, if you are playing yakuza, lol, rocket league... these games are not demanding. Let's try this, assume you are playing red dead redemption 2, Alane Wake 2, baldur's gate3or and bad optimized games, these games can cause your pc to shut down if you have a PSU below 450w, simply put gpu demand too much power
If i was to use PC - it is using like , 1 KWH every 3 hours or so. A KWH is now 37 cents.
I use for 3-9 hours. So i could use about call it a buck 10 a day.
My Gaming PC costs me 300 dollars a year to use with this calculation.
What's your spec
Mmmmmmm...... Lots of constructive and mindful advice. Good job, Man.
maaaaan so cool white hair line, although this video wasn't what i was looking for, have a good one!
Thank you.
Well done mate, congrats for your work!
this is great, i was so curious about my pc power consumption i appreciate the information.
Good vid man; good message, good advice.
Great video.. here in Denmark we pay example 0,44 usd for each 1 kwh and even more in the witner time when the big solar power parks dont provide as much power as normal. This ofcourse depend on many things often its less than 0,44 usd.
Actually the electricty itself cost example 0,075 usd, BUT on top of that come all kind of tax such as delivery tax, sales tax, net tax and other stuff and then the price end up like 0,44 usd for 1 kwh of electricty.
But it depend allot if its peak time of the day from 17-22 where its the most expensive ( like 0,44usd )
but from 12-16 the price can be like 0,22 usd since there is an overproduction of power so in such a case you only pay all the taxes but get the power for free. So the prices goes up down allot.
The main issue for many people are that you are not at home when the power is "cheap" so you spend most of your time gaming, cooking, tv from 17-22 in ordinary familys. The same goes with your electric car that you cant charge in the middle of the day when its away from home. ( some electric cars can even be used as a house battery where you can charge the car example in the night and then if its home deliver to the house when the power is expensive so you dont have to buy during peak houres. ).
Pesonaly when the war in ukraine started prices skyrocked and my bill was like 2-3 times higher than normal for x month and then it dropped to a more normal level again.
But due to all the unrest in the world that can effect prices i bought a hybrid solar power system so now i make my own electricty and in the summer time produce like 99% of all power i use my self. From 7 in the morning to 20 in the evening it can power my gaming computer with solar. ( and house )
And in winter time i can charge the battery when its cheap at night ect.
I think the electricty prices/tax is crazy BUT in example 10 years from now i think prices will be lower when there are more wind and solar parks than today.
At idle my gamer pc run at like 150watt ( that is with pc, 4k monitor and usb devices, and like 550w when gaming )
tip if you turn off HDR in the screen you save like 50watt, and you can also reduce game fps and monitor hz so the computer dont have to use quite as much power. Honestly most games do NOT require you to run them at any more than 60fps to have a fluid and great gaming experience. There are also options in windows 11 under power/graphics where you can select what games should run with "reduced power" and turn off HDR and such things, but honestly i dont really know if it work that great. You can also in nvidia and AMD software reduce FPS in games, just remember to reduce your monitors HZ if you do so for best result.
ps. my 10 year old laptop uses only 45watt when surfing the net.. the gaming pc uses 150watt.
think about that.
pps, i consider to get a small one or two 500watt windmill in the garden for the winter time to charge the battery.
power is very expensive in my country.
This is something I've been thinking about lately concerning my old PC I use as a plex server. I'm trying to think of some way to automate the sleep of my machine when I don't need it to be awake. I need to do more research on this, but it seems like Plex could create a feature to wake the machine when you ping it.
I am thinking on buying a laptop or an handeld pc for lower power usage, but on the same time I could limit refresh rate, UV GPU, limit FPS and choose better efficiency components to reduce power draw on the desktop.
That's tough :c
My house only has three amps of power most of the time(using a generator).
And I think I need another three to power a gaming PC
My gaming setup (LG CX 48", RTX 3090 and 5950x) consumes around 250w, just browsing the web... The CX alone consumes about 70w... Sigh lol
250w for browsing and video it's HUGE! No way
A huge part of that is your 48" screen. Turn down the brightness to 50%, turn down the refresh rate to 144hz. Then with your cpu, you need to look into lowering voltage, etc.
Actually fantastic tech channel. You have really interesting premises and you follow through in a natural way that makes for a really entertaining video. Keep it up.
Undervolting also helps.
LOL, In the UK we can only dream of power costs like that, we are more like 36c per hour!
It would be interesting to compare cost now we have a global energy crisis. Here in London for example costs can be 5X higher!
......do most of your general computing on a NUC :)
I also advise running everything from a power strip with individually switched sockets. Turn off chargers, power bricks etc when not in use.
after watching this video you sir have earned my subscription
The list at 7:00 makes me so mad as a German. Seems like we're a small island because our great politicians are brain dead since 2000.
How can you have this high Energie costs and still try to be a leading industrial country. Same with fuel and pretty much anything else.
I have heard that they plan to turn that around with Nuclear. Maybe I heard wrong.
Thanks for this. Quality content right there. Did you see Germany as #10 on the list of most expensive countries. Just small island nations are more expensive 🤔
I know this. We are shutting down nuclear power plants, just so we can buy nuclear power from France and Finland. It‘s stupid.
I live in Germany :(
Paying fines for all those coalpower plants is not cheap.
my school actualy says to just ''log out'' of the computers and they force us NOT to turn it off.
welcome to germany
I'm not trying to discredit your research, I really appreciate the effort you put into it. But would a 5000 series CPU draw the same yields? Let's be honest, these are the ones that people are looking at.
Good point. Time for me to upgrade?
Depending on the CPU, could be more, could be less. My R7 3700X has the same TDP (65W) as the R5 5600X. (Of course real world power draw will be somewhat different) R7 5800X is rated at 105W
My power consumption is skewed in one direction by the GPU and arguably the other way by the CPU. I still think it's quite efficient compared to the 5800X
@@TechIlliterate Let's take into account that's the TDP, which usually doesn't really indicate power cnmsumption of the CPU. According to AMD, a R5 3600 and R7 3700 CPUs have a 65W TDP, but in reality they use about 80W under heavy load (using the built-in boost). If PBO is enabled they can easily use 100W.
To be fair, this may not be AMD's fault or a false statement from them, it might just be related to the "issue" about AM4 motherboards forcing more voltage than it's set and reported to the CPU. So a configuration that states will provide 1.2v may well be feeding 1.3v to the CPU.
Many sites report in their tests that the 5600X uses less power than the 2600, 3600, 3700 and 3700x with reports about 62-67W under heavy load (varies a bit depending on the load).
The reason I watched this video is because I got a hold of my friend to see if he could upgrade my computer for me. He didn't have any spare parts lying around but what he DOES have is a lenovo thinkcentre. its supposedly quite big, and it's got 128gbs of ram dual CPUS at 4.2 ghz for a total of 32cores.
1500 watt power supply.
he also told me that usually it runs at around 600 watts, and even then, I still thought it was too much. My current PC runs at around 375.
Look, I'm not a power hungry guy. I had an electrical fire once, so I tend to unplug and put away things that I'm not using anyways, and as someone who lives in the toronto area, I thought this was very insightful for the power bill. Thank you for this video.
P.S, does anyone know any other ways I can conserve power to help me make a decision?
I don't need all of this power as I said before, but as someone who is interested in doing things regarding operating systems or even game development, I thought maybe it would be useful to have so I don't need to go out and get it later. It would be really nice to force the computer to run at around a quarter of its power and then bring it back up when I decide I want to develop. Just doing something very similar to the power saving mode on an Android phone as an example. I also plan to use Linux which is very power efficient last I checked.
Bro even $80 a year is nothing. Thanks for putting me at ease. Here in california its our central ac thats killing our power bill.
Very unique information!
Interesting, how much does that equate to as watts used? If I was to buy an inverter what size would I need to run a rig like yours in gaming mode for 3 hrs? Any advice appreciated.
Just found the channel. Keep up the good work
thanks a lot!
"Don't live on small islands to do PC gaming!" Now I'm considering moving off a small island. 😅
there is a town in sibiria where the people have to leave there cars on all year because otherwise it would freeze in 20 min and you can't start it anymore -.- XD
Heheh, OTR truckers do it all the time, leaving their trucks idling over night as they sleep in their cabs. ALL cruise ships are running 24/7 because they can not shut off the systems onboard without damaging the vessel, and cause mold to form all over the interior.
another great video! as always!!!
Thank you very much!
what a cool video and ways to show info like with the globe and the paper
Cool video.
I wonder if someone measured how much PC uses when turned OFF (it's not 0, believe me).
I let my PC 24/7 and while it does that it uses under 50W/h tested. How is yours using over 100? That's definitely not idle!
5600x / 1080Ti / SSD / HDD / EK X360 Extreme Kit (both CPU+GPU) / 7 Noctua Fans
It's probably his 3080 Ti that's using excessive amounts of power even on idle.
The 3080ti uses 140w more than the 1080ti.
He compared his usage to a 3080 🤣
@@dharmacode27 yeah, he might be a few watts out 😂
@@Kizarat Probably, yes.
Gone are the good old days of cheap electricity where I had my PC running flight sims on 3 screens with a ton of added hardware. Never really measured power consumption in those days but it sure was a lot higher than your 3 hour test because of the 3 screens and close to 100% CPU/GPU usage pretty much all the time. As of next month I'm gonna pay 89 euro cents per kWh (currently already paying half of that). Really glad I barely use my PC these days and have moved my gaming over to a Nintendo Switch & Switch Lite which only uses a few watts, especially in handheld mode. Desperate times call for desperate measures 🙂
You can game on power efficient PC hardwares albeit they will be low end. RX 6400 and GTX 1650 draw 75w from the wall, combine it with a 35w CPU like Intel T or S series or older i3 and newer Pentium (54w) will not draw a lot of power. Games are cheaper on PC for me personally, while Nintendo charges Dollar, Euro, Pounds, and Yen which are expensive.
@@main_tak_becus6689Mid tier hardware is pretty decent on efficiency as well. Total system power on my rig is probably around 250 - 300 watts after going with a 5900X. My 6700 pushing 1440p by itself averages around 100 - 110 watts. +130 watts at 4k. 5900X is around 50 - 70 watts according to Adrenaline.
Good advice thanks I'll show my kids this video 😄👍🏻👍🏻
Any Update? We need new calculations with new price for KWh. In Poland we have about $0,40/KWH and we are not a Island :-) And I can say more, companys witout private bills have to paid araound 0.80USD/1KWH
Hooray for my solar panels, keeping my gaming PC cruising along for no dollars at all!
…
Just don’t ask what the panels cost. 🤫
😉
What was the more important to some of us, such as what is the actual wattage at maximum GPU and max possible CPU load during gaming, you didn't show.
idling computer? My pc makes me money when I am not using it.. high power usage but higher income to offset the parts and power needed to make a few bucks. A 3090 can make about 8-10 bucks a day running it for 21 hours. Paid for the entire cost of the computer in a few (6 ish) months. I do use mostly solar and battery banks to supply the home 9 months out of the year.. Winter however, well, there's always one toasty warm room! lol
My computer has 12 hard drives, nvidia 1660, AMD thredripper 2950X and 1000w power supply. Using my solar scam monitoring software, it shows that I am using more power for my computer than my air conditioner. I'm upgrading to a 4090 and a 1300 watt power supply.
In some places might be good to go solar. What about mining?
They are saying that ryzen 7000 CPU is a beast, could be equal to the 3050 GPU I wonder how much power you would be saving?
love your videos, I bought the Samsung odyssey monitor after seeing your review sir Thanks again, Love from India