Cooling perfrmance being directly correlated to the temperature of the medium used for cooling still need to be proven ? Really ? Are people becoming THAT dumb ?
In one generation we've gone back 70 iq points. All due to staring at phones and consuming short form media from birth or later childhood. Err, sorry, what I was meant to say was put the skibidi in the bag little bro.
@ who says the computer has to be in the same room, during the winter i put my in a cold area, the hotter a machine gets the more inefficient it is that is why the apple silicon chip is out performing an equivalent pc
If you are good on that field, I'm not and i have one question why is warm water cooling more efficient then cold water cooling. Because on warm water cooling the delta of cooling is smaller then the delta with cold water cooling. For my understanding it should be linear and irrelevant or small enough to ignore but some told me the delta is smaller so it is more efficient to use warm water cooling then cold water cooling. The temp for warm water cooling is between 30-50°C and the outlet delta is 4-6°C. It would be really nice if you can try to explain that to me, because i want to know why and don't want to go deep in the field to answer only one question
@@Junky1425 You have to use energy to cool your water down to that temperature. That kills your efficiency compared to just using the ambient air(or outside air) to cool your "hot" water.
I'm gonna let the video play, but I did remain conscious in physics 101, and know you're correct. Hopefully the video will also remind people "water cooling doesn't make your room cooler", an argument I had just last week.
The funny thing is adding water cooling can actually make your room *hotter* if you have a CPU or GPU that dynamically adjusts power based on available cooling capacity. More power = more heat generated no matter how you look at it.
@@vollkerball1 The amount of actual power consumed for "work" on a CPU is so minuscule by comparison that you'd have trouble measuring it, component-by-component, with consumer-grade instruments. The VAST majority of the power consumed by silicon transistors is converted to heat. So, while you are, TECHNICALLY correct (the best kind of correct), when we're talking about a PC consuming 400w of power from the wall, I'd estimate, whether on water cooling or air cooling, 350-380w of that power is being output as heat.
as an electronics engineer using temperature constraints & ambient offsets for safety critical applications; it never occurred to me that this is something that should be explained. I'm glad you had the chance to make this video to share what we know to a greater audience.
There is stuff in this world we 99% know is right, but still isnt comfirmed 100% by scientific data. Might not be important to you, but for people who are humble and wise that 1% unkown will affect the confidence in witch they present stuff. Witch in turn affect if people listen. That is why most people will listen to someone who is soundig confident but is wrong over someone humble and reflected, but 99% chance of being correct
Wait, has this become a "basic physics channel"?!? First law of thermodynamics: "The total energy of an isolated system is constant; energy can be transformed from one form to another, but can be neither created nor destroyed." The "isolated system" in this case is the entire room. Is this really something people have been questioning? LMAO
Ironically, thats the wrong thermal principle though. What you're stating would be the way to argue why better CPU coolers dont keep the room cooler. We're not talking about _total_ energy/heat, but rather the _distribution_ of it (e.g. constant temperature everywhere vs all the heat concentrated in a single point). The reason why the temperature difference between PC and room is constant is because thermal resistance is constant and doesn't change based on temperature. Or alternatively: Heat "moves" along a linear gradient from a hotter place to a cooler place. As a simple example: lets assume you have two sides of a heat-resistive barrier, left 50°C, right 20°C. Then the energy flowing rate (="heat transfer speed") is proportional not to the 50°C but to the difference, 30°C. So, having lets say 70°C left and 40°C right, it still stays the same, as the difference is still 30°C. The basic formula would be: "Flow = thermal resistance * temperature delta". Imagine alternatively a non linear relation between flow and temperature delta. Lets say, the hotter the barrier/resistance gets (based on the left side's temp), the faster heat is exchanged. Then the flow might be =30°C*R=30 arbitrary unity (lets assume R=1 for 50° on the left), and flow=30°C*R'=60 arbitrary units (assuming R'=2R for 70° on the left). So the energy/heat would leave the left side at double the rate, thus cooling down quicker, just because of the baseline temperatures being different.
@@andreas_rr No I am not, the principle is correct, but I should also have sprinkled in a bit about the 2:nd law, entropy. Witch is what you are on about.
@@gamestopper15 Can you point to one ? Also, on top of the clickbait thumbnail Heatsink efficiency will go down with increase in absolute ambient temperature Also, the internet components will decrease their power output as die temperature reached the safety limit, and that would make the gap between die temperature and ambient temperature shrink.
I did wonder if the specific heat capacity of air changes with temperature, like water does, which would make this a non-linear effect. Turns out it's a very minor effect, so yes, you would get what Jay finds here (certainly within the margin of error) 👍
You need to block anyone who've been arguing with you on that. There's a point where stupidity becomes a dangerous thing and those people are way past that point.
Yeah, Jay has one of the few working prototypes of EVGA's 4090 if they would have sticked around and did not leave the market. It's the "EVGA *not a 4090* 4090" :) Sad that they left :( Edit: thx to @tnadawg89 who corrected me, it wasn't the "not a 4090", I rewatched the video to the part where the GPU was visible.
@@tnadawg89 Ah lol you're correct, didn't realize he used the 3090, thought I saw his EVGA 4090 in the build but didn't really look at it in the first watch through.
As someone who lives in a relatively cold area, I never doubted this causation, since I can just open the Window to lower the temperature in the room from 25C to 10C and see how the temperature of my GPU drops from 80C to 70C (+ Fans are slowing down due to temperature curve which accounts for another 5C difference). But really all you need to understand this point is a 7th grade physics book.
@@overyonderjustapiece Yes, when the temperature is lower, fans no longer need to run as fast to keep it under control, which is why the difference between ambient temp and GPU temp is not the same. Part of the change of ambient temperature goes into allowing fans to run slower while still maintaining lower GPU temperature than at higher RPM but with higher ambient temp.
My gaming setup is in a small room that measures 10x14 feet. Although the room is not very spacious, it becomes quite warm when the temperatures rise. The temperatures on the PC also increase slightly.
Same here, once the room is saturated with heat the PC runs just a little bit hotter but eventually it stops getting hotter of course. Thought this was common sense but I guess our education system is such a failure people don’t know.
Unless you're actively displacing heat out of the room you can ONLY get as cool as ambient which will raise since moving energy tends to take energy. Or, you can only get as cold as the coldest thing in the exchange chain. Technically since the peltier just moves the heat to the other side of the module, but maintains the same temperature delta (at the cost of current, so youll need more power to maintain delta T as the hot side gets hotter, they really are horribly inefficient, neat yes, but power hungry), the ratio should still hold. Sure the cold side will be below ambient, but as ambient gets warmer, so will the hot side, and since delta T is constant, so will the cold side. I think where people get hung up is the thermal exchange efficiency. They assume because the air is hotter it cant take away as much heat. And...I mean...there is some nuanced physics there. But its not really a concern, much less noticeable on human temperature scales. The working fluid (ultimately air, water cooling is just a big thermal capacitor, its thermally dense, but the water will each a steady state too) has a specific energy, or how much energy it takes to raise its temperature by one degree (per cubic meter per kilogram or whatever). And its always that. It doesnt change. When you boil water and turn the burner up the water doesnt get hotter, it stays 100C. It just boils faster. It becomes easier to turn to vapor. So as long as you CAN put more energy into it, the thermal exchange delta holds. Like, in phase change, you're boiling off the refrigerant. Just like the water, it sucks the heat out of the pan dropping it to 100C, and boils like mad. Its just that the refrigerant boils at A LOT lower temp sucking the heat out of the evap coil or whatever is touching it. But, yeah, in the case of purely radiant cooling (and even the there are caveats, in the physics of energy there are no free lunches) whether it be water or heatsink and fan (which, I mean....whats a radiator? Its a heatsink for liquid), you cannot lower the heat (energy) lower than the lowest level in the environment. They can only at best reach equilibrium. As long as there is thermal space for a watt in the air, and the heat source is not out running that exchange (even then remember that boiling thing), one watt made, one watt moved, one watt out, plus the watt it took to transport it. No free lunches.
There should be a very slight non-linearity due to silicon being less efficient at warmer temperatures. However it should be pretty tiny over any reasonable range.
I was about to say this. As your computer gets warmer, resistance increases so you need more voltage and across more resistance, that means more amps. That makes your power go up and things will go that way until the chips start throttling.
the problem is, when they made one use less power and be more efficient, they got put on blast for having lower performance even tho the efficiency is more important to the life span of not only the cpu but the gpu if they are both air cooled.. thats why i gave up on linus and gamers nexus because they both said intel was dumb for trying to make chips more efficient
The reason why people think ambient temperature doesnt affect max temps is because they simply don't know how CPU/GPU coolers work. They dont see Radiators/heatsinks as "heat exchangers", which is what they technically are. They see the coolers (especially watercoolers) as a refrigerant based cooler i.e. one with a compressor>>refrigerant>>condenser where the refrigerant flows over the coldplate and actively cools a cpu/gpu, in most cases well below ambient.
Yup. Reminds me of the time on Gamers Nexus where a few very vocal people in the comments argued that you can cool down a CPU/heatsink to below air temperature (with an air cooler) if you just ran the fans fast enough. They seemed to think that by blowing air that is 20 degrees C fast enough the CPU could end up being 15 degrees. They would not accept the fact that 20 degree C air will never cool down a CPU to below 20. It will at best get the CPU down to 20 C, but even that is extremely unlikely. If anything, if the air starts moving super fast, it will instead start heating up the CPU/heatsink because of friction, although that requires very fast air speeds.
@@PixelatedWolf2077 There is also a funny thing if you point you fan towards a window you gain more airflow into the room. If you have the normal rotating fan, its even worse because it pulls lots of air from the side not the back
been thinking about you saying this a few times. i have baseboard heating in my room, and it when i have to crank the heat it definitely makes a difference in my overall temps even at idle. thanks for the awareness!!
Jay, let's get you in the gym and get ya a bit ripped. I built my first computer a couple years ago and you were a big part of that, I watched your stuff constantly and your education was invaluable to me. I appreciate all that you do and I'm so happy for your success. Here's my two cents: I'm 51 years old and do a few things to keep myself around longer. Drastically reduce my wheat and sugar intake, eat red meat constantly and only eat up to twice a day. I only say this because guys like you in the YT scene are important to me and I want you happy and healthy. Keep up the good work and let your honesty towards the industry stay strong. The advice you gave us when I was building mine all turned out to be correct historically. Stay solid!
I first started noticing this type of change recently myself. I've got some physical health issues that don't allow me to keep the temperature as cold in my home as I once did. To keep it simple, the colder it is, the sorer my body becomes. I have an Aquacomputer real time temperature sensor and display for my CPU in my system and as I started experimenting with how low I could keep my home and be able to tolerate the pain, I started to notice that my CPU would also reflect the 1 or 2 degree change as I would adjust my home thermostat the same 1 or 2 degrees for at least a full day between the thermostat adjustments so it could kind of normalize before changing it again. What probably helps the almost exact corresponding temperature change is that I have my system "in" a fully open air "case". (TT Core P3 Pro with no glass or additional radiator/fan mounting brackets.)
Keep in mind that cooling/thermal conductivity efficiency is measured in w/m*k If your ambient is raises to be closer to the object you're cooling, you need more airflow or more powerful cooling to dissipate the same energy Sure as long as the cooling is able to cool the object its still X watts being disipated, just that a higher ambient makes cooling less efficient, unless you were already running way below TJMAX, say running 60c on the die in a 20c room, and then going up to 70c on the die in a 30c room would in practice require the same airflow, but 60c in a 30c room would require some level of higher airflow.
It matters more at extremes though, extremes that by themselves are not too applicable to average PCs. For example, the lower the delta, due worse conductivity efficiency the more diminishing returns further cooling system improvements (such as eg. adding rads to LC loop) nets. And high deltas, when efficiency is highest, nets bigger cons (eg. throttling) then pros. But as rough aproximation linear relation of ambient temps to temps of cooled generic PC hardware with generic PC cooling systems works reasonably well for practical purposes.
Longtime viewer here. This is exactly why too; you made a video about something that should be common knowledge. But you are so interesting to watch, I still watched all of it! Love ya man and all the people behind the scenes love yall too! ♥♥♥
That's totally logical and intuitive. With my old Phenom II I tested AI upscalers and in Winter, everything were OK but late Spring/Summer, my PC was heating so much I had to be carefull on how many tests I had to perform each 10 minuts. Even my 7950X3D can get very hot and my PC shutdown three time, if I remember well. So I have to be carefull even if the CPU is cooled with liquid metal.
Being a sensor freak I have done the same test in the past with my watercooled PCs: the result is the same except for a slight delay in the curves given by "thermal inertia" of the liquid in the loop which is less immediate than the variation in air. In various loop sizes I have seen from 30 to 90 seconds of delay.
Depending on contextual conditions and methods of cooling, ambient humidity can play a rather significant role. That as a variable really needs to be part of and accounted for, in tests like these.
I was going to suggest that the small 1.6⁰ difference in delta could also be due to the short duration of the test and thermal mass of the environment [cold walls] causing improved radiative heat transfer. However, after calculating the maximum potential magnitude of blackbody radiation transfer, this can only account for 7% of the observed difference. (Assumptions are 300w power per device and 100 square cm area per device with emissivity coefficient of 0.8) So the 39.6⁰/38⁰ difference is most definitely a matter of measurement methods and materials.
I always had the idea of "if I want my computer running at 100% without heat saturating my room, I should have a water-loop lead outside to an outdoor-grade radiator." Glad to know that'll work after having seen this video
*when it gets hot in my Room My Temps go up. When it's Cold in My Room they stay Normal. Living in Alaska this can be annoying but I have learned to work with it*
You are literally using the ambient temperature to do the cooling. So the components temperature is always a set difference from the ambient temperature for the same cooling system. This is very basic thermodynamics.
Here's an idea for your next video. "What happens when you mix baking soda and vinegar!?! No one believed me until they saw this!" That'll really blow everyone's mind.
Best time for my system is always winter. I use it as a mini space heater in my office space when it hits -11F outside XD. My system does hover anywhere from 40-60c depending on what I am doing and season. Summer I am a monster in everyone's opinion for I keep my house from 76-78F. One saves my electric bill and two I like being WARM! >>
have an idea for the long term test benches ... keep them indefinitely. Build an new one when hardware improved enough to require it. Imagine if you had a top of the line and a bang-for-buck build from the first year of your channel, and another one every 3-5 or so years. Watching the same known system perform year after year and decade; trying to keep up as tech moves on and games get more complicated. Eh, I'm being silly, thanks for all the content ;)
Ambient definitely drives internal, but whether it's exactly linear probably depends on a host of factors. In my 2,300-ish cubic foot "tech" room, I have an 8K BTU Midea window AC unit that can turn the space into a refrigerator. In the summer, I can drop ambient from the house's 76F to a chilly 65F in 30 minutes. That lowers PC temps by about the same amount, but more importantly, it stops PCs from raising the ambient temperature and making it uncomfortable. (Yes, 76F is perfectly fine in the summer as long as the air is fairly dry.) In the winter, with my house set to 66F, running a couple of gaming PCs in the room for an hour can raise ambient by 2-3F.
You are right jay. The non linear part is in cooling efficiency = power/temp delta. You can increase cooling efficiency by reducing the pumpspeed and fans.
you are right about it and always have been of course but its nice to have something to point to and show data. its not a hard principle to work out though: when your system is off its at ambient room temperature (any low voltage draw in standby ought to be irrelevant since the whole case is acting as passive heatsink. ambient temp then raises temp of the hardware to match itself, or the hardware cools down to ambient when switched off. when you switch it on, the temperature of the pc rises because its drawing power inefficiently and the inefficiency is always either expelled from components either as heat or light. in a perfect world the pc would not rise in temp when running, and not heat up the room either because thats wasted power - but components (resistors diodes etc) that are perfect would be too expensive for anyone to afford. also the rise in temps always ADDS to the room temp, by a small amount. the ambient temperature of the room is the base temperature before you turn it on and create heat. i worked out my old tv would warm the room by 1 to 1.5C when turned on, and i am thinking my pc isnt the room heater i was hoping for because the current tv is more efficient in watts than the old one i have, and heats the room much less, and the computer is heating the room about at the same rate as a 60w incandescent bulb would.
Thank you Jay for proving a well known fact, the law of thermodynamics. If you're interested in doing more of these amazing videos proving yourself right based on things which are widely known as being that exact way, you could make a video proving how water is wet.
I've noticed that because my fans spin faster in summer because my coolant is a lot warmer when it's 26°C inside here at summer compared to the 16°C in here in winter.
I wouldn't use a paste with silicone in it as a test bench. It will pump out beyond the margin of error (2C) by your 100th case. If you want to remove any doubt use a KryoSheet. Steve mentions the use of Hydronaut for his test benches in the TG sponsorship, ask him why. If you want to measure a room's ambient temperature, don't put the probe in a case behind a fans hub, just put it in the room. Want it to look cool, zip tie it to a mic stand. As far as a CPU probe goes, I think I remember Roman milling an IHS for this purpose. You don't need to max out your fans, just have flat fan curves in the BIOS and MSI Afterburner. Saving and exporting the settings is a good idea too. Let the AC do its thing and run the test for a longer period of time. You don't need the same system or environment, the TLDR that Jay is trying to get across is, "The delta between your ambient and system temperature is proportional". It dose not matter what CPU and/or GPU you are using, with whatever cooling you have. If you can drop your rooms temp by 5C your system temp will drop by 5C (within margin of error).
Just one recommendation: Every time, you measure anything with temperature, you should leave the system to totally stabilize! That means, you do the same load until all temperature curves get totally flat. Then, you wait 20-30-60 minutes. And THEN you got your accurate results... Also, if you export your results to Excel, you can create a "delta column" between the two temperatures, you want to show, and draw it along with your results.
Nice to finally having some validation on this (Beyond knowing how physics and all that works). Now i can rest my case and point my friends at this video should they wish to argue it's not so haha. Having lived in a flat with floor to ceiling windows and south facing meant Winter temps were about 19-20 in the house/room and summer temps could go up to 34c (no outside blinds only internal ones) I even had tracked data from hardware info that showed the Temps higher in the summer without changing anything, even idle temps. My logic therefore stated I should water cool due to the smaller delta, would love to have that shown and confirmed too, especially where a hot system may be struggling more under air than with water. PS. your chart said EVGA 4090 FTW3, but the box showed a 3090 :P
Assuming the system isn’t airflow restricted, yes, a linear correlation. Also, using a water cooler won’t make the ambient cooler. The dissipated heat will be the same.
if anything a water cooler might make a room warmer. Since its more efficient at removing heat from the source. the heat must get dumped into air instead of staying inside the PC case or the parts getting warmer
Seeing the self-described know-it-alls go into apoplectic fit over a "The More You Know" PSA in the comments section makes this video all the more worthwhile.
Since I started building computers I noticed websites that showed cooling performance of cooling components expressed it in delta of ambient. Then I realized the best result for the cost is going to be from a list after culling those that can't keep it down from their best, though preferably quiet, delta on that chip above the hottest the specific spot it goes in will get.
The higher the temperature difference the more W/(m⋅K) thermal paste will do. If you do the test starting measurement after heat soak it will be almost 1 on 1. Your only variables will be the heatpipes, thermal paste and electron migration. Heatpipes efficiency will change depending on the absolute temperature. Thermal paste will change because of that. Electron migration will become worse as temperatures go up.
So ... ... this was clear right from the Start... Cooling always has a maximum Performance that is linear and constant within the Measurement Tolerances and direct related to the Ambient Temperature ... In any Case, I learned this 1977 in Secondaryschool, later 1980 in Highschool and in the First Semester of my two University Courses (1983 Electrical Engineering and 1986 General Computer Science) in Physics.
I noticed this myself just recently unscientifically. Built a new 7800x3d system in September where we were still seeing a few 100 degree fahrenheit days here in California. Was a little alarmed my system was booting and hovering around 50 celcius at idle consistently, central AC running. The previous 5600x setup would boot and sit at 33. Under full load the 7800x3d system runs 70 degrees tops. All the same hardware except mobo, RAM and CPU. Case, hard drives, all identical. Outside temps took a sudden drop last week, and now my system boots to around 40 celsius at idle, no AC running. Under full gaming load, it still runs 70 degrees top though.
Yay, Lancool 216. It is the case that I used last week to built my very first PC. Your videos were a great help for building it and setting up the system.
People hating on this video, but it's interesting to see a scientific approach. I'm interested in real-world application testing of a modern computer. Not just math, most of us know the math. As a guess, I will posit that because of the loss in efficiency of thermal transfer in a lower delta T of 2 materials (air\cooler). The temp will be nearly but not 1:1. Full guess 10c ambient is 10.5-11c system temp. For hot climate people sake, I'd like to see 20c versus 40c ambient as well Jay. I've been in countries with 18c nights and 40c days. 1.12c i was close.
People living in cooler climates most of the year should consider two overclock bios templates (winter: max OC speed/Summer: reduce OC speeds) Winter months should give you faster overclocks because ambient air temperature would be lower meaning better cooling higher speeds from overclocking. In the summer time those pc temperature would run warmer which could start making the system unstable especially long gaming sessions. Living in a warmer climate most of the year shouldn’t really effect you because in winter months your pc just run cooler so you could get some extra overclocking speeds over the winter months.
I have 2 different fan profiles for summer and winter in NL next to a maximum profile for whatever. In summer we can have tropical temperatures and in winter we are part of the cold north
For decades, people have known that cars cool better in the winter than in the summer. This is the same. Who was it that ever questioned it? Are they gonna share what they were smokin'?
This is great! I've always wondered about this. Where I live we can go through 20 days in a row with ambient temperature above 35º, getting close or passing 40º. I have no air conditioning its just that absurd temperature all day long. From watching my pc temperature, I guess it caps around 98% so it doesn't burn, so I think this high temperature also makes my pc hold its performancve back so it doesn't overheat.
I mean, I've see this anecdotally with my system, here in phoenix. When it's 120 out and I cool my house to 80 degrees the system runs hotter than when I have to heat my house to 70 degrees in the winter when it's 50 outside.
Good vid!! How do people not know this?! I work in a server trailor at boeing its one of the 1st things taught, 15years ago. The temp is an ambient 37f...not sure who decided on hat specific temp. We wear arctic coats in there.
Proof-positive that personal gaming as a leisure activity is a _lifestyle._ If your lifestyle is trashy messy and _dusty_ then your machine will become such in-kind. If you don't manage the temperatures in your room, your machine will also suffer in-kind. And if you _simply can't_ then you could set your machine in a grow tent and vent the heat out with a fan providing active cooling like LTT's Alex had set up that one time in a small stuffy Canadian apartment.
I always keep my office at 69f(20.6c). PC never goes above 70c. Newer components definitely run cooler than my old ones. My 5900X+3090 combo was a space heater and my ac had to kick in more often. Now with the 7800X3D+4080 build I have now, my ac barely kicks on to maintain ambient.
Glad you used an air cooler. Linus had the "genius" idea of testing if putting a PC in the fridge would make it cooler but used a liquid cooler which made no difference lol.
Wow.. let's measure the ambient temperature in the room. Jay: Tapes thermometer to the inside of the case... A thermal camera would have been nice to see if the radiating heat from the components was reaching the "ambient room temperature".
Would be interesting to do a test with no AC from a base ambient value and compare water vs aircooling showing that water is more efficient and in theory raising the ambient temp faster
At steady state this is conservation of energy, but the differential equation shows that a larger delta T will result in more energy transferred in the same amount of time. With the thermal mass of the heatsink this is pretty negligible, but it in fact would be nonlinear for transients
The lower delta is because the temps are close together on the high end. Your data was correct. That is about what I expected. Science still works 😊 Jay was right, as I expected.
Of-course it is, it is literally as linear offset in the formula for calculting a heatsink, Tj = Pd (Rjc +Rcs + Rsa) +Ta, where Ta stands for ambient temperature.
I can confirm your results. My System with the 13900K and 7900XTX got the exact results. At the hot summer in germany ive got an ambient temp of 30°C and CPU core and GPU Hotspot was exact the same delta of 9° Kelvin compare to 20°C ambient. Another interesting part was after 1 hour stress testing in a closed room, that my ambient temp rised nearly to 39°C The 400W from my GPU + 310W from my CPU was insane how good i can heat up my room. The funny part my system was not reached the temp limit at the 30°C ambient temp but with 38°C was the vrm the limited factor. My VRMs can be Watercooled like my cpu cooler, but MSI or EK spend too thick thermal pads. Another funny part is the last winter, my heater was broken for few days and ive tried to heat up my room temp and it worked really well with over 700W can raised my room temp of 14°C to 22°C in two hours. By the way with the lower room temp of 12°C ive got a cinebench r23 score of 43700 points! That was way higher compare to a 14900ks score and my core temp was always under 80°C with only 270W power consumtion. For only 10° less room temp i can save over 40 Watt, thats really insane. The bad thing my Power Supply the NZXT 850W Cooler had a broken barrel that makes bad noises by reaching above 600W system consume after the long heat up session :(
The little wavy line on the ambient line is most likely the result of the Heat Pump unit oscillating flap moving up & down. The unit itself does not cycle on and off that fast. They do make baffles that will help eliminate that.
I believe that on the higer room temperatures it is normal to get lower delta on the CPU/GPU temperatures because air coolers aren't really cooling with air, but with evaporation. Evaporation starts faster when the temperatures are higher. It doesn't change the capabilities of the air coolers, but (I believe) that it changes the reaction (evaporation) inside the heatpipes. If I were you, I d test 2 identical builds, and tunning them simultaneously only difference would be air and water coolers, to test that theory
Is it just me, or is Jay doing normal sponsor spots kinda eerie. Like the world is ending, but he still has a job to do. "There's zombies eating everyone, but check out these monitors." Is it possible to a sponsor spot for baby bottle pops or nickelodeon gak instead?
At uni I've had to run multiple simulation calculations periodically on a short time budget, so I've set the AC in my dorm at 16°c and propped my laptop in a way that maximised airflow and cooling, so my cpu could turbo higher for more time. I've seen a 50% reduction of computation time, at least, with that technique.
ofc ambient is heavily related. The ambient air is the "material" any cooling solution is using to get rid of the heat and also the ambient temp is always "touching" whatever heat sink or fins you are using, regardless of tower or water cooler. Did some people really question that?
Funny man, for several years, in the summer when it is too hot (heatwave), I do not use my main PC even if it is equipped with water cooling, it remains closed, I use a laptop near my air conditioner
Jay. This was a good video and I'm sorry that you had to make it. This however is another moment I'm reminded that we as humans at large may not make it. I can't imagine a reality where one can process things in a way to conclude that the temperature of the room doesn't matter.
Ambient temps matter, it's the reason the most accurate pc temps will be measured as "delta over Ambient". That way you know that all you have to do is add the delta to your room temps and that's the pc temps you can expect.
I want to see someone create an intake for outside air that can be used when the outside temperature is lower than the temperature in your home. Also the test you did I always considered it common sense that your room temp will affect your cooling ability.
Hi Jay, I hate to be THAT guy, but just FYI - You run a EVGA 3090* in the build, but the slide is labeled as 4090*. Thanks for all your (and the teams) work
Cooling perfrmance being directly correlated to the temperature of the medium used for cooling still need to be proven ? Really ? Are people becoming THAT dumb ?
then why are server rooms cold
@@LyleestlingBecause HVAC and floor planning. 😂
Look, people don't realize you need to use the same signature on all your legal documents. They're getting really dumb
In one generation we've gone back 70 iq points. All due to staring at phones and consuming short form media from birth or later childhood. Err, sorry, what I was meant to say was put the skibidi in the bag little bro.
@ who says the computer has to be in the same room, during the winter i put my in a cold area, the hotter a machine gets the more inefficient it is that is why the apple silicon chip is out performing an equivalent pc
Thank you jay for providing proof of the law of thermodynamics
Right? Im a bit confused as to what the alternative thinking was here?
I think some folks are confusing evaporative cooling with adiabatic expansion?
How to reverse entropy?
If you are good on that field, I'm not and i have one question why is warm water cooling more efficient then cold water cooling. Because on warm water cooling the delta of cooling is smaller then the delta with cold water cooling. For my understanding it should be linear and irrelevant or small enough to ignore but some told me the delta is smaller so it is more efficient to use warm water cooling then cold water cooling. The temp for warm water cooling is between 30-50°C and the outlet delta is 4-6°C.
It would be really nice if you can try to explain that to me, because i want to know why and don't want to go deep in the field to answer only one question
@@Junky1425 You have to use energy to cool your water down to that temperature. That kills your efficiency compared to just using the ambient air(or outside air) to cool your "hot" water.
I'm gonna let the video play, but I did remain conscious in physics 101, and know you're correct. Hopefully the video will also remind people "water cooling doesn't make your room cooler", an argument I had just last week.
The funny thing is adding water cooling can actually make your room *hotter* if you have a CPU or GPU that dynamically adjusts power based on available cooling capacity. More power = more heat generated no matter how you look at it.
what dumb argument. But in all fairness if your components are running cooler means less power leakega so less heat, just nitpicking here
@@vollkerball1 The amount of actual power consumed for "work" on a CPU is so minuscule by comparison that you'd have trouble measuring it, component-by-component, with consumer-grade instruments. The VAST majority of the power consumed by silicon transistors is converted to heat. So, while you are, TECHNICALLY correct (the best kind of correct), when we're talking about a PC consuming 400w of power from the wall, I'd estimate, whether on water cooling or air cooling, 350-380w of that power is being output as heat.
@@reikoshea me blushing at the "techinically correct: thank you.
If someone does have that argument again, just ask them, if they are correct then why does AC need an outside unit/need to point outside.
Next video you need to prove that air exists even though you can't see it.
well the us's incoming defense secretary doesn't believe in germs cause he can't see them. wish i was joking
The outgoing secdef believes men can get pregnant...
as an electronics engineer using temperature constraints & ambient offsets for safety critical applications; it never occurred to me that this is something that should be explained. I'm glad you had the chance to make this video to share what we know to a greater audience.
Apparently it's a crazy concept for some people that blowing colder air results in better cooling than blowing hot air
Wait.... People actually said that this was not how it worked?? Wtf are they teaching in school...
Don’t know, that’s a good question.
Oh lots of people on his channel argued with him over this lol.
literally everything that is irrelevant in life
Gender ideology?
There is stuff in this world we 99% know is right, but still isnt comfirmed 100% by scientific data. Might not be important to you, but for people who are humble and wise that 1% unkown will affect the confidence in witch they present stuff. Witch in turn affect if people listen. That is why most people will listen to someone who is soundig confident but is wrong over someone humble and reflected, but 99% chance of being correct
I fucking love Jay, Instead of just telling everyone to go back to highschool and learn thermodynamics, He's like fuck this shit we're doin' it LIVE
Wait, has this become a "basic physics channel"?!?
First law of thermodynamics: "The total energy of an isolated system is constant; energy can be transformed from one form to another, but can be neither created nor destroyed." The "isolated system" in this case is the entire room.
Is this really something people have been questioning?
LMAO
Ironically, thats the wrong thermal principle though. What you're stating would be the way to argue why better CPU coolers dont keep the room cooler. We're not talking about _total_ energy/heat, but rather the _distribution_ of it (e.g. constant temperature everywhere vs all the heat concentrated in a single point). The reason why the temperature difference between PC and room is constant is because thermal resistance is constant and doesn't change based on temperature. Or alternatively: Heat "moves" along a linear gradient from a hotter place to a cooler place.
As a simple example: lets assume you have two sides of a heat-resistive barrier, left 50°C, right 20°C. Then the energy flowing rate (="heat transfer speed") is proportional not to the 50°C but to the difference, 30°C. So, having lets say 70°C left and 40°C right, it still stays the same, as the difference is still 30°C. The basic formula would be: "Flow = thermal resistance * temperature delta".
Imagine alternatively a non linear relation between flow and temperature delta. Lets say, the hotter the barrier/resistance gets (based on the left side's temp), the faster heat is exchanged. Then the flow might be =30°C*R=30 arbitrary unity (lets assume R=1 for 50° on the left), and flow=30°C*R'=60 arbitrary units (assuming R'=2R for 70° on the left). So the energy/heat would leave the left side at double the rate, thus cooling down quicker, just because of the baseline temperatures being different.
@@andreas_rr No I am not, the principle is correct, but I should also have sprinkled in a bit about the 2:nd law, entropy. Witch is what you are on about.
Was this ever really a question?? Holy shit man
It's always a been a question for years that people argued over lol. People just don't understand simple thermodynamics
Yes you'd be surprised the amount of people fighting jay. So it was a really a question Holy Crap Man
There are people out there that believe cpu water cooling keeps their room cooler
@@gamestopper15
Can you point to one ?
Also, on top of the clickbait thumbnail
Heatsink efficiency will go down with increase in absolute ambient temperature
Also, the internet components will decrease their power output as die temperature reached the safety limit, and that would make the gap between die temperature and ambient temperature shrink.
I did wonder if the specific heat capacity of air changes with temperature, like water does, which would make this a non-linear effect. Turns out it's a very minor effect, so yes, you would get what Jay finds here (certainly within the margin of error) 👍
"Delta T Over Ambient" feels like a concept I learned in high school chemistry class.
You need to block anyone who've been arguing with you on that.
There's a point where stupidity becomes a dangerous thing and those people are way past that point.
Hey Jay, your graph says evga 4090 instead of 3090, just a heads up :D
Yeah, Jay has one of the few working prototypes of EVGA's 4090 if they would have sticked around and did not leave the market. It's the "EVGA *not a 4090* 4090" :) Sad that they left :( Edit: thx to @tnadawg89 who corrected me, it wasn't the "not a 4090", I rewatched the video to the part where the GPU was visible.
@@deanderlp6892 That's correct but what he was saying is that on the chart itself, it says EVGA RTX 4090 when he's actually using the EVGA 3090 FTW3
@@tnadawg89 Ah lol you're correct, didn't realize he used the 3090, thought I saw his EVGA 4090 in the build but didn't really look at it in the first watch through.
@@deanderlp6892 that's not the card he used.
ok I will give gamers nexus a call! /s.
As someone who lives in a relatively cold area, I never doubted this causation, since I can just open the Window to lower the temperature in the room from 25C to 10C and see how the temperature of my GPU drops from 80C to 70C (+ Fans are slowing down due to temperature curve which accounts for another 5C difference).
But really all you need to understand this point is a 7th grade physics book.
slowering???
@@overyonderjustapiece Yes, when the temperature is lower, fans no longer need to run as fast to keep it under control, which is why the difference between ambient temp and GPU temp is not the same. Part of the change of ambient temperature goes into allowing fans to run slower while still maintaining lower GPU temperature than at higher RPM but with higher ambient temp.
@@overyonderjustapiece To be clear, you can change it in MSI afterburner or other software, but this Is a stock behavior.
@@overyonderjustapieceHe skipped the English classes.
@@vampe777slowing, not slowering :P
I'm also not native english speaker.
My gaming setup is in a small room that measures 10x14 feet. Although the room is not very spacious, it becomes quite warm when the temperatures rise.
The temperatures on the PC also increase slightly.
Same here, once the room is saturated with heat the PC runs just a little bit hotter but eventually it stops getting hotter of course. Thought this was common sense but I guess our education system is such a failure people don’t know.
then ur room temperature rises even more and thats infinite loop till ur pc overheats
@@darkfury3914 Yeah but what kind of the chart is it? Is it really linear meaning +10 = +10?
@@darkfury3914 At some point the room loses more heat that it gains. When the room gets hotter than its surroundings it starts to lose heat.
@samiraperi467 i know iknow i was joking
I’ve always thought this was common sense.. if not using some sort of peltier or other sub ambient cooling
Unless you're actively displacing heat out of the room you can ONLY get as cool as ambient which will raise since moving energy tends to take energy. Or, you can only get as cold as the coldest thing in the exchange chain. Technically since the peltier just moves the heat to the other side of the module, but maintains the same temperature delta (at the cost of current, so youll need more power to maintain delta T as the hot side gets hotter, they really are horribly inefficient, neat yes, but power hungry), the ratio should still hold. Sure the cold side will be below ambient, but as ambient gets warmer, so will the hot side, and since delta T is constant, so will the cold side.
I think where people get hung up is the thermal exchange efficiency. They assume because the air is hotter it cant take away as much heat. And...I mean...there is some nuanced physics there. But its not really a concern, much less noticeable on human temperature scales. The working fluid (ultimately air, water cooling is just a big thermal capacitor, its thermally dense, but the water will each a steady state too) has a specific energy, or how much energy it takes to raise its temperature by one degree (per cubic meter per kilogram or whatever). And its always that. It doesnt change. When you boil water and turn the burner up the water doesnt get hotter, it stays 100C. It just boils faster. It becomes easier to turn to vapor. So as long as you CAN put more energy into it, the thermal exchange delta holds. Like, in phase change, you're boiling off the refrigerant. Just like the water, it sucks the heat out of the pan dropping it to 100C, and boils like mad. Its just that the refrigerant boils at A LOT lower temp sucking the heat out of the evap coil or whatever is touching it.
But, yeah, in the case of purely radiant cooling (and even the there are caveats, in the physics of energy there are no free lunches) whether it be water or heatsink and fan (which, I mean....whats a radiator? Its a heatsink for liquid), you cannot lower the heat (energy) lower than the lowest level in the environment. They can only at best reach equilibrium. As long as there is thermal space for a watt in the air, and the heat source is not out running that exchange (even then remember that boiling thing), one watt made, one watt moved, one watt out, plus the watt it took to transport it. No free lunches.
There should be a very slight non-linearity due to silicon being less efficient at warmer temperatures. However it should be pretty tiny over any reasonable range.
I was about to say this. As your computer gets warmer, resistance increases so you need more voltage and across more resistance, that means more amps. That makes your power go up and things will go that way until the chips start throttling.
If this isn't common sense to anyone, that scares me.
3:15 I went with the CPU that generates heat, that tells you everything that is/was wrong with intel.
On point!
the problem is, when they made one use less power and be more efficient, they got put on blast for having lower performance even tho the efficiency is more important to the life span of not only the cpu but the gpu if they are both air cooled.. thats why i gave up on linus and gamers nexus because they both said intel was dumb for trying to make chips more efficient
Imagine thinking that this doesn't happen. The fact you tried to prove it, PROVES that people are stupid. Nothing else.
I don't think people disagree, it might just be that people don't believe same step in temperature...
The reason why people think ambient temperature doesnt affect max temps is because they simply don't know how CPU/GPU coolers work. They dont see Radiators/heatsinks as "heat exchangers", which is what they technically are. They see the coolers (especially watercoolers) as a refrigerant based cooler i.e. one with a compressor>>refrigerant>>condenser where the refrigerant flows over the coldplate and actively cools a cpu/gpu, in most cases well below ambient.
Already knew this that’s why I keep my room at 69 degrees 24/7 but hey how about that Mike Tyson fight it’s still buffering Netflix sheesh.
Nice.
Pls
Just use celcius degree
It is hood for health
Also metric system good for back pain
I notice this in the summer when we haven't set up the ACS yet Lol. Ambient temp in the room definitely affects how cool the PC is.
that is really obvious it is ridiculous that people would say otherwise
Yup. Reminds me of the time on Gamers Nexus where a few very vocal people in the comments argued that you can cool down a CPU/heatsink to below air temperature (with an air cooler) if you just ran the fans fast enough. They seemed to think that by blowing air that is 20 degrees C fast enough the CPU could end up being 15 degrees. They would not accept the fact that 20 degree C air will never cool down a CPU to below 20. It will at best get the CPU down to 20 C, but even that is extremely unlikely. If anything, if the air starts moving super fast, it will instead start heating up the CPU/heatsink because of friction, although that requires very fast air speeds.
Were you so sure before watching? That +10 in the room = +10 for the CPU?
Of course you are right. Basic physics. Ambient go up, computer temps go up in proportion.
Winter gaming ftw. Open windows gaming wearing a parka and snow pants for better performance.
If you really need it, put a fan at the window blowing in 😂
@@PixelatedWolf2077 Its actually more effective to move the warm air towards the open window, Blow out rather then in.
@@toddblankenship7164 Hmmm, I guess that is a good point. I've always just had the outside cold air blowing in especially during winter
@@PixelatedWolf2077 There is also a funny thing if you point you fan towards a window you gain more airflow into the room.
If you have the normal rotating fan, its even worse because it pulls lots of air from the side not the back
@@PixelatedWolf2077 ya me too i just crack the window and that cold ass -30C air keeps it cold at floor level where my comp is 🙂
Proving a long felt point is always good :P
been thinking about you saying this a few times. i have baseboard heating in my room, and it when i have to crank the heat it definitely makes a difference in my overall temps even at idle. thanks for the awareness!!
Love your approach to getting data! Been a fan of your down-to-earth method and personality!
Jay, let's get you in the gym and get ya a bit ripped. I built my first computer a couple years ago and you were a big part of that, I watched your stuff constantly and your education was invaluable to me. I appreciate all that you do and I'm so happy for your success. Here's my two cents: I'm 51 years old and do a few things to keep myself around longer. Drastically reduce my wheat and sugar intake, eat red meat constantly and only eat up to twice a day. I only say this because guys like you in the YT scene are important to me and I want you happy and healthy. Keep up the good work and let your honesty towards the industry stay strong. The advice you gave us when I was building mine all turned out to be correct historically. Stay solid!
I first started noticing this type of change recently myself. I've got some physical health issues that don't allow me to keep the temperature as cold in my home as I once did. To keep it simple, the colder it is, the sorer my body becomes. I have an Aquacomputer real time temperature sensor and display for my CPU in my system and as I started experimenting with how low I could keep my home and be able to tolerate the pain, I started to notice that my CPU would also reflect the 1 or 2 degree change as I would adjust my home thermostat the same 1 or 2 degrees for at least a full day between the thermostat adjustments so it could kind of normalize before changing it again. What probably helps the almost exact corresponding temperature change is that I have my system "in" a fully open air "case". (TT Core P3 Pro with no glass or additional radiator/fan mounting brackets.)
Keep in mind that cooling/thermal conductivity efficiency is measured in w/m*k
If your ambient is raises to be closer to the object you're cooling, you need more airflow or more powerful cooling to dissipate the same energy
Sure as long as the cooling is able to cool the object its still X watts being disipated, just that a higher ambient makes cooling less efficient, unless you were already running way below TJMAX, say running 60c on the die in a 20c room, and then going up to 70c on the die in a 30c room would in practice require the same airflow, but 60c in a 30c room would require some level of higher airflow.
It matters more at extremes though, extremes that by themselves are not too applicable to average PCs. For example, the lower the delta, due worse conductivity efficiency the more diminishing returns further cooling system improvements (such as eg. adding rads to LC loop) nets. And high deltas, when efficiency is highest, nets bigger cons (eg. throttling) then pros.
But as rough aproximation linear relation of ambient temps to temps of cooled generic PC hardware with generic PC cooling systems works reasonably well for practical purposes.
Longtime viewer here. This is exactly why too; you made a video about something that should be common knowledge. But you are so interesting to watch, I still watched all of it! Love ya man and all the people behind the scenes love yall too! ♥♥♥
That's totally logical and intuitive. With my old Phenom II I tested AI upscalers and in Winter, everything were OK but late Spring/Summer, my PC was heating so much I had to be carefull on how many tests I had to perform each 10 minuts. Even my 7950X3D can get very hot and my PC shutdown three time, if I remember well. So I have to be carefull even if the CPU is cooled with liquid metal.
Being a sensor freak I have done the same test in the past with my watercooled PCs: the result is the same except for a slight delay in the curves given by "thermal inertia" of the liquid in the loop which is less immediate than the variation in air. In various loop sizes I have seen from 30 to 90 seconds of delay.
that burn on intel. say8ing that AMD can't produce heat like intel can.
Depending on contextual conditions and methods of cooling, ambient humidity can play a rather significant role. That as a variable really needs to be part of and accounted for, in tests like these.
I was going to suggest that the small 1.6⁰ difference in delta could also be due to the short duration of the test and thermal mass of the environment [cold walls] causing improved radiative heat transfer.
However, after calculating the maximum potential magnitude of blackbody radiation transfer, this can only account for 7% of the observed difference. (Assumptions are 300w power per device and 100 square cm area per device with emissivity coefficient of 0.8) So the 39.6⁰/38⁰ difference is most definitely a matter of measurement methods and materials.
I always had the idea of "if I want my computer running at 100% without heat saturating my room, I should have a water-loop lead outside to an outdoor-grade radiator."
Glad to know that'll work after having seen this video
*when it gets hot in my Room My Temps go up. When it's Cold in My Room they stay Normal. Living in Alaska this can be annoying but I have learned to work with it*
I would think the heat capacity of air decreases at elevated temperatures but not enough to be noticeable in Jays tests.
You are literally using the ambient temperature to do the cooling. So the components temperature is always a set difference from the ambient temperature for the same cooling system. This is very basic thermodynamics.
Nice to see a Vid on it for folks that dont quite understand the basics of thermodynamics.
I just ordered the Y70 so I’m excited to switch over to it from the Lian Li Vision
If it could, Jay would marry all the AC units he owns...
Something about a big guy and his AC is funny.
Here's an idea for your next video.
"What happens when you mix baking soda and vinegar!?! No one believed me until they saw this!"
That'll really blow everyone's mind.
Best time for my system is always winter. I use it as a mini space heater in my office space when it hits -11F outside XD. My system does hover anywhere from 40-60c depending on what I am doing and season. Summer I am a monster in everyone's opinion for I keep my house from 76-78F. One saves my electric bill and two I like being WARM! >>
Because I live in a tropical country, I really can feel that ambient temperatures just make different performances between day and night.
have an idea for the long term test benches ...
keep them indefinitely. Build an new one when hardware improved enough to require it.
Imagine if you had a top of the line and a bang-for-buck build from the first year of your channel, and another one every 3-5 or so years.
Watching the same known system perform year after year and decade; trying to keep up as tech moves on and games get more complicated.
Eh, I'm being silly, thanks for all the content ;)
Keep up the good work Jay and team. Love these basic videos.
Ambient definitely drives internal, but whether it's exactly linear probably depends on a host of factors. In my 2,300-ish cubic foot "tech" room, I have an 8K BTU Midea window AC unit that can turn the space into a refrigerator. In the summer, I can drop ambient from the house's 76F to a chilly 65F in 30 minutes. That lowers PC temps by about the same amount, but more importantly, it stops PCs from raising the ambient temperature and making it uncomfortable. (Yes, 76F is perfectly fine in the summer as long as the air is fairly dry.) In the winter, with my house set to 66F, running a couple of gaming PCs in the room for an hour can raise ambient by 2-3F.
You are right jay. The non linear part is in cooling efficiency = power/temp delta. You can increase cooling efficiency by reducing the pumpspeed and fans.
you are right about it and always have been of course but its nice to have something to point to and show data. its not a hard principle to work out though: when your system is off its at ambient room temperature (any low voltage draw in standby ought to be irrelevant since the whole case is acting as passive heatsink. ambient temp then raises temp of the hardware to match itself, or the hardware cools down to ambient when switched off. when you switch it on, the temperature of the pc rises because its drawing power inefficiently and the inefficiency is always either expelled from components either as heat or light. in a perfect world the pc would not rise in temp when running, and not heat up the room either because thats wasted power - but components (resistors diodes etc) that are perfect would be too expensive for anyone to afford. also the rise in temps always ADDS to the room temp, by a small amount. the ambient temperature of the room is the base temperature before you turn it on and create heat.
i worked out my old tv would warm the room by 1 to 1.5C when turned on, and i am thinking my pc isnt the room heater i was hoping for because the current tv is more efficient in watts than the old one i have, and heats the room much less, and the computer is heating the room about at the same rate as a 60w incandescent bulb would.
Thank you Jay for proving a well known fact, the law of thermodynamics.
If you're interested in doing more of these amazing videos proving yourself right based on things which are widely known as being that exact way, you could make a video proving how water is wet.
I've noticed that because my fans spin faster in summer because my coolant is a lot warmer when it's 26°C inside here at summer compared to the 16°C in here in winter.
I wouldn't use a paste with silicone in it as a test bench. It will pump out beyond the margin of error (2C) by your 100th case. If you want to remove any doubt use a KryoSheet. Steve mentions the use of Hydronaut for his test benches in the TG sponsorship, ask him why.
If you want to measure a room's ambient temperature, don't put the probe in a case behind a fans hub, just put it in the room. Want it to look cool, zip tie it to a mic stand. As far as a CPU probe goes, I think I remember Roman milling an IHS for this purpose.
You don't need to max out your fans, just have flat fan curves in the BIOS and MSI Afterburner. Saving and exporting the settings is a good idea too.
Let the AC do its thing and run the test for a longer period of time.
You don't need the same system or environment, the TLDR that Jay is trying to get across is, "The delta between your ambient and system temperature is proportional". It dose not matter what CPU and/or GPU you are using, with whatever cooling you have. If you can drop your rooms temp by 5C your system temp will drop by 5C (within margin of error).
Just one recommendation: Every time, you measure anything with temperature, you should leave the system to totally stabilize! That means, you do the same load until all temperature curves get totally flat. Then, you wait 20-30-60 minutes. And THEN you got your accurate results...
Also, if you export your results to Excel, you can create a "delta column" between the two temperatures, you want to show, and draw it along with your results.
Nice to finally having some validation on this (Beyond knowing how physics and all that works). Now i can rest my case and point my friends at this video should they wish to argue it's not so haha.
Having lived in a flat with floor to ceiling windows and south facing meant Winter temps were about 19-20 in the house/room and summer temps could go up to 34c (no outside blinds only internal ones) I even had tracked data from hardware info that showed the Temps higher in the summer without changing anything, even idle temps.
My logic therefore stated I should water cool due to the smaller delta, would love to have that shown and confirmed too, especially where a hot system may be struggling more under air than with water.
PS. your chart said EVGA 4090 FTW3, but the box showed a 3090 :P
Assuming the system isn’t airflow restricted, yes, a linear correlation. Also, using a water cooler won’t make the ambient cooler. The dissipated heat will be the same.
if anything a water cooler might make a room warmer. Since its more efficient at removing heat from the source. the heat must get dumped into air instead of staying inside the PC case or the parts getting warmer
Seeing the self-described know-it-alls go into apoplectic fit over a "The More You Know" PSA in the comments section makes this video all the more worthwhile.
Since I started building computers I noticed websites that showed cooling performance of cooling components expressed it in delta of ambient. Then I realized the best result for the cost is going to be from a list after culling those that can't keep it down from their best, though preferably quiet, delta on that chip above the hottest the specific spot it goes in will get.
Good to see Jay using the correct case. :)
The higher the temperature difference the more W/(m⋅K) thermal paste will do. If you do the test starting measurement after heat soak it will be almost 1 on 1. Your only variables will be the heatpipes, thermal paste and electron migration. Heatpipes efficiency will change depending on the absolute temperature. Thermal paste will change because of that. Electron migration will become worse as temperatures go up.
So ...
... this was clear right from the Start...
Cooling always has a maximum Performance that is linear and constant within the Measurement Tolerances and direct related to the Ambient Temperature ...
In any Case, I learned this 1977 in Secondaryschool, later 1980 in Highschool and in the First Semester of my two University Courses (1983 Electrical Engineering and 1986 General Computer Science) in Physics.
btw, I love the Lancool 216
I noticed this myself just recently unscientifically. Built a new 7800x3d system in September where we were still seeing a few 100 degree fahrenheit days here in California. Was a little alarmed my system was booting and hovering around 50 celcius at idle consistently, central AC running. The previous 5600x setup would boot and sit at 33. Under full load the 7800x3d system runs 70 degrees tops. All the same hardware except mobo, RAM and CPU. Case, hard drives, all identical. Outside temps took a sudden drop last week, and now my system boots to around 40 celsius at idle, no AC running. Under full gaming load, it still runs 70 degrees top though.
Very nice! So for thermal testing, ambient temp always needs to be reported, then it's easy to extrapolate for different conditions
Yay, Lancool 216. It is the case that I used last week to built my very first PC. Your videos were a great help for building it and setting up the system.
People hating on this video, but it's interesting to see a scientific approach. I'm interested in real-world application testing of a modern computer. Not just math, most of us know the math.
As a guess, I will posit that because of the loss in efficiency of thermal transfer in a lower delta T of 2 materials (air\cooler). The temp will be nearly but not 1:1. Full guess 10c ambient is 10.5-11c system temp. For hot climate people sake, I'd like to see 20c versus 40c ambient as well Jay. I've been in countries with 18c nights and 40c days.
1.12c i was close.
3:19 i literally laughed out loud right here 😂
Jay was right, everybody rejoice
People living in cooler climates most of the year should consider two overclock bios templates (winter: max OC speed/Summer: reduce OC speeds) Winter months should give you faster overclocks because ambient air temperature would be lower meaning better cooling higher speeds from overclocking. In the summer time those pc temperature would run warmer which could start making the system unstable especially long gaming sessions. Living in a warmer climate most of the year shouldn’t really effect you because in winter months your pc just run cooler so you could get some extra overclocking speeds over the winter months.
I have 2 different fan profiles for summer and winter in NL next to a maximum profile for whatever. In summer we can have tropical temperatures and in winter we are part of the cold north
For decades, people have known that cars cool better in the winter than in the summer.
This is the same. Who was it that ever questioned it? Are they gonna share what they were smokin'?
This is great!
I've always wondered about this.
Where I live we can go through 20 days in a row with ambient temperature above 35º, getting close or passing 40º. I have no air conditioning its just that absurd temperature all day long.
From watching my pc temperature, I guess it caps around 98% so it doesn't burn, so I think this high temperature also makes my pc hold its performancve back so it doesn't overheat.
I mean, I've see this anecdotally with my system, here in phoenix. When it's 120 out and I cool my house to 80 degrees the system runs hotter than when I have to heat my house to 70 degrees in the winter when it's 50 outside.
Good vid!! How do people not know this?! I work in a server trailor at boeing its one of the 1st things taught, 15years ago. The temp is an ambient 37f...not sure who decided on hat specific temp. We wear arctic coats in there.
I didn't know people questioning that, ambient temp is often mentioned in tests for that reason. It's good to have the experiment though.
Jay out here with some "How to reverse entropy?" shit
Proof-positive that personal gaming as a leisure activity is a _lifestyle._ If your lifestyle is trashy messy and _dusty_ then your machine will become such in-kind. If you don't manage the temperatures in your room, your machine will also suffer in-kind. And if you _simply can't_ then you could set your machine in a grow tent and vent the heat out with a fan providing active cooling like LTT's Alex had set up that one time in a small stuffy Canadian apartment.
I always keep my office at 69f(20.6c). PC never goes above 70c. Newer components definitely run cooler than my old ones. My 5900X+3090 combo was a space heater and my ac had to kick in more often. Now with the 7800X3D+4080 build I have now, my ac barely kicks on to maintain ambient.
Glad you used an air cooler. Linus had the "genius" idea of testing if putting a PC in the fridge would make it cooler but used a liquid cooler which made no difference lol.
Wow.. let's measure the ambient temperature in the room. Jay: Tapes thermometer to the inside of the case... A thermal camera would have been nice to see if the radiating heat from the components was reaching the "ambient room temperature".
Would be interesting to do a test with no AC from a base ambient value and compare water vs aircooling showing that water is more efficient and in theory raising the ambient temp faster
At steady state this is conservation of energy, but the differential equation shows that a larger delta T will result in more energy transferred in the same amount of time. With the thermal mass of the heatsink this is pretty negligible, but it in fact would be nonlinear for transients
The lower delta is because the temps are close together on the high end. Your data was correct. That is about what I expected. Science still works 😊 Jay was right, as I expected.
Of-course it is, it is literally as linear offset in the formula for calculting a heatsink, Tj = Pd (Rjc +Rcs + Rsa) +Ta, where Ta stands for ambient temperature.
I can confirm your results.
My System with the 13900K and 7900XTX got the exact results.
At the hot summer in germany ive got an ambient temp of 30°C and CPU core and GPU Hotspot was exact the same delta of 9° Kelvin compare to 20°C ambient.
Another interesting part was after 1 hour stress testing in a closed room, that my ambient temp rised nearly to 39°C
The 400W from my GPU + 310W from my CPU was insane how good i can heat up my room. The funny part my system was not reached the temp limit at the 30°C ambient temp but with 38°C was the vrm the limited factor. My VRMs can be Watercooled like my cpu cooler, but MSI or EK spend too thick thermal pads.
Another funny part is the last winter, my heater was broken for few days and ive tried to heat up my room temp and it worked really well with over 700W can raised my room temp of 14°C to 22°C in two hours. By the way with the lower room temp of 12°C ive got a cinebench r23 score of 43700 points! That was way higher compare to a 14900ks score and my core temp was always under 80°C with only 270W power consumtion. For only 10° less room temp i can save over 40 Watt, thats really insane.
The bad thing my Power Supply the NZXT 850W Cooler had a broken barrel that makes bad noises by reaching above 600W system consume after the long heat up session :(
The little wavy line on the ambient line is most likely the result of the Heat Pump unit oscillating flap moving up & down. The unit itself does not cycle on and off that fast. They do make baffles that will help eliminate that.
I believe that on the higer room temperatures it is normal to get lower delta on the CPU/GPU temperatures because air coolers aren't really cooling with air, but with evaporation.
Evaporation starts faster when the temperatures are higher.
It doesn't change the capabilities of the air coolers, but (I believe) that it changes the reaction (evaporation) inside the heatpipes.
If I were you, I d test 2 identical builds, and tunning them simultaneously only difference would be air and water coolers, to test that theory
Is it just me, or is Jay doing normal sponsor spots kinda eerie. Like the world is ending, but he still has a job to do.
"There's zombies eating everyone, but check out these monitors."
Is it possible to a sponsor spot for baby bottle pops or nickelodeon gak instead?
At uni I've had to run multiple simulation calculations periodically on a short time budget, so I've set the AC in my dorm at 16°c and propped my laptop in a way that maximised airflow and cooling, so my cpu could turbo higher for more time. I've seen a 50% reduction of computation time, at least, with that technique.
I can't believe there was enough people saying this wasn't true that you had to do a video proving it.
ofc ambient is heavily related. The ambient air is the "material" any cooling solution is using to get rid of the heat and also the ambient temp is always "touching" whatever heat sink or fins you are using, regardless of tower or water cooler. Did some people really question that?
Funny man,
for several years, in the summer when it is too hot (heatwave), I do not use my main PC even if it is equipped with water cooling, it remains closed,
I use a laptop near my air conditioner
I believe you. Isn't this exactly why Steve always shows a delta T instead of an actual measured temperature?
Jay. This was a good video and I'm sorry that you had to make it. This however is another moment I'm reminded that we as humans at large may not make it. I can't imagine a reality where one can process things in a way to conclude that the temperature of the room doesn't matter.
Ambient temps matter, it's the reason the most accurate pc temps will be measured as "delta over Ambient". That way you know that all you have to do is add the delta to your room temps and that's the pc temps you can expect.
I want to see someone create an intake for outside air that can be used when the outside temperature is lower than the temperature in your home. Also the test you did I always considered it common sense that your room temp will affect your cooling ability.
Where’s the guitar ?
Would have been nice to see a scatter plot, x axis ambient, y axis computer temp. The angle/linearity would be extremely easy to visualize.
Hi Jay, I hate to be THAT guy, but just FYI - You run a EVGA 3090* in the build, but the slide is labeled as 4090*. Thanks for all your (and the teams) work