Pros Liquid Cool Everything... Even the 3000W PSUs

แชร์
ฝัง
  • เผยแพร่เมื่อ 27 ธ.ค. 2024

ความคิดเห็น • 108

  • @abavariannormiepleb9470
    @abavariannormiepleb9470 3 หลายเดือนก่อน +40

    Tip from someone with watercooling experience since 1999: Even if all power-hungry chips on the motherboard and add-in cards are properly watercooled, be sure to still have a bit of air flow throughout everything to lengthen the lifespan of small parts like capacitors. Even two small and silent fans in a completely closed case that move the enclosed air in a circle help a lot here. Still, stagnant air is a great temperature insulator.

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  3 หลายเดือนก่อน +8

      That is why the liquid cooled one has two small fan modules I think. There are a few passive components not being cooled

    • @JessicaFEREM
      @JessicaFEREM 3 หลายเดือนก่อน +3

      Arctic desktop water coolers include a tiny little fan that's powered off the water pump, to give airflow to the mobo in an area where it's expected to have airflow usually. it's a good idea.

  • @semosesam
    @semosesam 3 หลายเดือนก่อน +18

    Absolutely fascinating video, thanks! What I'd love to see as a followup is a facility tour of the infrastructure required at a data center to support these water cooled nodes. From the QD manifold at the back of each rack, down to the distribution pipes in the floor, all the way back to the pumps and water chilling facilities. I know you focus more the server side, but something like that would be an incredible companion piece to show the full picture of this coming transition.

  • @csvscs
    @csvscs 3 หลายเดือนก่อน +76

    Yeah but what happens when a stick of ram breaks? The amount of expertise +time +precision + knowledge required to service these seems problematic

    • @UnfortunatelyAj
      @UnfortunatelyAj 3 หลายเดือนก่อน +28

      They calculate failure rates and replace things preemptively its how we find alot of server grade micron sss’s on the market from 5 years ago as there plannned to be replaced even if theyre not faulty

    • @CoreyPL
      @CoreyPL 3 หลายเดือนก่อน +15

      With the cost of saved power and the ability to utilize hot liquid exchange elsewhere (heating the building for example) it is still more cost effective even if servicing is more expensive. Node density is king here and having few nodes down for a moment is not that big of a deal when you have hundreds or thousands of them available. Those kind of solutions are not for one server implementations.

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  3 หลายเดือนก่อน +25

      In large clusters like these you often see spare trays. So it can take longer to service the DIMM but you get to the node in seconds and faster than in a 2U server.

    • @coreyhipps7483
      @coreyhipps7483 3 หลายเดือนก่อน +14

      In addition to the other comments, two things I'd note:
      Cost savings wise these nodes use 14-17% less energy as noted by Patrick. So, depending on the failure rate, that can be a lot cheaper.
      Second, it looks like these are using standard ram stick with a clip on heat sink and one copper, water cooled plate, between every two sticks of RAM. So, in the grand scheme of things this may add a bit of downtime and overhead if replacing a RAM stick, probably will not add that much.
      Lastly, my understanding of the data centers these kind of things are operated in is that you just swap the node with another working node and do the repair later if uptime is that critical. Most of the loads are heavily virtualized, so they can be migrated and load balanced across nodes.

    • @esunisen3862
      @esunisen3862 3 หลายเดือนก่อน +7

      You just have spare nodes on a shelf, ready for a quick swap. The IT tech then has plenty of time to repair the faulty one and put it on the shelf when done.

  • @Blustride
    @Blustride 3 หลายเดือนก่อน +13

    Liquid cooled power supplies is something else.
    I also really appreciate that you pointed out that Intel DSG went to MiTAC. I got a beat up Coyote Pass server from work and had no idea where to get spare parts after Intel sold their server business. Next I need to figure out how to afford Ice Lake…

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  3 หลายเดือนก่อน +5

      Check eBay as well. There is a seller in the northwest that has had parts for those systems listed inexpensively at times

  • @tokiomitohsaka7770
    @tokiomitohsaka7770 3 หลายเดือนก่อน +13

    I’m a big fan (pun intended) of these highly liquid cooled systems. Would be awesome if you could tour their place and have a chat with their engineers.

  • @Felix-ve9hs
    @Felix-ve9hs 3 หลายเดือนก่อน +8

    I think eliminating fans in servers at this scale makes a lot of sense because it reduces power consumption and removes another point of failure.

    • @satibel
      @satibel 3 หลายเดือนก่อน

      Yeah, my fans are like 150W on a 600W server. (Though they're not running at 150W)

    • @allanwind295
      @allanwind295 3 หลายเดือนก่อน +2

      How does that remove a point of failure? You still have a couple of fans. Most fans are replaced with liquid cooling that have their own failure mode. The cooling liquid is probably externally routed to a centralized heat exchanger so that's another failure mode on top site ac.

    • @noname-gp6hk
      @noname-gp6hk 3 หลายเดือนก่อน

      This adds a bigger much worse point of failure by adding liquid fittings. Get a leak at the top of the rack and it can flood all the servers under it and kill the entire rack.

  • @DrivingWithJake
    @DrivingWithJake 3 หลายเดือนก่อน +5

    I do like that plastic cover having the two things to push in. The screw ones we end up trashing and ripping out on so many of our older 2u 4 node SM systems.
    Also 10G port while it's not as great two would have been much better to have for a public/private setup without using the pcie slots for networking. Still better than not having anything like a lot of them have done.
    The amount of cooling on that is quite nuts. We have talked about getting some of our new suites in our current data center linked up for water cooling like this and basically it would just interconnect to the data centers current cooling loop that they otherwise would put into the cracs from what it sounded like which is a big cost savings but they would want a large amount to setup it either way and our demands are not quite there yet.
    We've done some small scale tests and the best way is tanks but it's a mess and better suited for a large warehouse space vs a data center. Which is where this cooling design is nice.

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  3 หลายเดือนก่อน +2

      Asus had a screw design that was annoying when we had dozens of nodes we used for E5 v3/ v4

  • @jolness1
    @jolness1 3 หลายเดือนก่อน +3

    Wow these are an impressive engineering feat. Way more than just “slap some cpu water blocks on”. The power savings alone seems worth it and CoolIT does good work. I wouldn’t be worried running these if one takes care of them.

  • @AG-pm3tc
    @AG-pm3tc 3 หลายเดือนก่อน +2

    Cool Stuff!

  • @ewenchan1239
    @ewenchan1239 3 หลายเดือนก่อน +2

    I have an older Supermicro Twin^2 system (6027TR-HTRF) quad node/2U blade server, and the way that you pull the blades out is pretty much the same as it is here, with the two latches on the side.
    I still remember trying to design my own waterblock, back in like 2006, for the Socket940 AMD Opteron as a 3rd year mech student.
    Watercooling the PSU is pretty cool though.
    I wonder how they prevent the blocks from getting gunked up though...

  • @satibel
    @satibel 3 หลายเดือนก่อน +2

    A datacenter where you don't need ear protection sounds great.

    • @wileamyp
      @wileamyp 3 หลายเดือนก่อน

      Leadership-class HPC facilities (like top 20 in Top500 over the last 5 years or so) already use liquid cooling for a while, and yes it is a massive difference in noise levels compared to the older air-cooled facilities.

  • @EyesOfByes
    @EyesOfByes 3 หลายเดือนก่อน +6

    Now I get why Oracle etc want nuclear power ASAP

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  3 หลายเดือนก่อน +1

      Looks like Microsoft beat them to it

  • @redtails
    @redtails 3 หลายเดือนก่อน +1

    We really need to work towards more efficient workloads. Endlessly throwing more compute and electricity at problems is an ancient concept that needs to change. I have a small dual-core opteron x3216 as my home server doing all sorts of automation and server tasks and every time I consider getting a bigger system, I at least try to optimize my workload to still fit on that system and that approach is key to not waste a heckton of electricity. Datacenters talking about having their own nuclear power plant is something that we will see in our lifetime if this trend continous

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  3 หลายเดือนก่อน

      The reactor DCs are in progress. Less compute is unlikely. More compute and solving bigger problems is a competitive innovation advantage

  • @stephen-boddy
    @stephen-boddy 3 หลายเดือนก่อน +2

    You mention 14-17% less power. Is that just the server consumption, or does it include the pro-rata power for running the off-system cooling loop?

  • @nidafatima2005
    @nidafatima2005 หลายเดือนก่อน

    Great review.

  • @Stefan_Payne
    @Stefan_Payne 3 หลายเดือนก่อน +2

    The funny thing, if you think about it, is that you could, in theory, use the heat of the Servers to heat other buildings or so...
    In reality its not as easy as the water might be too cold and it might not work efficiently...

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  3 หลายเดือนก่อน

      This is more common in places like Europe than the US right now.

    • @BioTechproject27
      @BioTechproject27 3 หลายเดือนก่อน +2

      It's good for pre-heating, and the last 10-20K to reach ~60°C or so by other means

    • @eDoc2020
      @eDoc2020 3 หลายเดือนก่อน

      The server cooling loop could be used with a water-source heat pump (the type usually used with geothermal) to provide the rest of the temperature rise.

  • @abavariannormiepleb9470
    @abavariannormiepleb9470 3 หลายเดือนก่อน +5

    Can anyone name the specific manufacturer and part numbers of the quick disconnect couplings/connectors that are used in data center liquid cooling solutions?

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  3 หลายเดือนก่อน +5

      Staubli but I do not know the model numbers

    • @hexd0t
      @hexd0t 3 หลายเดือนก่อน +4

      In some of the closeups (e.g. 15:07) you can see this system is using Stäubli connectors, specifically the CGO line, at least for the Blade to Chassis connection.

    • @abavariannormiepleb9470
      @abavariannormiepleb9470 3 หลายเดือนก่อน +2

      Thanks!

  • @ChaosHusky
    @ChaosHusky 2 หลายเดือนก่อน +1

    I had one of the first CoolIT Domino ALC AIO coolers back in the day, around 2007 or 2008.. Sadly eventually the block head developed a pinhole and dumped its coolant into my Radeon HD4890.. But it worked alright until then lol that was on an AMD Phenom.. I switched to Intel in 2010 with an i7 920 and nVidia in 2014 with a GTX 970, haven't looked back!

  • @TheFullTimer
    @TheFullTimer 3 หลายเดือนก่อน +4

    I like the coverage of MiTAC (& more Tyan), but at the same time, I'm burnt out on Intel. Watercooling example, cool, but why liquid cooling? Well, because TDP specs are a joke. Did i miss something recently, or is there a comparison of how many watts Intel & AMD pull in under heavy load on air vs. liquid. How they perform to targets on air vs liquid, and how much liquid might unleash.

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  3 หลายเดือนก่อน +4

      So both AMD and Intel will be at 500W TDP this upcoming generation. Liquid v. air will be similar. Density and lower system power is what you get with liquid.

    • @b127_1
      @b127_1 3 หลายเดือนก่อน +2

      ​@@ServeTheHomeVideo This. 140w TDP like the good old days just wouldn't work for the crazy top end cpus we have now. Turin will have up to 128 core regular and 192 core dense configs. Just imagine how crazy that is, in 5 generations of Zen, we'll have gone from 32 cores to 192. That's 6 times as many! And we're certainly not using 6 times the power.

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  3 หลายเดือนก่อน

      @@b127_1 Also, Intel will be at 128 P-cores by the end of the quarter (stay tuned on STH) and is at 144 E-cores in only 250W already and will be at 288 cores in Q1 2025. AMD loses the P-core count lead in the next few days to Intel.

    • @b127_1
      @b127_1 3 หลายเดือนก่อน

      @@ServeTheHomeVideoOh, I almost forgot. Turin vs Granite Rapids will definitely be interesting. Probably much closer than we've had in years past.

  • @keyboard_g
    @keyboard_g 3 หลายเดือนก่อน +2

    If meant for large clusters, why arent DC bus bars with a power supply shelf for the entire rack catching on? Seems like a lot of wasted loss in conversion and server space for 40 small power supplies in a rack.

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  3 หลายเดือนก่อน +2

      There is a DC bus bar version of the Tyan server we reviewed recently at the start of this video

  • @rkan2
    @rkan2 3 หลายเดือนก่อน

    Just what I need under my bed as a space heater!

  • @BobHannent
    @BobHannent 3 หลายเดือนก่อน +1

    Notable that the liquid cooled server had 3kW PSUs and the air cooled one is 2.7kW...
    I could see these servers being useful in office mini-DCs. If they are linked into the air-conditioning cool water loop they'll work okay.

    • @LtdJorge
      @LtdJorge 3 หลายเดือนก่อน

      You need a pretty beefy AC to cool more than one of these 😂
      Because would be running non-stop, unlike AC.

  • @Entity8473
    @Entity8473 3 หลายเดือนก่อน +3

    The day one of these videos comes to the end without Patrick having a key lessons learned segment, world beware! An extinction level is probably approaching our doorstep.

  • @Doktoreq
    @Doktoreq 3 หลายเดือนก่อน

    Data hall full of those sounds like absolute commissioning nightmare

  • @xeon_1705
    @xeon_1705 3 หลายเดือนก่อน

    can they release the ram cooling for sale?
    my old waterblocks for dimms don't fit ddr5 because of the voltage regulator circuit that sticks out a bit to far on the dimm

  • @nyftn
    @nyftn 3 หลายเดือนก่อน

    so instead of wasting energy on fans and AC for the whole room . they just install a huge water chiller and use less power in the end . maybe not in smaller setups but for the huge setups . and if they install a few heat exchangers they can also cool the room when needed with the same setup.

  • @wileamyp
    @wileamyp 3 หลายเดือนก่อน

    I wonder how this arrangement compares with full-immersion systems

  • @AraCarrano
    @AraCarrano 3 หลายเดือนก่อน +14

    odd, No "Paid Promotion" tag in upper left at video start.

    • @brucefay5126
      @brucefay5126 3 หลายเดือนก่อน +4

      There was when I viewed the video.

  • @snwbrdn777
    @snwbrdn777 3 หลายเดือนก่อน +1

    How about immersion cooling? I would think that given the market's trend towards increased density, immersion cooling would be the way to go?

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  3 หลายเดือนก่อน

      Perhaps at some point. 2-phase can handle a lot more, but it was pushed out a bit due to environmental concerns. www.servethehome.com/2-phase-immersion-cooling-halted-over-multi-billion-dollar-health-hazard-lawsuits/
      Also, with 2-phase immersion you have the challenge of when you open the tank what happens to the vapor?

  • @allanwind295
    @allanwind295 3 หลายเดือนก่อน +2

    Don't play the drinking game when Patrick says "cool" in this one ;-)

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  3 หลายเดือนก่อน +1

      Hard when you have to talk air-cooled versus liquid-cooled

    • @allanwind295
      @allanwind295 3 หลายเดือนก่อน

      @@ServeTheHomeVideo That will be 2 shots for you, sir. All in good fun. It was a great video, Patrick, and I really appreciate how to take the time to respond to comments (even dumb ones like mime).

  • @nemesis851_
    @nemesis851_ 3 หลายเดือนก่อน

    Of the 3 DCs I’ve serviced in over the last decade, never come across anything water cooled by the DC

  • @MrHav1k
    @MrHav1k 3 หลายเดือนก่อน +1

    Super cool (yes, pun intended 🤣🤣)

  • @dota2tournamentss
    @dota2tournamentss 3 หลายเดือนก่อน +2

    why you stopped uploading so often? :/

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  3 หลายเดือนก่อน +1

      We will have a lot more soon. We had 3 videos where products got delayed or the vendor said they would not be releasing the product this month.

    • @dota2tournamentss
      @dota2tournamentss 3 หลายเดือนก่อน +1

      @@ServeTheHomeVideo looking forward for more!

  • @HyPex808-2
    @HyPex808-2 3 หลายเดือนก่อน +1

    Mini blade servers, that’s pretty cool, I don’t think liquid cooling is needed. Seems like a nice to have…

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  3 หลายเดือนก่อน +2

      Above 300W the power savings tends to be huge with liquid

    • @HyPex808-2
      @HyPex808-2 3 หลายเดือนก่อน

      @@ServeTheHomeVideothat maybe the case but I work at a hospital and I don’t think anyone would want to take the chance if the liquid cooling unit somehow broke down and the coolant leaked out below anything else that was racked underneath it. Seems like a liability….it would have a direct impact on patient care….

  • @Sama_09
    @Sama_09 3 หลายเดือนก่อน

    Take my money !! But how much is it expensive than the aircooled counter part !!??

  • @OVERKILL_PINBALL
    @OVERKILL_PINBALL 3 หลายเดือนก่อน +1

    *Fun Fact:* When you have *100% pure water* electricity cannot flow through it

    • @ChristopherGoggans
      @ChristopherGoggans 3 หลายเดือนก่อน +3

      True, but the distilled water can strip metallic ions from the metal, and in turn become conductive, even in all copper cooling loops. Also, if something isn't correctly grounded, it can create a electrical potential difference, and in turn cause various forms of rapid corrosion.

  • @danbrit9848
    @danbrit9848 2 หลายเดือนก่อน

    people laugh at me when i say i want to water cooled everything on my pc ...and o look in the high end stuff they do it ...motivating my build

  • @testing2517
    @testing2517 3 หลายเดือนก่อน +1

    Neat system. 10G is not great though.

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  3 หลายเดือนก่อน

      Yea usually you add higher speed NICs in the add in card slots

  • @AIC_onyt
    @AIC_onyt 2 หลายเดือนก่อน

    Lame.
    The Cmos Battery doesnt get any cooling

  • @A-BYTE94
    @A-BYTE94 3 หลายเดือนก่อน +1

    LTT style

  • @keyboard_g
    @keyboard_g 3 หลายเดือนก่อน +1

    To service a fan you need to pull the power supply. Wat.

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  3 หลายเดือนก่อน +1

      It has four though so I get it.

  • @BloodyIron
    @BloodyIron 3 หลายเดือนก่อน +1

    proprietary dongles are yucky

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  3 หลายเดือนก่อน

      Sure, but figure these are designed to be installed in clusters (mostly using remote management) and so you end up getting a ton of them.

  • @kiri101
    @kiri101 3 หลายเดือนก่อน

    Sponsored video?

  • @MunKpereng
    @MunKpereng 2 หลายเดือนก่อน

    Hazrat Hamza

  • @Anonymous______________
    @Anonymous______________ 3 หลายเดือนก่อน +4

    Everything's great until that wonderful liquid cooling system is compromised or starts leaking.

    • @ecotts
      @ecotts 3 หลายเดือนก่อน +3

      like Linus's server which took out his UPS and server..lol

    • @torr136
      @torr136 3 หลายเดือนก่อน +1

      That's why you use dielectric liquid.

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  3 หลายเดือนก่อน +4

      Right, but this is when Pros do liquid cooling.

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  3 หลายเดือนก่อน +5

      The big issue these days is not leaking. It is instead microbes growing in the lines when these are on 24x7x365 for many years

    • @kmeronek
      @kmeronek 3 หลายเดือนก่อน +3

      ​@@ServeTheHomeVideo Can you elaborate on the microbe growth?

  • @sMv-Afjal
    @sMv-Afjal 3 หลายเดือนก่อน

    Liquid helium

  • @inflatablemicrowave8187
    @inflatablemicrowave8187 3 หลายเดือนก่อน +1

    14:50 Did you say 12 to 15 or 12 to 50 here? Can't make it out, sorry.

    • @CoreyPL
      @CoreyPL 3 หลายเดือนก่อน +3

      12-15%. Having it at 50% of total system power just for the internal fans would be crazy.

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  3 หลายเดือนก่อน +1

      15

    • @inflatablemicrowave8187
      @inflatablemicrowave8187 3 หลายเดือนก่อน +2

      @@CoreyPL Linus: LADIES AND GENTLEMEN TODAY WE ARE GOING TO BE COOLING A PC WITH A BOEING 747 JET ENGINE!

    • @CoreyPL
      @CoreyPL 3 หลายเดือนก่อน

      @@inflatablemicrowave8187 I wouldn't put it past him 😂😂😂😂