Yet it's still out of my budget ಥ‿ಥ Even if it was a quarter or an eighth or even a twelfth (seeing how long it has to be divided till I can afford it) Or a 1/16 Maybe a 1/32 aha but then I couldn't get ps5 ಥ‿ಥ
"Why buy a space heater for your bedroom for the winter" In College when it got cold my roommate would run folding@home on his rig all night to keep our bedroom warm.
@@wazuzoe I don't mean to date my age, but crypto (bitcoin specifically) wasn't released until the spring semester of the school year. And he didn't find out about it until it started getting warm. He definitely got into mining when it did come out.
I was incredibly scared... this is suuppeerr cool, and I know he knows what he is doing, but in my head I’m just like, yep 4 2080ti’s, even one being more than my whole setup
@@AlexRamosDrTaz $3,000 for a computer? No thanks. I have a hard sell buying a new console for $500 but atleast I know that I can easily pull 6 years out of that console before it becomes technically "outdated" but can still play it forever as long as I have a catalog of games for it.
@@paapali ofcourse you can. But you cant play a 2020 game on a 2014 computer that costs equal or lesser monitary value as say a ps4. Which was what $399? Cmon its laughable.
@@SoumyadeepBanerjee007 dude literally ANYTHING is better then that . they have 80 dollar core 2 duos on ebay FULL systems with board / case /ram/ power / cpu (or less around here i can get them free from recycle'ers ) that will do loops around the d , The d was considered by many one of the worst worst worst generations of intel cpus ever. it also chews threw so much power that if you were to "upgrade" to a duo it would pay for itself in 6 months to a year lol
I use a 300W because that was the smallest available for my undervolted CPU PC (10+ years old). Draws 60-130W actually. I still got more RAM than these new wonderful M1's lol hard to believe that 16-64GB is not the standard these days. 16GB is bare minimum.
3:50 Task manager doesn't show that the GPU is under use when used for 3D rendering. You gotta go to performance>GPU and set one of the graphs to Compute.
Funny thing is the 3000 series cards aren't even that much faster in non gaming scenarios. We use a bunch of them in a 3D Render Farm and the 3080s we got are just 15% faster than the 2080tis we had before. 15% is what you would expect from a newer gen card but not the leap people are making out of it. That being said there is a chance this might still improve a bit once drivers mature.
the 2080 ti is about quite the same as the 3070, and the 3000 series isnt even quite optimized yet (I know this cause my 3070 has some signal problems from the newest drivers so I have to use the manufacturer ones, which still have problems)
Linus : "We ran this benchmark in a room temperature about 22 degree celsius" Me living in Malaysia : "22?!?!? i have to turn my air conditioning on for the whole night to achieve that temperature!"
I'd still love to see those pcie riser/extenders available to just buy Having something like that is so useful for really small form factor PCs where you don't want some stupid large ribbon cable getting in the way
Truly fantastic camera work! How did they achieve the "fly through" shot through the chassis grid? Is that just a zoom coupled with a focal lens adjustment?
Splitting up coolant between hot parts is ONLY wierd in the mind of someone who is convinvced that putting fans, radiators, and water blocks in series is a good idea. Giving everything its own fresh cool/hot water and cool air is a much more efficient use of hardware.
If it's for ML and your ML employee earns $200k+ a year, $32k is probably reasonable. 🤷♀️ In grad school, our bosses don't blink at dropping $3-4k on individual builds and who knows how much in cloud compute.
@@emma70707 it's crazy to think businesses will be charged such high prices as well... like, for $4k I could build pretty much the best workstation you could ask for (reasonably)...
Linus and team, I've been watching you guys for years, please keep it up and know that you are an educational channel for some of us. I wish you a merry Christmas, and may you have a happy, healthy and prosperous new year
I miss the days when Linus reviewed tech for average middle-income consumers. This stuff is interesting, so we keep watching, but not particularly useful.
You don't understand, though. The reason he reviews things with exorbitant prices, is for two reasons. #1 He's showcasing the newest tech (at the time) for people who are getting into this type of high-end computing. It makes it easier for them to determine the best solution for their application, and while they might not spend that much to get something like this, it puts them on the right path. #2 This stuff is the future of our normal everyday tech. Just like the cheapest Tesla was originally $100k, the more people bought them, the further the price drops. Currently, you can get a brand new Tesla for $32k, and there are rumors of a $25k version coming soon. Everything in your desktop or laptop was, at one point, a concept only. A one off that cost thousands of dollars to build, not to mention the millions it cost for R&D.
@@gothnate I also don't care about industry scaling or tech doubling rates. I care about tech I can afford. That's what I used to come to Linus for. That's Linus's roots. Unboxing motherboards on a park bench, that's what I miss. Not industry-scaled tech nobody I will ever meet in my life could ever afford. Like the petabyte project, nobody, NOBODY I have ever met will ever be able to afford any of the data servers he goes on about. His content has lost its relevance for me, and just about everyone I know. Sure it's cool, sure it's interesting, sure it's.... probably sponsored a lot of the time, but where's the carrot? Running out of reasons for Linus to be relevant for me is a bad thing.
@@ellemnist no, they change faster. Options for better budget builds change nearly daily due to price variances. Any schmuck can achieve performance with a 3090 rtx if they have 3 grand to blow. What if they only have 300 dollars?
I would like to see a comparition of an Epyc, Threadripper and Ryzen with the same core count and max. core count for compilation and other memory bandwidth dependent tasks. Because (I think) the main difference is the memory bandwidth? (dual-, quad-, octa-channel)
@@WarriorsPhoto But in many places for Threadripper is written "Memory Channels: 4" and for Epyc "Memory Channels: 8". Are they still in dual-channel configuration? Edited: I looked again and when I understand correctly, every CCD on the processor has two memory channels. So it's not octachannel but the Epyc can access eight channels at the same time; every CCD two.
@@parkchanyeolexo7572 Did I understand this correctly and you want to say: "You have no idea about technology, go away and watch other videos."? If so, I think that's not okay. Better explain the real facts about the topic.
The black Plague Pretty much. It’s a jack of all trades, which is great, but more expensive than something specific, and will never perform in any given task as fast as something specialized.
Its funny because he described the PC pro as the "jack of all trades", but didn't provide one example where it would actually be better than the half-price Comino.
Reporting usage on AMD graphics cards is even worse. Sometimes the workload will be under headers you expect like "Video Encode" but then other times it's in the most random process names like "Engine 15".
MKBHD gets a probe lens and now all the tech youtubers cant resist it. Its a cool lens, and im not accusing anyone of just copying him, but it is a lens that produces a very specific, very unique type of shot.
I don't think he was actually first, maybe the first big youtuber. Anyway its a tool to expess creative ideas, just using it doesn't mean you're stealing from others.
This is the coolest custom build system I've ever seen with a few really clever ideas. i will like to share the video on my social media space and to my friends on different platforms
I'm glad they included that. Most people don't realize how loud server rooms can be. Imagine that clip, times a rack of 16-32 servers, and you'll understand why many datacenters mandate hearing protection for those out on the floor.
Hmm I swapped out the stock cooler on my RX 5700 xt and put an ARCTIC Accelero on no problem, it has mini heat syncs to put on the memory that didn't seem like they'd be adequate but so far so good.
I'm against the whole company loyalty thing and certainly am no fanboy for Nvidia but I really don't understand why anyone would buy a Radeon card /shrug
Actually, splitting the coolant flow seems like the ideal method to me. This is a classic case of series vs parallel. 1. The specific heat capacity of water at a lower temperature is higher. Putting CPU and GPU in series reduces the heat capacity of water for the device that comes second in the loop. The second device may receive inadequate cooling. 2. The heat current from source to sink is higher if the temperature gradient is higher. In this case, since both CPU and GPU get cold water, heat from both components flows equally fast into the coolant. 3. It seems like splitting the flow will cause flow rate to reduce to each component. While that is true, splitting also reduces the resistance of the loop to the flow of water. Also, it is easy to compensate for this with thicker tubes and a more powerful pump. LTT Video on series vs parallel cooling?
The specific heat capacity of water is a reasonably constant value for any temperature your computer is likely to see. I think it's a difference of less than a percent between 20C and 40C, if I remember right. Please point me to some sources if I'm wrong on this.
They did videos on series vs parallel and they found that there is absolutely no difference. There's not even a difference if you use two loops instead of one. (one loop with say a 280 rad and two loops with two 140 rads, equivalent cooling capacity). In series it will raise the temp of your GPU if you're hammering your CPU and not using the GPU, but not a bothersome amount - and vice versa. But running the two together nets very similar performance to a separated loop. The reason for this is because each pass the water is not warmed significantly. Maybe it will warm up 1C from CPU then 1C from GPU, but then it goes through the radiators and is cooled a lot, let's say 1.8C. 23C water instead of 22C is not earth shattering. And next loop it will be 23.2C. Point being, it's not as if the CPU is warming the water by 20C or something to legitimately heat the GPU.
@@kaldogorath " Maybe it will warm up 1C from CPU then 1C from GPU," The water definitely warms up a lot more than that. I'd be surprised if it didn't warm up at least 15 C, especially in a system like this when under the full 1200W+ GPU load.
I think splitting the flow is reasonable in many case's to reduce coolant flow time between component and radiator. In this case splitting the flow in half between one 280w CPU and four 277w GPU is questionable, and unlikely to enable sufficient cooling for overclocking the GPU's.
This is the coolest custom build system I've ever seen with a few really clecver ideas. The GPU cooling with the sandwich cooler is just genius and reasonable easy to swap one! Seems like the steroids version of my budget optimised work build for Autocad Inventor & Matlab. At a far higher price :D .
An interesting point in this build is that it really only has three fans (minus those built into power supplieS). So often we see custom builders go fan-crazy, with 7 or 9 in a case. But, in reality it seems the most important thing is using your fans effectively, and directing air where it needs to go (in this case, through the radiatior!)
You know what this machine would be FANTASTIC at, I bet? That deepfaking you did recently. With that CPU and those GPUs, it would be a BEAST at machine learning. Would have loved to have seen a comparison of just how fast it could tear through that training despite not having a quadro card. I know on that video you said something about worrying SLI would slow it down, but I expect that's not a realistic concern. Tensorflow doesn't use the GPUs in the same way games do, it doesn't need to synchronize each frame or anything, its just submitting compute tasks and pulling results. I don't know the specific software you were using, but I imagine it uses Tensorflow under the hood and would probably absolutely sing on that platform.
Linus going “yeah it’s efficient” is like me talking to a non pilot about a twin engine plane saying “yeah it only uses 35 gallons of gas per hour, it’s super efficient”
He always got the focus of the pro workstations wrong. 2, 3 or 4 GPUs is the way to performance for the kind of apps he’s benchmarking. Super expensive CPUs aren’t that relevant.
lmao linus has lost all realization of middle of the road after his latest upgrade to his PC. hahaha only kidding but that made crack up, like this dude really called a 32k pc middle of the road for anything
@@mikeuk1927 Maybe but he's right that task manager has been wonky. I've been playing MW and 90% of the time it shows my CPU pinned at 99% and my GPU at 2-3%
Hey linus i know you may not see this but i just wanted to thank you for making great and funny tech videos and they got me motivated to finally get my gaming rig built about 6 months ago i had been wanting to build for a few years but i didnt know what i was doing i found your channel in 2018 and you guys have taught me what i needed to do keep making great videos thank you!! :-)
@@walidfakhfakh3660 Did I say lol I could have worded it way better for it to be a joke but I'm glad you found something that was intended to not be funny not funny, thank you for that gave me a chuckle! Edit: @Dave Jones thank you good sir!
“Half the price”
Me: oooooooo
“...of $32k”
Me: that makes more sense...
Yet it's still out of my budget ಥ‿ಥ
Even if it was a quarter or an eighth or even a twelfth (seeing how long it has to be divided till I can afford it)
Or a 1/16
Maybe a 1/32 aha but then I couldn't get ps5 ಥ‿ಥ
@@maryjaygomes4441 I am gonna buy PS5 as well as a New PC
TLDR: they showed one stupid benchmark that like 10 people in the world use outside of benchmarks.
Ah my hope... It burns.
@@curt8806 it crush anything, but it wuold not handle lord of rings/avatar special effect...
Yeah mom, I need this one for my school powerpoint presentation
It technically does count as a school supply since we are all at home and rely on computers
No , I need it run ms paint
@@ygobe2 not everyone's is at home, some countries let people go to school
No mom, i know that im also going to need something for word >:(
@@alexstromberg7696 yeah but, people gotta do computer work at home... to do homework!
"Why buy a space heater for your bedroom for the winter"
In College when it got cold my roommate would run folding@home on his rig all night to keep our bedroom warm.
what is "folding@home"?
@@buttnutt A community of supercomputing ..
should have mined crypto and made money...
@@wazuzoe I don't mean to date my age, but crypto (bitcoin specifically) wasn't released until the spring semester of the school year. And he didn't find out about it until it started getting warm. He definitely got into mining when it did come out.
@@nigelwang2447 so did he get the lambo?
Linus yanking on those 4gpus, everyone else almost had a heart attack, perks of being the boss.
We know he’s a klutz, but Camino aren’t going to get this exposure anywhere else.
I was incredibly scared... this is suuppeerr cool, and I know he knows what he is doing, but in my head I’m just like, yep 4 2080ti’s, even one being more than my whole setup
It was Alex's Shap intake of breath that got me.
not like they have 2080ti lying around where they can easily replace it themselves so meh rly
I almost shat my pantaloons
$32,000 machine = middle of the road performer.
**looks at my $1,000 machine**
Don't worry, I still love you.
Me, to my $700 PC: you are strong and wise and I am very proud of you
@@davey_rulez7301 sighs on an i7 860.
but how big is the road
how is 1000 a low amount!!
Me to my $450 PC: don’t listen to the crazy bearded man, you’re still speedy inside
"Half the price"
Oh neat, something I can afford
"of our $32,000 machine"
oh
Most of us plebs can barely afford 1/10th of that LOL
@@AlexRamosDrTaz $3,000 for a computer? No thanks. I have a hard sell buying a new console for $500 but atleast I know that I can easily pull 6 years out of that console before it becomes technically "outdated" but can still play it forever as long as I have a catalog of games for it.
@@brettcasale Good for you. I've pulled out much longer with most PCs I've used personally, and it didn't cost me $3k a pop.
@@brettcasale Ok I'll bite. What makes you think you can't play current games with a current pc 6 years from now?
@@paapali ofcourse you can. But you cant play a 2020 game on a 2014 computer that costs equal or lesser monitary value as say a ps4. Which was what $399?
Cmon its laughable.
I know it's this machine has a pretty specific job title, but it's still weird to see LTT not to run a game on it.
alrightalrightalright probably wasn’t allowed, this was an ad episode after all
I wanna see how well it mine :D
ikr
its still gonna run games like FKIN CRAAAAAZY tho
alrightalrightalright they’re not running in SLI btw
Me: *actually about to cry from hearing Linus may not be upgrading again*
Linus, a day later: "Okay maybe just 1 more upgrade"
Never say Never....
@@eisenklad only say never while saying never say never
@@eisenklad while i am using a 🥔 PC from the year 2005. Intel pentium d. 😢
@@SoumyadeepBanerjee007 dude literally ANYTHING is better then that . they have 80 dollar core 2 duos on ebay FULL systems with board / case /ram/ power / cpu (or less around here i can get them free from recycle'ers ) that will do loops around the d , The d was considered by many one of the worst worst worst generations of intel cpus ever. it also chews threw so much power that if you were to "upgrade" to a duo it would pay for itself in 6 months to a year lol
In the tech space you can never not upgrade.
"Only 760w? Yeah its efficient"
Me over here with a 750w psu
Me over here watching it on 6W Odroid N2+. ;)
450W... that machine is taking three times as much TDP as my system is even able to :D
I use a 300W because that was the smallest available for my undervolted CPU PC (10+ years old). Draws 60-130W actually. I still got more RAM than these new wonderful M1's lol hard to believe that 16-64GB is not the standard these days. 16GB is bare minimum.
700w +White
@@rageagainstthebath add the monitor and now it's 8006 watts
One GPU in that beast alone is the cost of 90% of everyone's PCs here
😂😂😂😂 PC is so expensiv
RiGht that make sense
Costs more than my home
Oh Hey! I didnt know you like watching Linus. I mean who dosent!
REeeEeeEEeeEee
5:06 Yo, whoever filmed/edited that amazing shot.... GIVE THAT PERSON A RAISE. That was freaking awesome!!!
Gotta be a render, right?
@@clutchhutch3287 Probe Lens? Maybe.
@@clutchhutch3287 It looked like a render at first, but that is just because of those rubbery pipes.
probably Taran
it was shot in a prob lens
"Ohhh a cool 760 watts..."
"Is that all?"
Yep, the power company loves Linus. Greg at the electric company is employee of the month... 10 years and counting.
The wattage alone could make this vide one of the most expensive ones to make, even removing the giant beasts
Everyone: My space heater emits heat.
Linus: My space heater emits heat, and emits amazing render times.
Elon musk: my space heater heats space...
@@grummhd3020 If only the car he shot into space had a gas engine, then we could pollute space enough to have an atmosphere.
Linus: and it emits segways, like this Segway to our sponsor.
And has a pretty high price tag
@@kaldogorath Made no sense but i like it regardless.
3:50 Task manager doesn't show that the GPU is under use when used for 3D rendering. You gotta go to performance>GPU and set one of the graphs to Compute.
Was just about to say the same - discovered this during Blender renders
After the 3000 series announcements, this video did not age well.
Funny thing is the 3000 series cards aren't even that much faster in non gaming scenarios. We use a bunch of them in a 3D Render Farm and the 3080s we got are just 15% faster than the 2080tis we had before. 15% is what you would expect from a newer gen card but not the leap people are making out of it.
That being said there is a chance this might still improve a bit once drivers mature.
@@Klokopf52 it will get better once they optimize it more
@@Klokopf52 yeah, but they're also like $500 cheaper lol
@@Klokopf52 The point is, they are better AND almost half the price...
the 2080 ti is about quite the same as the 3070, and the 3000 series isnt even quite optimized yet (I know this cause my 3070 has some signal problems from the newest drivers so I have to use the manufacturer ones, which still have problems)
*Waiting for Linus to combine all his computer builds into a one Supercomputer*
Not possible lmao
Says who mwah ha haaaaa
Adam Samaha I mean with some special software and some special wiring special parts it might work
@@hellomoto6300 no, its impossible to combine multiple computers into one, who ever told u that is dillusional
@@jacksheneman6242 multi computer load splitting looking at you like: bruh am I a joke to you?
the fact that linus has just given up on actual segways and is just saying "just like this segue" is amazing
Segue*
Segway is an overpriced electric two-wheeled transport.
Now I can't get the idea of Linus literally hopping on a segway and rolling off screen out of my head.
@@omgMBP That is no longer. The phonetic spelling though, haunts us.
Segue*
I instinctively double clicked the screen to jump 10 seconds when I read your comment
Linus : "We ran this benchmark in a room temperature about 22 degree celsius"
Me living in Malaysia : "22?!?!? i have to turn my air conditioning on for the whole night to achieve that temperature!"
just put cool fever on it
Heh where i live at winter i could really overclock the shit out of my computer if i put it on the outside 🤣 its like -30 degrees celcius sometimes
@@qairulaimanabdulhakim6967 LMAOOO HAHAHA
Amatures
This comment was made by 35+ C gang
In malaysia, West malaysia, highland. 15c at night
Linus 3 years later: uses it as toilet paper
More like 3 years later
MickTheMan yeah ok I will chance it
@@beefboy8812 🙄🙄🙄🙄 idiot Aindroid user that's what he said
@@loveiphones4550 dude he changed it from 20 years
@@loveiphones4550 btw iphone are more or less just expensive shit compared to android devices
I'd still love to see those pcie riser/extenders available to just buy
Having something like that is so useful for really small form factor PCs where you don't want some stupid large ribbon cable getting in the way
ŁTT/
My guy has ltt confirmed.
I'd also want to see that waterblock available, might find some use in a tiny rendering pc
Did a quick little search and Comino has them for sale! comino.com/en/risers/
@@satibel screw your tiny rendering PC i want my SLI in like a 3.5ish slot package
Employee: We need more rendering power
Boss: Slap some gpus on it
6:38
PS4: Finally! A Worthy Opponent! Our Battle Will Be Legendary!
Username checks out
our battle will last 20 seconds
The beard is growing on me... I mean, it's growing on Linus...
@Drifters Smorgasbord bruh, you gotta open her canal. Whatever that means.
@@ReyaadGafur LMAAO
My girlfriend said the same thing...
@Drifters Smorgasbord that makes no sense.
@Drifters Smorgasbord wha, what the fek did i just read
@ 5:05 that camera workmanship is really cool. Whoever decided to do that zoom in like that, give him a bonus. That's what we like to see!
5:03 That's some great, satisfying b roll.
Came looking for this comment 😁
Absolutely was going to say the same. Delivering the experience right there
Truly fantastic camera work! How did they achieve the "fly through" shot through the chassis grid? Is that just a zoom coupled with a focal lens adjustment?
@@AntonyCoote Probe lens, they recently got one
@@AntonyCoote there's a dope tech episode by mkbhd abt a lens that enables sick shots like that
Welcome to another episode of "Something I can't buy"
But do you need to buy it?
@@dantat5713 cries on DSL
Can't buy Now or ever*
That is a great alternative title for the "Holy $h*t"" episodes!
LTT in a nutshell, really. xD
Splitting up coolant between hot parts is ONLY wierd in the mind of someone who is convinvced that putting fans, radiators, and water blocks in series is a good idea.
Giving everything its own fresh cool/hot water and cool air is a much more efficient use of hardware.
"maybe you just need to force it", as Linus swings open the PCI-E daughter-board.
(everyone in the studio gasps and goes white in the face)
Everyone: Linus NO!
Linus: Linus YES!
Linus is one of my favorite idiots
I wasn't in the studio and I went white in the face. ;)
th-cam.com/video/R3tv5NsZMyU/w-d-xo.html
Best moment in the video 😂
6:48 "room temperature of 22°C"
*sweats in equatorial 32*💀
In Sweden we got 32c like last week
Alex Strömberg wtf? We have like 28c here in the Philippines at the same time
@@JustAnNPC69 global warming is catching up with us I guess.
Room temperature room
It's almost 40ºc here and it's winter.
Imagine, a Comino that's 72x as fast as the Mac Pro. That's epic. :-D
Nope. That's Threadripper. Epyc is used in the PC Pro :kappa:
@@maYdaY1337 [lol] And please note, I spelled "epic" properly and deliberately. Context matters.
i know. it had to be said though :D
Knowing apple give it it a couple updates and the original doom will have trouble running
Yes it its
Myself
and the people who like this appreciate the probe lense shot at 5:05
The little effort matters.
I thought it was CGI marketing material it looked that good.
Got the Laowa lens out. Sooo gooood!
How did they do it?
@@BlackedBeast A probe camera, it's a really thin camera lens used for shooting through holes and like you just saw at 5:05
@@BlackedBeast laowa macro lens
"Half of the price"
Me : Noice this is gonna be good
"At a budget of 32 k"
Me: *"Ah shit here we go again"*
If it's for ML and your ML employee earns $200k+ a year, $32k is probably reasonable. 🤷♀️ In grad school, our bosses don't blink at dropping $3-4k on individual builds and who knows how much in cloud compute.
CoPiEd CoMmEnt!
@@emma70707 people dont understand that
which is annoying
@@emma70707 it's crazy to think businesses will be charged such high prices as well... like, for $4k I could build pretty much the best workstation you could ask for (reasonably)...
@@emma70707 while my school has i5-2400 pc's
10:47 - "woah! hey, hi, ho hey, how's it going?" absolutely dead 😂😂
linus: one you've never seen before...
*proceeds to introduce glass wire for the 2784th time*
eh... he's introduces squarespace more
@RITA 25 y.o , I WANT SЕХ !!! !OPEN MY CANAL !!! scam
It's being cooled by 1 420mm rad, that's kinda insane
Pls, the rad is only doing half the work. The noctua fans and the case design and hardware layout is holding the temps in check.
Yeah, they're missing the triple rad stack!
@@synceware1453 well you are poor
@@AznTony360 shut up
No one respond to @@walidfakhfakh3660 because they're just a lonely child looking for attention so it'll save you time and braincells
Linus and team, I've been watching you guys for years, please keep it up and know that you are an educational channel for some of us. I wish you a merry Christmas, and may you have a happy, healthy and prosperous new year
"Just like this segue, it's like nothing you've ever seen before." Oh, Linus..
*segway
That was as smooth as sandpaper... 😅
@@ned-gr9ur *segue
SquirmyEmpire66 *segue
@@d35p0 *segway
Every other weekend
Linus : The fastest computer ever!!
6:36 Holy crap, the Chevrons are locking, the Flux Capacitor is fluxing, and the Warp Core's going critical! RUN!
Are you really going to replace me that quick?
Rip PC Pro ain't so pro anymore
PC Noob
Pc Proh Moment
@@fenrir1g shit up
No one reply to @@walidfakhfakh3660 it'll save you time and brain cells
me watching using an Intel HD Graphics laptop:
haha nice vid
me watching using an Intel HD Graphics laptop:
Linus "this one is faster than this other pc"
me: haha piece of trash
I miss the days when Linus reviewed tech for average middle-income consumers.
This stuff is interesting, so we keep watching, but not particularly useful.
This stuff is cool and all but i miss the budget series
@Hudson Hamman budget computers change much more slowly.
You don't understand, though. The reason he reviews things with exorbitant prices, is for two reasons. #1 He's showcasing the newest tech (at the time) for people who are getting into this type of high-end computing. It makes it easier for them to determine the best solution for their application, and while they might not spend that much to get something like this, it puts them on the right path. #2 This stuff is the future of our normal everyday tech. Just like the cheapest Tesla was originally $100k, the more people bought them, the further the price drops. Currently, you can get a brand new Tesla for $32k, and there are rumors of a $25k version coming soon. Everything in your desktop or laptop was, at one point, a concept only. A one off that cost thousands of dollars to build, not to mention the millions it cost for R&D.
@@gothnate I also don't care about industry scaling or tech doubling rates. I care about tech I can afford. That's what I used to come to Linus for. That's Linus's roots. Unboxing motherboards on a park bench, that's what I miss. Not industry-scaled tech nobody I will ever meet in my life could ever afford.
Like the petabyte project, nobody, NOBODY I have ever met will ever be able to afford any of the data servers he goes on about.
His content has lost its relevance for me, and just about everyone I know.
Sure it's cool, sure it's interesting, sure it's.... probably sponsored a lot of the time, but where's the carrot? Running out of reasons for Linus to be relevant for me is a bad thing.
@@ellemnist no, they change faster. Options for better budget builds change nearly daily due to price variances. Any schmuck can achieve performance with a 3090 rtx if they have 3 grand to blow. What if they only have 300 dollars?
I would like to see a comparition of an Epyc, Threadripper and Ryzen with the same core count and max. core count for compilation and other memory bandwidth dependent tasks. Because (I think) the main difference is the memory bandwidth? (dual-, quad-, octa-channel)
Manuel M. th-cam.com/video/75ot0F7c0-k/w-d-xo.html
Octo channel memory??? Wow I didn't think we were there already. WOW
@@WarriorsPhoto But in many places for Threadripper is written "Memory Channels: 4" and for Epyc "Memory Channels: 8". Are they still in dual-channel configuration?
Edited:
I looked again and when I understand correctly, every CCD on the processor has two memory channels. So it's not octachannel but the Epyc can access eight channels at the same time; every CCD two.
@@parkchanyeolexo7572 Did I understand this correctly and you want to say: "You have no idea about technology, go away and watch other videos."? If so, I think that's not okay. Better explain the real facts about the topic.
Manuel M. That’s awesome either way. Thank you for the information. (:
BEARDED LINUS IS OMEGA CLASS MUTANT
@RITA 25 y.o , I WANT SЕХ !!! OPEN MY CANAL !!! These kinds are common - they're spams. Report them.
I don't know why, but I didn't expect to see you comment on this channel.
@@Jack-im2wd This is just like a Dyno roller.
@@Jack-im2wd who is he
Rob corps?
Love it. Two double taps at the cue “sponsor” gets you exactly where you want to be. LTTed
Linus: PC Pro is a middle of the road performer
Me: what
I think it means it’s prepared to do what ever task you throw at it instead of just being built for editing or gaming
The black Plague Pretty much. It’s a jack of all trades, which is great, but more expensive than something specific, and will never perform in any given task as fast as something specialized.
thegamer fromjuipiter it’s like the old question, would you rather be pretty good at everything, or be extremely good at one thing?
@@BlackedBeast I know lol, but for a layman like me, it is still blisteringly fast
Its funny because he described the PC pro as the "jack of all trades", but didn't provide one example where it would actually be better than the half-price Comino.
This computer: exists
me: I don't need two kidneys, I need this PC
Your kidney worth less than that mate. You need to sell your cornea for that kind of money
This video exists: benchmarks using one useless software.
Blank Blank I think kidney is 100k
I don't think two kidneys are enough to get you that PC
In 5-7 years a high end $2000 PC will be as powerful.
When the segways are just like being stick bugged, but we still get surprised and somehow enjoy it
Me: HALF THE PRICE?! im going to build this
Also Me: Later realizes its half of $32,000
Linus: "These memory chips don't actually output that much heat"
GDDR6: "o rly now?"
By comparison I'm sure it's fine. Plenty of overhead
I love how the advert of the sponso is never longer than 10 secs so you can skip it easily
3:52 Task manager is weird with CUDA workloads. When I render scenes in Blender on my GPU task manager reports
Reporting usage on AMD graphics cards is even worse. Sometimes the workload will be under headers you expect like "Video Encode" but then other times it's in the most random process names like "Engine 15".
5:05 awesome probe lens shot 👌
Wish I had one of those since you can't really do anything like that without it.
MKBHD gets a probe lens and now all the tech youtubers cant resist it.
Its a cool lens, and im not accusing anyone of just copying him, but it is a lens that produces a very specific, very unique type of shot.
I don't think he was actually first, maybe the first big youtuber. Anyway its a tool to expess creative ideas, just using it doesn't mean you're stealing from others.
that machine design was perfection.... havent seen something so clean in a long time
7:58 "I'm a little worried" and he does that recklessly anyway
"a little" key-words
Linus: This PC is made by Comino
Me: The cloners are making Pc's now.
Just make sure they are not sticking in a remotely-activated Order 66 chip and you will be fine.
@@Carahan but how are they going to make way for the empire?
This is the coolest custom build system I've ever seen with a few really clever ideas. i will like to share the video on my social media space and to my friends on different platforms
“Middle of the road at everything” my dude it’s a $32k PC I wouldn’t say it’s middle of the road any any task😂
It is middle of the road, like Michael Phelps is a middle of the read swimmer
Linus being hyperbolic?! NO WAY!
I wish they had said: “It lacks focus, commitment, and sheer fing will”.
Ah! Men of culture!
r/unexpectedwick
Pc pro is a jack of all trades master of none, while the Camino is a master of one.
I love watching things I can't afford.
makes you appreciate that humble pc we all enjoy a bit more.
Same
Nobody cares
I can relate to that to a spiritual point man
HAHAHAHAHAHA love the comment XD
6:38 *(Insert F1 V12 engine)*
I'm glad they included that. Most people don't realize how loud server rooms can be. Imagine that clip, times a rack of 16-32 servers, and you'll understand why many datacenters mandate hearing protection for those out on the floor.
5:05 - hot damn, now THAT'S some B-roll!
For a moment I thought I was watching Star Wars.
that Laowa probe lensss
2:08 look at that charming smile
A "room-temperature room". Also known as a room. :p
well, room and room would cancel out, so just "temperature"
get your math right
@@tahabashir3779 your "math" is wrong. I'll let you think it over.
@@TheNefastor c'mon
@@tahabashir3779 sorry, I never joke about math 😉
Growing up my “room temperature” was about 16C
9:35 Linus: These memory chips don't output that much heat.
Me: laughs in Tuf RX 5700XT
Hmm I swapped out the stock cooler on my RX 5700 xt and put an ARCTIC Accelero on no problem, it has mini heat syncs to put on the memory that didn't seem like they'd be adequate but so far so good.
@@MrJohnboyofsj I added some M.2 heatsinks and a few thermal pads to solve the problem. It's still loud but it doesn't die
Me: laughts in nothing
I don't understand any of these comments
I'm against the whole company loyalty thing and certainly am no fanboy for Nvidia but I really don't understand why anyone would buy a Radeon card /shrug
Actually, splitting the coolant flow seems like the ideal method to me. This is a classic case of series vs parallel.
1. The specific heat capacity of water at a lower temperature is higher. Putting CPU and GPU in series reduces the heat capacity of water for the device that comes second in the loop. The second device may receive inadequate cooling.
2. The heat current from source to sink is higher if the temperature gradient is higher. In this case, since both CPU and GPU get cold water, heat from both components flows equally fast into the coolant.
3. It seems like splitting the flow will cause flow rate to reduce to each component. While that is true, splitting also reduces the resistance of the loop to the flow of water. Also, it is easy to compensate for this with thicker tubes and a more powerful pump.
LTT Video on series vs parallel cooling?
Wouldnt splitting increase the resistance rather than lower it? Considering more water would be in contact with the tube walls?
The specific heat capacity of water is a reasonably constant value for any temperature your computer is likely to see. I think it's a difference of less than a percent between 20C and 40C, if I remember right. Please point me to some sources if I'm wrong on this.
They did videos on series vs parallel and they found that there is absolutely no difference. There's not even a difference if you use two loops instead of one. (one loop with say a 280 rad and two loops with two 140 rads, equivalent cooling capacity). In series it will raise the temp of your GPU if you're hammering your CPU and not using the GPU, but not a bothersome amount - and vice versa. But running the two together nets very similar performance to a separated loop.
The reason for this is because each pass the water is not warmed significantly. Maybe it will warm up 1C from CPU then 1C from GPU, but then it goes through the radiators and is cooled a lot, let's say 1.8C. 23C water instead of 22C is not earth shattering. And next loop it will be 23.2C. Point being, it's not as if the CPU is warming the water by 20C or something to legitimately heat the GPU.
@@kaldogorath " Maybe it will warm up 1C from CPU then 1C from GPU,"
The water definitely warms up a lot more than that. I'd be surprised if it didn't warm up at least 15 C, especially in a system like this when under the full 1200W+ GPU load.
I think splitting the flow is reasonable in many case's to reduce coolant flow time between component and radiator. In this case splitting the flow in half between one 280w CPU and four 277w GPU is questionable, and unlikely to enable sufficient cooling for overclocking the GPU's.
9:33 How quickly things can change, hey!
With the Ampere cards GDDR6X even being on the brink of breaking out in flames for some models
This is the coolest custom build system I've ever seen with a few really clecver ideas. The GPU cooling with the sandwich cooler is just genius and reasonable easy to swap one! Seems like the steroids version of my budget optimised work build for Autocad Inventor & Matlab. At a far higher price :D .
6:39 We've reached 1.21 gigawatts, Marty. We're going back to the future!
An interesting point in this build is that it really only has three fans (minus those built into power supplieS). So often we see custom builders go fan-crazy, with 7 or 9 in a case. But, in reality it seems the most important thing is using your fans effectively, and directing air where it needs to go (in this case, through the radiatior!)
That's insane
(Haven’t watched the vid yet) but what if it takes a bit of its life span? Or is that only for overclocking
That’s insane
you're not even paying Attention to the video
yeah it is
But can it run doom?
Well, this is just proof that when you buy real pro stuff, it does the pro stuff well :-)
For that price, it would want to
You know what this machine would be FANTASTIC at, I bet? That deepfaking you did recently. With that CPU and those GPUs, it would be a BEAST at machine learning. Would have loved to have seen a comparison of just how fast it could tear through that training despite not having a quadro card. I know on that video you said something about worrying SLI would slow it down, but I expect that's not a realistic concern. Tensorflow doesn't use the GPUs in the same way games do, it doesn't need to synchronize each frame or anything, its just submitting compute tasks and pulling results. I don't know the specific software you were using, but I imagine it uses Tensorflow under the hood and would probably absolutely sing on that platform.
at 5:05, Friggin 10/10 camera shot!
Probe lens 👌
3:58
This is what I expect of Linus's heater for the Linus's house tour.
Don't disappoint us Linus you got us excited
Well it's all in a server rack :)
Linus going “yeah it’s efficient” is like me talking to a non pilot about a twin engine plane saying “yeah it only uses 35 gallons of gas per hour, it’s super efficient”
Linus: Makes big Boi pc
Weird Computer: I'm about to end this man's while career.
It doesn't even have RGB!
@@jerryli7787 sure??? Take a better look at it... I actually do have rgb
He always got the focus of the pro workstations wrong. 2, 3 or 4 GPUs is the way to performance for the kind of apps he’s benchmarking. Super expensive CPUs aren’t that relevant.
Probe lens! 👌
8:25
6:38 Those server power supplies are scary sounding. If my pc made that sound i would be like RIP
"Yeah it's efficient."
That smile. That damned smile.
“Middle of the road computer”
- Linus 2020
lmao linus has lost all realization of middle of the road after his latest upgrade to his PC. hahaha only kidding but that made crack up, like this dude really called a 32k pc middle of the road for anything
True
Those closeups on the Camino were really awesome
Linus: More powerful at HALF THE PRICE!
Me: Looks at my wallet.
*Weeps*
The Windows task manager has been weird recently - I was watching something on TH-cam and the task manager says network utility was 0%.
TH-cam could have already downloaded the whole video, or you could have been looking at a wrong network adapter.
@@mikeuk1927 Maybe but he's right that task manager has been wonky. I've been playing MW and 90% of the time it shows my CPU pinned at 99% and my GPU at 2-3%
I know, I was downloading something on steam and at the performance tab of task manager it didn't even show a single megabit being used, literally 0.
"Maybe we need to just ,force it.."
*Everyone dies of a heart attack
Yeah I was looking for this comment! I immediately run for my pills after that!
2:08 "Yeah, it's efficient"
That smile
"new gpu encoding" yeah good job adapting technology that is nearly a decade old Adobe.
5:05 wow looks like someone got a new probe lens XD
Hey linus i know you may not see this but i just wanted to thank you for making great and funny tech videos and they got me motivated to finally get my gaming rig built about 6 months ago i had been wanting to build for a few years but i didnt know what i was doing i found your channel in 2018 and you guys have taught me what i needed to do keep making great videos thank you!!
:-)
But it won't help me afford one lol.
This guy does no giveaways (not that I know of anyway).
mary jay gomes Linus has done many giveaways including giving away a 1440p 165hz aorus monitor recently
"This is the segway you have never seen before"
You'll never guess what the segway was.
Segway is the thing you can ride like Paul blart mall cop, segue is what Linus always does. 😁 Now you know!
@@tflyfoster3018 not funny
@@walidfakhfakh3660 Did I say lol I could have worded it way better for it to be a joke but I'm glad you found something that was intended to not be funny not funny, thank you for that gave me a chuckle!
Edit: @Dave Jones thank you good sir!
@@walidfakhfakh3660 stop commenting on everything, nobody likes you and you're only further embarrassing yourself
TacticalIdiot17 Ngl you’re the one embarrassing yourself
We need a benchmark roundup from all of the high level PCs LMG has built in the last 5 or 10 years.
linus: Like this segway you've never seen before
me: wait, i've seen this before...
Segue ^
Deym, why do I love to watch expensive stuff while my wallet is crying
Editing is gettng better and better day by day and congrats for the new intro
jeebus
...with socks under his sandals
diabeetus
them: This AMAZING PC performs just as well as our most expensive PC - at half the price!
My money: bro i am out of here.
The shriek at 8:09 when Linus damn near ripped those GPU's damn near took me out lmao
PSU bay: folds out
Linus: I figured it out!
Alex: “...DAMN!”
Me too Alex, me too. That was pretty sick.
Not funny
I think Linus is the only tech youtuber who keeps his desk clean always....
Weak arm boom boom
Because he just drops everything off it
They didn't show his desk in this video...