It’s Back and I’m SO Excited! - Threadripper 7000

แชร์
ฝัง

ความคิดเห็น • 1.6K

  • @嵓
    @嵓 10 หลายเดือนก่อน +5359

    I love watching videos on products I can't afford

    • @cip0llo
      @cip0llo 10 หลายเดือนก่อน +66

      sameeee

    • @chef_ekan
      @chef_ekan 10 หลายเดือนก่อน +23

      RHIGT ME TO LOL

    • @alexanderroodt5052
      @alexanderroodt5052 10 หลายเดือนก่อน +34

      My favourite passtime

    • @bondkings
      @bondkings 10 หลายเดือนก่อน +27

      the same thing my nephew tells me whenever he finds me watching pc builds

    • @survil321
      @survil321 10 หลายเดือนก่อน +14

      Its the best

  • @randomher089
    @randomher089 10 หลายเดือนก่อน +430

    I honestly expected my 7950X to last longer before being referred to as low end....

    • @ailestriker9888
      @ailestriker9888 10 หลายเดือนก่อน +48

      Hah! Peasant!
      *He says whilst sitting computerless*

    • @randomher089
      @randomher089 10 หลายเดือนก่อน +20

      @@ailestriker9888 Apparently we're almost in the same boat then, but I guess a low end pc is slightly better than no pc technically...

    • @i_like_Peanuts
      @i_like_Peanuts 10 หลายเดือนก่อน +35

      Me slowly petting my Ryzen 5 3600 :
      "There there buddy.. You're high end to me"

    • @leadwolf32
      @leadwolf32 10 หลายเดือนก่อน +22

      @@i_like_Peanuts
      Gently pats my i7 4790K
      "Please dont off yourself because I watched this video"

    • @humanitylostmusic
      @humanitylostmusic 10 หลายเดือนก่อน

      @@leadwolf32pats my i910900kf

  • @PenguinPolar
    @PenguinPolar 10 หลายเดือนก่อน +1240

    For those wondering, the specs of the highest of the PRO lineup is:
    2TB of ECC RAM
    96 Cores / 192 Threads
    128 PCIe 5.0 lanes
    $9,999
    Note: No, I'm not going to buy one, I can barely afford rent.

    • @hambotech9954
      @hambotech9954 10 หลายเดือนก่อน +11

      bruh

    • @Moondust7
      @Moondust7 10 หลายเดือนก่อน +34

      Those are intended for server use, right?

    • @Swarailiaball
      @Swarailiaball 10 หลายเดือนก่อน +129

      So the specs of my body after that purchase will be:
      1 kidney
      1 lung
      1 arm
      1 leg
      1 heart
      1 brain

    • @AFE-VoidSmasher
      @AFE-VoidSmasher 10 หลายเดือนก่อน +8

      My poweredge r620 tops out at 768gb ram 2tb is like X_x and the cores i wouldnt say no to seeig task manager....adjust to the cores graph 😂

    • @bionicgeekgrrl
      @bionicgeekgrrl 10 หลายเดือนก่อน +60

      ​@moondust6034 no, high end workstations. Epyc is the server range.

  • @JanasV
    @JanasV 10 หลายเดือนก่อน +202

    These graphs are a lot easier to understand at a glance than the old ones. Good improvement!

    • @totallynotmyeggalt6216
      @totallynotmyeggalt6216 10 หลายเดือนก่อน +8

      Except for the thermal draw graphs, where the highlighted box over the 7980X and 7970X in the legend changes the color of their labels!

    • @the_undead
      @the_undead 6 หลายเดือนก่อน

      ​@@totallynotmyeggalt6216the thermal graph not being super readable doesn't bother me too much because the average person really doesn't care.

  • @oakley6889
    @oakley6889 10 หลายเดือนก่อน +1287

    The competition is actually so disappointing rn, im glad AMD is still actually making improvements, although it would be nice to see the prices come down even just a little

    • @ventilate4267
      @ventilate4267 10 หลายเดือนก่อน +95

      To be fair they themselves killed threadripper

    • @syncmonism
      @syncmonism 10 หลายเดือนก่อน +2

      The prices are coming down

    • @bionicgeekgrrl
      @bionicgeekgrrl 10 หลายเดือนก่อน +42

      Until intel gets something remotely competitive then they can bank the profits. Though the reality is that they probably will make more pro and epyc chips than non pro which bumps the non pro price up.

    • @blank141
      @blank141 10 หลายเดือนก่อน +43

      ​@@bionicgeekgrrlits how the market works, what do you expect, even Nvidia is doing it because they know there is no competition

    • @divinehatred6021
      @divinehatred6021 10 หลายเดือนก่อน

      @@bionicgeekgrrl the only reason why intel is competitive at all is because they are trusted by the people who do not care to use the brain tbh

  • @D3cibyte
    @D3cibyte 10 หลายเดือนก่อน +771

    Can't wait for another Linus' personal rig upgrade, back to HEDT

    • @korosaki13
      @korosaki13 10 หลายเดือนก่อน +56

      His probably going to say is not going to use it, then one year later we learn in wan-show that he's been using it for 6 month just because "no one was using it". Yeah sure linus, your editing team "couldn't" use it right.

    • @Tophatguy_vr
      @Tophatguy_vr 10 หลายเดือนก่อน +3

      Nice Profile pic

    • @FinneasJedidiah
      @FinneasJedidiah 10 หลายเดือนก่อน +79

      ​@@korosaki13lmao my guy you're getting mad at him for an imaginary scenario you made in your mind

    • @TechnoBabble
      @TechnoBabble 10 หลายเดือนก่อน +14

      @@FinneasJedidiah Seriously, dude needs to touch grass.

    • @bbggakkba
      @bbggakkba 10 หลายเดือนก่อน +2

      ​@@korosaki13they can't... Their boards aren't compatible....

  • @Epicgamer_Mac
    @Epicgamer_Mac 10 หลายเดือนก่อน +1218

    I love watching huge amounts of cores crush tough projects 😁

    • @shapelessed
      @shapelessed 10 หลายเดือนก่อน +17

      Meanwhile I love what Apple's smaller core-count baseline M chips can do in web dev...
      In web dev we actually mostly want super fast single-core performance due to everything being largely single-threaded.
      But in everything else? Yes. Threadrippers kick ass.

    • @Epicgamer_Mac
      @Epicgamer_Mac 10 หลายเดือนก่อน +9

      @@shapelessed I’m actually a huge M-series Mac fan myself!
      I mean, if you’re like me and you got the full M3 Max the minute it was available, then you have almost desktop i9 performance in your lap right now. The thing is insane. Threadripper level? Not quite, but 16 cores up to 4.1 GHz with VERY wide decode per clock cycle? It punches well above its weight to say the least.

    • @sk-sm9sh
      @sk-sm9sh 10 หลายเดือนก่อน +12

      @@shapelessed what's the deal with web? Application running on web tech typically should be built so that average consumer pc user can run it with no issues. And then if you need anything more heavy multithreading is possible on web though not as efficient due to costly cross-thread communication in javascript.

    • @IDv8I
      @IDv8I 10 หลายเดือนก่อน +2

      A giant, expensive processor that sucks for gaming...yaaaa.

    • @najeebshah.
      @najeebshah. 10 หลายเดือนก่อน +3

      ​@@Epicgamer_Maclol NO its not near i9 😂

  • @Haarschmuckfachgeschafttadpole
    @Haarschmuckfachgeschafttadpole 10 หลายเดือนก่อน +712

    It's insane that AMD is actually bigger (money wise at the current moment) than Intel is. Lisa Su has to be one of the most successful CEO's in modern history.

    • @AdrianEdelen
      @AdrianEdelen 10 หลายเดือนก่อน +318

      helps when your executive is actually an engineer and not just a business person

    • @Antimonious
      @Antimonious 10 หลายเดือนก่อน +144

      Fr! The entire handheld gaming pc market, that is booming, only exists because of Ryzen APU's!

    • @MarkusNemesis
      @MarkusNemesis 10 หลายเดือนก่อน +56

      "money wise" being? I love AMD, don't get me wrong, but I cannot see how AMD is in any financial metric stronger than Intel right now.

    • @blueprint7
      @blueprint7 10 หลายเดือนก่อน +15

      Found the team red fanboy

    • @CapArchy
      @CapArchy 10 หลายเดือนก่อน +56

      Intel has a lot of business outside of CPUs, and still owns massive market shares there anyway. AMD is financially much smaller than both competitions Intel & Nvidia, something that makes wins against them impressive.

  • @4RILDIGITAL
    @4RILDIGITAL 10 หลายเดือนก่อน +130

    I'm thoroughly impressed with AMD's threading prowess here, really ripping those threads! Love how they're still pushing the high-end even in this class of CPUs. Besides, as you said, time equals money, so those extra threads could be a lifesaver. And being an AMD fan, these prices still feel like a justifiable investment. Can't wait to see how the landscape changes when Intel brings their mojo back.

  • @michaelrichard9122
    @michaelrichard9122 10 หลายเดือนก่อน +169

    Can't wait for Linus to invest into the new Threadripper platform for the office. Only for it to be discontinued a year later.

    • @the_undead
      @the_undead 10 หลายเดือนก่อน +18

      If they're going to upgrade their workstations, this is really their only option right now, unless they go with the professional thread ripper

    • @michaelrichard9122
      @michaelrichard9122 10 หลายเดือนก่อน +3

      @@the_undead If they do. They made the decision to go to LGA 1700.

  • @Drrobverjones
    @Drrobverjones 10 หลายเดือนก่อน +228

    As an AI developer, and sometimes training can take days or even weeks, something powerful like this would be amazing. Not all AI is trainable on gpus.

    • @pietheijn-vo1gt
      @pietheijn-vo1gt 10 หลายเดือนก่อน +9

      Which type of AI can't be trained on GPUs?

    • @Deja117
      @Deja117 10 หลายเดือนก่อน +17

      That's the first thing that came to my mind... Even if you're training on GPU's, the CPU still needs to do a significant amount of work. Especially if you have say... A server full of GPU's all running different scenarios at the same time.

    • @lhl
      @lhl 10 หลายเดือนก่อน +34

      @@pietheijn-vo1gt There are definitely some types of models (like RNNs) that don't parallelize well because they have sequential dependencies. There's probably also some other corner cases for higher precision or other constraints, but regardless having a beefy system can be a benefit even if GPU training because the bar for data processing and task management can be pretty high, especially as models grow bigger (not to mention that CPU offloading is becoming more and more common and in those cases, more memory bandwidth/faster CPUs absolutely crushes).

    • @DanielFerreira-ez8qd
      @DanielFerreira-ez8qd 10 หลายเดือนก่อน

      ​@@pietheijn-vo1gt They didn't say they couldn't, and from a quick google search it seems anything less dense than large language models can be used with a CPU instead, to not much issue

    • @romanpul
      @romanpul 10 หลายเดือนก่อน +10

      @@pietheijn-vo1gtReinforcement learning for example, since these types of models rely on a trainable decision tree rather than a neural network or in some more elaborate versions like Qlerning are combining such a tree with a neural network. And updating that tree and the interaction of the agent with the environment is mostly a CPU task.

  • @eugenes9751
    @eugenes9751 10 หลายเดือนก่อน +60

    A $5000 CPU only costs about 80hrs of developer time savings, so it's surprisingly quick to ROI.

    • @Chipsaru
      @Chipsaru 10 หลายเดือนก่อน +9

      in comparison to some high end consumer cpus developer would "save" 1h / month at most upgrading to this, so ~7 years ROI is way too much.

    • @eugenes9751
      @eugenes9751 10 หลายเดือนก่อน +19

      @@Chipsaru Maybe if your developer is using the computer to watch TH-cam ok, but saving several minutes per short render adds up very quickly. Obviously nobody's going to pay 5k for a CPU on a computer that does spreadsheets all day long.

    • @Chipsaru
      @Chipsaru 10 หลายเดือนก่อน +8

      ​@@eugenes9751 developers don't do rendering, we are talking code compilation, AI training, db queries, running VMs
      or containers, etc. This CPU will be faster in some scenarios, but not that much. If my project compiles in 5 minutes now, it will compile in 4-4.5 minutes with this CPU because bottleneck is somewhere else.

    • @andrewgrant788
      @andrewgrant788 4 หลายเดือนก่อน

      If you are building large C++ code bases, a Thread Ripper will pay for itself very quickly.

  • @SunnyZ
    @SunnyZ 10 หลายเดือนก่อน +8

    Let's be honest, this isn't really a desktop cpu, more of a low end enterprise/business chip.
    *Low End Enterprise Threadripper*
    LEET for short

    • @chiefjudge8456
      @chiefjudge8456 10 หลายเดือนก่อน

      No it isn't. First of all there's no such thing as an "enterprise/business chip", which is a buzzword with no definition. This just sounds like mental gymnastics because you can't afford one and only have a low end desktop processor with 16 PCIe lanes. People who aren't satisfied with toys will go Threadripper.

    • @smittyvanjagermanjenson182
      @smittyvanjagermanjenson182 10 หลายเดือนก่อน +2

      ​@chiefjudge8456 the mental gymnastics to convince me to spend 5 grand on a cpu for my personal desktop doesn't exist in any conceivable reality.

    • @SunnyZ
      @SunnyZ 10 หลายเดือนก่อน

      @@chiefjudge8456 lol mental gymnastics?
      Missed the whole LEET joke part aye?

  • @artyomexplains
    @artyomexplains 10 หลายเดือนก่อน +10

    THERE IS COMPETITION FROM INTEL - The Xeon W-2400 lineup! So many reviews and no one even tried to compare to intel's actual fresh HEDT - W790 platform. What kind of review incompetence is this? Yes, Intel would loose, but so what, this means we can just ignore it and compare to desktop CPUs? And LTT made a video about W790 Xeon W and when fresh TR comes out you just ignore them? Mindblowing.

    • @newearth9027
      @newearth9027 10 หลายเดือนก่อน +1

      Make a thread about it on their forum much more likely to get more traction

    • @Mioniks
      @Mioniks 10 หลายเดือนก่อน

      +

    • @alexos7977
      @alexos7977 10 หลายเดือนก่อน

      +

  • @nipa5961
    @nipa5961 10 หลายเดือนก่อน +23

    AMD's 64 cores consume just as much power as Intel's 8 (+16 "efficiency") cores. Damn.

    • @Zarod89
      @Zarod89 10 หลายเดือนก่อน +1

      Don't think you can compare just the amount of cores tho. For example an 1 efficiency core on an intel 14900k is much stronger than 1 AMD core on the threadripper 7000. Not even speaking about the performance cores. Just the raw clock power of 6ghz vs 5.1ghz and ofc the TDP of intel being much more efficient and "easier""to cool.

    • @nipa5961
      @nipa5961 10 หลายเดือนก่อน +13

      @@Zarod89 You seem to confuse a lot.

    • @VASILISPAGOURAS7
      @VASILISPAGOURAS7 10 หลายเดือนก่อน

      ''intel being much more efficient''you have no clue buddy @@Zarod89

    • @ifyouwantmoneythengivemeev8094
      @ifyouwantmoneythengivemeev8094 10 หลายเดือนก่อน +4

      @@Zarod89 what? if you said a intel performance core was faster than a zen4 core, then sure enough. but an efficiency core? intel THEMSELVES have said that these are about the performance of a skylake core, and you know what AMD's equivalent to skylake was? zen2.

  • @NateFromIT
    @NateFromIT 10 หลายเดือนก่อน +8

    Can you test this by playing on a 300K plus population Cities Skylines 2 save?
    Need to know if this is a decent upgrade to play CS2.
    Thanks.

  • @peyzah2289
    @peyzah2289 10 หลายเดือนก่อน +29

    Id love to know how these chips compare under different use cases, compiling, running VMs (EVC vs no EVC) , databases, big data map reduce stuff, ai, etc... synthetic tests, blender and video work only tell a small section of the story.

    • @Gramini
      @Gramini 10 หลายเดือนก่อน +2

      Phoronix has a more versatile test coverage up, just FYI.

  • @olavsierotvr4282
    @olavsierotvr4282 10 หลายเดือนก่อน +30

    When i knew nothing about computers 7 years ago, your videos were enjoying. Then i got my degree in IT, and you videos were still enjoying. Im currently in my second year of computer science and quess what. I still find your videos wildly enjoying! Achieving such depth, while still keeping the consepts easy to understand, is crazy. Great work!

    • @tzuyd
      @tzuyd 10 หลายเดือนก่อน +5

      FYI it's 'enjoyable'.

  • @zwerko
    @zwerko 10 หลายเดือนก่อน +38

    I got my 2950X with a X399 motherboard for $1k back in the day. I might consider upping that to even $2k for an upgrade as Threadrippers are pretty awesome for my needs (tons of virtualization) but even the cheapest option w/ a motherboard is now going over $3k5 and I just can't justify it... I guess the Threadripper line is a dead-end for me, it was awesome while it lasted (before the previous generation), although I won't be changing mine anytime soon-it still does the job adequately.

    • @maximkovac2000
      @maximkovac2000 10 หลายเดือนก่อน +4

      I agree, i am using a 1920X for my server at home and need an upgrade. But even if there was a 12-Core Threadripper, paying more than 1000 for just a motherboard is just too much...
      There is nothing new on the market with 12-16 cores, lots of lanes and higher clockspeeds than 3.3Ghz

    • @niter43
      @niter43 10 หลายเดือนก่อน

      ​@@maximkovac2000Pro series starts at 12 cores (7945WX), but pricing is yikes

    • @chriswright8074
      @chriswright8074 10 หลายเดือนก่อน +1

      Spend the money

    • @chriswright8074
      @chriswright8074 10 หลายเดือนก่อน

      ​@@maximkovac2000spend the money you be happy you did

    • @stpr16
      @stpr16 10 หลายเดือนก่อน

      ​@@chriswright8074consoomer mindset

  • @cj_zak1681
    @cj_zak1681 10 หลายเดือนก่อน +1

    I'm a gamer, but videos like this remind me just how small a part of computing gaming actually is. This is just mind blowing computing power...and this is only at the HEDT level

  • @ZintomV1
    @ZintomV1 10 หลายเดือนก่อน +10

    In relation to AVX-512, yes not a lot of programs explicitly use it, however the runtimes they are build on do, so things such as searching datasets, computing physics for games etc, will run faster out of the box with AVX-512. An example of this is the AVX-512 support added to .NET 8, it is integrated into lots of the core libraries.

  • @gingaming_gg
    @gingaming_gg 10 หลายเดือนก่อน +1

    If we’re going with “LEDT”…. All I need is for LEDT to throw a few more PCIe lanes our way to fill that gap. Not being able to move my m.2 card into AM5 has me stuck on TRX40…

  • @MarkHawk
    @MarkHawk 10 หลายเดือนก่อน +4

    Im so happy Threadripper is back. I was so angry when I invested in the line and they pretty much pulled out less then a year later. It'd have been hard to move onto another processor after this.

    • @FleaOnMyWiener
      @FleaOnMyWiener 10 หลายเดือนก่อน +1

      What do you use yours for?

    • @MarkHawk
      @MarkHawk 10 หลายเดือนก่อน

      @@FleaOnMyWiener Houdini Renders

  • @rikou1986
    @rikou1986 10 หลายเดือนก่อน +1

    It's wild that AMD said it will pull 830watts if you can cool it.

  • @PerfectlyFriedBread
    @PerfectlyFriedBread 10 หลายเดือนก่อน +5

    I think some LocalLLM benchmarks would br nice to see in general and would be an arena where a product like this could have some interesting utility (even if its still not practical for those applications).

  • @SamMurphyHSV
    @SamMurphyHSV 10 หลายเดือนก่อน +79

    I love my 3970x! Nice to see AMD back with more god chips.

    • @嵓
      @嵓 10 หลายเดือนก่อน +26

      Stop flexing on us poor peasants

    • @MU-we8hz
      @MU-we8hz 10 หลายเดือนก่อน +15

      @@嵓 stop flexing with beaing a poor peasant.

    • @skyecloud968
      @skyecloud968 10 หลายเดือนก่อน +13

      I love my 9970x too from AMD and my RTX 8090ti which has over 46gb of vram.

    • @SamMurphyHSV
      @SamMurphyHSV 10 หลายเดือนก่อน +13

      @@skyecloud968 Only 46gb of VRAM? Posh! I expected at least 10 48 GB quadros from you.

    • @SamMurphyHSV
      @SamMurphyHSV 10 หลายเดือนก่อน +7

      @@嵓 Lmao, not flexing. I use it for my business and its been a good product to me.

  • @TrueThanny
    @TrueThanny 10 หลายเดือนก่อน +9

    Don't care about the core count. It's the I/O that matters. If you need expansion cards, you simply cannot use the toy computer platforms, because there's nowhere to plug them in.
    That's why it's still frustrating that AMD didn't release 12-core and 16-core parts on the HEDT platform, with comparably lower prices.

    • @SethTaylor1
      @SethTaylor1 10 หลายเดือนก่อน

      what industry / io are you specifically referring to?

    • @TrueThanny
      @TrueThanny 10 หลายเดือนก่อน +1

      @@SethTaylor1 My own personal mix of expansion cards:
      Graphics card - 16 lanes
      RAID controller - 8 lanes
      10G NIC - 8 lanes
      Sound card - 1 lane
      It's simply not possible for me to build a computer using a toy platform. What about onboard sound, you might ask? It sucks. What about boards with 10G onboard? It's RJ45, not SFP+.
      I'm not even talking about any special industry parts. Just normal things that any advanced PC user might have.

    • @levygaming3133
      @levygaming3133 10 หลายเดือนก่อน +2

      @@TrueThannyalso I think 40gbe fiber takes 16x, and if you wanted to even just use *single* GPU + 4xM.2 carrier card + 10gig NIC that’s also more than consumer platform supports.

  • @canyonrunner331
    @canyonrunner331 10 หลายเดือนก่อน +2

    Does anybody else watch 99% of these videos, but not understand anything that's going on?

    • @kowaisenpai82
      @kowaisenpai82 7 หลายเดือนก่อน +2

      no, I think it's only you

  • @bgezal
    @bgezal 10 หลายเดือนก่อน +7

    "eye-watering $5000"
    Tim Apple: ...and it comes in black.

  • @tompov227
    @tompov227 10 หลายเดือนก่อน +8

    Actually the M1 Ultra comparison is kinda impressive for Apple. That is a 3 year old chip that uses 1/5 the power inside a computer that costs $1000 less than the CPU only being tested and it was only ¼ the performance

    • @benstacey831
      @benstacey831 10 หลายเดือนก่อน +2

      kinda odd to not use the m2 ultra as comparison

    • @Pasi123
      @Pasi123 10 หลายเดือนก่อน +3

      March 2022 was 3 years ago? Though I do get your point that the M1 Ultra is based on the same arch as the regular M1 from November 2020

    • @Pasi123
      @Pasi123 10 หลายเดือนก่อน

      @@benstacey831 The M1 Ultra score they compared to is the one on Cinebench 2024. Cinebench doesn't come with scores for M2 Ultra
      Looking at scores online the M2 Ultra scores around 1918 so it's quite a bit below the LEDT 7950X and 14900K

    • @LeLe-pm2pr
      @LeLe-pm2pr 10 หลายเดือนก่อน

      @@benstacey831 cinebench 2024 might not have reliable data for it

    • @Teluric2
      @Teluric2 10 หลายเดือนก่อน

      Impressive for you but apple silicon cant do what this cpu can do.

  • @gogogomes7025
    @gogogomes7025 10 หลายเดือนก่อน +12

    Can't wait to see the userbenchmark review of this.

  • @raptormonkey253
    @raptormonkey253 10 หลายเดือนก่อน +2

    This seems like a really good cpu for CFD simulations, high core count and high frequency has me just dying to get my hands on one some day. Recently bought an old xeon server for simulations and its awesome, but it doesnt hold a candle to this. I look forward to possibly getting one in 10 years 🤣

  • @pivorocks
    @pivorocks 10 หลายเดือนก่อน +4

    Right after compensator 4, now that has to be redone.

  • @MrDanwilliams
    @MrDanwilliams 10 หลายเดือนก่อน +1

    Best introduction ever Linus. You took it to the next level there, "Makes Intel's fastest consumer chip look like a drugged donkey!" WOW.

  • @ScottAshmead
    @ScottAshmead 10 หลายเดือนก่อน +5

    the rub is going to be that youtube benchmarks will not include threadrippers in the normal benchmarks and the consumer version will go away again because there is no one associating these CPUs to the every day user but 4090s are somehow ok 😖

    • @chiefjudge8456
      @chiefjudge8456 10 หลายเดือนก่อน +1

      They will and they won't. Not everyone is satisfied with low end desktop chips.

    • @smittyvanjagermanjenson182
      @smittyvanjagermanjenson182 10 หลายเดือนก่อน +1

      RTX 4090 isn't what any average consumers need smh.. that's all just overhype from Gamers.

    • @Teluric2
      @Teluric2 10 หลายเดือนก่อน

      ​@@smittyvanjagermanjenson182average consumers cant afford to pay the license x core of that machines.
      Average users are not the ones who change the world.

  • @enderblazex2718
    @enderblazex2718 10 หลายเดือนก่อน +16

    Rip and tear until it's done I guess.
    Also, HOLY SHIT the 7980X alone costs as much as my car, even before factoring motherboard prices.

    • @RiceCubeTech
      @RiceCubeTech 10 หลายเดือนก่อน +6

      Just wait for like 5 years and get it for nothing lol

  • @pmonet31
    @pmonet31 10 หลายเดือนก่อน +11

    Genuine question out of ignorance. Why does everyone avoid comparing HEDT to Xeon w if the price and performance are comparable. Would they not both be options someone looking for a $5000 threadripper may consider?

    • @vadnegru
      @vadnegru 10 หลายเดือนก่อน +3

      I would suspect that Xeon motherboards for workstation are kinda rare and not obtainable in retail. In that case they need to be compared with Threadripper Pro.

    • @Ecker00
      @Ecker00 10 หลายเดือนก่อน +2

      Agree, major oversight. The Xeon W5-2465X and it's siblings are the real competitor to this line up, and would be nice to see how it performs head to head with Threadripper 7000 series.

  • @Hobbles_
    @Hobbles_ 10 หลายเดือนก่อน +2

    THREEAAADDDRIIPPPEERRRRR!!! I'm still really hoping they actually keep up support on this one, after what they did to us before

  • @ThumperJunkie
    @ThumperJunkie 10 หลายเดือนก่อน +11

    Hope to see a new 16 core threadripper that might be a bit more economical. Could really use the PCIe lanes in my server.

    • @TrueThanny
      @TrueThanny 10 หลายเดือนก่อน +6

      Unfortunately, that's very unlikely to happen.
      There are 12-core and 16-core TR Pro parts, but the former is just $100 cheaper than the 7960X, and the latter is $400 more expensive.
      Given the increased cost that a WRX90 motherboard will likely have, the 7960X is still safely the price entry point to getting a modern processor capable of using expansion cards.
      So with the $600 ASRock TRX50 board, $1500 for the 24-core Zen 4 TR chip, and $1200 for the 128GB DDR5-6400 kit, you're looking at $3300 for an upgrade, if you have all the other bits already (case, PSU, cooler, etc.).

  • @lucasrudolph8681
    @lucasrudolph8681 10 หลายเดือนก่อน +2

    1000 IQ move: Buying an old Intel 7980X and making everybody think you are rich😂

  • @Dygear
    @Dygear 10 หลายเดือนก่อน +12

    When it comes to AVX-512 you also need to mention if the chip clocks down like the Intel side of things. Intel hobbles their chips when using AVX-512 due to how much power it takes so the slow the whole chip down to a lesser clock speed in order to not blow their entire power budget (and also create a melted pile of CPU under your already toasty cooler.)

    • @TrueThanny
      @TrueThanny 10 หลายเดือนก่อน +7

      Not an issue with Zen 4. There are no AVX-512 offsets, and the dynamic clocks don't drop any lower under full load with AVX-512 than they do with any other kind of load.

    • @Dygear
      @Dygear 10 หลายเดือนก่อน +2

      @@TrueThanny yes but it should be in the video as a highlight of the AMD chip.

    • @omegaPhix
      @omegaPhix 10 หลายเดือนก่อน

      ​@@DygearBut the chips don't clock down?

    • @TrueThanny
      @TrueThanny 10 หลายเดือนก่อน

      @@Dygear But it's not a highlight. It's just a lack of bizarre behavior found on older Intel chips running AVX-512. A video about those chips is the proper place to discuss a loss of clock speed in that scenario.

    • @Dygear
      @Dygear 10 หลายเดือนก่อน +5

      @@TrueThanny considering that was an expected behavior for the intel chips the AMD chips not having that behavior is a highlight.

  • @coralanapdx
    @coralanapdx 6 หลายเดือนก่อน +2

    MRDT: Mid-range Desktop

  • @joefries7046
    @joefries7046 10 หลายเดือนก่อน +6

    7000 AMD Threadripper comes out :
    Vegeta : Are you ready now? To witness a power not seen for thousands of years!?

  • @grandmoffporkins
    @grandmoffporkins 10 หลายเดือนก่อน

    My hope is always that this is like the Mercedes E-Class. Eventually this type of performance should make it's way down to the average consumer product. But maybe that's unrealistic since we're talking nanometers.

  • @TheSlayerN
    @TheSlayerN 10 หลายเดือนก่อน +17

    Not that it's the most important thing, but the mobo design for threadripper has always been 🔥🔥🔥

    • @scottieb1
      @scottieb1 10 หลายเดือนก่อน +2

      I agree, but they were also all overpriced. And with the last gen they dead-ended the chipset and socket after zero upgrades. Fool me once...

    • @username8644
      @username8644 10 หลายเดือนก่อน

      Not really. The x399 boards were great but after that they really went downhill. There's a reason that trx40 boards are barely even selling at $100 on eBay, and X99 boards are still selling for more than that lmao.

  • @gammaraider
    @gammaraider 10 หลายเดือนก่อน +2

    These graphics are so useful into proving insight. It really drives home how the $5000 7980x completely destroys the $600 Intel 14900k. Almost 3x the performance, at only 9x the price! AMD clearly the better -sponsor- product.

  • @gabrielcreamer5012
    @gabrielcreamer5012 10 หลายเดือนก่อน +24

    can you do some interviews with customers of these higher end products to see what they're actually used for?

    • @LeLe-pm2pr
      @LeLe-pm2pr 10 หลายเดือนก่อน +10

      you'd be seeing mostly prosumers running some AI tasks, rendering, computationally expensive algorithms, virtualization, or maybe also needing the pcie lanes for multi-gpu (rendering/AI), or storage and probably a ton of more pro workloads, the GN video seems to have some interesting benchmakrs surrounding that

    • @lucasrem
      @lucasrem 10 หลายเดือนก่อน

      Who need non supported solutions ?
      Most people need intel and Nvidia computing, Autodesk etc, if one of them is AMD, you will get non supported results too.
      Who needs AMD, if you develop everything on AMD, they can do the same job, or better ? But who is willing to do that ?
      Small experimenting parties ? On what benefits ?

    • @vadnegru
      @vadnegru 10 หลายเดือนก่อน +3

      I know that Threadripper was used in rendering Terminator movie effects.

    • @jockey12022011
      @jockey12022011 10 หลายเดือนก่อน +2

      @lucasrem maybe try to explain that more clearly. Your comment is a bit of a brain vomit. Are you saying the AMD chips will perform worse than the Intel chips in Autodesk applications?
      And are you saying AMD CPUs are not supported by all software in general?

    • @eddycolangelo
      @eddycolangelo 10 หลายเดือนก่อน +3

      ​@jockey12022011 yeah, it seems to me like he's suggesting that AMD CPUs and GPUs are not supported by some software, so you should just stick to Intel and Nvidia.
      Yeah kind of a fanboy take when you don't support that stance with any proof, especially after seeing how Autodesk itself claims to be "hardware agnostic", it is written right on the main page lol.
      By the way, intel produces x86 CPUs just like AMD does, they don't have any kind of proprietary technology that would make them the "only supported" way to do something, in contrast to AMD.
      The only relevant "CPU proprietary stuff" I can think of is virtualization support, where both AMD and Intel offer equivalent tech.
      Talking about GPU stuff is a different matter, and there I can see how some of the proprietary technologies that Nvidia offers could lead someone to prefer them over AMD or Intel offerings just purely based on the brand and the associated features.
      But then again, for most of the stuff, the subject is more about compute and less about specific technologies: audio and video editing, professional movie or music making, 3d modelling, simulation environments, you name it, they all pretty much don't care about the hardware that you're using as long as it's offering similar features, which AMD does, so why would you argue that you just can't do shit on AMD because "they are not supported" when the reality is that you clearly can, and actually, sometimes, it is even the better option?

  • @TheHulkminator
    @TheHulkminator 10 หลายเดือนก่อน +1

    Now this needs an ITX board!

  • @zasbirrahmanzayan8648
    @zasbirrahmanzayan8648 10 หลายเดือนก่อน +4

    ahh yes, technology that i cannot afford.

  • @DavidGoshadze
    @DavidGoshadze 10 หลายเดือนก่อน

    I would love to see some of these chips also tested in engineering applications. Structural simulations, MEP loads and calculations, they all benefit from many multi-core high frequency cpus.

  • @IkethRacing
    @IkethRacing 10 หลายเดือนก่อน +3

    The lower temps from the 7980x is actually due to the much lower power per core. The cores are operating far more efficiently from less leakage. I’d say 90% of the lower temps are due to this and surface area is only 10%.

  • @markdatton1348
    @markdatton1348 10 หลายเดือนก่อน

    I'm an electrical engineer, and we generally use tools that are HIGHLY parallelizable. Using cloud virtualized instances are just much too slow, and even 16 cores really is just too little to effectively run simulations and workflows for us. Having 32 or 64 cores would truly save a lot of time, and likely be worth the high upfront cost.

    • @Teluric2
      @Teluric2 10 หลายเดือนก่อน

      What software do you use for simulation and what OS?

  • @emf321
    @emf321 10 หลายเดือนก่อน +48

    Twice as fast as the 14900K, 10 times as expensive.

    • @abdou.the.heretic
      @abdou.the.heretic 10 หลายเดือนก่อน +26

      5k for anything consumer grade is theoretical levels of idiocracy

    • @danyuzunov
      @danyuzunov 10 หลายเดือนก่อน +18

      Yup, kinda dumb. But there are indeed tasks in which the threadripper delivers about 4 times the results for about the same power consumption, which for professional use can be quite impactful.
      (Obviously it can have a lot higher power consumption but I am referring to "Some" uses)

    • @MrBrax
      @MrBrax 10 หลายเดือนก่อน +4

      ​@@abdou.the.heretic same with 4090

    • @RiceCubeTech
      @RiceCubeTech 10 หลายเดือนก่อน +16

      Difference is you can support much more RAM and it’s got an entirely different use case. Waayyyyy more PCIE lanes too for expansion cards and faster storage.
      There’s a reason this exists. And it’s not to game or video edit on.

    • @RiceCubeTech
      @RiceCubeTech 10 หลายเดือนก่อน +6

      @@abdou.the.hereticit’s prosumer. Not epyc levels of versatility when it comes to running them in servers, but it has way more PCIE lanes than anything from intels HEDT and it also has support for a lot more RAM.

  • @rano12321
    @rano12321 10 หลายเดือนก่อน

    The problem with gpu rendering is that its not slower but it's limiting vram. In a big blockbuster scene, you can have 100s GB data for your geomtery, openvdb data and it's not gonna fit in your vram so you'd have to chain up multiple gpus where for cpu rendering you can have scenes upwards to terabytes.

  • @emanuelperez3595
    @emanuelperez3595 10 หลายเดือนก่อน +7

    I wonder how big of a change this cou could bring to a big vfx studio. I always wonder how and when do they justify updating their hardware.
    That would be an interesting video if you ever have a chance yourself or team members to touch on

    • @samk9632
      @samk9632 10 หลายเดือนก่อน

      I'm a vfx artist myself, and I can certainly see the appeal, but it's not like a studio will equip all their artists with these chips, a 7950x is just infinitely more economical.
      HOWEVER
      If you're a houdini artist that handles a lot of very large simulations, I guarantee that your studio will do everything that they can to set you up with one of these chips, because the additional RAM capacity is a huge thing when you need to run incredibly large fluid sims. I'm sitting at a comfortable 128GB of RAM rn, and it's certainly got it's limits.
      Also if they have a dedicated CPU render farm, these will probably be eaten up for that, although more and more are using GPU render farms, and for most shots, a 6000 ADA will perform great, provided they're using a GPU render engine like octane.

  • @MasterGoku21
    @MasterGoku21 10 หลายเดือนก่อน +1

    Scotty Kilmer: "The threadripper is back and im mad as hell"

    • @Planeandaquariumgeek
      @Planeandaquariumgeek 4 หลายเดือนก่อน

      Gotta swap that 94 Celica ECU CPU with threadripper

    • @MasterGoku21
      @MasterGoku21 4 หลายเดือนก่อน

      @@Planeandaquariumgeek "I've seen these things with a million crypto hours on them and they still run like a clock"

    • @Planeandaquariumgeek
      @Planeandaquariumgeek 4 หลายเดือนก่อน

      @@MasterGoku21 But this traverse is done after just 50,000!

  • @Asaki9013
    @Asaki9013 10 หลายเดือนก่อน +3

    Wake up Intel!

  • @lockitdrop
    @lockitdrop 10 หลายเดือนก่อน +1

    Sounds kinda juicy for running LLM's via CPU. Curious to see what people do with this new line from AMD, should be fun

  • @MajorLeagueAwesome_
    @MajorLeagueAwesome_ 10 หลายเดือนก่อน +7

    This video needs 3 more sponsor ads.

  • @therealplatinumminer385
    @therealplatinumminer385 10 หลายเดือนก่อน +1

    I will fight anyone who says this is a desktop CPU when it costs 5 times what most gaming desktops cost and can handle a terabyte of ram

  • @GGnext.crazycro
    @GGnext.crazycro 10 หลายเดือนก่อน +3

    The games are not coded for this amount of cores but! What if you install a Windows VM, and run the games in the VM. This way, you would be able to define which cores can the VM use... It's at least an idea to try....

    • @Nelo390
      @Nelo390 10 หลายเดือนก่อน +1

      To explain why this wouldn't help would take too much effort, but it wouldn't.

  • @infin1ty850
    @infin1ty850 10 หลายเดือนก่อน +2

    I love the Threadripper series of CPUs, but I have no idea why anyone would find a way to legitimize purchasing one if their primary use gaming.

    • @Thecyclingeconomicsdoc
      @Thecyclingeconomicsdoc 10 หลายเดือนก่อน +2

      They are terrible for gaming. They aren't made for it and games aren't optimized for them. I learned the hard way. 😢😢😢

  • @sakurajin_noa
    @sakurajin_noa 10 หลายเดือนก่อน +3

    Could you start including compile benchmarks in your Productivity tests. These are relevant not only for devs but also many engineers, so seeing how a cpu like that performs would be nice.

  • @publicspeaker4009
    @publicspeaker4009 10 หลายเดือนก่อน +1

    The amount of 420 jokes LTT have been making has me thinking Linus is finally joining team green 🍃

  • @jodyyondko
    @jodyyondko 10 หลายเดือนก่อน +3

    This is my daily reminder that Im never going to afford a pc.

    • @Dessert_And_Tea
      @Dessert_And_Tea 10 หลายเดือนก่อน +1

      linus never fails to spread my cheeks and fill me up with happiness

    • @Pasi123
      @Pasi123 10 หลายเดือนก่อน +1

      @@Dessert_And_Tea Oh, happiness. I thought you were talking about something else

  • @staffie85
    @staffie85 10 หลายเดือนก่อน

    IMO it’s should be LEDT for the i3’s and Pie’s of the world, GEDT (Gaming end desktop) for the rest and then HEDT for Threadripper.
    6400 ECC memory 😮 I thought ECC memory was usually magnitudes slower than non ECC memory?

  • @thekoz5158
    @thekoz5158 10 หลายเดือนก่อน +5

    AMD is back but chip looks HUGE!

  • @fedupguy2004
    @fedupguy2004 10 หลายเดือนก่อน +1

    Although overkill for most home use cases I dont think this is the point. Wendall's Threadripper vid today showed him running a game v fast on proton on Proxmox whilst doing many other things. In 5 years time we will be running low power arm based pcs with 100s of core doing many many thinks at once, many in the background using AI to support work based and home support tasks whilst streaming content to many lightweight clients to the whole family - like the new Playstation handheld device. It would be useful if LTT set up a video trying to simulate this with a new 96 or 128 core system on a hypervisor base, as as new HEDT systems like the forthcoming Mac Studio Ultra and Qualcomm 12 Arm Core System are released next year. The aim being not to see how much compute is left on the table but with compute fully utilised how powerful as human beings can we be,

  • @Kennephone
    @Kennephone 10 หลายเดือนก่อน +3

    I think these have their place for one big reason, they have as many cores as Epyc, but they're clocked much higher, so you get the best of both worlds.

  • @speckkatze
    @speckkatze 10 หลายเดือนก่อน

    Damn that upload schedule is crazy, not even 24hrs in between!

  • @dagarath
    @dagarath 10 หลายเดือนก่อน +30

    Definitely planning a Threadripper 7000 system for the future, just need to find a motherboard and a nice case that can support multiple 1200W PSU so I can throw in 3 or 4 5090 GPUs since it will all be for high detail 3D modeling and rendering.

    • @stuffstoconsider3516
      @stuffstoconsider3516 10 หลายเดือนก่อน +1

      My problem with these high-end multi-core CPUs is that most software and applications do not utilize the entire silicon real estate, especially at the consumer level. This is an overkill for us average Joe even if you need an amazing PC.

    • @Wrublos212
      @Wrublos212 10 หลายเดือนก่อน

      Dang, that's a lot of computing power :D Its great that such CPU hits desktop market.

    • @doublevendetta
      @doublevendetta 10 หลายเดือนก่อน

      ​@@stuffstoconsider3516that's because this product isn't FOR you, so I don't understand why you have a "problem" with it?

    • @dagarath
      @dagarath 10 หลายเดือนก่อน +5

      @@stuffstoconsider3516 It's completely fine with me, I am ok with forcibly committing cores to specific applications so they don't overlap, more cores means more apps without overlap.

    • @knifetheirishman8976
      @knifetheirishman8976 10 หลายเดือนก่อน

      @@stuffstoconsider3516 Most of the technology and innovations in these absurdly high-end monstrously powerful CPUs will eventually trickle down into the consumer end hardware. Which is genuinely a nice thing with EPYC giving us a bit to look forward to in terms of hardware innovations.

  • @ashvinla
    @ashvinla 10 หลายเดือนก่อน +1

    For AI, video and rendering workloads, GPUs are far superior. For VMs in a data center, EPYC is the way to go. So what is this good for?

  • @MrChessy-uf5ob
    @MrChessy-uf5ob 10 หลายเดือนก่อน +3

    Nice vid Linus! It’s funny to see you test things that are twice my computer’s price.😂😂

  • @pieterrossouw8596
    @pieterrossouw8596 10 หลายเดือนก่อน +2

    I read somewhere on reddit that the Ryzen C cores have power efficiency to rival Apple Silicon - which would be amazing considering it's still x86. Is this confirmed, or is it like in a super niche edge case?

  • @MarkHawk
    @MarkHawk 10 หลายเดือนก่อน +6

    Linus, please do 3 levels of budget builds with these new Threadrippers when they are in normal rotation next year. I need suggestions and inspiration! I would love to see builds for people looking to render CG shots. Im not really gonna game on this, but I don't mind seeing those performance numbers, but mainly care about various types of render times for various types of artists.

    • @MrA6060
      @MrA6060 10 หลายเดือนก่อน +4

      ah yes the 3 levels of budget being 10k, 20k and 30k

    • @MarkHawk
      @MarkHawk 10 หลายเดือนก่อน +1

      @@MrA6060 This type of PC is for work for me. So I get to go a little all out for it since it's how I make my income for the next few years. If the render times are worth it, these prices can be justified :p

  • @Frisky_Panda
    @Frisky_Panda 10 หลายเดือนก่อน

    I didn’t even notice they stopped making these! Can’t believe that

  • @V8Power5300
    @V8Power5300 10 หลายเดือนก่อน +3

    I'm looking forward to these coming down in price to reasonable levels in 5 years or so. I'm really looking forward to get my hands on 64 overclockable cores. This might actually break the 1kW mark as a daily driver

  • @Trevorcraft71
    @Trevorcraft71 10 หลายเดือนก่อน

    this thing is WILD
    i mean i bought a 1920X when waaaay long ago thinking id get more use of it, but to see them jump from $800 USD for that top of the line back then to $5000 NOW is ridiculous. but performance, good heavens

  • @Zer0_Smith
    @Zer0_Smith 10 หลายเดือนก่อน +1

    Ngl I kinda wish that normal TR 7000 supported uDIMM, and rDIMM was for the Pro lineup. I know this is 100% a dumb wish, but I use TR 3000 for my main desktop and I liked that it was this insanely powerful system, but can still look like a high end gaming desktop. With rDIMM (and the new motherboards for 7000) are very clearly workstations, and nothing else

  • @PanduPoluan
    @PanduPoluan 10 หลายเดือนก่อน

    The new Threadrippers are perfect for oil reservoir simulation.

    • @Teluric2
      @Teluric2 10 หลายเดือนก่อน

      Can you explain what software and OS is used for oil simulations?

  • @SONYSIKE
    @SONYSIKE 10 หลายเดือนก่อน +1

    tell msi to bring back the dragon logo on laptops!!!

  • @James2210
    @James2210 10 หลายเดือนก่อน +1

    Just bought a shiny new 12th gen laptop, fresh without AVX-512 or undervolting. Feels like I threw my money in a dumpster fire. Maybe I'll move to AMD next time.

  • @Arnaud57915
    @Arnaud57915 10 หลายเดือนก่อน +1

    I miss the 5960x time
    When is intel releasing a quad channel 16P cores CPU or something
    That would be a killer in games

  • @Kaptime
    @Kaptime 10 หลายเดือนก่อน +1

    HEDT is too insecure as a SKU to be worth investing in imo, Intel had one then got rid of it, AMD had one then got rid of it. I'd rather just go straight to the server lineup of CPU's or to a dedicated render server (if I ran an edit suite) and use consumer 7950x's in the clients. If it saves dev hours I guess it is worth it, but I just don't see it rather than just picking between server/client tiers already.

  • @mike_dodane
    @mike_dodane 10 หลายเดือนก่อน

    Claw of death holding onto that thing

  • @dtibor5903
    @dtibor5903 10 หลายเดือนก่อน

    Linus happy. Linus got some Threadrippers. The sponsor money ripped the threads off his pockets.

  • @S1nGuLariTY_
    @S1nGuLariTY_ 5 หลายเดือนก่อน

    0:22 Half the speed but 5 times less the money Linus.

  • @SuodesTzeos
    @SuodesTzeos 10 หลายเดือนก่อน

    no words on efficiency?? i'm sure its a pretty important topic!

  • @Anaerin
    @Anaerin 10 หลายเดือนก่อน +1

    How does it compare to Epyc? What's the price/performance advantage of these over just grabbing an Epyc server/workstation board if you're already spending that much money?

  • @jarrettembry643
    @jarrettembry643 10 หลายเดือนก่อน

    I’m glad they made it, but how stupid is it that the relatively few people who have need for HEDT now have trust issues with them…

  • @ChewableMeteor
    @ChewableMeteor 10 หลายเดือนก่อน

    Its double the size, makes sense

  • @arctrog
    @arctrog 7 หลายเดือนก่อน

    You should start running fps in squadron 42 when it comes out, it’s feature complete now so it’s only beta it’s got to go through

  • @reikoshea
    @reikoshea 10 หลายเดือนก่อน

    Editor Feedback: Sound effect at 0:06 sounds like the default iphone ringtone at the start.

  • @moist_ointment
    @moist_ointment 10 หลายเดือนก่อน +1

    Would love to see you guys to a Xeon Max review.
    Not because it makes sense, but I really want to see what that on package HBM does for gaming

  • @gloriosatierra
    @gloriosatierra 10 หลายเดือนก่อน +1

    Maybe the sweet spot is 8-16 cores.

  • @dallinsprogis4363
    @dallinsprogis4363 10 หลายเดือนก่อน

    Great video on AMD’s new Threadripper processors, loved it!

  • @CameronHarris1986
    @CameronHarris1986 10 หลายเดือนก่อน +2

    Although it's not for gaming, i would like to see what the CPU usage is like on a City Skyline 2 map with 200k+ residents.

    • @chiefjudge8456
      @chiefjudge8456 10 หลายเดือนก่อน +1

      Threadripper is great for gaming, but it can ALSO many other things at the same time. A CPU is for anything you want it to be. Threadripper can do it all if you need the power.

  • @Aaron-zl5gq
    @Aaron-zl5gq 10 หลายเดือนก่อน +1

    As excited as I am for this the prices compared to prior TRs are disgusting, even as a pro user those prices are gross

  • @chunk3875
    @chunk3875 10 หลายเดือนก่อน

    I’m gonna buy this for a Minecraft server and school work.

  • @hardwire666too
    @hardwire666too 10 หลายเดือนก่อน +1

    "Still recommend a GPU for your Blending needs" WRONG!!! GPU is only good for small projects that will ALWAYS fit into vram. It doesn't take long AT ALL to have a project that will devastate 16 or even 24GB of vram. CPU is still the go-to for larger projects that you don't want to fail or crash. Sometimes it's not about speed, but reliability. So essentially video game asset development, sure go all GPU. Rendering a demo reel or doing gig work, CPU. All the way. Making a processor that can compete with a GPU is very attractive for some of us.
    There is a reason why the big guys have CPU based rendering farms.