Corrections: 3:01 - The M1 Ultra CPU has 16p/4e cores, not 16p/8e. 7:32 - The i9 13900K has 8 performance cores and 16 efficiency cores, not 16 performance cores and 8 efficiency cores. 7:16 - The numbers shown for the MacBook Pro 16 M2 Max are incorrect and are actually the numbers for the i9 13900k, as shown on the following graph at 7:23, where the numbers are accurate. 10:17 - There was a misunderstanding; Blackmagic Design hadn't replicated our results.
I have noticed, and completely agree that Apples is snail slow. I switched to Apple, due to my last three PCs bricking less than a year after purchase. The Apple computers lasted me for multi-years.
No apple, you cannot use a random number generator for the graphs, even if it is very convenient. And yes, you need to label your axis, even if it doesn't look as good.
Oh, and... one more thing. Here at Apple, we are proud to bring you GRAPHS. Our engineers have been hard at work removing every axis label to give it a clean, streamlined experience that will revolutionize the way you use data.
This is why we NEED independent reviewers to look at products and test them against the claims of the manufacturer (even if they make it impossible with vague wording so they can't be called liars) and why I was so excited to hear about LTT starting up the Lab.
@@taruninjathat was a joke btw XD everyone if free to do what they want, if they can afford it, there is no need to stop them buying overpriced things. They simply just don't care about performance and the like, I don't even know why Apple keep this in their presentation, it's probably fir the shareholders to see they are doing something lol
The fact that LTT Labs can now test performance at different ambient temperatures is great. Living in a relatively hot part of India (35 degC - 40 degC ambient ) with no air conditioning, I remember not being able to play high performance games in the summer due to excessive stutter. The same games used to work great during the monsoon and winter months.
I can't believe a 3.5k vr headset makes meta 1.5k vr headset less of a absurd price XD It's still a absurd either way XD Same with Apple pricing and Nvidia pricing XD
It's honestly refreshing to see a shout out to the engineers. There's a lot of skilled people that work at Apple and they do not set the prices, nor do they decide how things will be marketed. A lot of the time they're given requirements that may not even come from a technical individual, and they just have to make it work.
if the Apple Messiah Jobs was still alive and in charge, the engineers would be SOL. since Jobs didnt give a f-ck if it worked or not, as long it was shiny slick blahblah. if product catches on fire, requires hourly dropping from 5 feet those are features
I imagine the enormous pain on the engineers when given the orders on NO expansion, NO reparability, NO space to breath, so on and so on. The horrendous model that became selling laptops like smartphones: If any part fails, just buy a new one, because the cost of the creeple (and often impossible) repair doesn't make sense. In Job's era, at least where machines which trough the years became icons in productivity, reparability and longevity. In my case, had the last one of the Intel MacBooks, the 2015 15" with 1TB M2 disk (Installed by me of course) and maxed at 16 GB Ram. People today loves being showed bling bling from Apple instead of solid, noble, reparable without much problem, well designed and durable stuff. They got away with it.
@@Tech-geeky I just don't know. I'm an almost 50 years old Graphic + Motion Designer with 30 years of use of Apple computers (desktops and laptops), because of that, i can talk about Macs. If other brands do shitty practices too, it's a bad sign of the times, isn't it?
Or if they simply licensed AMD RDNA3 cores, or a whole chip, and optimized software for it. AMD cards can do so much, but they lack the resources to get the software /driver optimization Nvidia can provide.
As a photographer and 3D generalist, I had been saving for a year and a half to purchase the Mac Studio with top-notch specs, including 2 TBs with the M2 Ultra chip. However, it turned out to be ridiculously expensive. Little did I know, I could build my own custom PC. With the same budget, I managed to get a fully loaded RTX 4090, an i9 13900K, a whopping 192GB of RAM, and 4TBs of storage. Plus, the added advantage of being able to upgrade my computer in a few years. I am absolutely thrilled with my choice!
Yeah, it's always been hard to see Apple performance when coming from a PC side of things, it's nice to see the Labs showing just how much Apple have been massaging their numbers. The power efficiency though is insane for those M2's, but the performance claims are near illegal. Reminds me of when they called their Airpod MAx's ' the ultimate listening experience' and I just had to lol in my Orpheus's
When Apple compares and uses products others than there "Its faster than ...." Its just marketing Those on the outside don't know because they don't make the product/software their comparing to. Its guess only.. I reckon Apple would rather show something there than just nothing at all.
@@lostkostin sure. and with all those parts you'd for sure be able to build a PC with the form factor of the Mac and the power draw, noise, and performance per Watt. Go ahead and build it. And yes, noise and power draw DO matter.
Thanks for including higher ambient temperatures! You guys take so much flak for the Labs so, as someone who lives on a hotter climate, this makes the Labs an awesome addition to LTT. I'm very glad that you are putting your money towards better testing methodology :)
@user-qj3iu3dl7f Your reply bears no relevance to the comment you replied to. Furthermore, while I dislike most Apple products myself, raw performance does not always mean a better product. That efficiency is amazing. My "Windows" gaming PC can heat up my room to annoying levels.
What's the benefit of testing at different temperatures? I compared the graphs and they all reach the same numbers regardless of how hot or cold it is, it just happens faster or slower.
These are one of the kind of videos that make LTT so great. I cant imagine anyone else being able to make such an extensive video like this that is also entertaining and easy to understand.
Insert mac fanboys hating for judging it too harshly insert mac haters hating for not putting it down enough and praising too much unfortunate that no matter how honest and factual it might be, such huge chunks of people are still willing to be unreasonable
Gamers Nexus could do a video just as good as this one if not better BUTT (and that's a big butt) they don't care enough about Apple to do it, as it should be. At this point I declare Apple is a cult.
@@MegaGasek GN would make this into a 35 minute snorefest with so many graphs half of the audience would fall asleep. I love the guy but his videos are not for everyone.
We are really starting to see the benefits for LTT Labs for all of us in the computing industry. It's awesome having LTT as one of the channels pushing itself further to get better quality comparisons and backing it up with tests. Great to see, because we needed that to settle debates and call companies out. When it's good, it's being said. When it's bad, well, the numbers speak for themselves.
I mean, when it comes to apple you don't even need to test anything. You just look at the specs of their stuff and then at the pricing and you know it's at least three times as expensive than it should have been so therefore almost never worth it.
As an Apple user, I hope you don't back down. They lie constantly and get away with it. Very frustrating as a customer to buy their products and find huge gotchas all the time.
Please stop buying their products. Apple sucks giant balls. They treat their customers like shit. It took me over a month to delete my Apple ID/Account because they wouldn't just let me do it myself. I had to CALL them like 5 times to delete my account. Unreal.
Noise reduction is an important part of colorgrading in resolve. I'm a professional colorist. NR is very hardware intensive so it's a good way to test your gpu, render speeds are also important. To be able to play back large files on multiple tracks with noise reduction enabled at full project fps is a good thing. Having said that, RTX 3090 4090 are also capable of playing large files with noise reduction at a much lower price compared to the new apple chips, so that's definitely something to think about. And with a proper pc workstation you have the ability to upgrade your gpu later if you need to without buying a completely new machine.
Loving the hard data to back up the debunking of performance claims. It's great to see more channels doing these kinds of involved and necessary tests.
doubly nice that they're taking an neutral stand, running a ton of tests (more than just performance) and being able to give more nuanced conclusions. It would have been super easy to bypass any talk about how efficient the system is and focus on debunking their marketing numbers for views.
Somehow these review sample-less, completely honest Apple reviews I’ve seen in the last few years from LTT are just the best. Too much of the Apple content on TH-cam is relentlessly positive and Apple-worshipping. I’m not an Apple hater (I use an iPhone) but the realistic, completely honest approach from LTT to these products is incredible. Thanks team.
Review's for most things can be very hard to know if favoritism playing a part as they want to keeping their invite's to shows from companies and freebies as if they stray out of line and say if a product is bad even when it is they lose all that you need to know what deals they have with the companies product before watching really. Game reviews are the same as the reviews know if they say anything bad about a company they are getting paid to review they can say bye to there invites to shows and free review copy's and they have to go buy it them self and then don't get the scoop on unknown things or early review copy's. It is the same with apple reviews and about everything else you can think of. only people that go out and pay out there pocket will give a somewhat real review but at times they are trying to get in that inner circle so they might not say how bad it is. That is why you need to watch a few different reviews to get a picture on what you are looking to buy as only 1 reviews is never good to know as they might brush past the bad stuff while others don't.
Well, almost every reviewer is going to hold their tongue a bit since their business relies on sales and not every one is a hardware nerd. Shoes are some of the worst culprits.
This video is 2x better than the last one. 3x more burns and 4.5x funnier! Most people won't notice the "performance" gain from M1 to M2. Just you wait for the M3 next year. It will just keep getting better.
I’m really glad your labs are revealing these things because it’s extremely concerning how vague their claims are and really don’t make any clear point to the average consumer or even entry or intermediate consumers
That's because they know how much of their userbase are not tech focused. It's sort of what apple's market is overall. They are very much doing this on purpose.
" it’s extremely concerning how vague their claims are and really don’t make any clear point to the average consumer or even entry or intermediate consumers" The average consumer don't buy a workstation... The computers we are talking about here are dedicated to high end video editing, color grading, vfx, 3D rendering, etc, etc. That's not an "average" use, nor an "entry level" use.
I always felt like the Mac Studio is an odd case where it’s overkill more most people but not justifiable for a good portion of the people it’s trying to cater for, professional industries. Potentially one of Apple’s most odd modern products since its introduction.
It's a pro product. Pro as in actual industry professionals using it who make a living off of creative content. Not "pro" in the loose sense of just high-end consumer content.
For the “average” consumer it is overkill, but for REAL professional “prosumers”, say mom and pop productions using SmallRig cages and shooting weddings, vs hollywood Arri Alexa studios, the studio is priced just right I think it seems odd because theres a good chance its consumers wouldnt be out of the average social circle. But then again not very many in that same social circle would buy into it. The average joe wouldnt buy it, but they likely know at least one person who might as a “producer” or someone with deeper pockets casually flexing. Not everyone would know someone tho, who would use an Arri Alexa If you want to see “odd pricing”, you have nobs off tripods and cameras that could easily be 3d printed for 5 cents being sold for hundreds of dollars at the higher end of hollywood. Now THATS pricing that would make even apple execs blush
That’s the Mac Pro for me. If the Mac Studio was the Mac Pro’s start price, we would’ve have a different conversation. The Mac Studio way much cheaper.
It fills the previous hole in their line-up between the Mac Mini and the Mac Pro. Except now, there is absolutely no reason to buy the Mac Pro other than having lots of local storage, and really if you need that sort of storage, you should put it in a NAS and connect it via a 10Gb ethernet cable.
I just wanted to say that the Labs graphics look SOOO good. Very easy to read in video format compared to the previous style and they still look attractive. The slight gradients when highlighting specific sections was a nice addition.
04:50 I see what you did there! A perfect "Storm" of conditions. That's some great writing! For the people not getting it, look at Apples Project Names for the Mx Processors: Firestorm, Icestorm, Blizzard...
Apple: we love the environment and ask you to pay extra for chargers. Apple: your devices are too old and we're trying to limit your experiences by not allowing you to download any software
I absolutely love that it feels like a 100% unbiased, honest review. Apple definitely should have marketed how efficient it actually is. That power/perf of this is unbelievable!
The only way to fight temperature increase is to do some water cooling or something with the i9 and 4090, which would probably end up totalling more, and with a way higher energy bill
@p15209i live in a country with expensive electricity and i could run that setup, and end up saving money, for the lifetime of that PC, compared to simply buying the M2. The efficiancy savings are miniscule. Even if you compare them with running the PCs idle for a whole year.
@@Bob_Smith19 The reason NVIDIA get away with it is because they simply are the best. They hold the market by the balls and AMD can't catch up. If you want the best feature set and performance, you have to get NVIDIA. Apple does not have that sort of market hold, their value comes from customer ignorance, vanity and laziness. Apple is not on the top because they are the best and most cutting edge, they are at the top because their brand is powerful and people hate change, also because their branding is so strong there is a level of snobbery their customer base feels entitled to.
@@Eagle3302PL"hate change" i mean no they made big changes with their M1 silicon. It's just that apple people "hate seeing changes" XD it's the whole shtick about apple : hiding things from their customers and making sure their customer won't think complaining lol
It's a shame, because Apple's engineer's really put some solid work into the SOC! But between locking some power options, bad pricing and bad marketing, they really do their community a glaring disservice.
@@vlcheish Better value than two years prior does not inherently equal good value, especially when the product is blatantly marketed in a rather deceitful way.
The catch on the headphone jack comparison spec was chef's kiss. Thank-you for the detail oriented work on this review. This sets the bar high for a product reviews in a good way. Great job!
@@Summer-us5ql as a Mac owner, I am incredibly angry at apple. I'm so angry that I'm storming to the nearest apple store to trade in all my apple devices 😡
your point at 16:07 is why I switched back to a PC last year when the studio was announced, your dollar figures are close to what I paid for a completely overkill PC (lesser video card than 4090, though)...Swapped out my 12900K with a 13900KS this year for net $400 additional. That runs my work faster than this year's M2 Ultra. It was a no-brainer. Also at 16:50, good point: if you have multiple jobs that are able to run in parallel, 4x mac minis is an option at that price!
Something to make the Apple fans a bit angy? _Don't mind if I dooooo_ On a more serious note, I do appreciate you guys calling out manufacturer marketing BS. I know that marketing's job is to fluff up the capabilities of the product, but misrepresenting performance is a big no-no.
@@johnsalamiiFair enough, though I think you can substitute any other manufacturer fan for Apple fans. Liking something is fine, being an obnoxious fanboy isn't. :b
Would be interesting to see the total power usage for a given render. Well the PC might use more power it is finished much quicker. The PC was 3 times quicker so a 2 hour render on a PC would be a 6 hour render on the Mac. So what is the power saving over a actual workload
I had the same thought. Only in a CPU v CPU (13900K) scenario could you perhaps make an argument for efficiency, with M2 Ultra having close to 70% of the former’s performance.
I really like that LTT is now making comparison in different ambient temperature scenarios. I live in Brazil and the biggest reason I switch from a Dell XPS 13 to a Macbook Air M1 was the amount of heat that the previous one generated at my room. I basically had to use my A.C. all the time because of that.
Sounds about right. My Dell Precision 5550 for work (basically an XPS 15) is constantly thermal throttling and even emergency hibernates from time to time from an overheat event. It both runs hot and is unable to cool itself properly despite having fresh thermal paste and a dust-free interior.
@@v1BroadcasterSome people don't understand why people prioritize insta and snap so highly. Which really is fine either way even if others don't agree.
I used the old Intel Macbook Pros's in Australia during the summer and man it was a struggle.. When the M1 came out I bought a Macbook Air and my god the performance in hot conditions was amazing.. I main an M1 Max Studio now and I freaking love it!!
I love apple as a product company and how they’re one of the few design focused companies but regardless of how I feel about them I’m glad you’re holding their feet to the flame here Linus! Keep it up
@@MrMali22Eh, it really depends heavily on the product. Apple has a mix of very well priced products, overly priced products, and ridiculously priced products. Their base MacBooks are still great for the price right now, especially year old models. Their premium desktops simply don’t sell enough to be priced well honestly.
That's my largest criticism of apple. They say they're focused on design, but then make extremely bad design choices. And that's why iOS is so clunky, lacks functionality and some menus are straight up messy. I don't get how they don't look at the competition and go"this functions brings such a smooth experience, let's implement it" This is written from my iphone btw
Or even a Ryzen 9 7945hx. Draws like 90W at 32000 Cinebench R23 Multicore score and 34000 at 120W. Thats pretty comparable efficiency to Apple and superior power
Also also, it would be fun to performance match a 13900K + 4090 system to the M2 Studio, because both the i9 and the 4090 actually have very good efficiency scaling curves when you performance normalize with lesser hardware. I can get the same results with my 13900K and 4080 sipping half of their max tdp rating, so a full system power draw for me under similar loads as the M2 Studio was put through, is actually very close in efficiency. But I still have the benefit of being able to crank my hardware to the max when I just need to get shit done - which Apple just can't do. And for professionals, when you want those calculations done and animations rendered, time and workflow counts 10x more than avg consumption over a full workday. It's not like either system is pushed to the max constantly anyways. But when they need to be pushed, I want my system performance to go through the roof instead of being gimped by whatever tdp target and inefficient airflow design Apple is operating with.
The problem is the 7950X3D is laser focused on Gaming. Sure it's a fairly efficient chip, but productivity is lacking due to it's lower base clocks and most productivity tasks do not benefit from the additional cache.
My main takeaway from this is that if you have an M1 Mac Studio as a workstation then you won't be rushing out to buy a new one. The upgrade isn't worth it. If you got a beast of a PC, then this also isn't gonna be a upgrade you need to have. Having said that: it's insanely power efficient. And I for one will happily take 80% the power at basically no noise and no space heater sitting on my desk. That's actually awesome.
But you could also just get more efficient PC parts to match the Macs performance and they would also cut the power down tremendously... Price as well.
@@gyrozeppeli00 Oh, Apple flat out lying on the benchmarks is inexcusable and I don't get why they do it. They got the perfect marketing angle by selling this thing as a power efficient machine that runs cool and quiet. Just go with that.
@@kentsutton4973 if you know anything about DIY you would know it’s not possible to get that power consumption even with a GTX 960 that’s nearly 8 years old, let alone a whole built PC.
Yep, M2 Pro Macbook or Studio will have you covered for years for music production or video editing, i've many friends with it (and even M2 Base) and they're really happy with it.
@@cavifax I always like it when apple falls behind in specs then it is all about feelings... "Im really happy with it, it just feels better... blah blah blah" but the second that apple wins in the specs then it is all about value and power. You all act like children.
@@thomgizziz Those Macbook and Studio are fairly decent on their pricing, i would not defent the M2 Ultra or the Mac Pro and i won’t deny there’s an “apple tax” and their ecosystem is a trap, but you can get a M2 Base Mac Mini from 600 USD that would smoke any other 600 dollar PC for video editing.
Super useful review. I couldn’t agree more that ultra-low power and quiet operation should have been the marketing value proposition. That’s exactly why I bought a (refurbished) M1 Ultra. As far as the price, I have a different problem with it. You compared a gaming PC with gaming parts in it to this thing. In the Intel days, Apple’s “pro” meant real, honest, workstation parts. Xeon CPUs, workstation graphics cards, ECC memory, server-grade storage options, so of course I expected to pay more than for a gaming PC. Now, I’m paying way more to get high efficiency iPhone silicon in much greater numbers inside this box than what fits in my iPhone. I’m not sure that’s worth as much a Apple charges just for the low power, even though the low power was the compelling feature that drove my purchase.
this video is insane! it destroys the marketing in the best way possible: completely objective. There is no way of interpreting this wrong when the numbers speek for themselves.
I know a commercial and indie producer that’s always been an Apple fan. He still is, but no longer uses them for professional video production. Simply because there is no form of GPU expansion. Not even an external eGPU via Thunderbolt 4 is compatible. What you got is what you get. And the M2 Ultra is still not powerful enough for him to do his color grading and edits in Resolve.
I use a 14" Macbook Pro, and absolutely love the hardware (the software is very much a mixed bag), overall I'm extremely happy with it. But this marketing cr*p from Apple, has been going on for years and is totally unacceptable.
I still believe, that for the average office user, this is true. You rarely need to maintain the MacBook. Windows just fails way too often. I’ve lost my data so many times because of Windows. Never happened with Apple.
I think one of the great values of MacOS is that it's Unix based and therefore generally easier to work with for programmers. Yeah, there is the Windows Subsystem for Linux, but I still see it as a hassle to use and you cannot use a Linux package manager for installing Windows apps. Then again, I kinda hate the rest of MacOS and I rather use Linux.
if you inspect element on the graph bars on apple's site, you can see the % difference between the top bar being 100%, and the core i9 imac being 16.3934% on 3d rendering
Apple should make servers. You'd think pitching this energy efficiency to companies could definitely make business owners switch to Mac servers. People who have used Mac servers back in the day preach how reliable they are as well. Having a cheap-to-run and reliable server really sounds good.
for "power efficiency" to matter, you need to compare it to "time taken" to complete a task like it wouldn't matter if it consumed 100w, if it took 5 times longer compared to a 200w device.
I'm debating buying a mac and switching my windows laptop to linux, because havign switched my major from mechanical engineering to computer science, I am no longer locked in Windows. And whatever beefs anyone may have with macOS, it plays really really nicely with everything I need and want to do.
About the comparison in the end, the thing is that you can keep using some parts of the pc for up to 10 years, not sure you can keep a Mac Studio running if an internal part breaks
My parents Dell Inspiration still functions to this days we got in 2016, Never upgraded, never been in for repairs. Occasion reinstall, but no other issues.. She's happy. Recently replace SATA HD with SSD and more RAM. Aside from the reseating of memory every so often due to freezes. its still ok Most Apple users would go though several Macs in that time. I know i've got 3 or for new Macs since then. Not because they die, but because i "wanted" too upgrade. In that timeframe i've had (1) Mac brick after a re-partition. soo just because we pay the expense doesn't say its all green.
@@Tech-geeky Ive been using an i5 4440 with igpu since 2015 and only had to replace the power supply cuz its fan died and it started to overheat. Im getting a Ryzen 7700x and Rtx 4070 build next week but am still gonna keep this computer for general web browsing and stuff like that for the family. It probably can go for 5-6 more years, maybe toss in more ram, sata ssd and a gtx 1650 and it could be decent for lighter games too, good enough for my younger brother. You cant get that with an Apply product.
@@Tech-geeky My first PC which I got as a hand-me-down in 2006 still works without any repairs (only the RAM has been upgraded). It's an Intel 440BX based Celeron 333MHz system built in 1999. These days I mainly use it to run older software and games. My main PC is also old, I built it in 2016 (and rebuilt to a new case in early 2017) from used parts and I've done some upgrades (storage, RAM, GPU) over the years. I had to replace the motherboard in 2021 because lightning killed it, I was lucky to find one of the same model for cheap and even got a much better CPU cooler with it. Asus P6X58D-E, X5670 6c/12t @ 4.4GHz, 24GB RAM, GTX 1080
I never realized I needed it until you did it once. But when you’re comparing 2 metrics in the script, highlighting it visually is very helpful in digesting the data.
As a M1 Ultra Studio owner, couple other points: 1. There’s no mic, so that’s another USB port (webcam in my case) 2. No hardware accelerated ray tracing mean Unreal and Twinmotion are not fully compatible (and run terribly) Anyway, your conclusion was spot on: looking forward to a 13900 / 4900 desktop for heavy lifting and a Mac laptop (daily driver). Also, it’s infuriating that Tim Apple nerfed the Mac Pro, essentially painting themselves into a corner all over again. RIP Bootcamp 😢
4900? Do you mean „nVidia GeForce RTX 4090“? The other digit number I could „translate“ with googling it: „Intel Core i9-13900K“. I really hope A.I. will be available one day to help commentators writing their comments :-)
Efficiency is probably the last thing on people's mind when they're paying top dollars for performance & power. I don't think you'll find a single Lamborghini owner asking the seller but "how much mileage am I getting per gallon?" More like, "how fast can I go?!"
@@retrocomputing Trivium, one of the most popular metal bands in the world uses a setup comparable to the one linus shown with a 4090, custom build by GamerNexus and its silent, while giving the best performance you can get + you can play the latest games at 4k 100fps+ with raytracing\path tracing. The vocalist also does streams on twitch with it. Efficiency is impressive but in the end no one cares, raw power and freedom is king.
@@retrocomputing A properly set up studio would have the PC in a noise isolated room or set up with multi slot 140mm water-cooling rads so they can run fans at like 10% speed.
@@thomgizziz that's what I heard from audio guys anyway, they use these Macs because of the power/noise ratio. Do they lie to themselves? I doubt it. But if you're into SFF stuff then you know about using lower grade stuff and undervolting, and you know it doesn't mean that it's going to be cheaper than a normal PC. You don't sound like a knowledgeable guy though, just a fanboy who's angry for no reason.
Apple are going back to their PowerPC days with their graphs. I remember seeing graphs with unlabelled X-Axis's back then too. Or they would over emphasis any performance difference by scaling the graph in such a way that 5-10 points in a benchmark would be massive when the score was measured in thousands. Or simply screw the results by not mentioning what benchmark they used.
The power consumption stuff is great but when the 4090/13900k machine can complete renders/tasks way faster then the additional power requirements don't really matter. You're finishing the job quicker which means the higher wattage is only needed for a shorter time. It's like running a mile vs walking a mile, it's the same work that is getting done
keep in mind, the windows machine will still consume more power at 30-60 percent utilization compared to the Mac at the same level. and that's where these machines will spend most of their time. also, running a mile still leaves you more tired than walking a mile
These are desktop machines and people want to talk about efficiency. The 2 don’t go together. My wife and I do contract editing from home, and our electric bill is 1500.00 a month. But we make 15-20k a month from it. Efficiency is not something we are worried about. We run mac and windows machines for our projects.
@@miloattal9313 A 13900k PC with a 4090 and 128GB of RAM cost about 3800 dollars on amazon, while the M2 Ultra with the same 4 TB of SSD and 128 GB of RAM would cost 6800 dollars. That price difference would be never made up in years of power bills, and you get a worse computer for the price.
Spoiler: m2 won't even hold a candle to the machine learning performance of a 4090, all AI frameworks where made with NVIDIA CUDA acceleration in mind since ages
@gabrielesilinic Right, the hardware and software support. In my mind, I was just thinking about the essentially nonexistence vram of the apple gpu but of course there is more to that
It's always interesting to see what Apple is doing, even though they fundamnetally ignore technical users who may have even slightly less mainstream use cases. For example, if you want/need mass storage (particularly with redudancy), you have to go external. If you want video capture, you have to go for something external. If you need multiple GPUs for any reason, you need something external. If you want a DVD or Blueray drive, you have to go external. The list goes on. You either wind up with a rats nest of external devices around your computer (potentially bottlenecking the IO controllers), you need a deskop PC too to do the heavy stuff, or you just have to accept that there is nothing for you in Apple's ecosystem. It kinda makes the notion some people have that Apple will take over the world with their "better" hardware laughable. You can't take over a space if you don't have an entry in it. And Apple's entries are either super mainstream or disgustingly overpriced.
At least thunderbolt is making the rats nest a little bit less bottlenecked than if it was all hanging off of USB, but I agree. For storage you can use a NAS in these cases, hopefully with at least 2.5G. The rest is gonna hang off of the rats nest.
There are lot of cases that Apple cater to technical users, but not for you. and do not conflate what you want with what other technical users want. You could own a Porsche which will be great for technical use of going around a track but not for hauling timber
@@sinni800dk if the issues I had were limited to just the machine I was using, but I found that one of the last gen Macs at work didn't allow for daisychaining for a bunch of peripherals. I was running an animation class at the time, so there was like....a decent amount to plug in? But I've never run into IO issues in any other case as long as I had dongles - this machine just wouldn't allow it. I doubt it's cos all the lanes were saturated, but QoL drops dramatically when a configuration seems to completely lock off fairly basic industry use cases.
Although in the previous generation we saw a great leap and a great risk taken by Apple, now it disappoints us again with false numbers... The numbers were too good to be true.
I know its very niche but I am quite curious about ML workloads and other GPGPU stuff on the M2 Ultra, for problems that would normally be limited by VRAM. With 192 or even just 128 GB you should be able to solve some massive problem sizes
That seems to be true. You can load larger LLMs into RAM for example. RTX 4090 tops at 24GB if I'm not mistaken. Large Apple Silicon RAM options seem great for running AI models. In many cases RAM limits are more important than raw performance.
I really appreciate the 35 degree Celsius ambient temperature test, in tropical country it practically never go below that all yea except in winter(sometimes).
As an apple user (I hesitate to say fan for a whole host of reasons not relevant to this video) I appreciate LTT calling apple out on this. Apple’s insistence on high in the sky marketing undermines the awesome engineering that they do pull off and just continues to annoy the users like me who do want to buy their hardware. Every time they do this they push the love hate relationship I have with them further towards hate.
they've been doing that way before M1. My 2015 MBP was more repairable, and to a degree, the OS was more usable ithot security stuff getting in way *user informed is more secure* ??
That's a feature, not a bug. I'm a long-time dev and let me tell you, the stability, speed, silent operation and incredible battery life of MacBooks is worth the tax. Introducing OEM vendors and their shitty drivers always makes everything else shitty too for the entire OS. Apple controls 100% of each component's drivers, which means they can push OEMs to actually deliver decent software. Let me put it this way - W11 on ARM running inside the Parallels emulator on my Mac boots faster than it does on an i7 laptop I have.
@@karmatraining boot time is mainly affected by the speed of the storage drive so yeah, if the i7 machine has a slower ssd or god forbid a hard disk of course the VM will boot up faster.
I do know someone who has ordered an M2 Mac Pro, but he's a high-end audio engineer (does Dolby Atmos mixing) who uses a ProTools HDX PCIe card with ProTools on MacOS. His studio also has a rack for rackmounting the system in. So yeah, it's a very tiny specialised niche that would get any value from the Mac Pro as compared to the Mac Studio.
Corrections:
3:01 - The M1 Ultra CPU has 16p/4e cores, not 16p/8e.
7:32 - The i9 13900K has 8 performance cores and 16 efficiency cores, not 16 performance cores and 8 efficiency cores.
7:16 - The numbers shown for the MacBook Pro 16 M2 Max are incorrect and are actually the numbers for the i9 13900k, as shown on the following graph at 7:23, where the numbers are accurate.
10:17 - There was a misunderstanding; Blackmagic Design hadn't replicated our results.
Apple workers are going to bully you in the comment section
Could it be that Apple compares the M1 vs M2 based on equal power input? This could lead to larger deltas rather than measuring absolute CPU speeds?
btw in the description it says geforece instead of Geforce. Just a simple typo but thought I should lyk
The second correction makes it worse for Apple, since it's now slower than the Intel chip with half the performance cores.
I have noticed, and completely agree that Apples is snail slow. I switched to Apple, due to my last three PCs bricking less than a year after purchase. The Apple computers lasted me for multi-years.
No apple, you cannot use a random number generator for the graphs, even if it is very convenient.
And yes, you need to label your axis, even if it doesn't look as good.
lmao
Apple : hahahahahahah. No
Oh, and... one more thing. Here at Apple, we are proud to bring you GRAPHS. Our engineers have been hard at work removing every axis label to give it a clean, streamlined experience that will revolutionize the way you use data.
And yet people buy there shit and Apple keep doing it
The used AI to generate their testing environment. Tags "we win, blue and purple, accuracy optional"
This is why we NEED independent reviewers to look at products and test them against the claims of the manufacturer (even if they make it impossible with vague wording so they can't be called liars) and why I was so excited to hear about LTT starting up the Lab.
Yes but that doesn't really apply to Apple lol, the people who buy Apple don't watch these videos, if they where, they wouldn't buy apple products XD
@@bablela26 True but the more people around them that know better, might help them to see they are being mislead/wrong.
@@taruninjathat was a joke btw XD everyone if free to do what they want, if they can afford it, there is no need to stop them buying overpriced things.
They simply just don't care about performance and the like, I don't even know why Apple keep this in their presentation, it's probably fir the shareholders to see they are doing something lol
It doesn't matter tho apple people will still buy em 😅
Or stricter laws against false or misleading claim. Or even both, both would be best.
Labs starting to pay off big time with all the additional tests and research into product marketing vs. reality
Yea, i really like the new graphs
@@webbie7503 Don't forget the spinning low-polygon 3D models. When things spin, you _know_ that Science! is taking place.
finally, linus laboratories
@@webbie7503
Clearly Linus needs to pay for Labs to get some lab coats.
@@eddiemate LTT branded labcoats as merch, letsgo
The fact that LTT Labs can now test performance at different ambient temperatures is great. Living in a relatively hot part of India (35 degC - 40 degC ambient ) with no air conditioning, I remember not being able to play high performance games in the summer due to excessive stutter. The same games used to work great during the monsoon and winter months.
For low budget you can just take a 10$ home fan and point it at your pc at max setting
@@kimiraikkonen4226 lol
I can't believe apple is making the rtx 40 series look like a great value
I can't believe a 3.5k vr headset makes meta 1.5k vr headset less of a absurd price XD
It's still a absurd either way XD
Same with Apple pricing and Nvidia pricing XD
Well to be fair apple was way worse back in the day, this is actually their best value ever. Remember the 50k USD Mac Pro?
And this is partly why Apple needs to die and why everyone should stop entertaining their ridiculously overpriced anti-consumer products.
Apple largest market is America. Why do americans keep buying their products? Overpriced and low tech scam.
@@bablela26 Expensive tech is expensive.
It's honestly refreshing to see a shout out to the engineers. There's a lot of skilled people that work at Apple and they do not set the prices, nor do they decide how things will be marketed. A lot of the time they're given requirements that may not even come from a technical individual, and they just have to make it work.
if the Apple Messiah Jobs was still alive and in charge, the engineers would be SOL. since Jobs didnt give a f-ck if it worked or not, as long it was shiny slick blahblah. if product catches on fire, requires hourly dropping from 5 feet those are features
I imagine the enormous pain on the engineers when given the orders on NO expansion, NO reparability, NO space to breath, so on and so on. The horrendous model that became selling laptops like smartphones: If any part fails, just buy a new one, because the cost of the creeple (and often impossible) repair doesn't make sense.
In Job's era, at least where machines which trough the years became icons in productivity, reparability and longevity. In my case, had the last one of the Intel MacBooks, the 2015 15" with 1TB M2 disk (Installed by me of course) and maxed at 16 GB Ram.
People today loves being showed bling bling from Apple instead of solid, noble, reparable without much problem, well designed and durable stuff. They got away with it.
@@gustavosaliola How much repairable are other computers becoming laptop pc really ?? RAM is soldered to HP Spectre 14" motherboard.
@@Tech-geeky I think you misread that comment, lol. They're saying laptops are like phones now, exactly like you're saying.
@@Tech-geeky I just don't know. I'm an almost 50 years old Graphic + Motion Designer with 30 years of use of Apple computers (desktops and laptops), because of that, i can talk about Macs. If other brands do shitty practices too, it's a bad sign of the times, isn't it?
Can you imagine how much Apple would charge if they had a 4090 class GPU? It'd make Nvidia blush!
Atleast $15k.
So true!!!
@@Chopper153 10x sounds about right. 😂
at least a million pennies
Or if they simply licensed AMD RDNA3 cores, or a whole chip, and optimized software for it. AMD cards can do so much, but they lack the resources to get the software /driver optimization Nvidia can provide.
As a photographer and 3D generalist, I had been saving for a year and a half to purchase the Mac Studio with top-notch specs, including 2 TBs with the M2 Ultra chip. However, it turned out to be ridiculously expensive. Little did I know, I could build my own custom PC. With the same budget, I managed to get a fully loaded RTX 4090, an i9 13900K, a whopping 192GB of RAM, and 4TBs of storage. Plus, the added advantage of being able to upgrade my computer in a few years. I am absolutely thrilled with my choice!
wow, but 192 gb of ram is oveerkill for even 3d rendering tasks,
Few questions: How many RAM sticks do you need to have 192 gb of Ram?? What is the motherboard you are using? Some server model?
@@tomaszzalewski4541 4X48GB DDR5 /// all 690 or 790 MB with DDR5 support max 192 GB
@@captainAOGsome hollywood producers use 512GB ram.
@@captainAOG future-proof?? Apple hardware isn't' tupgradable today.
Always appreciate some solid data analysis and visualization that tells a story quickly but allows a deeper look. You have some quality folks there.
Yeah, it's always been hard to see Apple performance when coming from a PC side of things, it's nice to see the Labs showing just how much Apple have been massaging their numbers.
The power efficiency though is insane for those M2's, but the performance claims are near illegal.
Reminds me of when they called their Airpod MAx's ' the ultimate listening experience' and I just had to lol in my Orpheus's
When Apple compares and uses products others than there "Its faster than ...." Its just marketing
Those on the outside don't know because they don't make the product/software their comparing to. Its guess only.. I reckon Apple would rather show something there than just nothing at all.
Only Apple could make the Rtx 4090 seem affordable
not only affordable, but also cheap
if you compare a full computer with a GPU, then yes. Drink less of that Kool Aid.
You could 100% fit enough parts in the rest of the budget to make a way better computer than the mac.
@@RandomUser2401 video had shown that not only a gpu itself, but also parts of the full pc with it cost less than the whole box of apple product.
@@lostkostin sure. and with all those parts you'd for sure be able to build a PC with the form factor of the Mac and the power draw, noise, and performance per Watt. Go ahead and build it.
And yes, noise and power draw DO matter.
Thanks for including higher ambient temperatures! You guys take so much flak for the Labs so, as someone who lives on a hotter climate, this makes the Labs an awesome addition to LTT. I'm very glad that you are putting your money towards better testing methodology :)
@user-qj3iu3dl7f
Your reply bears no relevance to the comment you replied to. Furthermore, while I dislike most Apple products myself, raw performance does not always mean a better product. That efficiency is amazing. My "Windows" gaming PC can heat up my room to annoying levels.
What's the benefit of testing at different temperatures? I compared the graphs and they all reach the same numbers regardless of how hot or cold it is, it just happens faster or slower.
These are one of the kind of videos that make LTT so great. I cant imagine anyone else being able to make such an extensive video like this that is also entertaining and easy to understand.
Insert mac fanboys hating for judging it too harshly
insert mac haters hating for not putting it down enough and praising too much
unfortunate that no matter how honest and factual it might be, such huge chunks of people are still willing to be unreasonable
Gamers Nexus could do a video just as good as this one if not better BUTT (and that's a big butt) they don't care enough about Apple to do it, as it should be. At this point I declare Apple is a cult.
@@MegaGasek Apple is large enough to be considered a religion.
And religions are just officially recognised cults
@@MegaGasek GN would make this into a 35 minute snorefest with so many graphs half of the audience would fall asleep. I love the guy but his videos are not for everyone.
@@MegaGasek I like Gamers Nexus and even bought the tool kit almost 3 years ago, but sometimes it can be a bit how @Neamow said.
We are really starting to see the benefits for LTT Labs for all of us in the computing industry. It's awesome having LTT as one of the channels pushing itself further to get better quality comparisons and backing it up with tests. Great to see, because we needed that to settle debates and call companies out. When it's good, it's being said. When it's bad, well, the numbers speak for themselves.
This proves what everyone knew all along.
Apple stuff is overpriced for what you get :p
I mean, when it comes to apple you don't even need to test anything. You just look at the specs of their stuff and then at the pricing and you know it's at least three times as expensive than it should have been so therefore almost never worth it.
As an Apple user, I hope you don't back down. They lie constantly and get away with it. Very frustrating as a customer to buy their products and find huge gotchas all the time.
Stick with Apple and don't complain..
Why do you use apple
@@ThermalWorld_awww are you butt hurt 😂
Please stop buying their products. Apple sucks giant balls. They treat their customers like shit. It took me over a month to delete my Apple ID/Account because they wouldn't just let me do it myself. I had to CALL them like 5 times to delete my account. Unreal.
So then why do you keep buying them?
Noise reduction is an important part of colorgrading in resolve. I'm a professional colorist. NR is very hardware intensive so it's a good way to test your gpu, render speeds are also important.
To be able to play back large files on multiple tracks with noise reduction enabled at full project fps is a good thing.
Having said that, RTX 3090 4090 are also capable of playing large files with noise reduction at a much lower price compared to the new apple chips, so that's definitely something to think about. And with a proper pc workstation you have the ability to upgrade your gpu later if you need to without buying a completely new machine.
Plus OS will be working for years to come especially since they switched to arm.
Loving the hard data to back up the debunking of performance claims.
It's great to see more channels doing these kinds of involved and necessary tests.
doubly nice that they're taking an neutral stand, running a ton of tests (more than just performance) and being able to give more nuanced conclusions. It would have been super easy to bypass any talk about how efficient the system is and focus on debunking their marketing numbers for views.
The public trusts these companies too much.
i see..................
@user-qj3iu3dl7f Exactly
Somehow these review sample-less, completely honest Apple reviews I’ve seen in the last few years from LTT are just the best. Too much of the Apple content on TH-cam is relentlessly positive and Apple-worshipping. I’m not an Apple hater (I use an iPhone) but the realistic, completely honest approach from LTT to these products is incredible. Thanks team.
Reviewers sometimes are unintentionally less agressive with apple review.
Makes sense, most reviewers are afraid of the backlash from uninformed Apple fanatics.
Review's for most things can be very hard to know if favoritism playing a part as they want to keeping their invite's to shows from companies and freebies as if they stray out of line and say if a product is bad even when it is they lose all that you need to know what deals they have with the companies product before watching really.
Game reviews are the same as the reviews know if they say anything bad about a company they are getting paid to review they can say bye to there invites to shows and free review copy's and they have to go buy it them self and then don't get the scoop on unknown things or early review copy's. It is the same with apple reviews and about everything else you can think of. only people that go out and pay out there pocket will give a somewhat real review but at times they are trying to get in that inner circle so they might not say how bad it is. That is why you need to watch a few different reviews to get a picture on what you are looking to buy as only 1 reviews is never good to know as they might brush past the bad stuff while others don't.
@@davide4725Most of the uninformed Apple fanatics are the people that hate on Apple though lol
Well, almost every reviewer is going to hold their tongue a bit since their business relies on sales and not every one is a hardware nerd. Shoes are some of the worst culprits.
This video is 2x better than the last one. 3x more burns and 4.5x funnier! Most people won't notice the "performance" gain from M1 to M2. Just you wait for the M3 next year. It will just keep getting better.
This was a stopgap chip.
I hope they covered this in the video, watching right now.
It will be the best M chip yet! /s
@@RealJoseph123 That's not an excuse, not a valid one at least.
@@Frozander I didn’t say it was a excuse. And it wasn’t a excuse to begin with.
@@RealJoseph123 ok tiktoker. Enjoy your tiktok apple toy
LABS did a great job with the graphs, and all the testing! Kudos to you guys
Dang, this feels like a thorough exploration of the device. Well done Labs!
I’m really glad your labs are revealing these things because it’s extremely concerning how vague their claims are and really don’t make any clear point to the average consumer or even entry or intermediate consumers
No, YOU'RE lying!!
😡😡😡
That's because they know how much of their userbase are not tech focused. It's sort of what apple's market is overall. They are very much doing this on purpose.
" it’s extremely concerning how vague their claims are and really don’t make any clear point to the average consumer or even entry or intermediate consumers"
The average consumer don't buy a workstation...
The computers we are talking about here are dedicated to high end video editing, color grading, vfx, 3D rendering, etc, etc.
That's not an "average" use, nor an "entry level" use.
I always felt like the Mac Studio is an odd case where it’s overkill more most people but not justifiable for a good portion of the people it’s trying to cater for, professional industries. Potentially one of Apple’s most odd modern products since its introduction.
It's a pro product. Pro as in actual industry professionals using it who make a living off of creative content. Not "pro" in the loose sense of just high-end consumer content.
For the “average” consumer it is overkill, but for REAL professional “prosumers”, say mom and pop productions using SmallRig cages and shooting weddings, vs hollywood Arri Alexa studios, the studio is priced just right
I think it seems odd because theres a good chance its consumers wouldnt be out of the average social circle. But then again not very many in that same social circle would buy into it. The average joe wouldnt buy it, but they likely know at least one person who might as a “producer” or someone with deeper pockets casually flexing. Not everyone would know someone tho, who would use an Arri Alexa
If you want to see “odd pricing”, you have nobs off tripods and cameras that could easily be 3d printed for 5 cents being sold for hundreds of dollars at the higher end of hollywood. Now THATS pricing that would make even apple execs blush
That’s the Mac Pro for me. If the Mac Studio was the Mac Pro’s start price, we would’ve have a different conversation.
The Mac Studio way much cheaper.
Sounds like it's totally justifiable for audio and video professionals, but not 3d
It fills the previous hole in their line-up between the Mac Mini and the Mac Pro. Except now, there is absolutely no reason to buy the Mac Pro other than having lots of local storage, and really if you need that sort of storage, you should put it in a NAS and connect it via a 10Gb ethernet cable.
Its never too soon to joke about darwin award winners
Linus is always so polite when roasting a whole company
It’s because he is Canadian. 😊🇨🇦
@@jmsadilek and cussing will limit the algorithm promoting his channel
The gloves usually come off on the WAN show 😄
imagine GN doing the same..
@@niks660097 LTT has a bit more focus on being appealing to the youtube algorithm, while GN is just focused on numbers
I just wanted to say that the Labs graphics look SOOO good. Very easy to read in video format compared to the previous style and they still look attractive. The slight gradients when highlighting specific sections was a nice addition.
This is why I will always trust LTT. They show EVERYTHING. Thank you guys for staying true to your brand.
At least when it comes to Apple 😅
@@Roach22 Yeah… their Intel content…
Big difference watching the two brands being covered on Linus Tech Tips.
I wonder why they chose an Intel processor for the comparison when AMD ones consume less
@@RealJoseph123 How, literally how. This just happens to be one of their first videos using most of the new tools on the Lab.
@@pi4795 Maybe - just maybe... it's because Apple used Intel CPUs.
04:50 I see what you did there! A perfect "Storm" of conditions. That's some great writing!
For the people not getting it, look at Apples Project Names for the Mx Processors: Firestorm, Icestorm, Blizzard...
some amazing content out of the labs with this one. Love to see the extra info it contributes.
The new Labs is really flexing it's benchmarking and testing here and I'm all for it!
Apple: _"It's not lying, it's corporate marketing!"_
_A new spin on the famous phrase "It's not lying, it's commercial real estate."_
Apple: "We never said we were telling the truth. We are an entertainment company! Proof: Apple TV!"
😆
Apple: we love the environment and ask you to pay extra for chargers.
Apple: your devices are too old and we're trying to limit your experiences by not allowing you to download any software
Fuck their logic. They deserve 0 sales.
@@jujuria13😂😂 so true
Its not lying its uplifting marketing.
I absolutely love that it feels like a 100% unbiased, honest review. Apple definitely should have marketed how efficient it actually is. That power/perf of this is unbelievable!
Whats that meme where the guy runs out the building while screaming about how pleasantly surprised they are? 😂
The only way to fight temperature increase is to do some water cooling or something with the i9 and 4090, which would probably end up totalling more, and with a way higher energy bill
@p15209i live in a country with expensive electricity and i could run that setup, and end up saving money, for the lifetime of that PC, compared to simply buying the M2. The efficiancy savings are miniscule. Even if you compare them with running the PCs idle for a whole year.
i'll never understant how apple justifies pricing their products how they do
Simple. Their customer base allows it.
People buy their stuff without asking any question.
I would do the same tbh.
The same way Nvidia does, people buy them at current costs.
@@Bob_Smith19 The reason NVIDIA get away with it is because they simply are the best. They hold the market by the balls and AMD can't catch up. If you want the best feature set and performance, you have to get NVIDIA. Apple does not have that sort of market hold, their value comes from customer ignorance, vanity and laziness. Apple is not on the top because they are the best and most cutting edge, they are at the top because their brand is powerful and people hate change, also because their branding is so strong there is a level of snobbery their customer base feels entitled to.
@@Eagle3302PL"hate change" i mean no they made big changes with their M1 silicon.
It's just that apple people "hate seeing changes" XD it's the whole shtick about apple : hiding things from their customers and making sure their customer won't think complaining lol
3:34 Cant wait for linus to review the gamepad oceangate used.
Those new graphs are absolutely fantastic. So much nicer than the old ones.
"Apple fans, start ..." I thought Apple chips were fanless
It's a shame, because Apple's engineer's really put some solid work into the SOC! But between locking some power options, bad pricing and bad marketing, they really do their community a glaring disservice.
It's the best value time in history to get decent performance out of a Mac and yet everyone is complaining
@@vlcheish they are marking up the price by 60% there is zero reason for you to be defending it other than you're a shill
@@vlcheish Of course they scammed you less and you d be happy about it
@@windslightly9117 Scam who? Windows is my main computer 🤣
@@vlcheish Better value than two years prior does not inherently equal good value, especially when the product is blatantly marketed in a rather deceitful way.
The catch on the headphone jack comparison spec was chef's kiss. Thank-you for the detail oriented work on this review. This sets the bar high for a product reviews in a good way. Great job!
yeah it's true that contrary to popular hater believe, apart from the iphone every device Apple makes has a headphone jack.
@@RandomUser2401Imagine if they made the shuffle without a headphone jack.
@@techno1561my man PLEASE GO SEE A DOCTOR
Come, the new iteration is "advanced". Means it's better than the previous one which is not "advanced"... Can't you read ? XD XD XD
As an apple user, I can confirm I am typing my angry response
Waiting for that angry response
@@Summer-us5ql sorry I’m too angry
yeah yeah typical apple fans without a brain xddd
@@Summer-us5ql as a Mac owner, I am incredibly angry at apple. I'm so angry that I'm storming to the nearest apple store to trade in all my apple devices 😡
@@Muhluritrading in? As in, to get the new equipment?
your point at 16:07 is why I switched back to a PC last year when the studio was announced, your dollar figures are close to what I paid for a completely overkill PC (lesser video card than 4090, though)...Swapped out my 12900K with a 13900KS this year for net $400 additional. That runs my work faster than this year's M2 Ultra. It was a no-brainer. Also at 16:50, good point: if you have multiple jobs that are able to run in parallel, 4x mac minis is an option at that price!
Something to make the Apple fans a bit angy? _Don't mind if I dooooo_
On a more serious note, I do appreciate you guys calling out manufacturer marketing BS. I know that marketing's job is to fluff up the capabilities of the product, but misrepresenting performance is a big no-no.
its ok to make apple fans angry, by that i mean the ones who defend apple on every single thing even if its a bad thing
@@johnsalamiiFair enough, though I think you can substitute any other manufacturer fan for Apple fans. Liking something is fine, being an obnoxious fanboy isn't. :b
Stop saying reasonable things in TH-cam comments, this isn’t supposed to be allowed
@@hobbesrl _Too bad, I'm a rule breaker._ c:
@@Just_a_commenter Critical thinking in my youtube comments? How dare you.
Would be interesting to see the total power usage for a given render. Well the PC might use more power it is finished much quicker. The PC was 3 times quicker so a 2 hour render on a PC would be a 6 hour render on the Mac. So what is the power saving over a actual workload
Dont question the Apple... they are the bestest
@@thomgizzizAvg. Apple fanboy 🤓
underrated comment
That is a valid point. Even though PC eats way more power, it doesn't necessarily mean it costs more money at the end.
I had the same thought. Only in a CPU v CPU (13900K) scenario could you perhaps make an argument for efficiency, with M2 Ultra having close to 70% of the former’s performance.
Checking Apple's graph is going to be a full time position at the lab.
You *crushed* it, with that joke about the submarine 😂
The misleading advertising is the issue, it's more in the territory of lying.
Straight up lying tbh
I really like that LTT is now making comparison in different ambient temperature scenarios. I live in Brazil and the biggest reason I switch from a Dell XPS 13 to a Macbook Air M1 was the amount of heat that the previous one generated at my room. I basically had to use my A.C. all the time because of that.
Kkkkkkkkkkkkk
Dos mesmos criadores de "mudei para iphone pois o Instagram é melhor"
Sounds about right. My Dell Precision 5550 for work (basically an XPS 15) is constantly thermal throttling and even emergency hibernates from time to time from an overheat event. It both runs hot and is unable to cool itself properly despite having fresh thermal paste and a dust-free interior.
@@Saibert1i mean… snapchat and instagram are alot better on iphone… and is why alot of people switched sooo 🤷
@@v1BroadcasterSome people don't understand why people prioritize insta and snap so highly. Which really is fine either way even if others don't agree.
I used the old Intel Macbook Pros's in Australia during the summer and man it was a struggle.. When the M1 came out I bought a Macbook Air and my god the performance in hot conditions was amazing.. I main an M1 Max Studio now and I freaking love it!!
The video shows the i9 as a 16+8 configuration a lot. The Intel i9 13900K is 8+16, not 16+8. The M2 Ultra is 16+4
You are correct, it looks like this was accidentally transposed -EY
@@LinusTechTips Linus can you make a pc with i9 13900t and rtx a4000 ada sff. apple fanboy always talking about power consumption
@@thecon_quererarbitraryname6286 its not really faster... But it can shine in stock efficiency
I love apple as a product company and how they’re one of the few design focused companies but regardless of how I feel about them I’m glad you’re holding their feet to the flame here Linus! Keep it up
The customers should be holding their feet to the flame. They've been screwing you guys for years
@@MrMali22Eh, it really depends heavily on the product. Apple has a mix of very well priced products, overly priced products, and ridiculously priced products. Their base MacBooks are still great for the price right now, especially year old models. Their premium desktops simply don’t sell enough to be priced well honestly.
That's my largest criticism of apple. They say they're focused on design, but then make extremely bad design choices. And that's why iOS is so clunky, lacks functionality and some menus are straight up messy. I don't get how they don't look at the competition and go"this functions brings such a smooth experience, let's implement it"
This is written from my iphone btw
@@TheHeadinchargeAgreed. I actually think the M2 Mini is very fairly priced if you go with a 3rd party storage solution.
these new graphs are amazing and easily understandable. colours are also contrasting from back ground are easier on the eyes
I misunderstood this comment at first to mean Apple's graphs. The LTT labs graphs are ace though.
true!! i had a hard time comprehending their previous graphs, these new graphs are so good
Would have been nice to see how the 7950X3D would have performed in those tests., it is a lot more energy efficient than the 13900K.
Or even a Ryzen 9 7945hx. Draws like 90W at 32000 Cinebench R23 Multicore score and 34000 at 120W. Thats pretty comparable efficiency to Apple and superior power
Also also, it would be fun to performance match a 13900K + 4090 system to the M2 Studio, because both the i9 and the 4090 actually have very good efficiency scaling curves when you performance normalize with lesser hardware. I can get the same results with my 13900K and 4080 sipping half of their max tdp rating, so a full system power draw for me under similar loads as the M2 Studio was put through, is actually very close in efficiency. But I still have the benefit of being able to crank my hardware to the max when I just need to get shit done - which Apple just can't do. And for professionals, when you want those calculations done and animations rendered, time and workflow counts 10x more than avg consumption over a full workday. It's not like either system is pushed to the max constantly anyways. But when they need to be pushed, I want my system performance to go through the roof instead of being gimped by whatever tdp target and inefficient airflow design Apple is operating with.
The problem is the 7950X3D is laser focused on Gaming. Sure it's a fairly efficient chip, but productivity is lacking due to it's lower base clocks and most productivity tasks do not benefit from the additional cache.
They only got so much time tho right, they probably picked intel because it matched the i9 so well
I‘m not shure but think macos dont like amd cpus…
Userbenchmark's numbers are more credible than Apple's
At last! Someone doing a decent 3D GPU testing and comparison between the M2 Ultra and M1 Ultra. Thank you!
13:44 If u dont have AC i dont think youll be buying a 5 thousand dollar computer
My main takeaway from this is that if you have an M1 Mac Studio as a workstation then you won't be rushing out to buy a new one. The upgrade isn't worth it. If you got a beast of a PC, then this also isn't gonna be a upgrade you need to have.
Having said that: it's insanely power efficient. And I for one will happily take 80% the power at basically no noise and no space heater sitting on my desk. That's actually awesome.
If you have a beast of a PC this is a severe downgrade just for the sake of buying Tim Cook a new yacht!
But you could also just get more efficient PC parts to match the Macs performance and they would also cut the power down tremendously... Price as well.
yeah now I get it, this is why apple gets away with flat out lying every single time.
@@gyrozeppeli00 Oh, Apple flat out lying on the benchmarks is inexcusable and I don't get why they do it. They got the perfect marketing angle by selling this thing as a power efficient machine that runs cool and quiet. Just go with that.
@@kentsutton4973 if you know anything about DIY you would know it’s not possible to get that power consumption even with a GTX 960 that’s nearly 8 years old, let alone a whole built PC.
I feel like the real underrated Mac of this generation is the M2 Pro Mac mini, I got one as a music production machine and it's been killing it.
Jeah we think to buy it for that readon
Yep, M2 Pro Macbook or Studio will have you covered for years for music production or video editing, i've many friends with it (and even M2 Base) and they're really happy with it.
@@cavifax I always like it when apple falls behind in specs then it is all about feelings... "Im really happy with it, it just feels better... blah blah blah" but the second that apple wins in the specs then it is all about value and power. You all act like children.
@@thomgizziz Those Macbook and Studio are fairly decent on their pricing, i would not defent the M2 Ultra or the Mac Pro and i won’t deny there’s an “apple tax” and their ecosystem is a trap, but you can get a M2 Base Mac Mini from 600 USD that would smoke any other 600 dollar PC for video editing.
The Mac Mini is a seriously good deal. So small. I abuse mine, temps never going above 50 C even with minimal fan speed
Super useful review. I couldn’t agree more that ultra-low power and quiet operation should have been the marketing value proposition. That’s exactly why I bought a (refurbished) M1 Ultra. As far as the price, I have a different problem with it. You compared a gaming PC with gaming parts in it to this thing. In the Intel days, Apple’s “pro” meant real, honest, workstation parts. Xeon CPUs, workstation graphics cards, ECC memory, server-grade storage options, so of course I expected to pay more than for a gaming PC. Now, I’m paying way more to get high efficiency iPhone silicon in much greater numbers inside this box than what fits in my iPhone. I’m not sure that’s worth as much a Apple charges just for the low power, even though the low power was the compelling feature that drove my purchase.
3:41 Oi! We missed you!!!
You're all the positve spin we need Linus :)
this video is insane! it destroys the marketing in the best way possible: completely objective. There is no way of interpreting this wrong when the numbers speek for themselves.
I know a commercial and indie producer that’s always been an Apple fan. He still is, but no longer uses them for professional video production. Simply because there is no form of GPU expansion. Not even an external eGPU via Thunderbolt 4 is compatible. What you got is what you get. And the M2 Ultra is still not powerful enough for him to do his color grading and edits in Resolve.
New labs graphs are so much more readable and understandable - Great job!
I use a 14" Macbook Pro, and absolutely love the hardware (the software is very much a mixed bag), overall I'm extremely happy with it. But this marketing cr*p from Apple, has been going on for years and is totally unacceptable.
you don't like 6x performance and 5x reduced render time than M1 Ultra!?
It's funny because for years Apple fans were saying you didn't need super fast machines with Mac because the software is so much better.
@@AllahDoesNotExistCool name
I still believe, that for the average office user, this is true. You rarely need to maintain the MacBook. Windows just fails way too often. I’ve lost my data so many times because of Windows. Never happened with Apple.
I think one of the great values of MacOS is that it's Unix based and therefore generally easier to work with for programmers. Yeah, there is the Windows Subsystem for Linux, but I still see it as a hassle to use and you cannot use a Linux package manager for installing Windows apps. Then again, I kinda hate the rest of MacOS and I rather use Linux.
if you inspect element on the graph bars on apple's site, you can see the % difference between the top bar being 100%, and the core i9 imac being 16.3934% on 3d rendering
Weren't those the ones that thermally throttled at like 25% usage?
That’s a great tip they could use for future analyses….
Apple should make servers. You'd think pitching this energy efficiency to companies could definitely make business owners switch to Mac servers. People who have used Mac servers back in the day preach how reliable they are as well. Having a cheap-to-run and reliable server really sounds good.
for "power efficiency" to matter, you need to compare it to "time taken" to complete a task
like it wouldn't matter if it consumed 100w, if it took 5 times longer compared to a 200w device.
Good point! I never thought about power efficiency stats way before!!
But he gave both metrics in the video.
I'm debating buying a mac and switching my windows laptop to linux, because havign switched my major from mechanical engineering to computer science, I am no longer locked in Windows. And whatever beefs anyone may have with macOS, it plays really really nicely with everything I need and want to do.
About the comparison in the end, the thing is that you can keep using some parts of the pc for up to 10 years, not sure you can keep a Mac Studio running if an internal part breaks
Is an anti repair thing.. you through it directly on the bin or if you are lucky you can repair it on repair center.
My parents Dell Inspiration still functions to this days we got in 2016, Never upgraded, never been in for repairs. Occasion reinstall, but no other issues.. She's happy. Recently replace SATA HD with SSD and more RAM. Aside from the reseating of memory every so often due to freezes. its still ok
Most Apple users would go though several Macs in that time. I know i've got 3 or for new Macs since then. Not because they die, but because i "wanted" too upgrade. In that timeframe i've had (1) Mac brick after a re-partition. soo just because we pay the expense doesn't say its all green.
@@Tech-geeky Ive been using an i5 4440 with igpu since 2015 and only had to replace the power supply cuz its fan died and it started to overheat. Im getting a Ryzen 7700x and Rtx 4070 build next week but am still gonna keep this computer for general web browsing and stuff like that for the family. It probably can go for 5-6 more years, maybe toss in more ram, sata ssd and a gtx 1650 and it could be decent for lighter games too, good enough for my younger brother. You cant get that with an Apply product.
@@Tech-geeky My first PC which I got as a hand-me-down in 2006 still works without any repairs (only the RAM has been upgraded). It's an Intel 440BX based Celeron 333MHz system built in 1999. These days I mainly use it to run older software and games.
My main PC is also old, I built it in 2016 (and rebuilt to a new case in early 2017) from used parts and I've done some upgrades (storage, RAM, GPU) over the years. I had to replace the motherboard in 2021 because lightning killed it, I was lucky to find one of the same model for cheap and even got a much better CPU cooler with it.
Asus P6X58D-E, X5670 6c/12t @ 4.4GHz, 24GB RAM, GTX 1080
@@rudrasingh6354 Exactly.
6:30
$1,000.00 just for 16 more cores for the GPU??? Bruh...
I never realized I needed it until you did it once. But when you’re comparing 2 metrics in the script, highlighting it visually is very helpful in digesting the data.
100% nailed it on the intro summary. The engineering is impressive but they can't pretend it's a massive boost every year.
I'm sorry guys but i LOOVVEE the graphs that you guys are putting out. If this is the standard that labs is going to be working at, I'm all for it
As a M1 Ultra Studio owner, couple other points:
1. There’s no mic, so that’s another USB port (webcam in my case)
2. No hardware accelerated ray tracing mean Unreal and Twinmotion are not fully compatible (and run terribly)
Anyway, your conclusion was spot on: looking forward to a 13900 / 4900 desktop for heavy lifting and a Mac laptop (daily driver). Also, it’s infuriating that Tim Apple nerfed the Mac Pro, essentially painting themselves into a corner all over again. RIP Bootcamp 😢
4900? Do you mean „nVidia GeForce RTX 4090“?
The other digit number I could „translate“ with googling it: „Intel Core i9-13900K“. I really hope A.I. will be available one day to help commentators writing their comments :-)
Efficiency is probably the last thing on people's mind when they're paying top dollars for performance & power. I don't think you'll find a single Lamborghini owner asking the seller but "how much mileage am I getting per gallon?" More like, "how fast can I go?!"
In an audio studio power efficiency means low noise and it's convenient
@@retrocomputing Trivium, one of the most popular metal bands in the world uses a setup comparable to the one linus shown with a 4090, custom build by GamerNexus and its silent, while giving the best performance you can get + you can play the latest games at 4k 100fps+ with raytracing\path tracing. The vocalist also does streams on twitch with it. Efficiency is impressive but in the end no one cares, raw power and freedom is king.
@@retrocomputing A properly set up studio would have the PC in a noise isolated room or set up with multi slot 140mm water-cooling rads so they can run fans at like 10% speed.
@@retrocomputing no, stop lying to yourself to justify and double down on your feelings that don't mesh with reality.
@@thomgizziz that's what I heard from audio guys anyway, they use these Macs because of the power/noise ratio. Do they lie to themselves? I doubt it. But if you're into SFF stuff then you know about using lower grade stuff and undervolting, and you know it doesn't mean that it's going to be cheaper than a normal PC. You don't sound like a knowledgeable guy though, just a fanboy who's angry for no reason.
Apple are going back to their PowerPC days with their graphs. I remember seeing graphs with unlabelled X-Axis's back then too. Or they would over emphasis any performance difference by scaling the graph in such a way that 5-10 points in a benchmark would be massive when the score was measured in thousands. Or simply screw the results by not mentioning what benchmark they used.
I guess there's a difference between absolute "points" and relative "percentage", isn't it ? Or is it just me ?
These graphs are fantastic and really clear to read. Even at 140p 😂 Interesting way to cover these results too
It's apple, marketing is the only thing they do that's groundbreaking.
I'm a big fan of the testing scenarios that come with the lab. Happily looking forward for all the vids to come 😊
I understand Apple soldering the memory right next to the chip because of the latency, but $400 to upgrade from 16GB to 32GB? Oh my god.
Not soldered - built into the chip.
The power consumption stuff is great but when the 4090/13900k machine can complete renders/tasks way faster then the additional power requirements don't really matter. You're finishing the job quicker which means the higher wattage is only needed for a shorter time. It's like running a mile vs walking a mile, it's the same work that is getting done
keep in mind, the windows machine will still consume more power at 30-60 percent utilization compared to the Mac at the same level. and that's where these machines will spend most of their time. also, running a mile still leaves you more tired than walking a mile
Yup. PC is still king for desktop, hopefully soon to regain king status for laptops:)
@@miloattal9313Okay but time that mile and tell me then which was quicker.
These are desktop machines and people want to talk about efficiency. The 2 don’t go together. My wife and I do contract editing from home, and our electric bill is 1500.00 a month. But we make 15-20k a month from it. Efficiency is not something we are worried about. We run mac and windows machines for our projects.
@@miloattal9313 A 13900k PC with a 4090 and 128GB of RAM cost about 3800 dollars on amazon, while the M2 Ultra with the same 4 TB of SSD and 128 GB of RAM would cost 6800 dollars. That price difference would be never made up in years of power bills, and you get a worse computer for the price.
It‘s interesting if you need 150GB VRAM. As I do. Yes, the GPU is not that fast, but it‘s a great (affordable) alternative to two NVIDIA A100 or H100.
love the new graphs - I feel like its a way more balanced and nuanced review than usual (and the standard was already pretty high for LTT)
Giving how machine learning is such a big trend these days, can you guys add a test for it? Perhaps compare the 4090 with the Apple silicon.
Spoiler: m2 won't even hold a candle to the machine learning performance of a 4090, all AI frameworks where made with NVIDIA CUDA acceleration in mind since ages
@gabrielesilinic Right, the hardware and software support. In my mind, I was just thinking about the essentially nonexistence vram of the apple gpu but of course there is more to that
It's always interesting to see what Apple is doing, even though they fundamnetally ignore technical users who may have even slightly less mainstream use cases. For example, if you want/need mass storage (particularly with redudancy), you have to go external. If you want video capture, you have to go for something external. If you need multiple GPUs for any reason, you need something external. If you want a DVD or Blueray drive, you have to go external. The list goes on.
You either wind up with a rats nest of external devices around your computer (potentially bottlenecking the IO controllers), you need a deskop PC too to do the heavy stuff, or you just have to accept that there is nothing for you in Apple's ecosystem.
It kinda makes the notion some people have that Apple will take over the world with their "better" hardware laughable. You can't take over a space if you don't have an entry in it. And Apple's entries are either super mainstream or disgustingly overpriced.
At least thunderbolt is making the rats nest a little bit less bottlenecked than if it was all hanging off of USB, but I agree.
For storage you can use a NAS in these cases, hopefully with at least 2.5G. The rest is gonna hang off of the rats nest.
There are lot of cases that Apple cater to technical users, but not for you. and do not conflate what you want with what other technical users want. You could own a Porsche which will be great for technical use of going around a track but not for hauling timber
@@sinni800dk if the issues I had were limited to just the machine I was using, but I found that one of the last gen Macs at work didn't allow for daisychaining for a bunch of peripherals. I was running an animation class at the time, so there was like....a decent amount to plug in? But I've never run into IO issues in any other case as long as I had dongles - this machine just wouldn't allow it.
I doubt it's cos all the lanes were saturated, but QoL drops dramatically when a configuration seems to completely lock off fairly basic industry use cases.
I make a habit of ragging on Apple to my friends, but as MKBHD said, isn't that all the point of the Mac Pro: all the PCIe slots?
@@sandmaster4444 I will refer you to my comment about "disgustingly overpriced".
It only solves the problem for companies with more money than sense.
Linus should have a ‘trust score’ on his website for each company.
It would be very interesting to see the Labs' data being used by other parts of LMG, like in this case, Mac Address using the same info.
I love seeing the labs equipment and data making its way into these videos, very cool to see.
Although in the previous generation we saw a great leap and a great risk taken by Apple, now it disappoints us again with false numbers... The numbers were too good to be true.
Loved seeing the labs benchmarks. Informative and direct. Keep up the good work
2:35 bro really tried hiding a titanic submersible joke ☠☠
Lol, Emily shouting "too soon...!" made my day 😂
I know its very niche but I am quite curious about ML workloads and other GPGPU stuff on the M2 Ultra, for problems that would normally be limited by VRAM. With 192 or even just 128 GB you should be able to solve some massive problem sizes
That seems to be true. You can load larger LLMs into RAM for example. RTX 4090 tops at 24GB if I'm not mistaken. Large Apple Silicon RAM options seem great for running AI models. In many cases RAM limits are more important than raw performance.
I really appreciate the 35 degree Celsius ambient temperature test, in tropical country it practically never go below that all yea except in winter(sometimes).
@elfrjzwe still called it winter since its colder than our usual temperature even tho some only reach 22-20°C
Mine reach 14°C in record, lol
@elfrjz nah, im in bandung, lol
As an apple user (I hesitate to say fan for a whole host of reasons not relevant to this video) I appreciate LTT calling apple out on this. Apple’s insistence on high in the sky marketing undermines the awesome engineering that they do pull off and just continues to annoy the users like me who do want to buy their hardware. Every time they do this they push the love hate relationship I have with them further towards hate.
Thank you for stress testing this machine in 35C ambient temperature, which is pretty much many countries near the equator.
Would have been interested to have seen the Final Cut Pro video benchmarks, as Davinci Resolve is quite slow on a Mac...
Apple could have seriously set the standards for arm PCs if they wanted to, but they want their systems to be completely locked down.
they've been doing that way before M1. My 2015 MBP was more repairable, and to a degree, the OS was more usable ithot security stuff getting in way *user informed is more secure* ??
@@Tech-geeky yeah so locked down that you can install custom Linux on them. Drink less of that cool aid.
That's a feature, not a bug. I'm a long-time dev and let me tell you, the stability, speed, silent operation and incredible battery life of MacBooks is worth the tax. Introducing OEM vendors and their shitty drivers always makes everything else shitty too for the entire OS. Apple controls 100% of each component's drivers, which means they can push OEMs to actually deliver decent software. Let me put it this way - W11 on ARM running inside the Parallels emulator on my Mac boots faster than it does on an i7 laptop I have.
@@karmatraining boot time is mainly affected by the speed of the storage drive so yeah, if the i7 machine has a slower ssd or god forbid a hard disk of course the VM will boot up faster.
What does "completely locked down" mean?
The level of data and the way is displayed is amazing. Great job labs!
Hearing Emily's voice AT ALL again made me smile.
We love you!!
I do know someone who has ordered an M2 Mac Pro, but he's a high-end audio engineer (does Dolby Atmos mixing) who uses a ProTools HDX PCIe card with ProTools on MacOS. His studio also has a rack for rackmounting the system in. So yeah, it's a very tiny specialised niche that would get any value from the Mac Pro as compared to the Mac Studio.
The worst part is it's called the M2 Ultra Mac Studio