Thanks Guard.io for sponsoring this video. Protect yourself from those pesky scammers! Save 20% on a monthly subscription plus a 7-day trial of Guard.io at guard.io/LinusTechTips
People who bought the previous generation of Mac Pro got screwed by Apple just like the previous ones that promised upgradability, but really didn't. Why do people who want to upgrade to the latest version of the Mac Pro have to buy another chassis, power supply and storage all over again?
Not only is that slot for the Thunderbolt card proprietary, it actually doesn't use PCIe at all. The only thing it has in common with MPX is the connector, but the signals on it are completely different. MPX used 8 PCIe lanes for the two Thunderbolt controllers on the GPU and routed two/four Displayports from the GPU to the Thunderbolt controllers on the motherboard. This slot only routes Thunderbolt from the integrated controllers on the M2 Ultra to the USB-C ports and likely some i2c from the PD controllers.
@@juannaym8488 of course you do, the "old" MacPro was not aimed for casual Video editing and "some rendering". More RAM is always better in a lot of professional work
@@juannaym8488some video rendering programs require more than 16GB of VRAM let alone system RAM, remember some disney studios use these to render movies. They easily fill up more than 192 GB of RAM.
@@mr.cookie8265 ah yes, nothing more ironic than some climate activist screaming about the environmental damage humans are doing to the planet from their iPhone
I think the reason behind the Honey Badger SSDs showing up as an external drive is that if it has hot swap functionality, macos will determine any hot swap capable drives as external so they can be ejected by the user
Hi Linus, Good to hear Pro Tools HDX mentioned as it is a critical workflow for some people. One correction if I may. An Avid HDX card is not $500. It is $5000 and you need IO for it, which could be an extra couple of grand but their current MTRX Studio and MTRX II interfaces are north of another $5000, plus you need a copy of Pro Tools Ultimate, which is $600 a year on subscription and used to be $2000 for a permanent license. A fully loaded MTRX II (like mine) can be north of $20k. Each HDX card gives 64 channels into Pro Tools, so you can stack up to three in a machine. A HDX card also doesn't stress the machine very much- it is PCIE 1.1, not PCIE 3.0 or 4.0. Almost all of the expansion chassis operate at PCIE 3.0, not PCIE 4.0. Sonnet chassis have a switch to operate at PCIE 2.0 to have HDX compatibility. Yes, you can use an expansion chassis with HDX but we also have to be aware of the limitations of the Mac Studio over Thunderbolt. This is especially an issue when using PCIE storage over Thunderbolt. You simply can't get the sort of transfer speeds that you can get with internal PCIE expansion in the Mac Pro (or with the internal Studio SSD) on the Studio over Thunderbolt. I've never had anything faster than 2700 MB/s, where the internal (M1 Ultra) Mac Studio drive will do around 6000 MB/s. My Highpoint SSD7505 RAID card does abysmally in an expansion chassis at 1700 MB/s. Otherwise, I agree with everything you've said- it is a slap in the face but for some it is still the best option. Logic doesn't exist on PC so if, like I do, you compose in Logic and mix in Pro Tools HDX then the Mac Pro is still the way to go- you just have to suck up the cost. One thing to consider is a lot of people who buy this machine are doing it through their business. As cap ex it is written down over a number of years so yes $3000 extra for the case does sting but it means everything can be in the one box, no limitation with Thunderbolt transfer speeds. I write articles on this sort of stuff for Production Expert and Pro Tools Expert and I have a 2019 Mac Pro and an M1 Ultra that I use with Pro Tools HDX. Feel free to hit me up if you ever need clarification on this stuff.
Totally agree with you. And, as I mentioned in another comment, once you start adding all of that external stuff, you're no longer looking at a $3k savings. Not to mention that if you upgrade whenever Apple brings out a new machine, the old machine is still worth a good chunk of money, and you'll recapture some of that $3k. Have you seen that new Sonnet PCIe 4.0 eight NVMe card that's coming out? Pretty insane performance. Also, I'm looking forward to the software products GPU Audio is going to be bringing out. Harnessing a lot of that GPU power for audio is going to be amazing.
In the end you nailed it. It's a mac studio with 6 thunderbolt enclosures built-in! And it still has a external thunderbolt so it's more like paying 3k more to change the shape of your thunderbolt ports to "wide"
Generally, yes. Especially with things like Akito's dual-slot that makes it a lot easier compared to hauling around multiple chonky enclosures just for a fiber card and a video capture card. Sonnet also makes a 3-slot one. The exceptions to that would be the SSD card he mentioned and the 100GB ethernet card. Thunderbolt 4 is only a x4 PCIe 3.0 lane, so it can't handle more than 40 GB/s of bandwidth. Some highly specialized cards--or any high end GPU--toss more ones and zeros back and forth across the PCI bus. So if you needed a 25GBe network card, an Avid card, and say, an SAS card for a tape drive, get a Mac Studio and one Sonnet enclosure. If you're getting PCIe for 100GB ethernet (or 40GB) and running multiple NVMe raid cards, the Mac Pro actually does have an edge.
This in fact should be the future of computers in general. SoCs with embedded RAM will hit the PC market sooner than later. Having tiered RAM is just a natural step.
@@big0sro0boss I don't give a sh!t about Apple, but I have to give them credit for what they did with the M1/M2. So now it's time to improve it, by adding openness and expandability, but of course not on their hardware.
Thanks to the outstanding efforts of the Asahi Linux team, running Linux on Apple Silicon is now possible. It would be interesting to compare the support for PCI expansion card between Linux and MacOS. This is pertinent especially since AMD GPUs are supported on ARM Linux.
And in the video it clearly shows mac os recognized the GPU, it just didn't have drivers, so it doesn't look like the hardware is a limitation. Maybe Apple didn't want to get AMD to make ARM drivers for macOS as they weren't selling variants with an AMD GPU but perhaps there is some deeper issue. Would love to see what the Asahi team finds out, maybe it could lead to EGPU support over thunderbolt but that's probably just being hopeful
@@johnatkinson1111 Since the AMD GPU was seen by the OS as a device, most likely AMD drivers could be written for MacOS on M1/2. Just like with Nvidia, Apple would still have to allow signing of the drivers for use and we know they won't do that. This locks you in to using whatever graphics Apple allows and 3rd party GPUs is not it.
The open source community is amazing. When im downloading tnese crazy big and thorough content mods for skyrim or otjer RPGs im always blown away at the work they put in for pennies on the dollar in donations they get compared to big studios. Theyre probably the majority reason software use and implementation keeps advancing so drastically.
Still kind of beyond stupid that Linux may be the only possible bet to make the Pro usable with gpus besides the integrated graphics only. Kinda defeats most of the purpose of these things imo. Looks more like Apple intended to make an expensive storage box rather than a computer.
Yeah. I guess BootCamp isn't a thing anymore but any method of booting Windows on the Mac Pro could also potentially allow use of the GPU cards. There just won't be native OS support for the GPUs.
Holy cr*p, the test results at the different temperatures blew my mind. The level of thoroughness is impressive, absolutely incredible work from the team!
Speaking of ProTools HDX support on the MacPro, I work in a recording studio where we had the issue of wanting to replace our old (like 2010 or something) MacPro, but needed a solution to continue using our HDX card without spending a fortune. Funnily enough, it wasn't even due to performance, it was due to the fact that the latest version of ProTools and some other drivers were no longer supported. We settled on a MacMini and a thunderbolt chassis for the HDX and DeckLink card, which cost us about 2 grand... and this tiny little Mac Mini handles cinema-length 5.1 surround sessions like it's nothing :)
16:10 I agree 100%. The lack of high amounts of RAM and extra GPUs bascially mean if the Mac Studio isn't enough for your high-end needs, you'll need to look elsewhere.
Would love to see how this thing compares to a real arm computer like one from adlink. 128 cores, drivers for Nvidia and amd, (and like basically anything that already has standard Linux kernel driver support via pcie) Ssd performance is way higher, especially if you use all 4 m2 slots in a zfs array with zstd compression And it can actually run steam and games, and server software, and virtual machines, and containers, and like actually useful things If I remember correctly, it has a bmc too.
@@JeffGeerling I am super excited to see what else you do with your adlink system I've been playing with arm64 and riscv64 vms for a while but haven't had the chance to get anything bigger than a rpi4 yet
@@JeffGeerlingAny idea how well Intel GPUs work with the ARM server? Will you try it out? Also it might be a bit much to ask, but how does it perform at ambient temperature close to 30C?
@@ericwood3709 apple silicon macs do not have drivers for add-in GPUs at all. They have a built-in igpu on the soc, but it is not anywhere near the same performance level as a competent modern dedicated GPU w/ a giant heatsink and 3 fans and 300+ watts of additional power running through it. These macs won't be able to properly run modern games without massive-core gpus and specialized ram (gddr6/6x/7) the unified memory on the Mac is way slower than the ram on a high powered GPU. I would like to see them put Linux on the Mac and the adlink (because drivers), put the same GPU in each, then run tests to see how the hardware handles without being kneecapped by poor OS development decisions
Thank you so much for the clearer lab test results presented. Simple design template. Nice Labs' Logo teaser there. Very hyped for Labs' website public release. I would love to suggest a feature to add like a device or component performance, thermals, power use, and cost comparison. Upgradability score [for laptops, Brand made systems like Apple, and System Integrators] - which also includes compatibility for hardware Anyways, love the video.
Glad you mentioned the HDX cards - It's a very niche use-case card, but the pro audio community is in fact one of those groups that really likes/needs internal PCIe slots, and also one of those groups that mostly use macs. HDX cards work alright in expansion chassis, but when are more reliable internally (which is obviously very important in a post production environment). In short, yes, I'll gladly take that lightly used mac pro off your hands lol
lol, music production can still easily be done on 5.1 and hdx card is like 20 years old and a lie and was designed for gen1/2 pcie slots when usb/firewire bandwidth was low. modern computers don't need 'cards', even when doing atmos music or post mixes. even usb 3.2 has 20gbit bandwidth and even 1gbit connection would give you around 500 tracks at 48khz nevermind thunderbolt. only reason people are forced to buy these cards is to make avid interfaces look more 'professional', 'complicated' and milk the customer cause now instead of just a tb or usb cable you get to sell em an extra card(or two), overpriced proprietary digilink cables and a useless support plan.
I think the only reason why studios that need HDX cards could buy a Mac Pro instead of Mac Studio is that if HDX cards are relevant to them they are either an enterprise studio that records something like symphonic orchestras or it's a Dolby Atmos dubbing stage and they need to output 128 channels from Pro Tools. In other words their main gear cost so much the price difference between Mac Pro and studio is completely neglectable for them.
And this is exactly why I'm sticking with Windows systems. Yeah the systems aren't ARM based, and use more power, but they're SO much cheaper and actually allow for customization, which is something Apple has seemingly hated for a long long time. I wish we could get a solid translation layer to run x86 content without issues on ARM based Windows systems, but Apple DOES have an advantage in that for now. Regardless, the lower power draw isn't enough of an advantage when nearly all the other major components are massively in favor of Windows systems. Plus Davinci Resolve is amazing on Windows, and has been my primary editor for 7 or 8 years now.
I feel like they said they don't work with GPUs is because they dont want to be compared to GPUs, like imagine you put a 4090 ot 79xtx and it gets absolutely destroyed... Its just saving face at this point
It’s not quite saving face but it definitely helps. In synthetic benchmarks the M2 Ultra is around a 6900XT so is about a generation behind on GPU while costing way more (mostly due to the complexity of making an M2 Ultra). Apple’s higher-end GPU’s have some major benefits (way more VRAM, next to no transfer cost between CPU and GPU, etc) but you need the software to take advantage of that and still end up with something less powerful. That said, in the majority of Macs Apple sells, the GPUs are a big improvement over what you can usually get. That said, what Apple Silicon loses in performance it makes up for in efficiency. I’ve rarely seen my M1 Ultra Studio go above 100W power draw or above 60°C even when being maxed out. You don’t appreciate how much of a difference that makes until you start realising how much colder it feels in your office in winter (or how less hot it gets in summer) 😅
They also want you to be forced to buy a new one 3 years down the line instead of the normal 10 or 15 years offered by upgradable GPUs. There's no good reason for a workstation to rely on integrated graphics. Maybe someone just wants more monitors or something? But it should still be possible without probably installing Linux.
This looks like an excuse for Apple to cancel the Mac Pro line altogether. It'd be interesting for modders to pick one of these up in the future to try to build a PC with the chassis though, or maybe LTT can try their hand at it? It'd take some fabrication but it'll be cool.
Apart from the nasty logo on the side I kinda like the look of the case. I'll never buy an Apple device though (unless their philosophy around lock-in and their own complete ecosystem changes to something more inter-operable with the rest of the desktop market). Maybe a case manufacturer will come up with something aesthetically similar in time (just different enough to avoid the lawsuit). All the black/white windowed boxes we have to choose from and I'm not a fan of any of them...
Apple won't cancel the Mac Pro line, because if they did, all the TV and Film production companies will swap to PC while hammering Apple's management team in the trade press, as all the PCIe based hardware used to run things like the LED wall of "The Volume" gets shifted away from Apple hardware - and Apple management are addicted to the good publicity of having Mac Pro's be the driving Force behind the live rendering and production.
If they wanted to cancel the Mac Pro line, they'd just do it. Nobody's forcing them to make these things. If anything I think it shows the opposite - Apple wants to keep the Pro line going even if the current state of Apple Silicon means it's not as useful as it could be yet. But consider that the M1 supported a max of 2 displays and 16GB of RAM. They're improving it, bit by bit, and I'm sure support for ever more RAM and proper external GPU support is on the horizon.
Why not just buy the intel version that comes in the same case if you're going for a windows machine? it costs less and will run windows out of the box.
One potential massive performance benefit from having the memory unified is that in theory there would no longer be any time needed to copy buffers between the GPU to CPU. Having worked on optimizing programs with the use of Cuda, the biggest slowdown was frequently tied to memory management alone. If Apple Silicon is actually able to share memory between GPU and CPU without needing extra copies/synchronization of memory, this could mean that any program can benefit from GPU acceleration at any time. Without needing to consider the memory management overhead and just getting raw compute power, this could be a significant game changer in accelerating programs by just sending workload to the GPU. Traditional PCs with discrete GPUs won’t be able to achieve this performance until the day they also share memory with the CPU; the overhead of syncing the memory alone can outweigh the performance boost from GPU execution.
Having worked with Apple Silicon, can confirm it's great not having that bottleneck. The only problems are that a) you have to use Metal or a translation layer and b) 99% of programs are written with the assumption that copying stuff between memories is a necessary step and that kinda negates the whole thing.
I love some times it feels like a few years ago there was this "Don't mess with Linus, he'll be honest about the products, but I want to see data so waaaa this thing he says sucks doesn't suck" and now there's Labs making real data, as close as possible. Love it
I'd be curious to see if Asahi Linux can make any use of the GPUs considering that they were at least enumerated. Linux's amdgpu driver does have ARM support as AMD cards work in the Ampere Altra ARM systems. Some Nvidia cards also work in those systems as well.
I expect that they know they can’t compete directly and instead are going for highest margin halo products. AMD TR Pro and dual Epyc are simply incomparable for CPU workloads. The CXL GPUs are likewise unbeatable in the top end configs.
Apple never managed to compete with them regardless. Very few people prefer Apple, even in the Intel era, because PC has more support for third party solutions for specific problems. Apple's one size fits all approach just isn't compatible with professional usage. So they use it instead to make their other products look better.
I loved how when opened, all you find is a Mac Studio mounted on a board. Apple really knows how to sell E-Waste. I mean that's what it is, Studios that wouldn't sell torn apart and mounted on a board in a bigger case and sold 2x the price so users think it must be better....
But that case is still a beauty. Perhaps not in external design, but construction quality and the quality of the materials used. They will bring quite a good price at the recycler.
@@blahorgaslisk7763how many times is it that the working professionals pull their Mac pros from under their table and cuddle with its case, admiring its "construction quality" 😂
@@blahorgaslisk7763 construction quality? There is no special construction on this thing lmao, no moving parts, no special trickery. It's an aluminium encasing a board in a pretty color scheme, and some metal tubes. That's all there is to it, there is nothing to improve the quality of, no moving parts that see actual wear over time, etc. Any machine shop could build this on an average Tuesday while mildly intoxicated.
@@akshatsingh9830 Yeah but to be fair it's consumer hardware. When threadripper 7000 comes out in a few months, the performance per watt and the performance in general (64 cores minimum) will blow it out of the water.
With Asahi Linux and Jeff Geerling's work on getting the AMDGPU Linux driver working on Arm, it may very well be possible to install a GPU in this machine.
Given that the GPU is one of the things that differentiates the different Apple silicon chips my tinfoil hat conspiracy is that Apple kneecapped that functionality to enforce their product segmentation. I wonder if asahi b will be able to get external GPUs working
Very likely not, the PCIe controller in M1 and M2 series chips have a fatal hardware flaw that prevents GPUs from working properly, so Apple has just decided to not support them at all in macOS
Or in other words, the Mac Pro only exists to be a price anchor to sell Studios, and then the unlucky few idiots actually buying the Pro are just a lucky benefit.
@@ThatStella7922 just another youtube "expert" comment claims some bullshit over the internet without any source or proof. PCIe has a spec, if they meet the spec, it works, if not it won't. but considering that the rest of the pcie products demonstrated worked then all that is needed for the graphic cards to work are just correct drivers.
@@CS_Mangoyou could install asahi Linux. You get a really powerful arm Linux system. Graphics are still work in progress but hey the graphics card drivers for arm mostly exist.
I'd argue that for a proper parallel, rather than an i7, a CPU from a workstation/server lineup like the Epyc or Xeon might have been more appropriate. As a compSci major most of our professional workstations used for large datasets use server CPUs on a workstation motherboard. This would have also let you get closer to the Mac Studios 6k price tag.
That would also be because corps and universities have deals with suppliers who will simply go "pro stuff for pro work", when it's not always needed. As a professional 3D artist I can attest to that. Been supplied with an abysmal PC to do work on in a company (dual-socket supermicro mobo, 2x 10-core Xeons, etc). I later replaced the tower with my own build, which cost less and a single consumer CPU gave me like 80% of the total rendering power, while running laps around the xeons in single-threaded workloads (basically everything other than rendering - including basic tasks). At the end of the day, how much can you *really* expect to do with an ARM SoC from apple, when the 4090 test shows how it absolutely slaughters that silly chip in workloads? IMO the comparison with a "homebrew" consumer workstation is on-point, because most professionals will either buy the mac pro, buy a prebuilt system from the likes of BOXX, or build their own. And if you know what you need and don't work with stupid datasets - the self-built option is the best bang for less money. My 4090+3090 rig would run laps around the mac pro in GPU workloads at a fraction of the cost - and I just keep upgrading GPUs on an older system I built in 2016 - something apple's SoC screws you over with, because there's nothing you can swap or reuse (aside from maybe the SSD). I've got 128GB RAM, so not like I'm starving on that front either - and my rig is built using prosumer parts, no server stuff. So... I get your point, but I think it's unnecessary to go with server/pro parts in most cases when a regular PC system can already do everything these macs can + more, and THAT is the comparison. And, if anything, we already know the PC - no matter what it is, server or consumer parts - will win, simply because of the aforementioned upgrade/reuse path! My PC alone is like that - I've got 9 drives in my PC, multiple GPUs - used to have 4, then 3, now 2 (due to their size, lol), extra front panel USB3, extra usb3 card for the back, a DVD RW drive...with space for more stuff...
I mean they used to use xeons, but the m2 is no xeon, it is no server cpu, it doesn't have ECC memory, it doesn't have many cores. You could compare it to an amd threadripper with 2 or 3 RTX 4090s but that would just be brutal and cruel XD
@@karpouzi and yet, for most of the stuff you do, you dont need an epic or M2 and can get much more performance for much less money. aside from wanting an apple cause its an apple, what is the benefit here ? spending a lot of money for less performance to get ? ...hmm and even if some ppl can bring up valide arguments ... are they worth it? thats a question everybody has to answer for themself. i dont like wasting money xD
I am glad I have a 2010 Mac Pro 5,1 Running Linux now, as Apple has dropped support for the 2010 Mac Pro. With the 2010 Mac Pro I can upgrade everything easily. So far, I have upgraded the CPU's, the GPU, RAM, and Hard Drives, added a USB 3 Card. That runs very well and I saved a huge amount of Money going with the 2010 Mac Pro that Mac killed off by dropping support and limiting its Mac OS compatibility.
I really loved my old original "Cheese Grater" Mac Pro. I was back working in audio and needed power, storage, and expansion slots. The old Cheese Graters weren't cheap but they weren't crazy expensive like all the Mac Pro since then. As ys Apple announced they were moving to SoC I knew the days of the Mac Pro were over. The Mac Studio is a good replacement for as you said about 90% of the people doing production or workstation class computing. I doubt Apple will sell many of the new Mac Pros and use it as their excused to kill it off.
That is precisely their reason for the overpriced turds. Kill off the internally expandable desktop (again). The company keeps looping back to Steve Jobs original ideal of a computer that has no expansion or upgrade capability, and selling it at a high price.
That was the friction between Steve Wozniak and Steve Job. Woz believed in open expandable computers and Jobs the control freak wanted to dictate to users how they use their Apples. That's what killed Next Computer Job finally was in a position to dictate everything hardware, OS and even 3rd party software. By the time Jobs figured out his control freak side was killing the company it was too late.
Your old Mac Pro could probably out-perform this one in a few years if you upgraded it lmao. At least graphically and memory-wise. This thing sounds like e-waste.
I think this is one of the best video's made yet about the M2 Ultra Mac Pro. I switched to Apple back in 2009. I really loved everything about it. The "playfulness" of the OS. Which honestly, seems to be gone as well. As well as the "ease of use." Which is still there in a big regard, but i loved it that Apple didn't constantly bothered me with updates like Windows did. Nowadays, it seems that Apple has been catching up when it comes to that. I have a 16-core 2019 Mac Pro. Which i do really love. I can at least upgrade to some of the "newer" MPX GPU's. Even though i mostly use logic Pro X. I do encounter some limits when it comes to my GPU. Which is only an AMD Radeon Pro W5700X 16 GB. The only use case for the "new" Mac Pro seems to be some kind of a big "storage/pro audio" computer. Anything else the Mac Studio would more than enough for indeed about 90% of the pro people. I'm happy that i can at least still upgrade my GPU as well as my RAM. I can also put in SSD's and audio cards and if i really want i could upgrade my CPU to a 28-core. But i don't think i'll need that. I do think that Apple silicon is great. But i also think that Apple should've left the Mac Pro intel based with different components available to be able to upgrade later on. At least, until they figured out a comparable way to do this with Apple silicon. It's actually pretty sad for the pro Apple users. I'm counting myself more as a "prosumer." But that doesn't mean it isn't sad for us either. Because eventually i will want to upgrade my machine as well. Which means i have only a choice between the Mac Studio or building a Windows pc again. Which i'm planning on doing anyway since i do want to start gaming on a gaming pc again.
Linus almost dropping the $7000 Mac off the table @13:16 because its back leg is hanging over the table edge - I expect nothing less Linus, keep up Gods work.
Hey LTT, could you put some indicator on the graph bars themselves to make them different on the maximum is better or minimun is better? So that we dont have to chech the corner every slide. Maybe a gray to color towards the better side?
Support for NVidia and AMD graphics cards would have been a killer feature, though. Add a much bigger power supply and support for 4x RTX 4090 or something.
Wow Avid actually has support for something relatively close to launch? Amazing. I had to wait nearly 6 months for Pro Tools to be fully functional back when Big Sur came out.
Technically you don't need a PCIe expansion slot on the Mac Pro to leverage Avid's DSP card: you can link to that card installed inside of their mixers over AVB cable Ethernet. It'll do the processing on the mixer and feed it back to the Mac Pro. The Mac Studio and other AVB cable Mac (which is all of them released since 2013 with build-in hardwired Ethernet) can do so as well. The difference between 10 gig and 1 gig networking here is just how many audio channels can be moved simultaneously.
What sample rate and bit depth are the audio channels? If my maths is correct, with 192kHz 32bit you could squeeze in around 160 per gigabit.. or 1400 mono CD quality channels 😅, or more sensibly, 640 48kHz 32bit channels. That said, obviously there has to be some overhead.. If I found the correct card, it supports 64 channels of 192kHz 32bit. Don't know anything about it. I haven't touched a mixer in more than a decade, and that was old fashioned analog.
@@phizc Recent Macs have the AVB features built in. You need to go into Audio MIDI setup to create the virtual audio IO device for the system and then select it as an audio output. The typical max for audio over IP is 144 channels at 48 kHz 32 bit over 1 Gbit. Due to chip implementations, another common limit is 64 channels at 96 kHz, 32 bit. I ran an AVB bandwidth calculator for a 10 Gbit link and the most channels I could get it produce was 1152 at 48 kHz 32 bit. The gotcha isn't channel count with AVB but rather stream count which can bundle up to 64 channels together. That is where overhead comes into play. Maximum quality of the protocol is 384 kHz at 32 bit and 64 channel in a single stream, though I don't know of any hardware that implements that for actual IO. There is also the overhead of maintaining network time clocks via IEEE 1588 precision time protocol (PTP) or the AVB interoperable gPTP variant. And of course you have to factor in the Ethernet packet overhead which eats up about 2.5% of bandwidth.
Regarding machine learning, TPUs are the state of the art compared to GPUs. As far as I know TPUs work fine on the Mac Pro, so you should test those since graphics cards only represent one side of the professionals using the machine
Labs graphs ... yeey 🥳 kudos for the designs as well. One problem i noticed with the graphs though, was that i always have to check if it's a "lower is better" or "higher is better" type. Maybe having a special background color for one type and another for the other type would help to differentiate between them, and make it easier to follow.
I forgot what the machine looked like before starting the video, and legitimately thought you were just being cheeky and photoshopping a cheese grater as the front of the case for a chunk of the video.
Very interested in the minimum price of a custom built pc that could match\exceed the mac performance. Matching price is great but matching performance for less is also good info.
@@scottydc Yes, that's great when you want to compare like pricing, but what about when you want to compare respective performance? if you keep the performance the same or as close as possible where do the price tags en up? it's just a different data set by changing the static variable to performance from price.
@@scottydc Just want to point out the PC had way less ram, which is expensive. The price of a 13900k + 128GB DDR5 + 4090 is about the same as the studio, but will definitely outperform in GPU tasks. Apple needs to really double up the GPU performance.
a request for the linus team. can you guys do a review on tensor flow using the macs unified memory structure. it would be interesting to see what over 100 gig of ram in a gpu would do in comparison to the 80 gig in an a100 card. this would probably require running some of the larger nlp models to see if having the extra memory makes it easier to work with them vs having the extra cuda to process the network.
In a way, the switch to all Apple integrated hardware is ruining Apple for me. I liked Apple when it had proprietary exterior and motherboard hardware, but used off the shelf parts. They used to be like PCs, but designed 1000% better, higher quality and very modular. Hell, the Mac Pro itself basically pioneered tool-less swappable bays and hardware upgrades, with beautiful interiors. Macbooks and iMacs had incredibly easy-to-access and quality RAM slots, SATA III bays and such. What made Apple great was that you got very good software in the OS, and premium hardware BUT, it was still just a PC and you could use off the shelf parts, for the most part, to upgrade components that matter the most... RAM and storage (and GPU, in Mac Pro's case). Now they just make these "lifeless" boxes that are untouchable and screw you into spending massive "Apple Tax" for ram and storage upgrades, upfront. I actually despise M-series for this reason. Give me x86 all day.
I'd love to see you install Asahi Linux and retry installing the 5700xt! Thanks to the low efficiency of Darwin Unix (what MacOS, iOS, etc use) you can get an extra 50%-100% performance uplift using Linux, not to mention drivers.
As a PC guy it was hard to reach the end with all the compromises, the cost, and random things that you just can't do. I already know Macs are... special. Thanks for the content!
This comment is to appreciate the editors and the things they add to these videos to make them just that much more entertaining everytime. Cheese grater Mac effect got me
You could install Asahi Linux which brings GPU support and see how Da Vinci Resolve reacts. I suppose this would make it a great Linux Workstation. Although I'm not sure whether Blackmagic offers an ARM version of DVR.
@@DoubleMonoLR And power efficiency, but yeah Apple just made an expensive storage-box. Arguably e-waste too with the soldered CPU and memory in addition to this.
Freebsd has excellent driver support for devices like networking cards due to enterprise environments. Apple pulls in a large amount of BSD code for macos, and drivers for relatively normal networking devices would absolutely be a part of that, enabling them to more easily adapt these drivers to their newer operating systems It actually makes a lot of sense that the networking and storage devices worked just fine.
I'm inclined to agree with Marques at this point; the chassis is just using existing form factor and parts with little worry over efficient use of the cooling, space, I/O, etc. Good sales and more user feedback will drive a redesign with later releases, but not with the introduction.
The worst part of this for me is that I LOVE the look of this machine. Once some of these cases are out there in the trash I hope to buy one for cheap so I can gut it and put a REAL computer in the chassis
You got problems... I guess this is the same way some guys are into obese women. I've got a Mac server from 2006 and it makes this new one look like a crappy cheese grater (because it is)
@@dwayne_dibley As much as I appreciate the sentiment, that is a large cheese grater and it looks like shit, who agrees? Probably like 80% of people that have seen one. This opinion is unpopular and therefore more wrong.
The more I look at these machines, the more happy I am at having a real desktop computer. More performance, more flexibility, more customisation. It does come at the cost of higher power unfortunately, but at least I have a GPU.
Yeah this is dumb as hell. Assuming that regular PCI-E doesn't change within the next few years my 2006 Mac Pro might be able to out-perform this thing graphically with the default config just for having an upgradable gpu. Unless Linux is installed on it maybe, but then what's the point?! They made it obsolete before even revealing it...
Agreed. Depending on your power costs, the cost of owning a PC might still be lower than these Apple machines. Plus the longevity and possibility of upgrades in a PC is the real win, at-least for me.
Just to be clear, the angle that the Mac Studio and the Pro are going for isn't the home or "prosumer" market; it's the actual professional market. That is, scientific labs, research firms, Hollywood, etc. The kind of stuff that goes into a "workstation"-grade computer isn't the usual i7/i9 and a bunch of RAM like you or I can buy off the shelf; they usually run on Xeon Gold/Platinum workstation processors (or similar) with a spectacularly low error rate, and ECC RAM is mandatory. A workstation from Dell or HP can easily start in the $3000 range for specs that would look weak for a gaming PC. The difference is, your gaming PC or mine can make errors and we don't notice them except as a hiccup in gameplay, or maybe a line written to a log somewhere. If a scientific research computer does even a single calculation wrong, it can invalidate an entire experiment, so they have to have significantly tighter tolerances in how their parts operate. All that said, there's no question that the Mac Pro by itself is extraordinarily silly, seeming to offer modularity and upgradeability but being so tremendously proprietary and locked down as to be useless compared to the preconfigured Studio next to it. I just wanted to make it clear that something like a 3.0 GHz workstation processor isn't necessarily "worse" than a 3.5 GHz CPU found for cheaper in a home/gaming rig, when high-requirement professionals are shopping for accuracy and stability before speed.
If you want to run a large language model (like ChatGPT) locally, having 192GB of unified memory is exactly what you want. While the H100 chips do provide similar amounts of memory, good luck getting your hands on them. Even if you had the estimated $40k as suggested in this video, these chips are typically only sold in large batches e.g. $10M at a time. Wanting to run an LLM locally shouldn't be considered "highly specialized" since it is the only way you can get uncensored outputs (which have been proven to have higher quality). And you don't have to use all but 4GB as suggested by Linus, you can leave more aside and still run really big models that can't be run on any consumer level PC.
If Mac Studio could load and ran a 65b model, it would look pretty interesting to me. Did anyone run this kind of tests? I wonder how the unified memory would perform. Maybe Linus would be interested to run a test.
It's probably not enough considering you need an entire DGX (320GB) to run GPT's. But the point stands that if Apple had retained or increased the memory capacity of the previous model (eg 2TB) it would be a substantially viable option to NVIDIA's DGX when the choke point is the memory rather than compute power itself. I'd hope that maybe a M3 model might be able to hit that threshold, but seeing as how Apple always gives Mac creators these half-assed options, I have no faith in Apple being interested in offering this.
I haven't run the numbers, but cant you get a not-so-high-end CPU with 3090 (4090 is arguably not significantly better for LLM applications) and 256GB system RAM for much cheaper than M2 Mac Pro?
@@sumitmamoria the difference tho is how the memory is regarded, as there is a big difference in memory utility depending on whether it is system ram or vram. Typically they are not interchangeable for workload demands, so if the gpu has a limited vram capacity (like the 24gb of the 3090), then that would be the bottleneck even if you had 1TB of system ram - so in this case where you can actively use the system ram as GPU vram substitute on the Mac with 196gb total capacity, I can see how that could be a benefit in niche usecases. Still makes you wonder why they capped it at 196 if they already know the entire selling point is that the system ram can be used as vram replacement - but that might just have to do with the memory controller not being able to handle an "unlimited" amount of system ram turned into vram. But I'm sure the Apple tinkerers would know a solution so probably it's just performance anchoring that keeps people wanting more, which makes the next gen product easier to sell.
I have the "Macs Fan Control" app on my M1 Max Mac Studio. It allows you full control of fan speeds and gives you all the temperatures on your mac. I leave it in Auto but its nice to know temps. This model is massively over cooled. The highest temp I've seen on any sensor temp on any of the 28 available sensors is in the 140F range. The fans only run 1300 to 1400 RPM. I have 45 TB of external hard drives in addition to the 1TB apple drive. It does everything I want it to do without heating the room and you never hear the fans. I have a 27" Innocn Monitor using USB-C from one of the thunderbolt channels using the Thunderbolt display Color Profile. I really think this is the best color I have ever seen. You couldn't dynamite me out of my M1 Max Mac Studio. I also use and recommend Clean My Mac X.
Literally what I've been saying in the comments lmao. These things were obsolete before they even existed thanks to the soldered/proprietary... well everything lmao. In a few years the 2006-2019 ones might even get to out-perform these in some areas just because of their upgradability lol
@@309electronics5and adding lifespan to the whole machine. If that SSD dies you can throw it away. Edit: it has replaceable SSDs but the other Apple devices don't so yeah...
@@bartomiejkomarnicki7506 gpus are not just used in games u gota be an idiot to think that think AI AUTO CAD VIDEO EDITING ETC AND YEA MORE THAN THE SOC IGPU WOULD SPEED UP THINGS
I could see this being kinda useful for LLM work. 192gb of unified CPU/GPU memory could get really good performance. No way that justifies the price though 😂
but AMD will release MI300A, it have 128GB HBM3 unified memory, and much faster...and MI300X, it have 192GB VRAM...why we should buy M2 macpro for LLM?
@@GH-vi7en dunno, possibly light work on heavy models. Apple has their neural engine which is decently powerful for inference, and the rest of the hardware has been shown to be pretty good too. loading 70b parameter models for inference and finetuning should be fast. Apple has never really been the 'value' sell and its quite clear with the m2 mac pro. other hardware is faster and cheaper, it just doesn't run macOS.
IDK why apple didn't support external GPU, it might because the ARM system is not compatible, but I've seen someone trying to put a GPU into raspberry pi and the high end one didn't work, the low end one works but has alot of bugs and issues
@@0Synergy so I think apple is like locked all of the GPU thing like in iMac, which is only compatible in select gpu, it'll not work when you put an rtx 2080 mxm into an 2011 iMac
If it has PCIe connector the GPU should just work (if there is driver in OS). I suspect the only issue the GPU doesn't have proper firmware to interface with AArch64 UEFI/BIOS. You can probably try attaching GPU to qemu-virt aarch64 machine or x64 machine by PCI passthrough and see what happens xD ARM SoCs don't have concept of video adapter like PCs do, they always require full stack of drivers from the operating system side.
Great review. Mac Studio seems like the obvious choice in most cases. My understanding is that the plan was for the M2 Extreme chip to go into the Pro, but that chip didn't work out as planned. Maybe M3 will work as anticipated and offer a great deal better performance. So, this MacPro is some kind of stopgap, but not really worth the extra price because it doesn't deliver much compared to the Studio. However, if there was a way to easily swap out the current M2 Ultra for an M3 Ultra or M3 Extreme when they become available, then buying a MacPro now would make more sense. But, it just doesn't seem like Apple really has a clear vision for where this MacPro line is going.
The biggest difference is actually storage, when you price up a comparatively quiet and expandable external storage system you’re talking 3-4 grand easy and I’ve not found anything as quiet and the options are not nearly as flexible. So that 3-4 grand over the studio is actually not bad. I’ve stuck 16tb NVME raid, 3.5” discs, SSDs in mine and there’s room for more. All quiet as a mouse. To do thst with the studio would have cost me a little less but no where near as quiet and I don’t need a bunch of external boxes cluttering up my desk. Along with things like Decklink snd Avid cards its actually a really neat, quiet and high end system. GPU was only ever one element of expandability.
Is there any interview anywhere or somebody literally sits down with the people from Apple and the people from Microsoft and put the systems together head to head and have them defend their systems? That would be freaking awesome!
1) While the MacPro may indeed be a poor value, Linus didn't mention one key advange it offers over the Studio: Fast network communication. Highpoint, for instance, advertises a card specifically designed for this Mac Pro that uses a single x16 slot to give 28 GB/s sustained transfer speeds. 2) Linus said: "But unlike the SSDs used for the Mac studio, which are bare flash modules, Apple does in fact still allow users to upgrade these." That language suggests those in the Mac Pro aren't also bare flash modules, but they may be; it could simply be that, in the Mac Pro, they can be upgraded.
Thanks Guard.io for sponsoring this video. Protect yourself from those pesky scammers! Save 20% on a monthly subscription plus a 7-day trial of Guard.io at guard.io/LinusTechTips
As a LTT subscriber I can approve of this segue.
as a guardio user i can confirm you can save 20% on a monthly subscription
If you weren't joking about selling the m2 Mac pro I'd be interested.
@@tipsfadora177 I'd give 'em $3 for it
Can you make a video on how to protect against hacks in general. A lot of people in India are getting hacked and it would be useful to know
People who bought the previous generation of Mac Pro got screwed by Apple just like the previous ones that promised upgradability, but really didn't. Why do people who want to upgrade to the latest version of the Mac Pro have to buy another chassis, power supply and storage all over again?
It shouldn't come as a surprise though... The last upgradable mac pro was the mac pro 5.1.A machine I kept running for a decade flawlessly!
At least they can reuse the 999 wheels
Just give me that chassis by itself, and I can literally make a computer that is cheaper and more powerful.
Why do people who want to upgrade buy apple?
The intel mac pro was fully upgradeble, linus got a few videos about it
I would like to see how far we could get running Asahi Linux, since GPU support on Arm Linux is way ahead of Windows or macOS.
Wondered the same
That would be an interesting idea. Maybe your next video.
upvote for this, definitely want to see that
This response should have been "how much?" I hope Linus gifts it to you at LTX.
Hi jeff iknow how hard u try runing an gpu on pcie ... I think still no hope here
Not only is that slot for the Thunderbolt card proprietary, it actually doesn't use PCIe at all. The only thing it has in common with MPX is the connector, but the signals on it are completely different. MPX used 8 PCIe lanes for the two Thunderbolt controllers on the GPU and routed two/four Displayports from the GPU to the Thunderbolt controllers on the motherboard. This slot only routes Thunderbolt from the integrated controllers on the M2 Ultra to the USB-C ports and likely some i2c from the PD controllers.
why does the Pro mac look like a cheese grater? they seem to be moving closer to that sort of computer look
Kinda crazy how much of a downgrade the new mac pro has for RAM. previous gen you could have up to 1.5 TB of RAM and now only a measly 192GB
@@juannaym8488 of course you do, the "old" MacPro was not aimed for casual Video editing and "some rendering". More RAM is always better in a lot of professional work
@@juannaym8488 professional workloads can eat up RAM like crazy. No one would get this much RAM if there was nothing that actually needed it.
but like 1.5tb of ram can only do basic tasks, i personally need around 69 tb to open up enough chrome tabs.
@@juannaym8488I daily run out of ram with 128gb lol
@@juannaym8488some video rendering programs require more than 16GB of VRAM let alone system RAM, remember some disney studios use these to render movies. They easily fill up more than 192 GB of RAM.
The new labs graphs are so much more readable than the old ones. Keep up the good work.
I feel like this machine would have more of a reason to exist if it had dual M2 Ultras, like the old dual Power Macs
omg
quad fx apple edition
They tried that, and could not get it to work. Four "m" chips together might be achieved with M3 or M4
So, 4 m2 maxes? I was thinking that too
@@NutchapolSaland liquid cooled!
The biggest problem is that the power consumption of M2 Ultra is as high as 300w, while the cooling power of mac pro is only about 400w
Apple has perfected the art of charging you more for less
and the sheeple are buying their electronic waste
Their old slogan was
“Does more
Cost less
It’s that simple”
Needless to say they don’t mention that anymore
@@mr.cookie8265 ah yes, nothing more ironic than some climate activist screaming about the environmental damage humans are doing to the planet from their iPhone
@@Robb_in_Oz not to mention the child labor
Apple figured out how to do that way back in the 80's with the original Macintosh...
I think the reason behind the Honey Badger SSDs showing up as an external drive is that if it has hot swap functionality, macos will determine any hot swap capable drives as external so they can be ejected by the user
FALLOUT 4 HONEY BADGER YESSSSSSSSSSSS
Well, technically PCIe is hot-swappable, it's just not particularly convenient and also device and driver dependent.
@@physx_yt1062 And also motherboard firmware dependent.
That's true, I have a PCI-E 4x NVMe card and it always showed as removable drives in both Windows and MacOS.
Just hot swap it by... taking the entire cover off? Oh, right. You'd have to disconnect all of the wires from the back just to do that.
Hi Linus,
Good to hear Pro Tools HDX mentioned as it is a critical workflow for some people.
One correction if I may.
An Avid HDX card is not $500. It is $5000 and you need IO for it, which could be an extra couple of grand but their current MTRX Studio and MTRX II interfaces are north of another $5000, plus you need a copy of Pro Tools Ultimate, which is $600 a year on subscription and used to be $2000 for a permanent license. A fully loaded MTRX II (like mine) can be north of $20k.
Each HDX card gives 64 channels into Pro Tools, so you can stack up to three in a machine. A HDX card also doesn't stress the machine very much- it is PCIE 1.1, not PCIE 3.0 or 4.0.
Almost all of the expansion chassis operate at PCIE 3.0, not PCIE 4.0.
Sonnet chassis have a switch to operate at PCIE 2.0 to have HDX compatibility.
Yes, you can use an expansion chassis with HDX but we also have to be aware of the limitations of the Mac Studio over Thunderbolt.
This is especially an issue when using PCIE storage over Thunderbolt.
You simply can't get the sort of transfer speeds that you can get with internal PCIE expansion in the Mac Pro (or with the internal Studio SSD) on the Studio over Thunderbolt.
I've never had anything faster than 2700 MB/s, where the internal (M1 Ultra) Mac Studio drive will do around 6000 MB/s.
My Highpoint SSD7505 RAID card does abysmally in an expansion chassis at 1700 MB/s.
Otherwise, I agree with everything you've said- it is a slap in the face but for some it is still the best option.
Logic doesn't exist on PC so if, like I do, you compose in Logic and mix in Pro Tools HDX then the Mac Pro is still the way to go- you just have to suck up the cost.
One thing to consider is a lot of people who buy this machine are doing it through their business.
As cap ex it is written down over a number of years so yes $3000 extra for the case does sting but it means everything can be in the one box, no limitation with Thunderbolt transfer speeds.
I write articles on this sort of stuff for Production Expert and Pro Tools Expert and I have a 2019 Mac Pro and an M1 Ultra that I use with Pro Tools HDX.
Feel free to hit me up if you ever need clarification on this stuff.
Totally agree with you. And, as I mentioned in another comment, once you start adding all of that external stuff, you're no longer looking at a $3k savings. Not to mention that if you upgrade whenever Apple brings out a new machine, the old machine is still worth a good chunk of money, and you'll recapture some of that $3k. Have you seen that new Sonnet PCIe 4.0 eight NVMe card that's coming out? Pretty insane performance.
Also, I'm looking forward to the software products GPU Audio is going to be bringing out. Harnessing a lot of that GPU power for audio is going to be amazing.
I love the new benchmarks from LABS, easy the best iteration of graphics so far
They make so much more sense frfr
And I hate clickbait techtips for another clickbait video and wasting 15 minutes of my time.
In the end you nailed it. It's a mac studio with 6 thunderbolt enclosures built-in! And it still has a external thunderbolt so it's more like paying 3k more to change the shape of your thunderbolt ports to "wide"
Generally, yes. Especially with things like Akito's dual-slot that makes it a lot easier compared to hauling around multiple chonky enclosures just for a fiber card and a video capture card. Sonnet also makes a 3-slot one.
The exceptions to that would be the SSD card he mentioned and the 100GB ethernet card. Thunderbolt 4 is only a x4 PCIe 3.0 lane, so it can't handle more than 40 GB/s of bandwidth. Some highly specialized cards--or any high end GPU--toss more ones and zeros back and forth across the PCI bus.
So if you needed a 25GBe network card, an Avid card, and say, an SAS card for a tape drive, get a Mac Studio and one Sonnet enclosure. If you're getting PCIe for 100GB ethernet (or 40GB) and running multiple NVMe raid cards, the Mac Pro actually does have an edge.
mac studio but THICC
I would like to see a comparison of final cut renders. I suspect this is the audience they are aiming for.
The best thing about these Pro towers is they look pretty cool. They get even cooler when you buy an old one for $80 to put your PC in.
Cheesegraters
Where did you find one of those things for 80 bucks?!?
Locally. Guy has a whole bunch of 2009 mac pros for $65 now.
That's the best (and for me only) use for any Mac
@@pkt1213 I would pay you to ship me one lol
A second tier of DIMM RAM alongside the unified memory with external GPUs would be an interesting selling point for the mac pro
This in fact should be the future of computers in general. SoCs with embedded RAM will hit the PC market sooner than later. Having tiered RAM is just a natural step.
You want apple to give people upgradability? Ew! you poor or what?
The new Xeons with on-package HBM cache will support tiering and will be able to run purely with HBM or with added DRAM sticks.
@@big0sro0boss I don't give a sh!t about Apple, but I have to give them credit for what they did with the M1/M2. So now it's time to improve it, by adding openness and expandability, but of course not on their hardware.
@physx_yt1062 that sounds really cool!
@4:27 - Please give your editor a raise. That was brilliant 😂
Thanks to the outstanding efforts of the Asahi Linux team, running Linux on Apple Silicon is now possible. It would be interesting to compare the support for PCI expansion card between Linux and MacOS. This is pertinent especially since AMD GPUs are supported on ARM Linux.
And in the video it clearly shows mac os recognized the GPU, it just didn't have drivers, so it doesn't look like the hardware is a limitation. Maybe Apple didn't want to get AMD to make ARM drivers for macOS as they weren't selling variants with an AMD GPU but perhaps there is some deeper issue. Would love to see what the Asahi team finds out, maybe it could lead to EGPU support over thunderbolt but that's probably just being hopeful
@@johnatkinson1111 Since the AMD GPU was seen by the OS as a device, most likely AMD drivers could be written for MacOS on M1/2. Just like with Nvidia, Apple would still have to allow signing of the drivers for use and we know they won't do that. This locks you in to using whatever graphics Apple allows and 3rd party GPUs is not it.
The open source community is amazing. When im downloading tnese crazy big and thorough content mods for skyrim or otjer RPGs im always blown away at the work they put in for pennies on the dollar in donations they get compared to big studios. Theyre probably the majority reason software use and implementation keeps advancing so drastically.
Still kind of beyond stupid that Linux may be the only possible bet to make the Pro usable with gpus besides the integrated graphics only. Kinda defeats most of the purpose of these things imo. Looks more like Apple intended to make an expensive storage box rather than a computer.
@@MitchyTheKid4095 you can install unsigned drivers, you just have to disable SIP
It would be very interesting if the GPU/PCIE passthrough to a virtual machine would work on the Mac Pro.
Under Linux, it would probably work.
@@carbongrip2108As long as you are stackoverflow-certified software enthusiast or any Linux forum.
Yeah. I guess BootCamp isn't a thing anymore but any method of booting Windows on the Mac Pro could also potentially allow use of the GPU cards. There just won't be native OS support for the GPUs.
@@MagnumForce51 Windows on M2 chip?
Good point. I wonder if at any stage Parallels would allow GPU passthrough to Windows rather than its own 'virtual GPU'.
Holy cr*p, the test results at the different temperatures blew my mind. The level of thoroughness is impressive, absolutely incredible work from the team!
mac pro just exists to make the studio look good and its working increadibly well
Looks like a cheese grater
@@bboness713but can it grate my cheese?
@@loopyloops5652 no ifixit tested it and it cant grate cheese
It exists to take money from dumb fanboys
It's not even able to grate cheese which is really sad at $7K
Speaking of ProTools HDX support on the MacPro, I work in a recording studio where we had the issue of wanting to replace our old (like 2010 or something) MacPro, but needed a solution to continue using our HDX card without spending a fortune. Funnily enough, it wasn't even due to performance, it was due to the fact that the latest version of ProTools and some other drivers were no longer supported. We settled on a MacMini and a thunderbolt chassis for the HDX and DeckLink card, which cost us about 2 grand... and this tiny little Mac Mini handles cinema-length 5.1 surround sessions like it's nothing :)
they did that too in 2010 with the HD Accel, the hardware cant run in newer systems just because they dont release a driver
16:10 I agree 100%. The lack of high amounts of RAM and extra GPUs bascially mean if the Mac Studio isn't enough for your high-end needs, you'll need to look elsewhere.
Yep, time to pull the ripcord and take the plunge
If they even supported tiered RAM that would be good. Basically not primary system memory but DRAM as secondary memory.
Would love to see how this thing compares to a real arm computer like one from adlink.
128 cores, drivers for Nvidia and amd, (and like basically anything that already has standard Linux kernel driver support via pcie)
Ssd performance is way higher, especially if you use all 4 m2 slots in a zfs array with zstd compression
And it can actually run steam and games, and server software, and virtual machines, and containers, and like actually useful things
If I remember correctly, it has a bmc too.
I'm going to upgrade my 96-core Ampere to 128 soon! Probably right after LTX
@@JeffGeerling I am super excited to see what else you do with your adlink system
I've been playing with arm64 and riscv64 vms for a while but haven't had the chance to get anything bigger than a rpi4 yet
@@JeffGeerlingAny idea how well Intel GPUs work with the ARM server? Will you try it out?
Also it might be a bit much to ask, but how does it perform at ambient temperature close to 30C?
Macs do run Steam and games. Not that you would want to get this silly system for doing that.
@@ericwood3709 apple silicon macs do not have drivers for add-in GPUs at all. They have a built-in igpu on the soc, but it is not anywhere near the same performance level as a competent modern dedicated GPU w/ a giant heatsink and 3 fans and 300+ watts of additional power running through it.
These macs won't be able to properly run modern games without massive-core gpus and specialized ram (gddr6/6x/7) the unified memory on the Mac is way slower than the ram on a high powered GPU.
I would like to see them put Linux on the Mac and the adlink (because drivers), put the same GPU in each, then run tests to see how the hardware handles without being kneecapped by poor OS development decisions
Even the old 5,1 "cheese grater" from 2012 can have 192GB of ECC DDR3.
Thank you so much for the clearer lab test results presented. Simple design template.
Nice Labs' Logo teaser there.
Very hyped for Labs' website public release.
I would love to suggest a feature to add like a device or component performance, thermals, power use, and cost comparison.
Upgradability score [for laptops, Brand made systems like Apple, and System Integrators] - which also includes compatibility for hardware
Anyways, love the video.
same logo they showed a couple of weeks ago.
Glad you mentioned the HDX cards - It's a very niche use-case card, but the pro audio community is in fact one of those groups that really likes/needs internal PCIe slots, and also one of those groups that mostly use macs. HDX cards work alright in expansion chassis, but when are more reliable internally (which is obviously very important in a post production environment).
In short, yes, I'll gladly take that lightly used mac pro off your hands lol
Yep, just commented mostly the same - the Pro line is the only computer I ever see at studios.
@@JonatanNoponenmost are probably still 5.1s
lol, music production can still easily be done on 5.1 and hdx card is like 20 years old and a lie and was designed for gen1/2 pcie slots when usb/firewire bandwidth was low. modern computers don't need 'cards', even when doing atmos music or post mixes. even usb 3.2 has 20gbit bandwidth and even 1gbit connection would give you around 500 tracks at 48khz nevermind thunderbolt.
only reason people are forced to buy these cards is to make avid interfaces look more 'professional', 'complicated' and milk the customer cause now instead of just a tb or usb cable you get to sell em an extra card(or two), overpriced proprietary digilink cables and a useless support plan.
@@一本のうんち kudos, my man... Someone gets it!
I think the only reason why studios that need HDX cards could buy a Mac Pro instead of Mac Studio is that if HDX cards are relevant to them they are either an enterprise studio that records something like symphonic orchestras or it's a Dolby Atmos dubbing stage and they need to output 128 channels from Pro Tools. In other words their main gear cost so much the price difference between Mac Pro and studio is completely neglectable for them.
And this is exactly why I'm sticking with Windows systems. Yeah the systems aren't ARM based, and use more power, but they're SO much cheaper and actually allow for customization, which is something Apple has seemingly hated for a long long time. I wish we could get a solid translation layer to run x86 content without issues on ARM based Windows systems, but Apple DOES have an advantage in that for now. Regardless, the lower power draw isn't enough of an advantage when nearly all the other major components are massively in favor of Windows systems. Plus Davinci Resolve is amazing on Windows, and has been my primary editor for 7 or 8 years now.
I feel like they said they don't work with GPUs is because they dont want to be compared to GPUs, like imagine you put a 4090 ot 79xtx and it gets absolutely destroyed... Its just saving face at this point
Basically. Apple's reasons for not supporting eGPUs on Apple Silicon devices always struck me as bullshit.
@rokie. A 4090 is gen 4.
Anyone familiar with Mac knows that you want an M3, not the M2. Well, unless you suddenly can't buy chips from Taiwan.
It’s not quite saving face but it definitely helps. In synthetic benchmarks the M2 Ultra is around a 6900XT so is about a generation behind on GPU while costing way more (mostly due to the complexity of making an M2 Ultra). Apple’s higher-end GPU’s have some major benefits (way more VRAM, next to no transfer cost between CPU and GPU, etc) but you need the software to take advantage of that and still end up with something less powerful. That said, in the majority of Macs Apple sells, the GPUs are a big improvement over what you can usually get.
That said, what Apple Silicon loses in performance it makes up for in efficiency. I’ve rarely seen my M1 Ultra Studio go above 100W power draw or above 60°C even when being maxed out. You don’t appreciate how much of a difference that makes until you start realising how much colder it feels in your office in winter (or how less hot it gets in summer) 😅
They also want you to be forced to buy a new one 3 years down the line instead of the normal 10 or 15 years offered by upgradable GPUs. There's no good reason for a workstation to rely on integrated graphics. Maybe someone just wants more monitors or something? But it should still be possible without probably installing Linux.
This looks like an excuse for Apple to cancel the Mac Pro line altogether. It'd be interesting for modders to pick one of these up in the future to try to build a PC with the chassis though, or maybe LTT can try their hand at it? It'd take some fabrication but it'll be cool.
Apart from the nasty logo on the side I kinda like the look of the case. I'll never buy an Apple device though (unless their philosophy around lock-in and their own complete ecosystem changes to something more inter-operable with the rest of the desktop market). Maybe a case manufacturer will come up with something aesthetically similar in time (just different enough to avoid the lawsuit). All the black/white windowed boxes we have to choose from and I'm not a fan of any of them...
Apple won't cancel the Mac Pro line, because if they did, all the TV and Film production companies will swap to PC while hammering Apple's management team in the trade press, as all the PCIe based hardware used to run things like the LED wall of "The Volume" gets shifted away from Apple hardware - and Apple management are addicted to the good publicity of having Mac Pro's be the driving Force behind the live rendering and production.
If they wanted to cancel the Mac Pro line, they'd just do it. Nobody's forcing them to make these things. If anything I think it shows the opposite - Apple wants to keep the Pro line going even if the current state of Apple Silicon means it's not as useful as it could be yet. But consider that the M1 supported a max of 2 displays and 16GB of RAM. They're improving it, bit by bit, and I'm sure support for ever more RAM and proper external GPU support is on the horizon.
Why not just buy the intel version that comes in the same case if you're going for a windows machine? it costs less and will run windows out of the box.
That Mac is worth more as an E-waste
One potential massive performance benefit from having the memory unified is that in theory there would no longer be any time needed to copy buffers between the GPU to CPU. Having worked on optimizing programs with the use of Cuda, the biggest slowdown was frequently tied to memory management alone. If Apple Silicon is actually able to share memory between GPU and CPU without needing extra copies/synchronization of memory, this could mean that any program can benefit from GPU acceleration at any time. Without needing to consider the memory management overhead and just getting raw compute power, this could be a significant game changer in accelerating programs by just sending workload to the GPU. Traditional PCs with discrete GPUs won’t be able to achieve this performance until the day they also share memory with the CPU; the overhead of syncing the memory alone can outweigh the performance boost from GPU execution.
Having worked with Apple Silicon, can confirm it's great not having that bottleneck. The only problems are that a) you have to use Metal or a translation layer and b) 99% of programs are written with the assumption that copying stuff between memories is a necessary step and that kinda negates the whole thing.
I love some times it feels like a few years ago there was this "Don't mess with Linus, he'll be honest about the products, but I want to see data so waaaa this thing he says sucks doesn't suck" and now there's Labs making real data, as close as possible. Love it
I'd be curious to see if Asahi Linux can make any use of the GPUs considering that they were at least enumerated. Linux's amdgpu driver does have ARM support as AMD cards work in the Ampere Altra ARM systems. Some Nvidia cards also work in those systems as well.
I expect that they know they can’t compete directly and instead are going for highest margin halo products. AMD TR Pro and dual Epyc are simply incomparable for CPU workloads. The CXL GPUs are likewise unbeatable in the top end configs.
they are going for the secure environments.
@@TalesOfWarlol
@@TalesOfWar That power cost isn't even a drop in the bucket for the kind of companies who would be buying high end workstations like this.
Apple never managed to compete with them regardless. Very few people prefer Apple, even in the Intel era, because PC has more support for third party solutions for specific problems. Apple's one size fits all approach just isn't compatible with professional usage. So they use it instead to make their other products look better.
@@TalesOfWar do you know how much a commercial kitchen, like a pizza place uses for power?
I really appreciate that someone took the time to do a stop-motion animation on the extension card. It's not time efficient but it's beautiful.
Honey Badger don't give a crap! It does what it wants, and it wants to stop motion animate!
-Butler: Your Spaghetti Napoli Sir... Would you like some Parmesan?
-Me: Thank you Alfred. Could you also bring the new cheese grater?
I loved how when opened, all you find is a Mac Studio mounted on a board. Apple really knows how to sell E-Waste. I mean that's what it is, Studios that wouldn't sell torn apart and mounted on a board in a bigger case and sold 2x the price so users think it must be better....
But that case is still a beauty. Perhaps not in external design, but construction quality and the quality of the materials used. They will bring quite a good price at the recycler.
@@blahorgaslisk7763how many times is it that the working professionals pull their Mac pros from under their table and cuddle with its case, admiring its "construction quality" 😂
@@confused.cat. Every day
@@blahorgaslisk7763 construction quality? There is no special construction on this thing lmao, no moving parts, no special trickery. It's an aluminium encasing a board in a pretty color scheme, and some metal tubes. That's all there is to it, there is nothing to improve the quality of, no moving parts that see actual wear over time, etc. Any machine shop could build this on an average Tuesday while mildly intoxicated.
@@confused.cat. It's all about the value at the recyclers as that's where you send this e-waste in a couple of years.
All I learned from this video is that the windows machine is either pretty much equal or 10x better lol
at half the price too
And less problems but stays behind in efficiency ☹️
@@akshatsingh9830 Yeah but to be fair it's consumer hardware. When threadripper 7000 comes out in a few months, the performance per watt and the performance in general (64 cores minimum) will blow it out of the water.
@@akshatsingh9830 bro the price difference nullifies the power usage. and the fact that pc is highly upgradable makes it a no brainer.
i like how its just same or 10 times there is no inbetween like twice the power
With Asahi Linux and Jeff Geerling's work on getting the AMDGPU Linux driver working on Arm, it may very well be possible to install a GPU in this machine.
Given that the GPU is one of the things that differentiates the different Apple silicon chips my tinfoil hat conspiracy is that Apple kneecapped that functionality to enforce their product segmentation. I wonder if asahi b will be able to get external GPUs working
Very likely not, the PCIe controller in M1 and M2 series chips have a fatal hardware flaw that prevents GPUs from working properly, so Apple has just decided to not support them at all in macOS
@@ThatStella7922Is there a link for that ? Pointing too this flaw that is
Or in other words, the Mac Pro only exists to be a price anchor to sell Studios, and then the unlucky few idiots actually buying the Pro are just a lucky benefit.
@@ThatStella7922 just another youtube "expert" comment claims some bullshit over the internet without any source or proof. PCIe has a spec, if they meet the spec, it works, if not it won't. but considering that the rest of the pcie products demonstrated worked then all that is needed for the graphic cards to work are just correct drivers.
@@photobunny So in other words, the idea of price anchoring worked. Amazing
After watching over 200 Linus tech tips videos, I am proud to announce that I can now tell when the sponsor message is coming
I thought I was seeing double but some bot account copy and pasted your comment lmfao
Thats subtle cheese grater effect actually made me laugh lol 4:25
I wonder how would it compare with a 4090 running on linux based kernel for IO operations, like compiling etc where it's way faster than windows.
You want to give apple a chance or just pull down their pants in front of everybody?
You know that it would totally destroy apple
At this price. Why even bother going with apple?
@@CS_Mangoyou could install asahi Linux. You get a really powerful arm Linux system. Graphics are still work in progress but hey the graphics card drivers for arm mostly exist.
@@CS_Mango blind fanboys exist every where so just to let them feel the pain
I'd argue that for a proper parallel, rather than an i7, a CPU from a workstation/server lineup like the Epyc or Xeon might have been more appropriate. As a compSci major most of our professional workstations used for large datasets use server CPUs on a workstation motherboard. This would have also let you get closer to the Mac Studios 6k price tag.
I was thinking the same. What would be the x86 equivalent *for the same price.*
That would also be because corps and universities have deals with suppliers who will simply go "pro stuff for pro work", when it's not always needed.
As a professional 3D artist I can attest to that. Been supplied with an abysmal PC to do work on in a company (dual-socket supermicro mobo, 2x 10-core Xeons, etc). I later replaced the tower with my own build, which cost less and a single consumer CPU gave me like 80% of the total rendering power, while running laps around the xeons in single-threaded workloads (basically everything other than rendering - including basic tasks).
At the end of the day, how much can you *really* expect to do with an ARM SoC from apple, when the 4090 test shows how it absolutely slaughters that silly chip in workloads?
IMO the comparison with a "homebrew" consumer workstation is on-point, because most professionals will either buy the mac pro, buy a prebuilt system from the likes of BOXX, or build their own. And if you know what you need and don't work with stupid datasets - the self-built option is the best bang for less money.
My 4090+3090 rig would run laps around the mac pro in GPU workloads at a fraction of the cost - and I just keep upgrading GPUs on an older system I built in 2016 - something apple's SoC screws you over with, because there's nothing you can swap or reuse (aside from maybe the SSD). I've got 128GB RAM, so not like I'm starving on that front either - and my rig is built using prosumer parts, no server stuff.
So... I get your point, but I think it's unnecessary to go with server/pro parts in most cases when a regular PC system can already do everything these macs can + more, and THAT is the comparison.
And, if anything, we already know the PC - no matter what it is, server or consumer parts - will win, simply because of the aforementioned upgrade/reuse path!
My PC alone is like that - I've got 9 drives in my PC, multiple GPUs - used to have 4, then 3, now 2 (due to their size, lol), extra front panel USB3, extra usb3 card for the back, a DVD RW drive...with space for more stuff...
I mean they used to use xeons, but the m2 is no xeon, it is no server cpu, it doesn't have ECC memory, it doesn't have many cores. You could compare it to an amd threadripper with 2 or 3 RTX 4090s but that would just be brutal and cruel XD
@@karpouzi and yet, for most of the stuff you do, you dont need an epic or M2 and can get much more performance for much less money.
aside from wanting an apple cause its an apple, what is the benefit here ? spending a lot of money for less performance to get ? ...hmm and even if some ppl can bring up valide arguments ... are they worth it? thats a question everybody has to answer for themself. i dont like wasting money xD
@@karpouzi custom loops don't cost anywhere near that.
I am glad I have a 2010 Mac Pro 5,1 Running Linux now, as Apple has dropped support for the 2010 Mac Pro. With the 2010 Mac Pro I can upgrade everything easily. So far, I have upgraded the CPU's, the GPU, RAM, and Hard Drives, added a USB 3 Card. That runs very well and I saved a huge amount of Money going with the 2010 Mac Pro that Mac killed off by dropping support and limiting its Mac OS compatibility.
Yes, that 2009-2010 MP generation was the best design ever. So of course, Apple abandoned it and gave us less for more $$$.
I really loved my old original "Cheese Grater" Mac Pro. I was back working in audio and needed power, storage, and expansion slots. The old Cheese Graters weren't cheap but they weren't crazy expensive like all the Mac Pro since then. As ys Apple announced they were moving to SoC I knew the days of the Mac Pro were over. The Mac Studio is a good replacement for as you said about 90% of the people doing production or workstation class computing. I doubt Apple will sell many of the new Mac Pros and use it as their excused to kill it off.
Old Mac Pros were awesome. I used/learned on one of them in college.
That is precisely their reason for the overpriced turds. Kill off the internally expandable desktop (again). The company keeps looping back to Steve Jobs original ideal of a computer that has no expansion or upgrade capability, and selling it at a high price.
That was the friction between Steve Wozniak and Steve Job. Woz believed in open expandable computers and Jobs the control freak wanted to dictate to users how they use their Apples. That's what killed Next Computer Job finally was in a position to dictate everything hardware, OS and even 3rd party software. By the time Jobs figured out his control freak side was killing the company it was too late.
Your old Mac Pro could probably out-perform this one in a few years if you upgraded it lmao. At least graphically and memory-wise. This thing sounds like e-waste.
@@doctahjonez you can grate your cheese with it
My mom: That's a nice cheese grater.
I think this is one of the best video's made yet about the M2 Ultra Mac Pro. I switched to Apple back in 2009. I really loved everything about it. The "playfulness" of the OS. Which honestly, seems to be gone as well. As well as the "ease of use." Which is still there in a big regard, but i loved it that Apple didn't constantly bothered me with updates like Windows did. Nowadays, it seems that Apple has been catching up when it comes to that.
I have a 16-core 2019 Mac Pro. Which i do really love. I can at least upgrade to some of the "newer" MPX GPU's. Even though i mostly use logic Pro X. I do encounter some limits when it comes to my GPU. Which is only an AMD Radeon Pro W5700X 16 GB.
The only use case for the "new" Mac Pro seems to be some kind of a big "storage/pro audio" computer. Anything else the Mac Studio would more than enough for indeed about 90% of the pro people.
I'm happy that i can at least still upgrade my GPU as well as my RAM. I can also put in SSD's and audio cards and if i really want i could upgrade my CPU to a 28-core. But i don't think i'll need that.
I do think that Apple silicon is great. But i also think that Apple should've left the Mac Pro intel based with different components available to be able to upgrade later on. At least, until they figured out a comparable way to do this with Apple silicon. It's actually pretty sad for the pro Apple users. I'm counting myself more as a "prosumer." But that doesn't mean it isn't sad for us either. Because eventually i will want to upgrade my machine as well. Which means i have only a choice between the Mac Studio or building a Windows pc again. Which i'm planning on doing anyway since i do want to start gaming on a gaming pc again.
Linus should take a look at Asahi Linux next. It’s basically Linux for M1 and M2 Macs and could make more GPU’s actually work on the Mac Pro
Right now it doesn't, and it's as yet unknown if it feasible can in a manner worth doing, as stated by Asahi team.
Linus almost dropping the $7000 Mac off the table @13:16 because its back leg is hanging over the table edge - I expect nothing less Linus, keep up Gods work.
Hey LTT, could you put some indicator on the graph bars themselves to make them different on the maximum is better or minimun is better? So that we dont have to chech the corner every slide. Maybe a gray to color towards the better side?
or maybe a little trophy and gold outline for the best performer in a given graph
Support for NVidia and AMD graphics cards would have been a killer feature, though. Add a much bigger power supply and support for 4x RTX 4090 or something.
Wow Avid actually has support for something relatively close to launch? Amazing. I had to wait nearly 6 months for Pro Tools to be fully functional back when Big Sur came out.
The pressure is real
Best thing about this video, seeing the new LABS logo and graphs. Pretty sweet.
Finally, we can show Apple the RIGHT way!
One word: cheese grater.
@@arron840 That's two words
This is the way
The right way?? Ditch the soldered crap and build functional and stylish PC's?
4090 and 13900KS with a whole set up that will turn that mac into road kill For that price?
Technically you don't need a PCIe expansion slot on the Mac Pro to leverage Avid's DSP card: you can link to that card installed inside of their mixers over AVB cable Ethernet. It'll do the processing on the mixer and feed it back to the Mac Pro. The Mac Studio and other AVB cable Mac (which is all of them released since 2013 with build-in hardwired Ethernet) can do so as well. The difference between 10 gig and 1 gig networking here is just how many audio channels can be moved simultaneously.
What sample rate and bit depth are the audio channels?
If my maths is correct, with 192kHz 32bit you could squeeze in around 160 per gigabit.. or 1400 mono CD quality channels 😅, or more sensibly, 640 48kHz 32bit channels.
That said, obviously there has to be some overhead..
If I found the correct card, it supports 64 channels of 192kHz 32bit. Don't know anything about it. I haven't touched a mixer in more than a decade, and that was old fashioned analog.
@@phizc Recent Macs have the AVB features built in. You need to go into Audio MIDI setup to create the virtual audio IO device for the system and then select it as an audio output.
The typical max for audio over IP is 144 channels at 48 kHz 32 bit over 1 Gbit. Due to chip implementations, another common limit is 64 channels at 96 kHz, 32 bit.
I ran an AVB bandwidth calculator for a 10 Gbit link and the most channels I could get it produce was 1152 at 48 kHz 32 bit. The gotcha isn't channel count with AVB but rather stream count which can bundle up to 64 channels together. That is where overhead comes into play. Maximum quality of the protocol is 384 kHz at 32 bit and 64 channel in a single stream, though I don't know of any hardware that implements that for actual IO.
There is also the overhead of maintaining network time clocks via IEEE 1588 precision time protocol (PTP) or the AVB interoperable gPTP variant. And of course you have to factor in the Ethernet packet overhead which eats up about 2.5% of bandwidth.
Regarding machine learning, TPUs are the state of the art compared to GPUs. As far as I know TPUs work fine on the Mac Pro, so you should test those since graphics cards only represent one side of the professionals using the machine
I'm really curious to see if the AMD GPU works on this mac if it runs properly under asashi Linux with the Mesa driver
It won't. The M2 chip simply does not support off-chip memory, and as such, has no (non-hacky) way to access its VRAM to put stuff into it.
@@tonetraveler992Macos reconised the gpu, thats a big step. Im sure they will get it running in asahi
@@racingweirdodid not, it just detected a pcie device
Labs graphs ... yeey 🥳 kudos for the designs as well. One problem i noticed with the graphs though, was that i always have to check if it's a "lower is better" or "higher is better" type. Maybe having a special background color for one type and another for the other type would help to differentiate between them, and make it easier to follow.
I forgot what the machine looked like before starting the video, and legitimately thought you were just being cheeky and photoshopping a cheese grater as the front of the case for a chunk of the video.
Very interested in the minimum price of a custom built pc that could match\exceed the mac performance. Matching price is great but matching performance for less is also good info.
The entire point of mac is demonstrate that you are able to afford that and therefore better than others. 🤪
the whole point of this pc is to show how much more expensive the mac is. the pc is less than both macs featured in this video by over $1000
@@scottydc Yes, that's great when you want to compare like pricing, but what about when you want to compare respective performance? if you keep the performance the same or as close as possible where do the price tags en up? it's just a different data set by changing the static variable to performance from price.
@@scottydc Just want to point out the PC had way less ram, which is expensive. The price of a 13900k + 128GB DDR5 + 4090 is about the same as the studio, but will definitely outperform in GPU tasks. Apple needs to really double up the GPU performance.
@@henriberger5420128 GB DDR5 6400MHz is around $450.
If you need 192GB it's around $600.
RAM isn't expensive these days.
a request for the linus team.
can you guys do a review on tensor flow using the macs unified memory structure.
it would be interesting to see what over 100 gig of ram in a gpu would do in comparison to the 80 gig in an a100 card.
this would probably require running some of the larger nlp models to see if having the extra memory makes it easier to work with them vs having
the extra cuda to process the network.
The vram is almost equal to the Ram, with some overhead taken by OS and applications.
Or even better, a PyTorch benchmark test (LLM training and inference).
In a way, the switch to all Apple integrated hardware is ruining Apple for me. I liked Apple when it had proprietary exterior and motherboard hardware, but used off the shelf parts. They used to be like PCs, but designed 1000% better, higher quality and very modular. Hell, the Mac Pro itself basically pioneered tool-less swappable bays and hardware upgrades, with beautiful interiors. Macbooks and iMacs had incredibly easy-to-access and quality RAM slots, SATA III bays and such. What made Apple great was that you got very good software in the OS, and premium hardware BUT, it was still just a PC and you could use off the shelf parts, for the most part, to upgrade components that matter the most... RAM and storage (and GPU, in Mac Pro's case).
Now they just make these "lifeless" boxes that are untouchable and screw you into spending massive "Apple Tax" for ram and storage upgrades, upfront. I actually despise M-series for this reason. Give me x86 all day.
I'd love to see you install Asahi Linux and retry installing the 5700xt! Thanks to the low efficiency of Darwin Unix (what MacOS, iOS, etc use) you can get an extra 50%-100% performance uplift using Linux, not to mention drivers.
Low efficiency of Darwin Unix? Do you mean Mach 3 or the BSD subsystem?
As a PC guy it was hard to reach the end with all the compromises, the cost, and random things that you just can't do. I already know Macs are... special. Thanks for the content!
As a PC guy I appreciate the video and your comment. I think all Mac owners are special too.
0:28 The new model looks like a cheese grater from the side
Uuuh LABS-branded/styled graphs, I like it! Seems like the team over there is starting to really get going 💪🏽
I wonder if Asahi Linux will eventually add support for GPUs, THAT would be cool!
This comment is to appreciate the editors and the things they add to these videos to make them just that much more entertaining everytime. Cheese grater Mac effect got me
You could install Asahi Linux which brings GPU support and see how Da Vinci Resolve reacts. I suppose this would make it a great Linux Workstation. Although I'm not sure whether Blackmagic offers an ARM version of DVR.
isn't Asahi under heavy development and not suitable for the general public just yet?
What's the point though?, when the PC is already faster & cheaper - the only real use for the Mac Pro seems to be that it runs MacOS.
@@DoubleMonoLR And power efficiency, but yeah Apple just made an expensive storage-box. Arguably e-waste too with the soldered CPU and memory in addition to this.
@@DoubleMonoLR I never doubted that. But when you want to see whether you can get a GPU to run on that hardware, go Linux.
Freebsd has excellent driver support for devices like networking cards due to enterprise environments.
Apple pulls in a large amount of BSD code for macos, and drivers for relatively normal networking devices would absolutely be a part of that, enabling them to more easily adapt these drivers to their newer operating systems
It actually makes a lot of sense that the networking and storage devices worked just fine.
Thought macOS has kind of left the FreeBSD codebase for a long while now
@@No-mq5lwit did, also has its own proprietary network drivers
FreeBSD only has good network card support for Intel NICs. The drivers for Broadcomm are average at best, to absolute crap at worst.
I'm inclined to agree with Marques at this point; the chassis is just using existing form factor and parts with little worry over efficient use of the cooling, space, I/O, etc. Good sales and more user feedback will drive a redesign with later releases, but not with the introduction.
The worst part of this for me is that I LOVE the look of this machine. Once some of these cases are out there in the trash I hope to buy one for cheap so I can gut it and put a REAL computer in the chassis
You got problems... I guess this is the same way some guys are into obese women.
I've got a Mac server from 2006 and it makes this new one look like a crappy cheese grater (because it is)
@@andreivaughn1468if we all liked what you like then we’d all just be you. Personally I’m glad that’s not the case.
@@dwayne_dibley As much as I appreciate the sentiment, that is a large cheese grater and it looks like shit, who agrees? Probably like 80% of people that have seen one. This opinion is unpopular and therefore more wrong.
The more I look at these machines, the more happy I am at having a real desktop computer. More performance, more flexibility, more customisation. It does come at the cost of higher power unfortunately, but at least I have a GPU.
Yeah this is dumb as hell. Assuming that regular PCI-E doesn't change within the next few years my 2006 Mac Pro might be able to out-perform this thing graphically with the default config just for having an upgradable gpu. Unless Linux is installed on it maybe, but then what's the point?! They made it obsolete before even revealing it...
Agreed. Depending on your power costs, the cost of owning a PC might still be lower than these Apple machines. Plus the longevity and possibility of upgrades in a PC is the real win, at-least for me.
For real, imagine buying Apple 🤣
Just to be clear, the angle that the Mac Studio and the Pro are going for isn't the home or "prosumer" market; it's the actual professional market. That is, scientific labs, research firms, Hollywood, etc. The kind of stuff that goes into a "workstation"-grade computer isn't the usual i7/i9 and a bunch of RAM like you or I can buy off the shelf; they usually run on Xeon Gold/Platinum workstation processors (or similar) with a spectacularly low error rate, and ECC RAM is mandatory. A workstation from Dell or HP can easily start in the $3000 range for specs that would look weak for a gaming PC. The difference is, your gaming PC or mine can make errors and we don't notice them except as a hiccup in gameplay, or maybe a line written to a log somewhere. If a scientific research computer does even a single calculation wrong, it can invalidate an entire experiment, so they have to have significantly tighter tolerances in how their parts operate.
All that said, there's no question that the Mac Pro by itself is extraordinarily silly, seeming to offer modularity and upgradeability but being so tremendously proprietary and locked down as to be useless compared to the preconfigured Studio next to it. I just wanted to make it clear that something like a 3.0 GHz workstation processor isn't necessarily "worse" than a 3.5 GHz CPU found for cheaper in a home/gaming rig, when high-requirement professionals are shopping for accuracy and stability before speed.
The subtle grated cheese while rubbing the front of the case was golden lol
If you want to run a large language model (like ChatGPT) locally, having 192GB of unified memory is exactly what you want. While the H100 chips do provide similar amounts of memory, good luck getting your hands on them. Even if you had the estimated $40k as suggested in this video, these chips are typically only sold in large batches e.g. $10M at a time. Wanting to run an LLM locally shouldn't be considered "highly specialized" since it is the only way you can get uncensored outputs (which have been proven to have higher quality). And you don't have to use all but 4GB as suggested by Linus, you can leave more aside and still run really big models that can't be run on any consumer level PC.
If Mac Studio could load and ran a 65b model, it would look pretty interesting to me. Did anyone run this kind of tests? I wonder how the unified memory would perform. Maybe Linus would be interested to run a test.
It's probably not enough considering you need an entire DGX (320GB) to run GPT's. But the point stands that if Apple had retained or increased the memory capacity of the previous model (eg 2TB) it would be a substantially viable option to NVIDIA's DGX when the choke point is the memory rather than compute power itself. I'd hope that maybe a M3 model might be able to hit that threshold, but seeing as how Apple always gives Mac creators these half-assed options, I have no faith in Apple being interested in offering this.
I haven't run the numbers, but cant you get a not-so-high-end CPU with 3090 (4090 is arguably not significantly better for LLM applications) and 256GB system RAM for much cheaper than M2 Mac Pro?
May I ask what models you are referring to? Everything seems to be below Chat GPT standards (even though it is highly censored).
@@sumitmamoria the difference tho is how the memory is regarded, as there is a big difference in memory utility depending on whether it is system ram or vram. Typically they are not interchangeable for workload demands, so if the gpu has a limited vram capacity (like the 24gb of the 3090), then that would be the bottleneck even if you had 1TB of system ram - so in this case where you can actively use the system ram as GPU vram substitute on the Mac with 196gb total capacity, I can see how that could be a benefit in niche usecases. Still makes you wonder why they capped it at 196 if they already know the entire selling point is that the system ram can be used as vram replacement - but that might just have to do with the memory controller not being able to handle an "unlimited" amount of system ram turned into vram. But I'm sure the Apple tinkerers would know a solution so probably it's just performance anchoring that keeps people wanting more, which makes the next gen product easier to sell.
For some reason i didn't see the part that they put the gpu in mac pro
I have the "Macs Fan Control" app on my M1 Max Mac Studio. It allows you full control of fan speeds and gives you all the temperatures on your mac. I leave it in Auto but its nice to know temps. This model is massively over cooled. The highest temp I've seen on any sensor temp on any of the 28 available sensors is in the 140F range. The fans only run 1300 to 1400 RPM. I have 45 TB of external hard drives in addition to the 1TB apple drive. It does everything I want it to do without heating the room and you never hear the fans. I have a 27" Innocn Monitor using USB-C from one of the thunderbolt channels using the Thunderbolt display Color Profile. I really think this is the best color I have ever seen. You couldn't dynamite me out of my M1 Max Mac Studio. I also use and recommend Clean My Mac X.
Fill it upp with P5800X Optanes and see if that improves memory swapping speeds
the very essence and pinnacle of e-waste, well done Apple !
Literally what I've been saying in the comments lmao. These things were obsolete before they even existed thanks to the soldered/proprietary... well everything lmao. In a few years the 2006-2019 ones might even get to out-perform these in some areas just because of their upgradability lol
Ty for all the testing!!!!
13:20
There. Saved you 13 minutes.
I still don’t understand why apple doesn’t allow gpus in m2 Mac Pro
Because apple does not like upgrade ability they even soldered their ssd to the board on their mac. Its them wanting the most money
well, the fact that you're a gamer, explains a lot
@@309electronics5and adding lifespan to the whole machine. If that SSD dies you can throw it away.
Edit: it has replaceable SSDs but the other Apple devices don't so yeah...
@@bartomiejkomarnicki7506 gpus are not just used in games u gota be an idiot to think that think AI AUTO CAD VIDEO EDITING ETC AND YEA MORE THAN THE SOC IGPU WOULD SPEED UP THINGS
@@crashniels it's a big "if", have you seen how long they last when endurance tested?
Having GPU support would give this Mac Pro much more sense
I could see this being kinda useful for LLM work. 192gb of unified CPU/GPU memory could get really good performance. No way that justifies the price though 😂
but AMD will release MI300A, it have 128GB HBM3 unified memory, and much faster...and MI300X, it have 192GB VRAM...why we should buy M2 macpro for LLM?
@@GH-vi7en dunno, possibly light work on heavy models. Apple has their neural engine which is decently powerful for inference, and the rest of the hardware has been shown to be pretty good too. loading 70b parameter models for inference and finetuning should be fast. Apple has never really been the 'value' sell and its quite clear with the m2 mac pro. other hardware is faster and cheaper, it just doesn't run macOS.
@metaprotium yes. M2 is great chip about energy-efficient. But that's all
I watched this whole video and I’m still the most shocked by the fact that they put a Magic Mouse as the default configuration for this $7000 computer
13:17 that one foot that got out of the table really scared me. IT really reminded me about linus drop tips
IDK why apple didn't support external GPU, it might because the ARM system is not compatible, but I've seen someone trying to put a GPU into raspberry pi and the high end one didn't work, the low end one works but has alot of bugs and issues
Well according to the video, the GPU gets recognized but doesn't work because of lack of drivers
You answered your own question.
No you can do it those ARM server CPUs that appeared recently work with GPUs.
@@0Synergy so I think apple is like locked all of the GPU thing like in iMac, which is only compatible in select gpu, it'll not work when you put an rtx 2080 mxm into an 2011 iMac
If it has PCIe connector the GPU should just work (if there is driver in OS). I suspect the only issue the GPU doesn't have proper firmware to interface with AArch64 UEFI/BIOS. You can probably try attaching GPU to qemu-virt aarch64 machine or x64 machine by PCI passthrough and see what happens xD ARM SoCs don't have concept of video adapter like PCs do, they always require full stack of drivers from the operating system side.
Over the weekend, I spoke to a guy who does video production for a living. They had been Mac for years, but now run all PC hardware with Linux.
I buy it for 1$
Am I the only one that sees a cheese grater? 😭
No you are not the only one
Everyone does.
Nope
No
I appreciate that you quoted Franklin :) He's quite the informed guy.
Great review. Mac Studio seems like the obvious choice in most cases. My understanding is that the plan was for the M2 Extreme chip to go into the Pro, but that chip didn't work out as planned. Maybe M3 will work as anticipated and offer a great deal better performance. So, this MacPro is some kind of stopgap, but not really worth the extra price because it doesn't deliver much compared to the Studio. However, if there was a way to easily swap out the current M2 Ultra for an M3 Ultra or M3 Extreme when they become available, then buying a MacPro now would make more sense. But, it just doesn't seem like Apple really has a clear vision for where this MacPro line is going.
4:26 Linus has radiation coming off of his hand
The biggest difference is actually storage, when you price up a comparatively quiet and expandable external storage system you’re talking 3-4 grand easy and I’ve not found anything as quiet and the options are not nearly as flexible. So that 3-4 grand over the studio is actually not bad. I’ve stuck 16tb NVME raid, 3.5” discs, SSDs in mine and there’s room for more. All quiet as a mouse. To do thst with the studio would have cost me a little less but no where near as quiet and I don’t need a bunch of external boxes cluttering up my desk. Along with things like Decklink snd Avid cards its actually a really neat, quiet and high end system. GPU was only ever one element of expandability.
Is there any interview anywhere or somebody literally sits down with the people from Apple and the people from Microsoft and put the systems together head to head and have them defend their systems? That would be freaking awesome!
This product is the direct consequence of Steve not with us.
Tim Cook has officially cooked me
Linus, thanks for doing the review on the Aplle Cheese Grater! I do not believe I will buy one as I have enough appliances in my kitchen! Cheers!
1) While the MacPro may indeed be a poor value, Linus didn't mention one key advange it offers over the Studio: Fast network communication. Highpoint, for instance, advertises a card specifically designed for this Mac Pro that uses a single x16 slot to give 28 GB/s sustained transfer speeds.
2) Linus said: "But unlike the SSDs used for the Mac studio, which are bare flash modules, Apple does in fact still allow users to upgrade these." That language suggests those in the Mac Pro aren't also bare flash modules, but they may be; it could simply be that, in the Mac Pro, they can be upgraded.