i think it’s bc if they do, then there would be significantly less sales of the Macbooks. there’s only so much they can do for iPads which wouldn’t affect the “necessity” of having a Macbook.
I wanna point out that although MKBHD says that the RTX3090 is meant for games, although it is marketed to be a gaming graphics card, it’s actual intended capability is for both 3D rendering and gaming. I know many people who deal with 3D art or 3D modeling use RTX3090s or 3080s to do their work and at the same time run the games that they’re working on.
Exactly this. RTX 3090 is more efficient and performs way faster than the "server grade" cards for GPU rendering. Games don't even utilize NVLink and the 24GB of VRAM the card offers.
@@magnotec. Rendering isn't always about VRAM. If you check the latest Octanebench, 3090 & 3080ti are topping the charts in terms of render speed. The closest thing is the A6000 which costs way more than the two (at MSRP).
I’ll give apple props for sustainability when they let us upgrade the damn machines. Still an incredible tool, the energy specs are also great but come on… without the ability to upgrade the device over time there’s a ton of eventual e-waste there.
@@edgarwalk5637 you're right, it's absolutely nuts. I get they need to integrate everything to keep it small but completely locking everyone out and even going so far as using glue so it's obvious if you try to fix it yourself is just wrong. While I love the size I'd take a bit bigger footprint with the ability to upgrade any day and I'm betting all but the hardest core, "apple can do no wrong" fans would agree.
@@morfelod it is definitely more than 1%. The second hand market thrives on PCs being upgradable. For example, an old Dell Optiplex that was only equipped with 8 GB of RAM could be upgraded with an additional stick of RAM and a GPU to make a decent gaming PC. Apple also used to let you upgrade by having normal Philips screws and standard parts, meaning you could upgrade down the line if you needed more RAM or need to replace the battery. That would potentially extend the life of the laptop. With how frequent some companies upgrade their systems, I’d rather have someone repurpose the machine rather than just throwing it in a landfill.
@@morfelod my entire pc is built off of Facebook market second hand parts. Running a 7 year old graphics card and I’m still able to play modern titles.
The fact that this $4000 computer is even being COMPARED AT ALL or MATCHING the $25000 Mac Pro of a few years ago is just insanity I don't think people are really taking that in enough
@@johnbuscher Didnt this hackintosh build took almost a year or over a year to get done because there were a lot of problems and it still only worked with specific hardware?
@@WestCoastAce27 their chips were behind intel & amd at that time. Only recently they could catch up because of intel’s stagnation at the top of the market
3090 is not exclusively for gaming, it is a quadro replacement. That means that it is for Rendering, by being the best for gaming at the same time. That's the reason that it has 24 GB of VRAM
It's basically where the Titan cards used to be. Nvidias highest end GPU, for Enthusiasts/ professionals, who don't need special Quadro features. Really not for gaming, especially with it's 24gbs of VRAM.
It really isn't a replacement for the Quadro. It's a low tier budget workstation card at best like the Titan was. A Quadro will still be better if for no other reason than the ECC memory and massively optimised studio drivers.
@@theunkownmr.562 I wouldn't say he's "defending" Apple, he just really doesn't know much about gaming. I tend to dismiss anything he says related to gaming, I'm pretty sure the only game he plays is FIFA.
It's possible the Studio gets hot under load and throttles down on heavy computing tasks like exporting the video due to the small inclosure compared to the Mac Pro. And Yeah, Apples packaging is insane. Just go the new iMac 24 and still blown away by the box of all things. Excellent review. I always enjoy watching your channel.
MaxTech did a teardown, and the screws to access the internals are underneath the rubber ring on the bottom. There are of course a lot more screws and connectors to really get into it. But the SSD isn't soldered, and there are actually two SSD slots. I assume that the larger storage options (4 TB and 8 TB) use both, but the 1 TB machine he disassembled only uses one of the SSD slots. They're not standard NVMe slots, but OWC will probably offer upgrades at some point.
Thanks for the hint! I watched that video and I got to say that it looks like Hell to do a simple thermal compound replacement or to clean those fans and fins.
the amount of leaps Apple has been able to make within a single "generation" of chip is REALLY impressive. I'm very excited to see how their chips improve in the future, and maybe one day in the future I'll make the full switch to MacOS.
Just like any other company and also their mobile chips division they won’t be making massive leaps anymore after this generation just to keep the yearly upgrades consistent.
Great review (as always) although I think it’s wrong to dismiss the 3090 as purely being for gaming. There’s a reason Apple made that comparison and had people from Redshift, Maxon and Adobe at the event. The 3090 is the best GPU for motion design and 3D GPU rendering.
From what I could gather the 3090 is more of a card for people that want to do both heavy workloads and game on the same card, since it's the only 30 series card still supporting SLI, which no longer has any application in gaming. It's really just a rebrand of the older Titan cards, which were meant for the same thing.
Yes I had to disagree with that statement too, game developer and 3D artist would like a real gpu. And in camera VFX with Unreal engine (like you can see in the mandalorian) needs the gpu power too.
I’m not even gonna lie I’m not even interested in this Mac and have no intention of buying it but it’s just so satisfying to watch MKBHD videos, the production quality, the editing and the aesthetics make me stay
Mac Studio is really not bang for the buck. I have seen windows machines that are 2x the speed for the same price! I also doubt the cooling in this is any good...
Side note, the 3090 is not really meant for gaming, as it's a "replacement" of the Titan. It's a studio card, for 3D modeling or animation, that can also be used for gaming!
It's absolutely meant for gaming, it's for the people have the money to burn and want the very best gaming PC. It's officially not a Titan replacement because there are no Pro drivers available for it and there are software limitations on its compute features. It to some extent does dual duty as a workstation card for some workloads but there is a reason it's branded the same as the rest of the 3000 series, which can also be used in workstations.
@@TimSheehan The 3080 is a gaming card. The 3090, with its absolutely overkill 24GBs of VRAM and only marginally improved compute power, is definitely a content creation GPU.
@@Krannski no it's absolutely a gaming card in terms of how it's clocked and how it's marketed. The A series (formerly Quadro) are for professional work, they have a tonne of ram and are clocked within the efficiency window.
Peak MKBHD review. Great work! So clean, so smooth, so clear, utterly professional yet accessible. This is exactly what I subbed for years ago and the team still consistently overdelivers. ❤️
I was waiting on them to release news about a new IMAC PRO. But I’m not going to lie, I am really leaning towards getting the Mac Studio with M1 Ultra chip instead . Tired of waiting & this seems like everything I need for years to come.
We were never going to get a new iMac Pro. They announced at the time this was a one off as a stop gap between the old trash can Mac Pro and the new modular one. I'm kind of dissapointed we didn't get a straight up repalcement for the 27" iMac though. The guts of this in the new Studio Display body would be killer.
@@TalesOfWar They said at the end of the presentation that they have one more product to implement Apple silicon with, the Mac Pro. They said that clearly at the end of the event. The M1 series is completed however. So the next product will be specifically for the Mac Pro rendition.
To everyone that reads this there was a youtuber that tore down the Mac Studio and there are 2 SSD spots on there and all the ports are modulus that can be unscrewed and replaced. To open the Studio the screws are under the black ring that is at the bottom.
The efficiency thing is no joke. Switching to an M1 Mac Mini from my Windows desktop not only made a noticeable impact on my power bill alone, the heat generation is making another positive impact because I have to air condition my office less.
3090 build referred to in the presentation walks the M1 Ultra in many real world tests. I recommend those reviews for anyone more curious about the benchmarks Apple chose to market.
The way the M1 Ultra is setup is still insane to me. Even with high end GPUs in the past you couldn't really just use 2 to effectively make 1 super GPU (as far as I know). I wonder if this approach could be used in future processors by different companies. Although the M1 chips is much more integrated including the GPU, ram, etc. while Intel/AMD CPU's are more individual parts. Props to Apple on this achievement.
SLI and Crossfire technologies exist, although the efficiency of using two cards are usually in 70 to 80 percent. 2 cards equal to 140-160% performance of one. They also have to be identical.
@@Impoigness It is my understanding that SLI and Crossfire configurations have to be supported by the software/drivers, so most games don't benefit from it at all. I think NVIDIA dropped support too. The m1 ultra seems to be interpreted in software and OS as a huge processor with everything combined by default. I think Nvlink is a significant improvement but only supported on 3090 current gen, not sure about software support requirement though. I hope more processors and GPUS can be combined in this modular way in pc builds in the future.
@@gvibes69 This satire? April Fools was yesterday. Of course you can probably build more powerful PCs, your reply is irrelevant. I was commenting on the fact that Apple can simply connect two powerful SOCs to be seen on all software as one giant SOC, with all resources (GPU, CPU, ram, etc.) pooled together. There isn't anything like this on the PC building side.
I come from an era of computers where power users like me had an Athlon 1.1MHz with 256Mb RAM and a 20Gb 7200 HDD. The Graphic card of choice was a Savage 3D 32Mb or Riva TNT2. This entire generation of machines is impressive to me, even my humble 3.0Gz octacore Android with 6Gb RAM and 126Gb storage seems like 10 generations newer. I can only imagine the heat the X64 line up is feeling now, and when the apps become 'native' rather than Rosetta 2.0 based, its gonna be a headshot.
@@danielkenney7768 I remember the voodoo banshee, the first stand-alone video card made by 3DFX. I remember using it for quake 3 as well. :) Things have come a long way indeed.
Not to be a tech nerd here but the video cards you named are late 90's models and Athlons are early to mid 2000's CPUs. Those components weren't even produced concurrently, I'm pretty sure they were launched at least 5 years apart. And a 20gb HDD is a small consumer drive from 2000/2001. Interesting choice of components for a power user computer if you ask me. And your "humble" (actually pretty high end) Android phone is literally more than 10 generations of silicone newer than even the Athlon...
I was using Amiga’s and then moved up to Pentium 3dfx Voodoo PC’s. I reassembled a Pentium 2 Voodoo 3 PC a year or so ago, installing it into a new case with a new power supply. It runs like a dream on all those old classic 3dfx games.
Since this is so unique, I'd love to see someone go to each of these different 'studios' and test it out for their workflows. You don't see a lot of reviews talking about how well this works for architects/engineers using archicad/revit for example. That's probably too much to expect though! I'm looking at getting my own professional setup at home since more places are accepting wfh and I feel like this could be a new aspect to the 'consumer' market.
Not sure if Revit is compatible with M1 chip. There was a work around where you could make it work on intel based macs. Please do let me know too if you find a way to make Revit work with m1 macs...
My bad, I threw Revit in there without realising it's not on macs at all. I've been in archicad land for a long time on PC and Mac in different offices.
Honestly I underestimated the Mac Mini M1. It's been able to run all my programs and handle all my Adobe programs without a problem. I would recommend that instead, I think TH-camrs get tunnel vision in focusing on video editing and graphics design. Most of us students don't deal with that. Plus it's like $600. Why spend a couple of grand on a computer if you won't be able to take full advantage of its capacity?
I enjoyed where you discussed “TOOLS”. Yes these machines are tools and knowing how to use them well is where we’ll see our greatest gains. Thank you sir.
Regarding the sustainability element, what is the expected useful lifespan of one of these machines, and how does the fact that it cannot be upgraded or modified (in a way that other products can that extend their lifespan) impact the sustainability element?
I actually think it makes it better. As long as these parts are as reliable as they always have been from Apple (my mom has a 2007 and 2009 iMac she still uses every day without issues), then the reliability aspect of full integration trumps having less reliability and having to throw out and change parts over the course of that 14-16 years. Not to mention: I don’t think almost anyone uses an entire computer that’s 16 years old. For the vast majority of consumers, I’d say this level of reliability is way way way more positively impactful on sustainability than a PC that maybe gets used for 7-10 years max, upgrading and throwing out old parts along the way. (And using more energy) Just my thoughts, I don’t actually know anything :)
@@Deliveredmean42 I feel like that’s what a refurbished product is. It’s not like a damaged device is often thrown out, it’s usually still sold to someone who ends up refurbishing them. (Even these sorts of machines, apple refurbishes them and certified repair places will as well.) The “and it sure shows” comment was pretty rude
@@WarrenWeekly I mean, I don't know about reliability lol, they had laptop keyboard that would randomly break, and they kept the design for multiple years despite knowing the flaws,
The way Apple made their presentation made me wonder “are they claiming Mac studio outperforms every other computer?”. Thanks for painting the full picture Marques.
*EDIT* Anyone talking about the *Mac Pro* PLEASE work on your reading comprehension, this comment was to highlight the *MAC STUDIO* (As this is a video about a Mac Studio) *NOT* the Mac Pro which is blatantly poor value for the money... indicating Apple is moving towards actually providing value in its products which they historically haven’t been (pre-M1). I think the take away here is it nearly performs as good as an intel Mac Pro fully spec’d for tens of thousands of dollars less. A fully spec’d Mac Pro is exceeding $60kCAD, SIXTY THOUSAND… So people saying this is over priced haven’t seen that an equivalent Intel CPU *alone* costs as much if not more than the entire Mac Studio. Very cool to see what Arm can do when really optimized, X86 has some work cut out for it at this rate.
The fully spec'd out Mac Pro is the biggest ripoff on the computer market. I could build a balls to the wall Threadripper PC with dual 3090's and still have money left over while crushing the Mac Pro.
@@theunkownmr.562 not when it comes to Video production, 3090’s are GREAT for gaming but these machines are made for content creation. You’d want something along RTX Quadro cards to be on par and you’d be spending a lot more than the Mac Studio (fully spec’d) at that point still. I am not in any shape or form claiming the Mac Pro was “good value”, Apple never has been BUT in this case for what it is the Mac Studio stomps anything else dollar for dollar WITH CONTENT CREATION. Be it a windows machine or MacOS. On top of that you’re not comparing apples to apples bringing Threadripper into account which on its own smashes Xeon’s price to performance wise.
@@theunkownmr.562 ok? What’s your point, the *Mac Studio* provides insane value for the workload that cannot be replicated in a DIY HEDT platform on windows. If you want to argue about the Mac Pro being bad value I already agree… Seems like you’re arguing with yourself at this point 🙃
@@NicoIsntHere This isn't accurate. The 3090 outperforms an RTX Quadro 8000 in most scenarios, at less than half the price. For anyone doing effect-heavy edits or motion-graphics, a 3090 should be more than enough. You could build a PC for editing that outperforms the fully specced Mac Studio for less than $4k, if you can manage to find the parts in stock. A windows machine with a Ryzen Threadripper 5950X, RTX 3090, and 128g RAM would handily beat Mac Studio in Adobe programs, for somewhere around $3k without including storage costs. And I'm saying this as someone who intends to get a Mac Studio.
@NaN yeah right, because there are not independent studios, freelancer 3d artist, designers and engineers in the world that could benefit with powerful gpu cards. This is just for gamers.
@NaN a rtx3090 is a series 3xxx version of the Titan cards, not the quadros. You talk a lot for someone who's really off target here. Its a hybrid top of the range video editing, 3d softarware , AutoCAD , modeling, photoshop, unreal engine 5 , maya, 3d max, all purpose + gaming card. That's why it has 24g of ram, no game needs more than 10gb of GPU ram. But creators do use it.
Nice review! It should be a dream machine for all Final Cut users. I think that many people buy RTX 3090 NOT for gaming. 3080 Ti is the one to chose, 3090 with it's 24 GB of RAM is perfect for 3D rendering etc.
@@tonyburzio4107 just connect it as an external SSD from the out and then shove the sad connected to the port inside the Mac studio Provided you do some drilling and re routing Or if you don't care about aesthetics just freaking tape it to the Mac studio
@@bodiminds6571 but you’ll always have to eject it before you shut down your Mac. Also, the external SSDs do not have a very fast transfer speeds compared to the internal SSDs and with Internal, you don’t even have to eject it before you shut down and it also connects directly to the motherboard.
@@ZybYt You don't have to eject SSDs before you shutdown your Mac. Dismounting the external SSD happens part of the shutdown process. That said, they do tend to have faster speeds, that much is true, but Thunderbolt enclosures can get very close.
We need to stop praising Apple for being "eco friendly" when their omission of certain basic accessories requires a purchase that makes even more waste, the product when it finally dies and the packaging associated with buying it.
The whole global warming/climate change and needing to be eco friendly is one big scam to make you pay more. It’s disgraceful the excuses they make to charge more for less.
Would really kill for one TH-cam tech reviewer to talk to a developer or two (not just iOS/Mac app developers but regular world devs). It’s probably the most common workflow that would actually benefit from a machine like this (and I mean biiig benefits), but the entire community pretends people with powerful computers are just content creators or scientists.
cause youtubers do not know any better, they just dont have knowledge besides editing in their software basically, I feel like this review is just a hype train with no downsided explained at all
Well, they are TH-camrs. They can only really talk about how good FinalCut or some other editing software runs in a given device, and then talk about specs. They are really clueless about anything outside of that, or to even think that there could be other places were these machines could be used.
All of you should watch the videos on TH-cam that talk about the realities of tech release and reviews. It's not because they don't know any better outside of their sphere, it's not cause they work predominantly on Final Cut, it's not cause they don't know other careers exists. This early on in the product lifecycle, there is no time to go talk to a developer or two (who have used the products extensively enough to make an informed opinion), then talk to designers, then talk to hardware engineers (in the interest of fairness, right?), then make a video in time for them to grab viewer traffic and revenue. This is also not mentioning the fact that most large TH-camrs are catering to the everyday person. "If this Mac can export this type of video file so quickly, you're good to go for your daily Excel or Netflix usage". The average viewer does not care if there are caveats to performance, cause the average person does not code, they don't use Inventor, they don't use Blender, etc. And someone who is at the enterprise level of an experienced engineer, developer, designer, who is making a purchase decision, is not going to look at TH-cam videos like these for advice.
Exactly, computers are used in other fields besides editing, there is 3D modelling like Autocad, scientific simulations and number crunching as well as running mathematical models for economics etc. That’s why PC is particularly special is that is gives you better options on your workflow etc.
Most consumers should see this and think “Cool, whenever these video editors upgrade again in a few years, these are going to be so worth the buy on the used market” But for now, refurb’d m1 Mac mini is probably doing it for 95% of people.
This computer hard drive is focused towards people that do heavy video editing for a full time job, people that make movies, and people that make music. It’s overkill for the majority of people. Get iMac Air laptop or iMac 24 inch all in one desktop.
@@GiovanniCheng exactly. But some “average users” with heavy wallets get hypnotized by figures like “20-core GPU”. I am just hoping they see this and save themselves some cash
For Adobe software such as Premiere Pro, After Effects and to a lesser extent Photoshop, the new M1 and M2 Macs are just not worth it. Adobe hasn't optimized their software for the new Apple silicone yet and compared to their PC counterparts are buggy and sluggish. If you're a Final Cut or DaVinci user, that's another matter completely. They're both highly optimized for the new Apple hardware, and particularly if you're not a tech guy and just wants something which just works reliably the Studio is definitely the answer.
@@HotepOurobo PC Version is more stable and responsive from my experience, but that's not saying much. DaVinci and FinalCut offer a far superior editing experience, even on run of the mill macs, let alone workhorses like the Pro, Studio and Ultra. I've long given Adobe the boot.
still haven't switched to resolve completely from premiere, but the more i see, use and read, the more adobe stuff just suck against any other product, huh
@@cescimes yeah man but where im from its still the industry standard for small to médium projects and avid for big productions , I really like resolve but cant make the full switch because of this
Impressive as the videography was, I was disappointed with the video content: - RTX3090 exclusive for gaming? What??!! - "Intel could never" when talking about raw performance in Cinenebench R23, when Intel has already more powerful consumer CPUs in both single and multi-core for this (and many) benchmark and real world use (and lose in others) - Lack of comparison with PC in regards to performance and value - Talking about saving in electricity bills and environmentally friendlynes but you have to replace the whole computer if something, anything, goes wrong, while a PC you easily just upgrade certain components and even when you upgrade motherboard and CPU, you can still re use PSU, case, NMVEs, etc. Not to mention at equal performance you are saving a lot by building a pc in most cases (unless your specific workflows really take advantage of M1 silicon) - Almost lack of anything other than video editing was touched in the video in regards to performance. I get it, you have a video production studio like you mentioned, but was still disappointed, I guess there are other channels for that and I shouldn't be, but I somehow expected a little more/wide context It's an excellent and impressive machine, performs equally to the best consumer hardware (no prosumer) while being so much more efficient, all in a tiny box. But there is a huge price to pay for having that efficiency and that performance packed in a tiny box, like double the price, and in lack of modularity (repairability and upgradability). Worth it for sure for many people, but still necessary to point out.
Maybe "intel could never" deliver such performance using a TDP this low. But yeah, this thing is impressive, but only if you consider TDP. Other machines far outperform this one. Also, on the environmental aspect, it probably costs a shitload in electricity and machining to make a case out of a solid block of aluminum. Not so eco friendly.
@@possamei good point about the block of aluminium. Also, Apples mention 100% recycling in big size, but you read the small letters and it's just in small pieces of the inside.
The value part is gonna get worse comparatively once AMD and Intel release Zen 4 and 13th gen CPU respectively later this year, and new GPU architectures as well in a few months that will presumably bring huge performance improvements after almost 2 years that the current gens have been in the market.
Isn't upgrading technically less sustainable given you're throwing part of it away? These will last for years, and trust me, the places people will buy these in bulk NEVER upgrade their systems. Ever. They're specced for a task and replaced outright when they no longer manage to perform said task. Lots of them are also bought on lease and replaced on a 3-5 year cycle. Upgrading is also often considered a waste of IT's time given how long it can take to do it on multiple systems when they could be doing something else. I don't even upgrade my PC's other than maybe a new GPU (when they're not crazy expensive). Usually by the time I need to upgrade something the whole platform needs replacing anyway so it's essentially a replacement anyway. The PSU and case may get reused.
@@TalesOfWar When I upgrade my computer I sell the parts I don't need. Even if you do throw it away throwing away just your old gpu is more sustainable than throwing away your entire system.
@@TalesOfWar I'm sorry but what? Throwing away 1 part does not equal throwing away a whole pc. Electronics fail, simple as that. And when it does you either get it repaired or do the apple thing and sell the consumer a new one. It would be nice if the machine is reused but what if the failure was just a drive? How bout a gpu? If you replace that 1 single part thats 1 less cpu, ram, psu, case, mb etc in the landfill, or at the least increasing the longevity of that pc.
@@truthseeker6804 yeah, but apple computers are pretty awesome. But their iphones are so costly which makes it almost useless comparing it to the Samsung's phone
Imagine complimenting a company's product as environmentally friendly because they used recycled materials, but simultaneously made that same product irreparable by design in such a way that if any one of those components of the product fails, it's not a matter of simply replacing that specific component, but essentially the entire product and that, once that product is no longer being produced and all previously existing stock has been exhausted that those devices will inevitably fail and become waste; another item destined for a landfill somewhere.
I can't even imagine what a full fledged Mac pro tower with no power limitations will be like. Remember, these new mac products are mostly mobile and pulling less than 250 watts. The old mac pros fully specd used to pull 1000+ watts power. If performance per watts is to be taken into consideration as we've already seen from the MAX and the Ultras, then 500watts+ M1 mac pro would be insane😤😤
The Mac Pro won't have an M1, this is the last of the M1 chips. It'll be something new we've not seen yet, be it an M2 or something entirely different. But yeah, I'm super excited to see what they do with an Apple Silicon Mac Pro. I hope they keep the same form factor of the current Mac Pro but just turn it up to 11. I'm imagining something that has all the memory on package like the M1 chips but with easily expandable storage and a bunch of 16x PCI-E slots, as well as an expansion of the MPX bus they created. Imagine all the bandwidth they'd have with PCI-E gen 5 and the extra features like the higher power delivery of the MPX slots?
I don't think Apple will do a no power limit Mac going forward. They seem to be really about doing as much as possible within a given conservative power envelope.
I’ve been trying to decide which mac studio to purchase, the reviews have been all over the place, but mostly folks are saying the Ultra is not worth the additional $2,000. I get it, but everybody is wrong to believe this. Everybody is starting off with the wrong premise. If you look at the two base models side by side, there is a $2,000 difference. But the max is lacking two things that I think EVERYBODY should get. First is the hard drive. 512GB of storage is simply not enough for this machine. Upgrading to 1TB, to match the Ultra, will cost another $200. The second is ram, I really think 32 gigs is not enough for this machine - and I absolutely don’t believe many work flows require 128. Going from 32 to 64 on the Max adds another $400. Now instead of a $2,000 difference we have a $1,400 difference with the cost of the two machines. Additionally, I don’t think anybody should underestimate the benefits of the two additional TB ports on front. Upgrading the two front ports on the Max to TB is work at least $100 each. That chops an additional $200 of the difference. So, now our overall difference is only $1,200. So, the real question is, is $1,200 a good price to pay for an additional 10 cores of CPU, an additional 24 cores of GPU, and an additional 16 cores of Neural Engine. We are not debating a $2,000 question; we are debating a $1,200 question. Moreover, some folks may add more value to the two additional Thunderbolt ports, and some may add slight value to the much better heat sink of the Ultra. So, the debate could go down to as little as $1,000. For some folks the answer to the question of whether the additional $1,000 cost of the Ultra is worth it may change, for some it may not. I happen to think the additional $1,000 - $1,200 for the additional cpu, gpu, and NE cores is worth it, I probably wouldn’t say the same if the difference was $2,000. I really wish more reviews would look at this aspect of the debate.
I have the baseline 14" M1 MacBook Pro w/ no upgrades to anything except 2GB of storage. And the more I use it, the more I wonder who possibly would need more power... like I feel like it would be overkill for 98% of people lol. I'm editing 4K video regularly on Premiere and have had zero issues so far.
I have Mac mini and I am very happy with it. But as you mentioned it is 100% true that skills is what matters the most. Tools is tool at the end of the day.
Apple silicon is very interesting and obviously becoming serious competition for the x86 platform, but until the compatibility issues with many workflows are resolved, especially like some specific engineering workflows designed for x86 Linux or x86 Windows only, these impressive machines will not be compelling enough to make the transition. I hope Apple addresses these issues soon.
The sad part is that it’s out and it’s already slow. A 16 core amd 5950x beats the 20 core arm cpu in every benchmark. Nvidia’s h265 compression is 50-100% faster than the one in the m1 chips. The video card is a joke. The only good thing about them is the prores encoder for who ever needs it. It is true, it consumes way less, but you get similar power consumption if you downclock the 5950x as well.
It is kind of rare but it would be very interesting to see how this PC performs in scientific applications. Something like „Molecular Dynamics“. This would be very important for many young scientist.
I finally got my Mac Studio Ultra with 48 cores and I am in computer Nirvana! For C4D, AE PP, PS, Audition it is smooth as silk. It has yet to crash after 1 month. I love this thing. Today I set up a 4X2TB Samsung 980 PRO RAID level 0 array and get 2500 r/w in an OWC Express 4M2 enclosure. I also have 2 Acasis enclosures with a Samsung 980 PRO 2 TB and a Samsung 990 PRO and get 2800 r/w for each. I copied a 22 GB folder in 10 seconds! Redshift is a joy with this machine. I am SO glad I got it! And from what I read that the M2 is only around 10-15% faster, I am glad I could not wait till June 2023.
I like how when the mac pro came out all the reviewers were happy and like "finally a long lasting upgradable machine". Apple then proceeds to replace the mac pro after a single generation and no upgrades whatsoever. Must feel great for those who dropped 50k on those things hoping they would be supported for a long time
I don't know why they had to stick the power button on the back of the machine. If you set it up like you did in the video, it's pretty inconvenient to to reach around the back. Putting it on the front would be much more ergonomic in pretty much any use case, IMO.
Except you don't really turn macs off... Like the only times I reboot my macs are when they get updates. Suppose if you push them hard enough or use some software you can have crashes once in a while but that's about it. My office mac mini was configured to wake-on-lan also so that I could restart it in case of a power outage or something, it was a push of button on my iPhone and it would start.
Sleep mode doesn't suck on Macs. It actually... works. I've had my iMac on and only ever sleep it since I got it in 2013. It's only ever turned off for system updates but thats a restart and not a full on shut down.
When my iMac’s GPU died I was left with best screen I have ever owned, dead because of failed GPU. I thought I would never own a Mac desktop again, but this will definitely be on my list when the time to upgrade comes… in about 8 to 10 years
@@mrlt1151 Thanks man, the screen itself is actually fine, it's getting the glued screen off to get to the internals which is a challenge. It was just so much easier to remove the screen when it was magnetically fixed on that's all.
Technically, all computers are modular. You can connect a keyboard to a laptop if you don’t like the built in keyboard or connect a monitor to an iMac for some reason. I think they were deliberately saying that hoping some people will think that it opens up like a Mac Mini, only to be disappointed when they find out it is not upgradable.
Macs are less modular than a Playstation 5, because even a Playstation or XBOX you can open yourself to install a new NVME drive. I think Macs might actually be more ewaste tier than Dell or even HP. It's specifically designed to be unupgradeable, unrepairable, and easily breaking down and needing expensive repairs or to be replaced. Even something as simple as installing a new SSD you need to get an alleged "technician" aka Apple corp. associate, and it's all designed to just milk you of money and use MLM scheme/cult like things such as the "Genius Bar" which is one of the most laughably stupid things I've ever seen. Basically, you want to con your mark into thinking he's a genius for letting you pick his pockets.
LTT actually opened up the Mac Studio. You just have to remove the rubber ring at the bottom to get to the screws. There's not much you can do inside though, but they discovered a second, empty NVME slot for SSDs.
I like how you talked about the "tools" at the end. Apple is definitely gonna market this as a very desirable product for anyone but yeah, this is overkill for many people, and kinda heavily priced. It's not for light work people or people like you even. The video studios they displayed in the event are professional filmography stuff and not for a 10-minute youtube review at all. For a person who REALLY needs all this power and makes perfect real use of it, this thing is a bargain!!! I know you can't resist and use it anyways, but this is a dope machine for many people who wasted money on any low or mid-specced mac pro......
As a software developer my work consists mostly on typing, and I will never ever in my life need this power. An M1 or M1 Pro will do the work perfectly for me 👌🏻😊
I think the price is competitive as well, I wish you mentioned it in this video. For the price the m1 max and m1 ultra is offering it's a killer deal for video editors. Other machines of comparable performance go for significantly more
Never commented on your viddy's but I really love em. The Mac Studio is def overkill for me as a web developer. But your channel is the perfect way for me stay updated on tech tools. Thanks!
the thing that impresses me the most is how little power it uses to get similar performance. We've been told by Intel more power needs more watts. For 3 decades intel rolls out new chips with 5-15% improvement with big releases getting 30%, but the power requirements kept going up. My AMD ryzen workstation has a 850W PSU!
@@Deses I got 850W in case I decide to run dual RTX or upgrade to 5900x. Sadly both of those were basically unavailable or over-priced since Jan 2020. For 3700x and RTX 2060 I could have gone with just 600W and had plenty. I still remember Pentium pro days when all I needed was 400W PSU for a mid tower and 2-4 HD. I've killed enough Intel CPU, PSU, cpu fans, case fans and HD to have a grave yard of hardware.
@@woolfel You are delusional. The only RTXs that can be run in SLI are the 3090s, 2080s and 2070s and you are not going to run TWO of these with just 850W. It's ok if you got a big 850W PSU. I got one too! Nothing wrong with it! But don't make up excuses to save face lmao.
I had heard that a Blender render is very slow when compared to an RTX card, there is not magic in the chips and they still have limitations on many workflows that no Tech Review TH-camr will cover since everyone of them are into video editing and Apple is making sure to build computers that will always have favorable results for their workflows. At least here we are told that gaming is not what Apple devices are built for, that is much better than just about all the other reviewers.
Blender has always been optimized to run on NVIDIA cards because it uses CUDA. Even AMD cards can’t compete in Blender. This has nothing to do with the chips. Apple is actually now contributing to Blender the necessary optimizations to improve its performance in Apple Chips. So Blender can’t be used for comparisons.
@@nullpex What are you talking about? CUDA is part of the NVidia chipset and that is comparable to other products like Apple’s chip, they even compare to it with their very bad graphs. I am just pointing out that there are still many cases out there where an Apple devices are not better, I do agree that the M1 chips are great for the cases that they are designed for but not outside of those cases. Apple is making video/image processing computers, if that is not your workload, then maybe you shouldn’t buy one. For most people, computers hit the necessary level of performance (in CPU and GPU) 10 years ago in the mid-range laptops and everything since has been mostly unused by them. I have an M1 iPad Pro and find the software lacking for a “Pro” label, Android tablets are about to do much more with lower spec chips (but also much lower performance). But would I still want these crazy high spec chips, yes! Do I have a use for them, no!
Apple is great at restricting software on their systems by not fully supporting programming languages, libraries, and standards that they don’t develop. This makes the system easier to control and can help reduce bugs, but now the user doesn’t own what they pay for and are locked into an ecosystem that appears to be great for users until you get into the details and find that your privacy is still being sold. Linux is the last hope for users to have true privacy, ownership, and freedom to their devices. The problem is most people have given up on having these things to avoid complications with using devices. This is also seen in the automotive industry with cars being built with little regard to user serviceability. If Apple would just sell kits like when they started out, then I would buy another Apple chip. Having to buy whole new computers to upgrade is dumb and this trend has spread to most of the industry. Apple should make a car that you can’t service and we can live in a world where users/consumers have more freedoms taken.
One thing I would be very interested to see is how much better the Mac studio handles AI upscale via Topaz. Since they are adding to the neural engine I wonder how much of a performance boost that actually translates to.
Yea, this didn't seem as impressive to me either when the graph came up showing that it's a little better than a 5 year old, 16 core threadripper and gets handily beat by a 4 year old 32 core threadripper. Yes this CPU is efficient but it's far off from a mobile cpu which most people seem to liken it to just because it's arm based. And it gets beat out by most modern desktop chips from AMD and Intel which are still much cheaper.
Yh I thought this as well. Don't get me wrong it's impressive as hell how much power this thing can kick out with such low tdp wattage, but it seems disingenuous to compare a brand new processor to fairly obsolete processors in comparison charts (i7 7700k????)
This device will become e-waste after the soldered, non-replaceable SSD dies. Flash storage has a limited lifespan after all. Cardboard packaging or recycled aluminum does not make this kind of tightly integrated devices enviromentally-friendly. -- Edit: Apple states that the SSD is not user accessible. Even though it's not soldered, it will not be user-upgradeable.
But honestly, that's like 8 to 10 years out and realistically, as a 28-Core 4x W6800X GPU user who's system smokes not just this Mac Studio but also MKBHD's Mac Pro...this littler device is still awesome. I ordered one for the music studio upstairs. Of course I wouldn't even think twice about it replacing my Mac Pro...BUT...I'm very curious to see what happens with the Apple Silicon Mac Pro because quite frankly, there are a few scenarios I can think of that could tempt me to replace my Beast.
The ssd isn’t soldered it’s a Apple special replaceable affair and Mac systems tend to live for 10 years easy and by then it would be super outdated anyways and it’s fully recyclable
@@jeffp1370 for what SSDs? Apple have a history of making what should be easy, difficult. As an example, in previous laptops and iMacs with SSDs, while Apple may have used say a Samsung made NVMe SSD, they dont use the same pin outs on their motherboards so you couldn't just go buy a retail Samsung SSD to replace a faulty Apple one. That was resolved by third party options including adaptor sockets which were made feasible by relatively high volumes that will not exist for the Mac Studio. And there there is the hardware "security" Apple has been implementing. It will be very interesting to see what happens with these SSD slots and possible upgrades but I doubt Apple will make it easy.
The thing that I hate about apple is that they shamelessly lie in their marketing! They stated that it has the performance of 3090, which is just a blatant lie! In Geek bench 5 the m1 ultra scored 102K points while the 3090 scored 212k!
Technically not a lie cus, it’s like better performance for their mentioned level of power draw. Like it beats it by performing at x speeds by drawing significantly less power at that range. But 3090 draws more than double the power and keeps going up in power draw and performance
@@saberkyo you see... The pc will crash as soon as it starts doing something intensive if it has less than the minimum needed power, so they didn't test it like that, they just lied!
How I wish (and I know its a long shot), that apple would make a M2 Game chip. Double the graphics cores of the M1 max or even just on the Pro and they would appeal immensely to my type: Loving the mac environment, primarily using the macs for light workloads but also loves to game with the opportunity occationally presents itself
3090 can be OC'ed and can take >400W but will hit thermal limits. Likely Apple found 320W-ish to be the limit for sustained performance for stock, air cooled 3090s.
No, 3090 can hitich higher, a TH-camr proved they cropped it on purpose to make the m1 ultra look better, also in blender test , software not native to apple, a 3090 smokes the m1 ultra.
This really reminds me of SGI back in the day. Silicon Graphics were these beatiful computers, great design and powerful too - but most people had zero use for it. Back then, I was thinking about getting into 3D graphics - just to work with something this beatiful and outlandish. I'm too old to start a video editing studio now to have a nice aluminum box :-) My Ryzen with Nvidia GPU is pretty much equal to this and when I'm done with my workstation, kids gonna get it for games or it's gonna be my new home server.
Welcome to the new era of Apple, in two years pay us $500 to install the NVMe drive you just bought from us for $1000. Apple we work hard to set the industry decades back.
Mate, you're amazing. After years and years of product reviews, I keep watching your videos. They make me happy, informed and love all the narrative you build around them. Keep at it! Love from Scotland :)
And the Max is great for those of us who don't want to spend $5k. I've been very impressed by how fast Handbrake went (when you choose the proper video encoder (VideoToolbox)). It was great to do Blu Rays at around 200 fps, dvd at about 800 and 4k at anywhere from 30-75 fps. CaptureOne flies now. To save even a bit more money, I went with 500GB internal storage and keep lots of large objects on external SSDs. This leaves me space to keep what I am working on internal. And with all the ports, you can connect a lot of external devices before you have to invest in a NAS or something. So, there is plenty of machine to had here in the lower end of $2k. Oh, 64GB because as noted you can't upgrade the memory, and going to 64GB wasn't that expensive. The Ultra would have been fun to get, though. But, I can't really justify it as a home user and stills photographer. Nice review!
"But hey, atleast it's $5,000 of overkill vs $25,000 of overkill" OOOOOOOF. The tech companies (not just Apple) CASHED OUT on their consumers/prosumers the past few years... It's FASCINATING to see a "new product" cost a 1/5th of the price for similar performance. The tech boom happening right now is actually mind blowing. Was Apple really behind over the years or were they just holding out?
It's all about the chip really.. They used a intel xeon, now their own. So it's not about being behind or holding out or not, but rather the in-house chip that's causing a paradigm shift..
They weren't behind. Years ago they mentioned how one of their iPhones they were releasing had the same processing power as one of the MacBook Pro's they had out at the time. That should have been a huge hint though I'm sure many people just didn't pay attention or expect them to move that same processing power into personal computers.
@@andremessado7659 exactly, my point proven. They were holding on out on their costumers (as well as other tech companies), because it was a head of it's "time" and profited big on it.
@@justinburgan No Justin... Just no... smh 🤦♂️ Do you know how hard, difficult & time consuming it is to essentially create an entirely new processor that will go toe to toe with AMD and Intel, the two biggest CPU manufacturers in the world? It probably took many many man hours in order to build MacOS for ARM, repurpose rosetta to run X86 code on ARM CPU's without issues and to make it fast enough for people not to notice. Your thinking on this is too simple, I don't think you have a clue as to what you're talking about or understand how processors and code work. They weren't "holding out". It's very difficult to do what they did to bring their ARM chips to a desktop/laptop especially considering Microsoft tried this and failed miserably. Apple has been wanting to ditch intel for a long time and thats what they've been focusing on. Also the power has been available in the iPad/iPad Pro so they again we'rent "holding out" because the power was already there in iOS. It just took time (obviously) to move MacOS, an OS thats been running on X86 Intel chips for almost 15 years to go to a completely difference CPU architecture (ARM).
Looks like that robot arm came in handy for some INSANE shots! This looks incredible!
agreed. marques shots always are sick
Looks awesome but could’ve removed the fingerprints before filming.
Looks like a 3D Render, impressive
@@affectedrl5327 came here to say that!
Factz! That intro was hard!
Apple can give us some of the fastest computers with amazing designs but can’t let me use final cut on my iPad Pro.
Scam
i think it’s bc if they do, then there would be significantly less sales of the Macbooks. there’s only so much they can do for iPads which wouldn’t affect the “necessity” of having a Macbook.
Or it could be they need to get all the iPads running on the M series so they can use macOS.
What's a computer?
because apple wants your money
I love this line "Just ‘cause it’s fast, if you’re bad at editing and you get this machine, you’ll still be bad at editing but just faster" 😂😂
Lol I'm going to bet a lot of people had to hear that
Yeah I liked the video at this point
This was the video highlight for me😂😂😂😂😂😂😂
but the faster you can see your mistakes, the faster you can get better!
Where’s the time stamp?
The first thing I noticed was how seamlessly it fit underneath the desk but seeing you push it in there was so aesthetic and spot on man.
I wanna point out that although MKBHD says that the RTX3090 is meant for games, although it is marketed to be a gaming graphics card, it’s actual intended capability is for both 3D rendering and gaming. I know many people who deal with 3D art or 3D modeling use RTX3090s or 3080s to do their work and at the same time run the games that they’re working on.
Exactly this. RTX 3090 is more efficient and performs way faster than the "server grade" cards for GPU rendering. Games don't even utilize NVLink and the 24GB of VRAM the card offers.
One of my friends designs houses, he loves the 3090's power :) Autocad and the likes run smooooth.
Exactly, it's the best card that you can buy for GPU rendering that's not a Quadro. Octane LOVES my 3090.
@@magnotec. Rendering isn't always about VRAM. If you check the latest Octanebench, 3090 & 3080ti are topping the charts in terms of render speed. The closest thing is the A6000 which costs way more than the two (at MSRP).
@@magnotec. it’s still the best bang for buck card that has 24GB VRAM compared to the Quaddro cards.
The shots in this video + the thumbnail are insane 🔥
views
@@dvngnt likes
Are you half Chinese ?
The shots fired to people compensating their skill with hardware as well. Really thought this would be a Skillshare segue
Drake 6 🙏🏿 Views album thumbnail homage is legendary! Too Good 😂! 🔥🔥🔥
"If you're bad at editing and you get this machine, you'll still be bad at editing but just faster."
So should I buy it then?
Naw, love your editing😂. Good to see you here!
Yes, you can always improve your editing skills, and with a computer like this you'll have one less thing to complain about.
Im DEAD ☠️☠️😂🤣🤣🤣
Lmao hahaha
The more iterations, the better you will get at editing.... YES.
I’ll give apple props for sustainability when they let us upgrade the damn machines. Still an incredible tool, the energy specs are also great but come on… without the ability to upgrade the device over time there’s a ton of eventual e-waste there.
If something goes wrong... Everything is glued together!
@@edgarwalk5637 you're right, it's absolutely nuts. I get they need to integrate everything to keep it small but completely locking everyone out and even going so far as using glue so it's obvious if you try to fix it yourself is just wrong. While I love the size I'd take a bit bigger footprint with the ability to upgrade any day and I'm betting all but the hardest core, "apple can do no wrong" fans would agree.
Question might be, is upgrading a real factor, like how many upgradeable pcs actually do get upgrades? I would bet its less than 1%
@@morfelod it is definitely more than 1%. The second hand market thrives on PCs being upgradable. For example, an old Dell Optiplex that was only equipped with 8 GB of RAM could be upgraded with an additional stick of RAM and a GPU to make a decent gaming PC. Apple also used to let you upgrade by having normal Philips screws and standard parts, meaning you could upgrade down the line if you needed more RAM or need to replace the battery. That would potentially extend the life of the laptop.
With how frequent some companies upgrade their systems, I’d rather have someone repurpose the machine rather than just throwing it in a landfill.
@@morfelod my entire pc is built off of Facebook market second hand parts. Running a 7 year old graphics card and I’m still able to play modern titles.
The fact it fits under that shelf so perfectly 🥵
Ikr so aesthetically pleasing 😩
love it haha
R/perfectfit
Any link to this shelf? Would love to get one
Spawnpoiint in the building. Love your vids
The fact that this $4000 computer is even being COMPARED AT ALL or MATCHING the $25000 Mac Pro of a few years ago is just insanity I don't think people are really taking that in enough
Mostly because that Mac Pro was overpriced as you can get. Linux was building a Hackintosh faster for far less money.
@@johnbuscher Didnt this hackintosh build took almost a year or over a year to get done because there were a lot of problems and it still only worked with specific hardware?
Agree. Apple should be kicking themselves for not bringing chip making in-house yrs ago - they’ve had the cash.
Well the Xeons Apple used were kind of obsolete back then already and easily beaten by their AMD counterparts.
@@WestCoastAce27 their chips were behind intel & amd at that time. Only recently they could catch up because of intel’s stagnation at the top of the market
3090 is not exclusively for gaming, it is a quadro replacement. That means that it is for Rendering, by being the best for gaming at the same time. That's the reason that it has 24 GB of VRAM
It's basically where the Titan cards used to be. Nvidias highest end GPU, for Enthusiasts/ professionals, who don't need special Quadro features.
Really not for gaming, especially with it's 24gbs of VRAM.
Yeah, exactly my thought too. How short sighted of Marcus. It's like he's defending Apple 🍎
It really isn't a replacement for the Quadro. It's a low tier budget workstation card at best like the Titan was. A Quadro will still be better if for no other reason than the ECC memory and massively optimised studio drivers.
@@theunkownmr.562 I wouldn't say he's "defending" Apple, he just really doesn't know much about gaming. I tend to dismiss anything he says related to gaming, I'm pretty sure the only game he plays is FIFA.
Of course, but think about the price. It‘s 2000$ only for the 3090. And now look what you get with the Studio machine.
It's possible the Studio gets hot under load and throttles down on heavy computing tasks like exporting the video due to the small inclosure compared to the Mac Pro. And Yeah, Apples packaging is insane. Just go the new iMac 24 and still blown away by the box of all things. Excellent review. I always enjoy watching your channel.
MaxTech did a teardown, and the screws to access the internals are underneath the rubber ring on the bottom. There are of course a lot more screws and connectors to really get into it. But the SSD isn't soldered, and there are actually two SSD slots. I assume that the larger storage options (4 TB and 8 TB) use both, but the 1 TB machine he disassembled only uses one of the SSD slots. They're not standard NVMe slots, but OWC will probably offer upgrades at some point.
Link ?
Thanks for the hint! I watched that video and I got to say that it looks like Hell to do a simple thermal compound replacement or to clean those fans and fins.
Link?
Even if this is true there's literally no way Apple would allow their users to take apart their products without voiding the warranty
@@josepedrocarmo5885 and do you need to? My iMac is 13 years and does fine without
the amount of leaps Apple has been able to make within a single "generation" of chip is REALLY impressive. I'm very excited to see how their chips improve in the future, and maybe one day in the future I'll make the full switch to MacOS.
@@tatsukialmologist srsly? capitalisation?
@@cthulhuiabs woooosh
If macOS still had Boot Camp support on Apple Silicon, I’d switch. Sadly, I don’t like macOS.
Just like any other company and also their mobile chips division they won’t be making massive leaps anymore after this generation just to keep the yearly upgrades consistent.
Hmmm.. Is it a new generation though? It is quite literally 2 of the other M1 chips just stuck together.
Great review (as always) although I think it’s wrong to dismiss the 3090 as purely being for gaming. There’s a reason Apple made that comparison and had people from Redshift, Maxon and Adobe at the event. The 3090 is the best GPU for motion design and 3D GPU rendering.
And quadro cards are meant for video editing, as well as extra heavy//power limited rendering.
On top of that I seriously doubt the M1 Max stands a chance when it comes to heavy AI driven work such as protein modeling etc.
From what I could gather the 3090 is more of a card for people that want to do both heavy workloads and game on the same card, since it's the only 30 series card still supporting SLI, which no longer has any application in gaming. It's really just a rebrand of the older Titan cards, which were meant for the same thing.
@@I_THE_ME That's because the software isn't optimised for the Mac. Pointless to even worry about hardware at that point.
Yes I had to disagree with that statement too, game developer and 3D artist would like a real gpu. And in camera VFX with Unreal engine (like you can see in the mandalorian) needs the gpu power too.
I love the choise of music for the intros. Love your intros!
The editing skill is increasing a bit by bit every single video. MKHBD is seriously killing it with the video quality.
He probably hasn’t edited his videos for years.
He has an editing team, he does not edit videos anymore
@@Trippyisop No. Just team. He explicitly says he edits the videos.
@@antonemceish1006 sure he does lol
@@antonemceish1006 Dude, he has a bunch of video editors in his team. Soooo, are you saying that he pays their wages just so he edit his own videos?
I’m not even gonna lie I’m not even interested in this Mac and have no intention of buying it but it’s just so satisfying to watch MKBHD videos, the production quality, the editing and the aesthetics make me stay
That’s most of us here. Welcome to the club.
I’m a big apple nerd it’s been working out great lately apple is on top of their game
Mac Studio is really not bang for the buck. I have seen windows machines that are 2x the speed for the same price! I also doubt the cooling in this is any good...
@@aaryamangupta what u talking about windows is not even a pitch of salt in performance at this price point
Yeah I just subbed for that reason
I saw a picture on twitter where someone opened the studio. It was really insane.
The specs for this machine is really dope
Side note, the 3090 is not really meant for gaming, as it's a "replacement" of the Titan. It's a studio card, for 3D modeling or animation, that can also be used for gaming!
It's absolutely meant for gaming, it's for the people have the money to burn and want the very best gaming PC. It's officially not a Titan replacement because there are no Pro drivers available for it and there are software limitations on its compute features. It to some extent does dual duty as a workstation card for some workloads but there is a reason it's branded the same as the rest of the 3000 series, which can also be used in workstations.
@@TimSheehan The 3080 is a gaming card. The 3090, with its absolutely overkill 24GBs of VRAM and only marginally improved compute power, is definitely a content creation GPU.
@@Krannski no it's absolutely a gaming card in terms of how it's clocked and how it's marketed. The A series (formerly Quadro) are for professional work, they have a tonne of ram and are clocked within the efficiency window.
Peak MKBHD review. Great work! So clean, so smooth, so clear, utterly professional yet accessible. This is exactly what I subbed for years ago and the team still consistently overdelivers. ❤️
at the End i was like : wohoho damn this was such a good video. Like Peak. I was honestly really impressed. Jeez.
Your lectures are never boring
preach!
Definitely seems impressive! But not having HDMI 2.1 seems strange.
That seems to be the case with all the M1 devices
@@djstuc that is not what Consoles and most TV manufacturers
I think it's because thats where the bandwith maxes out. if they would have added 2.1 it would have been too much.
I suppose Apple used all lanes for thunderbolt instead of losing a thunderbolt for better HDMI. You can use a usb-c to HDMI cable, you know.
Unless you're going to hook this computer up to an Oled TV, there's zero point to HDMI.
I was waiting on them to release news about a new IMAC PRO. But I’m not going to lie, I am really leaning towards getting the Mac Studio with M1 Ultra chip instead . Tired of waiting & this seems like everything I need for years to come.
Hi! verified person
Think about portability.
For most work flows this will be more than enough, and for either half or literally 1/5th (20%) of the price.
We were never going to get a new iMac Pro. They announced at the time this was a one off as a stop gap between the old trash can Mac Pro and the new modular one. I'm kind of dissapointed we didn't get a straight up repalcement for the 27" iMac though. The guts of this in the new Studio Display body would be killer.
@@TalesOfWar They said at the end of the presentation that they have one more product to implement Apple silicon with, the Mac Pro. They said that clearly at the end of the event. The M1 series is completed however. So the next product will be specifically for the Mac Pro rendition.
To everyone that reads this there was a youtuber that tore down the Mac Studio and there are 2 SSD spots on there and all the ports are modulus that can be unscrewed and replaced. To open the Studio the screws are under the black ring that is at the bottom.
The efficiency thing is no joke. Switching to an M1 Mac Mini from my Windows desktop not only made a noticeable impact on my power bill alone, the heat generation is making another positive impact because I have to air condition my office less.
yeah but you have to use macOS with is garbage
@@ddha0000 MacOS is way better than Windows.
@@ddha0000 especially Windows 11
@@BrianJacobson lol
@@nevergonnagiveyouup4189 I wish linux had better gaming support for anticheat software, cause I'd be using it currently.
I’m always so hyped for your intros 👏🏼👏🏼👏🏼 and double M1 Max chips is WILD 🤯🤯
3090 build referred to in the presentation walks the M1 Ultra in many real world tests. I recommend those reviews for anyone more curious about the benchmarks Apple chose to market.
Ig what they were saying is that at equal performance levels, the M1U consumes way less power compared to a 3090
The way the M1 Ultra is setup is still insane to me. Even with high end GPUs in the past you couldn't really just use 2 to effectively make 1 super GPU (as far as I know). I wonder if this approach could be used in future processors by different companies. Although the M1 chips is much more integrated including the GPU, ram, etc. while Intel/AMD CPU's are more individual parts. Props to Apple on this achievement.
SLI and Crossfire technologies exist, although the efficiency of using two cards are usually in 70 to 80 percent. 2 cards equal to 140-160% performance of one. They also have to be identical.
@@Impoigness It is my understanding that SLI and Crossfire configurations have to be supported by the software/drivers, so most games don't benefit from it at all. I think NVIDIA dropped support too. The m1 ultra seems to be interpreted in software and OS as a huge processor with everything combined by default. I think Nvlink is a significant improvement but only supported on 3090 current gen, not sure about software support requirement though. I hope more processors and GPUS can be combined in this modular way in pc builds in the future.
Another apple shill!!! you know that for 5000-8000 dollars that this are sold, you can make a much more powerful pc !!!
@@gvibes69 This satire? April Fools was yesterday. Of course you can probably build more powerful PCs, your reply is irrelevant. I was commenting on the fact that Apple can simply connect two powerful SOCs to be seen on all software as one giant SOC, with all resources (GPU, CPU, ram, etc.) pooled together. There isn't anything like this on the PC building side.
@@gvibes69 You are the one behaving like a shill here.
I come from an era of computers where power users like me had an Athlon 1.1MHz with 256Mb RAM and a 20Gb 7200 HDD. The Graphic card of choice was a Savage 3D 32Mb or Riva TNT2. This entire generation of machines is impressive to me, even my humble 3.0Gz octacore Android with 6Gb RAM and 126Gb storage seems like 10 generations newer. I can only imagine the heat the X64 line up is feeling now, and when the apps become 'native' rather than Rosetta 2.0 based, its gonna be a headshot.
I remember getting a riva tnt2 for quake 3 arena
@@danielkenney7768 I remember the voodoo banshee, the first stand-alone video card made by 3DFX. I remember using it for quake 3 as well. :) Things have come a long way indeed.
@@infjcafe haha me too, also with Quake III. The downer was I still had a really slow modem so online gaming wasn’t on the table
Not to be a tech nerd here but the video cards you named are late 90's models and Athlons are early to mid 2000's CPUs. Those components weren't even produced concurrently, I'm pretty sure they were launched at least 5 years apart. And a 20gb HDD is a small consumer drive from 2000/2001.
Interesting choice of components for a power user computer if you ask me.
And your "humble" (actually pretty high end) Android phone is literally more than 10 generations of silicone newer than even the Athlon...
I was using Amiga’s and then moved up to Pentium 3dfx Voodoo PC’s. I reassembled a Pentium 2 Voodoo 3 PC a year or so ago, installing it into a new case with a new power supply. It runs like a dream on all those old classic 3dfx games.
Since this is so unique, I'd love to see someone go to each of these different 'studios' and test it out for their workflows. You don't see a lot of reviews talking about how well this works for architects/engineers using archicad/revit for example. That's probably too much to expect though! I'm looking at getting my own professional setup at home since more places are accepting wfh and I feel like this could be a new aspect to the 'consumer' market.
Not sure if Revit is compatible with M1 chip. There was a work around where you could make it work on intel based macs. Please do let me know too if you find a way to make Revit work with m1 macs...
REVIT doesn't work on mac?
@@nascentnaga no, only “light” LT version and only works under bootcamp, shame
Without Revit it’s not really suited for architects, otherwise very nice machine
My bad, I threw Revit in there without realising it's not on macs at all. I've been in archicad land for a long time on PC and Mac in different offices.
Dude, you're such a great storyteller, your reviews can't stop getting better. Nice work!
One of his influencers he interviewed said “be a great storyteller”. #24
I don’t know. Still i feel dave2d does a better job. His reviews have more points to take away from a shorter duration
Honestly I underestimated the Mac Mini M1. It's been able to run all my programs and handle all my Adobe programs without a problem. I would recommend that instead, I think TH-camrs get tunnel vision in focusing on video editing and graphics design. Most of us students don't deal with that. Plus it's like $600. Why spend a couple of grand on a computer if you won't be able to take full advantage of its capacity?
I enjoyed where you discussed “TOOLS”. Yes these machines are tools and knowing how to use them well is where we’ll see our greatest gains.
Thank you sir.
Regarding the sustainability element, what is the expected useful lifespan of one of these machines, and how does the fact that it cannot be upgraded or modified (in a way that other products can that extend their lifespan) impact the sustainability element?
I actually think it makes it better.
As long as these parts are as reliable as they always have been from Apple (my mom has a 2007 and 2009 iMac she still uses every day without issues), then the reliability aspect of full integration trumps having less reliability and having to throw out and change parts over the course of that 14-16 years. Not to mention: I don’t think almost anyone uses an entire computer that’s 16 years old. For the vast majority of consumers, I’d say this level of reliability is way way way more positively impactful on sustainability than a PC that maybe gets used for 7-10 years max, upgrading and throwing out old parts along the way. (And using more energy)
Just my thoughts, I don’t actually know anything :)
@@WarrenWeekly And it sure shows. It still going to be E-waste if you get a defective device or the "accidents" you may encounter with it.
@@Deliveredmean42 I feel like that’s what a refurbished product is. It’s not like a damaged device is often thrown out, it’s usually still sold to someone who ends up refurbishing them. (Even these sorts of machines, apple refurbishes them and certified repair places will as well.)
The “and it sure shows” comment was pretty rude
@@WarrenWeekly thanks for information
@@WarrenWeekly I mean, I don't know about reliability lol, they had laptop keyboard that would randomly break, and they kept the design for multiple years despite knowing the flaws,
The way Apple made their presentation made me wonder “are they claiming Mac studio outperforms every other computer?”. Thanks for painting the full picture Marques.
The way it slid perfectly under the monitor desk stand thing is soooo satisfying
0:38 marques editing a clip showing the Malaysian skyline because the Mac Studio is made in Malaysia
nice
Woah nice
*EDIT* Anyone talking about the *Mac Pro* PLEASE work on your reading comprehension, this comment was to highlight the *MAC STUDIO* (As this is a video about a Mac Studio) *NOT* the Mac Pro which is blatantly poor value for the money... indicating Apple is moving towards actually providing value in its products which they historically haven’t been (pre-M1).
I think the take away here is it nearly performs as good as an intel Mac Pro fully spec’d for tens of thousands of dollars less. A fully spec’d Mac Pro is exceeding $60kCAD, SIXTY THOUSAND… So people saying this is over priced haven’t seen that an equivalent Intel CPU *alone* costs as much if not more than the entire Mac Studio. Very cool to see what Arm can do when really optimized, X86 has some work cut out for it at this rate.
The fully spec'd out Mac Pro is the biggest ripoff on the computer market. I could build a balls to the wall Threadripper PC with dual 3090's and still have money left over while crushing the Mac Pro.
@@theunkownmr.562 not when it comes to
Video production, 3090’s are GREAT for gaming but these machines are made for content creation. You’d want something along RTX Quadro cards to be on par and you’d be spending a lot more than the Mac Studio (fully spec’d) at that point still. I am not in any shape or form claiming the Mac Pro was “good value”, Apple never has been BUT in this case for what it is the Mac Studio stomps anything else dollar for dollar WITH CONTENT CREATION. Be it a windows machine or MacOS.
On top of that you’re not comparing apples to apples bringing Threadripper into account which on its own smashes Xeon’s price to performance wise.
@@NicoIsntHere a quadro 8000 would wipe the floor with the AMD GPU that's in the Mac Pro.
@@theunkownmr.562 ok? What’s your point, the *Mac Studio* provides insane value for the workload that cannot be replicated in a DIY HEDT platform on windows. If you want to argue about the Mac Pro being bad value I already agree… Seems like you’re arguing with yourself at this point 🙃
@@NicoIsntHere This isn't accurate. The 3090 outperforms an RTX Quadro 8000 in most scenarios, at less than half the price. For anyone doing effect-heavy edits or motion-graphics, a 3090 should be more than enough.
You could build a PC for editing that outperforms the fully specced Mac Studio for less than $4k, if you can manage to find the parts in stock. A windows machine with a Ryzen Threadripper 5950X, RTX 3090, and 128g RAM would handily beat Mac Studio in Adobe programs, for somewhere around $3k without including storage costs.
And I'm saying this as someone who intends to get a Mac Studio.
“You buy a 3090 for gaming” every tech reviewer forgets that 3D modelers and engineers exist.
Not disagreeing with you, but gamers massively outnumber creators
@NaN yeah right, because there are not independent studios, freelancer 3d artist, designers and engineers in the world that could benefit with powerful gpu cards. This is just for gamers.
@@gaeborg but the games… they need to get… created.
@@mrtwills so?
@NaN a rtx3090 is a series 3xxx version of the Titan cards, not the quadros. You talk a lot for someone who's really off target here. Its a hybrid top of the range video editing, 3d softarware , AutoCAD , modeling, photoshop, unreal engine 5 , maya, 3d max, all purpose + gaming card. That's why it has 24g of ram, no game needs more than 10gb of GPU ram. But creators do use it.
Nice review! It should be a dream machine for all Final Cut users.
I think that many people buy RTX 3090 NOT for gaming. 3080 Ti is the one to chose, 3090 with it's 24 GB of RAM is perfect for 3D rendering etc.
its, not it's
@12:54 "If you're bad at editing and you get this machine, you'll still be bad at editing, but just faster" aww shucks,there's always a catch
Pump for the full review. Beautiful intro as always.
@Don't read my profile photo Why not ????
Max jus5 did a tear down. Looks like you can upgrade or add extra capacity in the SSD department
Unfortunately, no, the security chip stops you from doing an upgrade. It is likely we can replace an SSD at Apple, which is better than nothing.
@@tonyburzio4107 just connect it as an external SSD from the out and then shove the sad connected to the port inside the Mac studio
Provided you do some drilling and re routing
Or if you don't care about aesthetics just freaking tape it to the Mac studio
@@bodiminds6571 but you’ll always have to eject it before you shut down your Mac. Also, the external SSDs do not have a very fast transfer speeds compared to the internal SSDs and with Internal, you don’t even have to eject it before you shut down and it also connects directly to the motherboard.
@@ZybYt agreed
@@ZybYt You don't have to eject SSDs before you shutdown your Mac. Dismounting the external SSD happens part of the shutdown process. That said, they do tend to have faster speeds, that much is true, but Thunderbolt enclosures can get very close.
We need to stop praising Apple for being "eco friendly" when their omission of certain basic accessories requires a purchase that makes even more waste, the product when it finally dies and the packaging associated with buying it.
You can't remove the POWER cable from their new display lmao! And we're supposed to praise these corporations for some reason.
The whole global warming/climate change and needing to be eco friendly is one big scam to make you pay more. It’s disgraceful the excuses they make to charge more for less.
Would really kill for one TH-cam tech reviewer to talk to a developer or two (not just iOS/Mac app developers but regular world devs). It’s probably the most common workflow that would actually benefit from a machine like this (and I mean biiig benefits), but the entire community pretends people with powerful computers are just content creators or scientists.
cause youtubers do not know any better, they just dont have knowledge besides editing in their software basically, I feel like this review is just a hype train with no downsided explained at all
Well, they are TH-camrs. They can only really talk about how good FinalCut or some other editing software runs in a given device, and then talk about specs.
They are really clueless about anything outside of that, or to even think that there could be other places were these machines could be used.
Alex Ziskind is the TH-camr who does exactly this
All of you should watch the videos on TH-cam that talk about the realities of tech release and reviews. It's not because they don't know any better outside of their sphere, it's not cause they work predominantly on Final Cut, it's not cause they don't know other careers exists. This early on in the product lifecycle, there is no time to go talk to a developer or two (who have used the products extensively enough to make an informed opinion), then talk to designers, then talk to hardware engineers (in the interest of fairness, right?), then make a video in time for them to grab viewer traffic and revenue.
This is also not mentioning the fact that most large TH-camrs are catering to the everyday person. "If this Mac can export this type of video file so quickly, you're good to go for your daily Excel or Netflix usage". The average viewer does not care if there are caveats to performance, cause the average person does not code, they don't use Inventor, they don't use Blender, etc. And someone who is at the enterprise level of an experienced engineer, developer, designer, who is making a purchase decision, is not going to look at TH-cam videos like these for advice.
Exactly, computers are used in other fields besides editing, there is 3D modelling like Autocad, scientific simulations and number crunching as well as running mathematical models for economics etc.
That’s why PC is particularly special is that is gives you better options on your workflow etc.
Oh my gosh, this is probably the best MKBHD video I've ever seen. So well edited, such great shots, and a great review overall. Great job, keep it up!
Most consumers should see this and think “Cool, whenever these video editors upgrade again in a few years, these are going to be so worth the buy on the used market”
But for now, refurb’d m1 Mac mini is probably doing it for 95% of people.
Well to be fair, this Mac is targeted towards professionals. The average user would be more than happy with an M1 MacBook Air or Mac Mini
This computer hard drive is focused towards people that do heavy video editing for a full time job, people that make movies, and people that make music. It’s overkill for the majority of people. Get iMac Air laptop or iMac 24 inch all in one desktop.
@@GiovanniCheng exactly. But some “average users” with heavy wallets get hypnotized by figures like “20-core GPU”. I am just hoping they see this and save themselves some cash
For Adobe software such as Premiere Pro, After Effects and to a lesser extent Photoshop, the new M1 and M2 Macs are just not worth it. Adobe hasn't optimized their software for the new Apple silicone yet and compared to their PC counterparts are buggy and sluggish. If you're a Final Cut or DaVinci user, that's another matter completely. They're both highly optimized for the new Apple hardware, and particularly if you're not a tech guy and just wants something which just works reliably the Studio is definitely the answer.
It seems like they are optimized now and not running on Rosetta. Adobe still have problems on their own.
@@HotepOurobo PC Version is more stable and responsive from my experience, but that's not saying much. DaVinci and FinalCut offer a far superior editing experience, even on run of the mill macs, let alone workhorses like the Pro, Studio and Ultra. I've long given Adobe the boot.
@@MgarrKidhow? I want to change to Mac , but most Tv and TH-cam work in my country is Avid or premiere fucking stuck to Adobe …
still haven't switched to resolve completely from premiere, but the more i see, use and read, the more adobe stuff just suck against any other product, huh
@@cescimes yeah man but where im from its still the industry standard for small to médium projects and avid for big productions , I really like resolve but cant make the full switch because of this
Impressive as the videography was, I was disappointed with the video content:
- RTX3090 exclusive for gaming? What??!!
- "Intel could never" when talking about raw performance in Cinenebench R23, when Intel has already more powerful consumer CPUs in both single and multi-core for this (and many) benchmark and real world use (and lose in others)
- Lack of comparison with PC in regards to performance and value
- Talking about saving in electricity bills and environmentally friendlynes but you have to replace the whole computer if something, anything, goes wrong, while a PC you easily just upgrade certain components and even when you upgrade motherboard and CPU, you can still re use PSU, case, NMVEs, etc. Not to mention at equal performance you are saving a lot by building a pc in most cases (unless your specific workflows really take advantage of M1 silicon)
- Almost lack of anything other than video editing was touched in the video in regards to performance. I get it, you have a video production studio like you mentioned, but was still disappointed, I guess there are other channels for that and I shouldn't be, but I somehow expected a little more/wide context
It's an excellent and impressive machine, performs equally to the best consumer hardware (no prosumer) while being so much more efficient, all in a tiny box. But there is a huge price to pay for having that efficiency and that performance packed in a tiny box, like double the price, and in lack of modularity (repairability and upgradability). Worth it for sure for many people, but still necessary to point out.
Maybe "intel could never" deliver such performance using a TDP this low. But yeah, this thing is impressive, but only if you consider TDP. Other machines far outperform this one.
Also, on the environmental aspect, it probably costs a shitload in electricity and machining to make a case out of a solid block of aluminum. Not so eco friendly.
Bruhh its not that deep...
@@possamei good point about the block of aluminium. Also, Apples mention 100% recycling in big size, but you read the small letters and it's just in small pieces of the inside.
Were you expecting something else from a video studio that primarily uses Mac and Final Cut??
The value part is gonna get worse comparatively once AMD and Intel release Zen 4 and 13th gen CPU respectively later this year, and new GPU architectures as well in a few months that will presumably bring huge performance improvements after almost 2 years that the current gens have been in the market.
I wouldn't call a computer you can't upgrade at all sustainable, recycled once means nothing if it won't get reused or recycled again.
Isn't upgrading technically less sustainable given you're throwing part of it away? These will last for years, and trust me, the places people will buy these in bulk NEVER upgrade their systems. Ever. They're specced for a task and replaced outright when they no longer manage to perform said task. Lots of them are also bought on lease and replaced on a 3-5 year cycle. Upgrading is also often considered a waste of IT's time given how long it can take to do it on multiple systems when they could be doing something else. I don't even upgrade my PC's other than maybe a new GPU (when they're not crazy expensive). Usually by the time I need to upgrade something the whole platform needs replacing anyway so it's essentially a replacement anyway. The PSU and case may get reused.
@@TalesOfWar When I upgrade my computer I sell the parts I don't need. Even if you do throw it away throwing away just your old gpu is more sustainable than throwing away your entire system.
@@TalesOfWar no
@@TalesOfWar I'm sorry but what? Throwing away 1 part does not equal throwing away a whole pc. Electronics fail, simple as that. And when it does you either get it repaired or do the apple thing and sell the consumer a new one. It would be nice if the machine is reused but what if the failure was just a drive? How bout a gpu? If you replace that 1 single part thats 1 less cpu, ram, psu, case, mb etc in the landfill, or at the least increasing the longevity of that pc.
Question: What were you doing to your hoodie with that mask @ 9:36 ? Moire reduction?
Killer review by the way. I am definitely getting that M1 Ultra.
Yes (with unsharpening) and tuning down the color a little bit
@@mkbhd cool response, btw i heard the m1 ultra suffers when you apply video noise reduction as compared to other systems.
@@truthseeker6804 Really?
@@Itz_a_me_mario yes i saw that on reddit, not sure how true it is. but id like to see some tests by youtubers.
@@truthseeker6804 yeah, but apple computers are pretty awesome.
But their iphones are so costly which makes it almost useless comparing it to the Samsung's phone
Imagine complimenting a company's product as environmentally friendly because they used recycled materials, but simultaneously made that same product irreparable by design in such a way that if any one of those components of the product fails, it's not a matter of simply replacing that specific component, but essentially the entire product and that, once that product is no longer being produced and all previously existing stock has been exhausted that those devices will inevitably fail and become waste; another item destined for a landfill somewhere.
12:54 „If you’re bad at editing and you get this machine, you‘ll still be bad at editing but just faster!“
My new favorite Quote of all time!
I can't even imagine what a full fledged Mac pro tower with no power limitations will be like. Remember, these new mac products are mostly mobile and pulling less than 250 watts. The old mac pros fully specd used to pull 1000+ watts power.
If performance per watts is to be taken into consideration as we've already seen from the MAX and the Ultras, then 500watts+ M1 mac pro would be insane😤😤
The Mac Pro won't have an M1, this is the last of the M1 chips. It'll be something new we've not seen yet, be it an M2 or something entirely different. But yeah, I'm super excited to see what they do with an Apple Silicon Mac Pro. I hope they keep the same form factor of the current Mac Pro but just turn it up to 11. I'm imagining something that has all the memory on package like the M1 chips but with easily expandable storage and a bunch of 16x PCI-E slots, as well as an expansion of the MPX bus they created. Imagine all the bandwidth they'd have with PCI-E gen 5 and the extra features like the higher power delivery of the MPX slots?
I don't think Apple will do a no power limit Mac going forward. They seem to be really about doing as much as possible within a given conservative power envelope.
Man these Macs have been just evolving everytime its crazy impressive. They once were the worst and became the best. Good job on this one Apple!
@Don't read my profile photo ok
true but they are basically unaffordable for a normal person now
Wait I thought the benchmarks showed there are better alternatives for the price??
@@hejhejaske Yeah these products are beginning to appeal only to the prosumer base instead of the typical consumer
@@hejhejaske because they arent made for the avarage joe
I’ve been trying to decide which mac studio to purchase, the reviews have been all over the place, but mostly folks are saying the Ultra is not worth the additional $2,000. I get it, but everybody is wrong to believe this. Everybody is starting off with the wrong premise.
If you look at the two base models side by side, there is a $2,000 difference. But the max is lacking two things that I think EVERYBODY should get. First is the hard drive. 512GB of storage is simply not enough for this machine. Upgrading to 1TB, to match the Ultra, will cost another $200. The second is ram, I really think 32 gigs is not enough for this machine - and I absolutely don’t believe many work flows require 128. Going from 32 to 64 on the Max adds another $400. Now instead of a $2,000 difference we have a $1,400 difference with the cost of the two machines.
Additionally, I don’t think anybody should underestimate the benefits of the two additional TB ports on front. Upgrading the two front ports on the Max to TB is work at least $100 each. That chops an additional $200 of the difference. So, now our overall difference is only $1,200.
So, the real question is, is $1,200 a good price to pay for an additional 10 cores of CPU, an additional 24 cores of GPU, and an additional 16 cores of Neural Engine.
We are not debating a $2,000 question; we are debating a $1,200 question. Moreover, some folks may add more value to the two additional Thunderbolt ports, and some may add slight value to the much better heat sink of the Ultra. So, the debate could go down to as little as $1,000.
For some folks the answer to the question of whether the additional $1,000 cost of the Ultra is worth it may change, for some it may not.
I happen to think the additional $1,000 - $1,200 for the additional cpu, gpu, and NE cores is worth it, I probably wouldn’t say the same if the difference was $2,000.
I really wish more reviews would look at this aspect of the debate.
We’re really gonna need a “best of intro” series, and how they were made, or what inspired you. They’re becoming a larger feature of your videos IMO.
I have the baseline 14" M1 MacBook Pro w/ no upgrades to anything except 2GB of storage. And the more I use it, the more I wonder who possibly would need more power... like I feel like it would be overkill for 98% of people lol. I'm editing 4K video regularly on Premiere and have had zero issues so far.
@David McGuigan haha. you are using it wrong.
I have Mac mini and I am very happy with it. But as you mentioned it is 100% true that skills is what matters the most. Tools is tool at the end of the day.
Apple silicon is very interesting and obviously becoming serious competition for the x86 platform, but until the compatibility issues with many workflows are resolved, especially like some specific engineering workflows designed for x86 Linux or x86 Windows only, these impressive machines will not be compelling enough to make the transition.
I hope Apple addresses these issues soon.
"do you really need that much power?" The craziest thing about technology is that in 10 years we'll be laughing at its poor performance
fr
When I first got into PC builds and such, I remember hearing that the GTX 1080 was so amazing! Look at us now
lets make it to 5 ! I say - 5 Years.
The sad part is that it’s out and it’s already slow. A 16 core amd 5950x beats the 20 core arm cpu in every benchmark. Nvidia’s h265 compression is 50-100% faster than the one in the m1 chips. The video card is a joke. The only good thing about them is the prores encoder for who ever needs it.
It is true, it consumes way less, but you get similar power consumption if you downclock the 5950x as well.
@@chrisp9859 gtx 1080 ti is still amazing btw.
I'm stuck with gtx 960 ahahaha
It is kind of rare but it would be very interesting to see how this PC performs in scientific applications. Something like „Molecular Dynamics“. This would be very important for many young scientist.
thats what the mac pro is for.
This is incredible! Thank you for making great quality content.
I finally got my Mac Studio Ultra with 48 cores and I am in computer Nirvana! For C4D, AE PP, PS, Audition it is smooth as silk. It has yet to crash after 1 month. I love this thing. Today I set up a 4X2TB Samsung 980 PRO RAID level 0 array and get 2500 r/w in an OWC Express 4M2 enclosure. I also have 2 Acasis enclosures with a Samsung 980 PRO 2 TB and a Samsung 990 PRO and get 2800 r/w for each. I copied a 22 GB folder in 10 seconds! Redshift is a joy with this machine. I am SO glad I got it! And from what I read that the M2 is only around 10-15% faster, I am glad I could not wait till June 2023.
lol, how much did they pay you for this comment?
@@sdfaasf7574way to ruin the mood there
You guys should be happy for me. Today in Premier Pro a 49 second video rendered in 8 seconds!
I like how when the mac pro came out all the reviewers were happy and like "finally a long lasting upgradable machine". Apple then proceeds to replace the mac pro after a single generation and no upgrades whatsoever. Must feel great for those who dropped 50k on those things hoping they would be supported for a long time
I don't know why they had to stick the power button on the back of the machine. If you set it up like you did in the video, it's pretty inconvenient to to reach around the back. Putting it on the front would be much more ergonomic in pretty much any use case, IMO.
Except you don't really turn macs off... Like the only times I reboot my macs are when they get updates. Suppose if you push them hard enough or use some software you can have crashes once in a while but that's about it. My office mac mini was configured to wake-on-lan also so that I could restart it in case of a power outage or something, it was a push of button on my iPhone and it would start.
a touchID with integrated "breathing" light would have been a nice way of building it in.
They probably expect you to turn it on and never touch the button again. Set it to sleep/hibernate and wake it up with the keyboard or mouse.
Sleep mode doesn't suck on Macs. It actually... works. I've had my iMac on and only ever sleep it since I got it in 2013. It's only ever turned off for system updates but thats a restart and not a full on shut down.
@@jarrodjob Why would they do that? Their keyboard has touchID already and the breathing light is already on the front of the machine.
When my iMac’s GPU died I was left with best screen I have ever owned, dead because of failed GPU. I thought I would never own a Mac desktop again, but this will definitely be on my list when the time to upgrade comes… in about 8 to 10 years
It was a crime that they changed from magnetically fixed screens to glued on ones. From easy repair-ability to being a freakin nightmare to fix.
@@Cyba_IT look on the used market. I see scores of old iMacs and Thunderbolt displays with broken or missing glass.
@@mrlt1151 Thanks man, the screen itself is actually fine, it's getting the glued screen off to get to the internals which is a challenge. It was just so much easier to remove the screen when it was magnetically fixed on that's all.
Technically, all computers are modular. You can connect a keyboard to a laptop if you don’t like the built in keyboard or connect a monitor to an iMac for some reason.
I think they were deliberately saying that hoping some people will think that it opens up like a Mac Mini, only to be disappointed when they find out it is not upgradable.
Macs are less modular than a Playstation 5, because even a Playstation or XBOX you can open yourself to install a new NVME drive. I think Macs might actually be more ewaste tier than Dell or even HP. It's specifically designed to be unupgradeable, unrepairable, and easily breaking down and needing expensive repairs or to be replaced. Even something as simple as installing a new SSD you need to get an alleged "technician" aka Apple corp. associate, and it's all designed to just milk you of money and use MLM scheme/cult like things such as the "Genius Bar" which is one of the most laughably stupid things I've ever seen.
Basically, you want to con your mark into thinking he's a genius for letting you pick his pockets.
LTT actually opened up the Mac Studio. You just have to remove the rubber ring at the bottom to get to the screws. There's not much you can do inside though, but they discovered a second, empty NVME slot for SSDs.
@@pandemicneetbux2110I have had 5k imac since 2016 with zero repairs needed. You are uninformed about the quality.
I like how you talked about the "tools" at the end. Apple is definitely gonna market this as a very desirable product for anyone but yeah, this is overkill for many people, and kinda heavily priced. It's not for light work people or people like you even. The video studios they displayed in the event are professional filmography stuff and not for a 10-minute youtube review at all. For a person who REALLY needs all this power and makes perfect real use of it, this thing is a bargain!!! I know you can't resist and use it anyways, but this is a dope machine for many people who wasted money on any low or mid-specced mac pro......
I believe there’s a reason why they showed high end studios instead of your regular consumer, they’re marketing towards that segment.
The 2k one looks great
Bargain ?
@@imerence6290 depends on how often you use it and how much time frustration it B saves
As a software developer my work consists mostly on typing, and I will never ever in my life need this power. An M1 or M1 Pro will do the work perfectly for me 👌🏻😊
once you start importing TensorFlow and Keras, that's when you'll wish you had this machine
but for such workflows you have gcp or any cloud for it!!
I think the price is competitive as well, I wish you mentioned it in this video. For the price the m1 max and m1 ultra is offering it's a killer deal for video editors. Other machines of comparable performance go for significantly more
👆Thanks for watching 🤗🤗
And commenting
Send a direct massage right away🆙
You have just won a gift 🎁✅.....
nothing apple makes is price competitive you are always paying a big premium for just the logo
Most people believe that getting better equipment raises your skills. Love that he points this out. Very few people would benefit using this.
Whew that intro!!
0:02 Marcus dog's name is MAC🐶
If you don't know
Never commented on your viddy's but I really love em. The Mac Studio is def overkill for me as a web developer. But your channel is the perfect way for me stay updated on tech tools. Thanks!
If you ever read this, I'm trying to break into that career field; would appreciate any real world advice :)
Ports on THE f'king FRONT is the revolution. All a bit late though. Most of our professional editing clients left Apple products behind years ago.
Craaaaazy shot! 0:10
the thing that impresses me the most is how little power it uses to get similar performance. We've been told by Intel more power needs more watts. For 3 decades intel rolls out new chips with 5-15% improvement with big releases getting 30%, but the power requirements kept going up. My AMD ryzen workstation has a 850W PSU!
That Has always baffled me as well
You are using a 850W PSU because you want, but you probably don't need one. A 3700X only uses 90W at 100%.
@@Deses I got 850W in case I decide to run dual RTX or upgrade to 5900x. Sadly both of those were basically unavailable or over-priced since Jan 2020. For 3700x and RTX 2060 I could have gone with just 600W and had plenty. I still remember Pentium pro days when all I needed was 400W PSU for a mid tower and 2-4 HD. I've killed enough Intel CPU, PSU, cpu fans, case fans and HD to have a grave yard of hardware.
Power supply units and 2025 will only require 400 W currently you can get away with 600 mark my words
@@woolfel You are delusional. The only RTXs that can be run in SLI are the 3090s, 2080s and 2070s and you are not going to run TWO of these with just 850W.
It's ok if you got a big 850W PSU. I got one too! Nothing wrong with it! But don't make up excuses to save face lmao.
For my needs Apple replaces a 27" iMac at $2500 with a 27" package at $5000. I can do without the super bass when photo editing. Pass on this one.
And they took away the monitor. $1600 extra.
I had heard that a Blender render is very slow when compared to an RTX card, there is not magic in the chips and they still have limitations on many workflows that no Tech Review TH-camr will cover since everyone of them are into video editing and Apple is making sure to build computers that will always have favorable results for their workflows. At least here we are told that gaming is not what Apple devices are built for, that is much better than just about all the other reviewers.
Blender has always been optimized to run on NVIDIA cards because it uses CUDA. Even AMD cards can’t compete in Blender. This has nothing to do with the chips. Apple is actually now contributing to Blender the necessary optimizations to improve its performance in Apple Chips. So Blender can’t be used for comparisons.
We actually did - a week ago to be accurate
@@nullpex What are you talking about? CUDA is part of the NVidia chipset and that is comparable to other products like Apple’s chip, they even compare to it with their very bad graphs. I am just pointing out that there are still many cases out there where an Apple devices are not better, I do agree that the M1 chips are great for the cases that they are designed for but not outside of those cases. Apple is making video/image processing computers, if that is not your workload, then maybe you shouldn’t buy one. For most people, computers hit the necessary level of performance (in CPU and GPU) 10 years ago in the mid-range laptops and everything since has been mostly unused by them.
I have an M1 iPad Pro and find the software lacking for a “Pro” label, Android tablets are about to do much more with lower spec chips (but also much lower performance). But would I still want these crazy high spec chips, yes! Do I have a use for them, no!
Apple is great at restricting software on their systems by not fully supporting programming languages, libraries, and standards that they don’t develop. This makes the system easier to control and can help reduce bugs, but now the user doesn’t own what they pay for and are locked into an ecosystem that appears to be great for users until you get into the details and find that your privacy is still being sold.
Linux is the last hope for users to have true privacy, ownership, and freedom to their devices. The problem is most people have given up on having these things to avoid complications with using devices. This is also seen in the automotive industry with cars being built with little regard to user serviceability.
If Apple would just sell kits like when they started out, then I would buy another Apple chip. Having to buy whole new computers to upgrade is dumb and this trend has spread to most of the industry.
Apple should make a car that you can’t service and we can live in a world where users/consumers have more freedoms taken.
@@nullpex For the CUDA Cores Nvidia got, the M1 chips have the media engine which makes them better in video and photo editing than an 3090.
One thing I would be very interested to see is how much better the Mac studio handles AI upscale via Topaz.
Since they are adding to the neural engine I wonder how much of a performance boost that actually translates to.
6:51 "Intel could never." Well, in CInebench 12900K scores around 28K points which is 4K more than M1 Ultra
Yea, this didn't seem as impressive to me either when the graph came up showing that it's a little better than a 5 year old, 16 core threadripper and gets handily beat by a 4 year old 32 core threadripper. Yes this CPU is efficient but it's far off from a mobile cpu which most people seem to liken it to just because it's arm based. And it gets beat out by most modern desktop chips from AMD and Intel which are still much cheaper.
Yh I thought this as well. Don't get me wrong it's impressive as hell how much power this thing can kick out with such low tdp wattage, but it seems disingenuous to compare a brand new processor to fairly obsolete processors in comparison charts (i7 7700k????)
I think we can all agree Marques has the best intros on TH-cam.
Damn bots
This device will become e-waste after the soldered, non-replaceable SSD dies. Flash storage has a limited lifespan after all. Cardboard packaging or recycled aluminum does not make this kind of tightly integrated devices enviromentally-friendly.
--
Edit: Apple states that the SSD is not user accessible. Even though it's not soldered, it will not be user-upgradeable.
But honestly, that's like 8 to 10 years out and realistically, as a 28-Core 4x W6800X GPU user who's system smokes not just this Mac Studio but also MKBHD's Mac Pro...this littler device is still awesome. I ordered one for the music studio upstairs. Of course I wouldn't even think twice about it replacing my Mac Pro...BUT...I'm very curious to see what happens with the Apple Silicon Mac Pro because quite frankly, there are a few scenarios I can think of that could tempt me to replace my Beast.
SSD is replaceable, not soldered in (see the tear down by MaxTech)
Two slots for SSD, all connectors can be replaced. This thing can be fine for at least 10 years....
The ssd isn’t soldered it’s a Apple special replaceable affair and Mac systems tend to live for 10 years easy and by then it would be super outdated anyways and it’s fully recyclable
@@jeffp1370 for what SSDs? Apple have a history of making what should be easy, difficult. As an example, in previous laptops and iMacs with SSDs, while Apple may have used say a Samsung made NVMe SSD, they dont use the same pin outs on their motherboards so you couldn't just go buy a retail Samsung SSD to replace a faulty Apple one. That was resolved by third party options including adaptor sockets which were made feasible by relatively high volumes that will not exist for the Mac Studio. And there there is the hardware "security" Apple has been implementing. It will be very interesting to see what happens with these SSD slots and possible upgrades but I doubt Apple will make it easy.
The thing that I hate about apple is that they shamelessly lie in their marketing! They stated that it has the performance of 3090, which is just a blatant lie! In Geek bench 5 the m1 ultra scored 102K points while the 3090 scored 212k!
Technically not a lie cus, it’s like better performance for their mentioned level of power draw. Like it beats it by performing at x speeds by drawing significantly less power at that range. But 3090 draws more than double the power and keeps going up in power draw and performance
@@saberkyo the best way someone described it is that "they don't lie, they just don't tell the entire truth"
@@kianhaynes23 yeah that’s the line haha
@@saberkyo you see... The pc will crash as soon as it starts doing something intensive if it has less than the minimum needed power, so they didn't test it like that, they just lied!
How I wish (and I know its a long shot), that apple would make a M2 Game chip. Double the graphics cores of the M1 max or even just on the Pro and they would appeal immensely to my type: Loving the mac environment, primarily using the macs for light workloads but also loves to game with the opportunity occationally presents itself
they don't care about gaming, and they would need dev support, which is gonna be even harder now that they are on ARM
The entro involving the camera robot was just breathtaking! great videos man.
...... 👆👆👆Congratulations you have being selected among our lucky shortlisted winners.....
3090 can be OC'ed and can take >400W but will hit thermal limits. Likely Apple found 320W-ish to be the limit for sustained performance for stock, air cooled 3090s.
No, 3090 can hitich higher, a TH-camr proved they cropped it on purpose to make the m1 ultra look better, also in blender test , software not native to apple, a 3090 smokes the m1 ultra.
This really reminds me of SGI back in the day. Silicon Graphics were these beatiful computers, great design and powerful too - but most people had zero use for it. Back then, I was thinking about getting into 3D graphics - just to work with something this beatiful and outlandish. I'm too old to start a video editing studio now to have a nice aluminum box :-) My Ryzen with Nvidia GPU is pretty much equal to this and when I'm done with my workstation, kids gonna get it for games or it's gonna be my new home server.
I remember seeing SG exhibit at a tradeshow back in the 90s- i think it was a onyx they had on display fir a few 100 grands
Should have uploaded this to your The Studio channel
Just uploaded a short on how we did the intro actually
Welcome to the new era of Apple, in two years pay us $500 to install the NVMe drive you just bought from us for $1000. Apple we work hard to set the industry decades back.
Mate, you're amazing. After years and years of product reviews, I keep watching your videos. They make me happy, informed and love all the narrative you build around them. Keep at it! Love from Scotland :)
3090 is not for gaming, it’s for heavy workloads like 3D rendering. Awesome review!
The shots in this video + the thumbnail are insane . The shots in this video + the thumbnail are insane .
For me.. my M1 Air itself feels like a overkill.
And the Max is great for those of us who don't want to spend $5k. I've been very impressed by how fast Handbrake went (when you choose the proper video encoder (VideoToolbox)). It was great to do Blu Rays at around 200 fps, dvd at about 800 and 4k at anywhere from 30-75 fps. CaptureOne flies now. To save even a bit more money, I went with 500GB internal storage and keep lots of large objects on external SSDs. This leaves me space to keep what I am working on internal. And with all the ports, you can connect a lot of external devices before you have to invest in a NAS or something. So, there is plenty of machine to had here in the lower end of $2k. Oh, 64GB because as noted you can't upgrade the memory, and going to 64GB wasn't that expensive. The Ultra would have been fun to get, though. But, I can't really justify it as a home user and stills photographer. Nice review!
Hey this was an incredibly well put together review!
I absolutely love how you edit your videos, gives such a sophisticated aura
"But hey, atleast it's $5,000 of overkill vs $25,000 of overkill" OOOOOOOF. The tech companies (not just Apple) CASHED OUT on their consumers/prosumers the past few years... It's FASCINATING to see a "new product" cost a 1/5th of the price for similar performance. The tech boom happening right now is actually mind blowing. Was Apple really behind over the years or were they just holding out?
It's Moores law taking effect.
It's all about the chip really.. They used a intel xeon, now their own. So it's not about being behind or holding out or not, but rather the in-house chip that's causing a paradigm shift..
They weren't behind. Years ago they mentioned how one of their iPhones they were releasing had the same processing power as one of the MacBook Pro's they had out at the time. That should have been a huge hint though I'm sure many people just didn't pay attention or expect them to move that same processing power into personal computers.
@@andremessado7659 exactly, my point proven. They were holding on out on their costumers (as well as other tech companies), because it was a head of it's "time" and profited big on it.
@@justinburgan No Justin... Just no... smh 🤦♂️ Do you know how hard, difficult & time consuming it is to essentially create an entirely new processor that will go toe to toe with AMD and Intel, the two biggest CPU manufacturers in the world? It probably took many many man hours in order to build MacOS for ARM, repurpose rosetta to run X86 code on ARM CPU's without issues and to make it fast enough for people not to notice. Your thinking on this is too simple, I don't think you have a clue as to what you're talking about or understand how processors and code work. They weren't "holding out". It's very difficult to do what they did to bring their ARM chips to a desktop/laptop especially considering Microsoft tried this and failed miserably.
Apple has been wanting to ditch intel for a long time and thats what they've been focusing on. Also the power has been available in the iPad/iPad Pro so they again we'rent "holding out" because the power was already there in iOS. It just took time (obviously) to move MacOS, an OS thats been running on X86 Intel chips for almost 15 years to go to a completely difference CPU architecture (ARM).
Mkbhd never fails to entertain us!❤️
@Don't read my profile photo does this work