i purchased a base M1 Mac Mini a week before this was announced. I have no regrets. This is fun, and the level of power is very impressive, but I don't need it. Just nice to see someone who isn't completely fawning over it and sees it for what it is - a great, powerful option for some people, but not everyone.
Dave, I must commend you for your unbiased reviews. Most youtubers flex the GPU power of the M1's to levels claiming it surpasses all existing GPUs, but your Blender result showed a lot of practical comparison for someone I know who uses Blender and was deciding between an M1 mac (not even M1 Ultra) and a 3080 GC windows one, it's clear from your video what would perform better, I really really love this comparison and really wish more youtubers did comparisons for real world proper non-apple applications.
You’re such an underrated tech TH-camr Dave. I really enjoy your reviews. Nothing flashy, nothing over the top, no attitude or ego, just a nice to watch, honest and credible review. Also love the studio you have. Super minimalist the way I like it. Looks like a spaceship.
Great review, as always, Dave. I’m currently using a MBP 16 Max. I’d love the smaller footprint of the Studio on my desk, but if I’m choosing a single machine… the laptop still wins. Having the flexibility to work from a location other than my desk is value I won’t trade. That said, I do find myself fantasizing about the Studio despite it not making logical sense.
Same here with mbp 16 m1 max, and loving it. You forgot to mention we have liquid retina hdr displays with 120hz refresh rate. They will have more powerful processor next year on 3nm, and will be in the macbook pro, which might end up faster than top of the line mac studio.
@@biosgl The display is a beauty! I’m typically at my desk, so mostly looking at external panels. Even so, I find reasons to work in different parts of the house just to experience the great keyboard and 120hz refresh. 😆
Yeah bro! I finally got my 16 inch a couple of weeks ago. Before that I was working (video editing business) off of a desktop PC. And when I would go and stay up north with my parents for a week I'd have to unplug my cables, monitor and PC and lug it all to my car, plus set it up when I got home again. Hours of effort when now I can literally just pop it in my bag in 2 seconds. Worked in a cafe with it for the first time as well, was so fun, and my battery only drained 30% in 4 hours while editing Multicam in FCP.
I would love to see a video about the M1 Max version of the Mac studio with 32 core GPU and check if performance is better than MacBook Pro with better cooling due to bigger size of the Mac Studio and being plugged in all the time. Ultra is obviously performance king, but way overkill for most.
I can't even imagine what a full fledged Mac pro tower with no power limitations will be like. Remember, these new mac products are mostly mobile and pulling less than 250 watts. The old mac pros fully specd used to pull 1000+ watts power. Now imagine the performance from an silicon with 500+ watts. Insane
Bro, cpu power to perform scaling is not linear, it can use 2x the power for 10-20% improved performance thus expecting night and day difference in performance is not ideal.
@@mehuljain5916 You're right, but I don't think there's been any cpus that have the kind of efficiency these chips have and running at the low wattage it is running at. Thus, they have a lot more head room in terms of drawing more power if they choose to go in that direction. I'm not tooooo well-versed in x86 cpus so I may be wrong about this, but I don't think there are cpus doubling their power draw either from one generation to another or through turbo boost.
Honestly the most impressive thing I find about this is the packaging. I wish more products opened up like a flower like the Mac Studio and display instead of having to tug on a device that tightly heled together by foam that you pretty much will have to crack parts off of it to get the item out of the box.
6:30 If you use a Mac, it might be worth it because MacOS doesn't do fractional scaling, so you'll get slightly blurry text on a 4k screen (because Apple upscales).
So true i had the M1 mac mini with a really good display but the txt looked awful. So i had to sell it and get am imac and oh boy the difference is sooooooo glaring
@@joeldb are you using an Apple silicon chip? I’m not sure if the issue is present in any Intel mac. It was fine with my Intel but when I switched to Apple Silicon suddenly my 4K display was borderline unreadable for long periods because of how fonts were rendered on this technically-not-retina-dpi display. If you’ve seen fonts on non retina iPads it’s similar to that.
M2 in the Air would make a lot of the previous line-up in cluding the macbook pros from the previous gen obsolete, i think they would rather stuff a M1 Pro/Max/Ultra in it instead. Even though apple did "out perform* the MBP with the mac studio it is not portable power yet and what you ask for will kill MBP's market because its sole selling point is that power. Edit:- After reading various kind replies i have understood that i had a severe lapse of judgement.
@@brandenwatt likely if you’re talking the successor to the minimal M1, but..l it’s not released yet. I expect the lower end M2 to be faster, but not hugely faster than the M1 it replaces.
Does it? Or did Apple just decide to make it? I think the demand would be higher if they put the same chips in a chassis 2x as tall with some room for storage expansion.
but wouldn't that make it not upgradable? For a desktop work/gaming computer, I'd much prefer having a better upgrade path (at least upgradable RAM and nvme) then a smaller chassis
I love the termals of the M1 chips! I have the M1 Max 16" MBP and when I play games with it, it stays cool. Before M1 I had the lates 16" Intel MBP and it was almost untouchable when I played games with it.
@@theoneonly5368 calm down fanboy. arm arch may not be a laptop arch pre se but it was build from the ground up for low power consumption first, so there you go. of course, anyone can license an arm arch and build their own chip and blablabla. and no, apple is not the superior computer corp. none is. as with the rest, m1 is great at a lot of tasks and just good at others. but that's true for most chips out there...
I find it weird that Apple pulls out all the stops for their own silicon (on cooling) but really limited Intel CPUs thermally in their products. I get the M1 line is a bit better at power per watt, but this could have been done with a copper heat sink with Xeons or K processors a LONG time ago.
as APPLE knew they would go over to ARM, which would stop them from doing proper cooling for intel's cpu, so they could get hot and look bad in tests, that would make ARM's test look faster and better
They did do it with Xeons in the iMac Pro and Mac Pro, several years ago. While the iMac Pro looks like a standard iMac, it has a totally different internal cooling setup and managed to run a Xeon with little fan noise. Besides, most of their products are laptops and sticking the Studio's cooling setup in a laptop would make for a _very_ different product...
I mean why would they overkill when they can give enough? Afterall r&d ing a pc for another company CPU vs their own CPU is vastly different in terms of resource needed as well as ROI on component. I think Apple silicone is already planned wayy far ahead and their research simply isn't done is the only reason they use intel.
@@andynormancx you do see they basically doubled a Mac mini here and killed off (at least so far) an iMac line to accomidate this cooling tower...right? Right?
Mac Pro Station with Intel were huge, like PC. Intel macbook air had Fan to keep temps down while m1 does nto have it. I dont buy this "Apple pulls out all the stops for their own Silicon on Cooling" since they clearly arent.
I think the winning combination would be M1 MacBook Air + M1 Max Studio instead of m1 max MacBook Pro. If you already have peripherals this combination actually saves you money, while providing incredible portability (with very respectable performance) and crazy performance at home.
agreed. also my preference. thing is i want a bigger display when mobile. if there was a 15-16” Air then i would buy it and the studio with joy. until then i will lug around the new 16” MBP
ive got an ipad Pro 13 inch with 16gb. Will get the Macbook Pro 16 inch with 32gb. The Mac Studio will be with the M1 Max processor with 64gb. iPad Pro I can take with me to the office. The MacBook will be specifically a travel companion (international flights, etc). The Mac Studio will be the home office machine. The Ultra offers too little for double the price. Better off getting an additional PC desktop machine with that extra $3000.
Ya nailed it. Ive got an Air, and bought studio max. I dont want any of the nitpicky issues of having to use clamshell mode. I also dont like having to drag a 6k machine all over the place. I want a cheap ish machine for the road I can bang around and the heavy boy at home.
@@edamameme1789 wouldn’t it be better to plug your 16 inch MBP to your desk setup’s monitor via thunderbolt/type-C? Saves money as well as allowing you to instantly resume working on whatever you were doing on your MacBook. The MacBook can also act as an additional display for improved multitasking.
I’m doing all my creative work off the m1 Mac mini. I’m thinking you are right. But after watching this video. I don’t think I need the m1 ultra. The performance gains aren’t that high. I’m just a hobbyist with 3D and the performance gain doesn’t seem that great especially if I go with cloud rendering.
I think you are one of the better tech reviewers out there, so I would love to see you include software development as a use case when reviewing such high-end hardware. Almost all tech reviewers on TH-cam seem to think that video editing is the only profession out there. They don't understand that not everyone out there is going to be editing videos ... I would guess that a lot of potential Mac Studio buyers are programmers, because tests have shown that this thing runs Xcode really well.
Love that Dave actually called out the fact that most apps are still single-core-working apps… performance in most apps is nearly the same whether you’re on the Pro, Max, or Ultra (3.22Ghz freq). 64-cores is great and all but… yeah… doesn’t mean much if you’re not using even a quarter of them, which is why the Air and the base 14” MBP are probably best for 99% of people.
The people who need it know who they are. It's a very different beast to the Air or base Macbook pro. And yes, there are people who will be tempted but don't actually need this, as he pointed out. Even most video producers can do fine with any of the 16Gb+ Macbook Pros (not so much the air), but if you want an M-series desktop, this is it for now. And it has way better cooling and port selections. A no-brainer for many studios.
“Why is the display so thick? Not everyone wants to pay extra for speakers built into the display.” Dave, don’t jinx it for those of us who cherish built-in audio quality and functionality over thinness. I want iMac Pro to be this thick, too. And include every port you can dream of.
Every component should have only one function. The function of the display should be only to display image. As soon as something alien is added into the design, like speakers or usb hubs, the whole system becomes full of compromises, it becomes heavier, therefore less stable on the arm mounts; bulkier, meaning taking up more space on the desk; pricier, meaning that for the same price you will get worse performing device. And the worst is when you completely don't need all of those features, but you have no other choice but overpay for them because there is no alternative.
I'm a musician so I always have pro speakers connected to an external interface and even I don't care about thickness. It's not a phone or tablet, it's meant to stay on a desk. Im glad the super thin obsession is no longer the priority even in laptops, finally!! Function over form.
@@MikeeRogers, smartphones are all-in-one because they simply cannot be anything else. Yet smartphones are terrible at pretty much every task imaginable and are a one big compromise. 1. They are too small for comfortable video/movie watching, unless you connect it to an external display - device created SPECIFICALLY for this task. 2. Terrible for music listening unless you get headphones - device created SPECIFICALLY for audio listening. 3. Web browsing is highly restricted by the screen size and most of web devs have to cut a lot of functionality in order to make their websites useable on smartphones. 4. Gaming is rudimentary not so much from the lack of computing power, but because of lack of input methods and their low precision. 5. Any smartphone camera is still worse than a cheap DSLR because of physical constrains and is only good for casual users and even then, decent photos/videos could be made only in good conditions. 6. Any professional tasks are out of consideration. 7. No flagship phone can hold the charge for an entire day if you fully utilize your phone, meaning many users have to constantly worry about running out of battery if the use their phones "too much". All of this beacuse of the neccesity to have everything crammed into one spot. With desktops, however, you have no such neccessity. Apple sells their new display for 1500$. Imagine if you could spend just 1000$ and have 500$ saved, or you could buy pretty much top quality desktop speakers and place them wherever you want with any device you want resulting in marginally better sound than you can get with any built-in speakers in any monitor. iMacs are solutions for buisness and you need to be really dumb to by one for personal use, unless you are in a situation of extreme lack of space.
@@MikeeRogers, I definitely didn't mention another type of iMac buyers - people who don't care about money spending efficiency and just want something that works well out of the box and looks good. However, from this comment thread I assumed I am talking to people who value function over form and would prefer to get more for the same price. I guess, my mistake, since you in particular never said so about yourself. But, you know, it really doesn't make a good conversation when "that was a whole lot of nothing" gets thrown at you after you carefully argument your point of view:/ yes, my reply might have been unnecessarily long, but noone is perfect and I am not used to online discussions.
Very interesting to see benchmarks and render times from a system out in the real world... A little surprising that it isn't just crushing the Cinebench scores from the 12900K or 5950X, and while seeing it match a 3080 is impressive, it probably doesn't bode well for the price (i.e. starting at $3999 means you could build/buy a competitive Windows PC for less even with the insane GPU pricing right now). I'm sure there are "professionals" who both **can** use the compute power this system has on tap and **want** the form factor... But from my perspective as a "content creator," I'm less concerned with the size of the machine compared to upgradability, and lacking the ability to swap out or add RAM, SSD/HDD (or to just have both internally, bc trust me that's nice), GPU, or CPU... yeah, that's too big of a compromise. I've incrementally upgraded the system I originally built two years ago to scale with my needs, and I still have headroom to grow in all the categories listed above if needed. Oh, and I haven't yet spent $4000.
I'm not convinced you can easily build a quality PC for much less than the price of the Ultra, with similar specs. Put together a 12900K, 60GB of DDR5, a Thunderbolt Z690 board, 1TB of SSD, a 3080 Ti (it is still hard to find a 3080 actually selling for less than a 3080 Ti most days), decent case and PSU. Add it all up and you're above $3,500 and well on the way to $3,999. That'll come down a bit when GPUs finally settle. And the PC will beat the Mac on some stuff, but so with the Mac beat it on plenty of other stuff. And yes, the PC is upgradeable, but the Mac is a Mac and that is what some of us want. They are a lot closer in price than you'd imagine before you add up all the costs.
@@andynormancx so you just proved their point, it's less, and it is more versatile, especially when you don't have to deal with MacOS (which is a pile of shit)
@@andynormancx Truth is, the mac is great for video editing, but for 3d work and gaming, the PC is just a superior device. Plus, a PC is upgradable. The ugly truth is that in between mac and pc, the pc will benefit consumers more, since there are far more gamers than there are video editors.
Love the review Dave. Question - Was the Blender test definitely using the GPU? Because even in 3.1, Blender still defaults to the CPU. You have to manually select it in the prefs, and then the properties panel afterwards. I’d expect those scores to be much higher if they were on the GPU tbh. Loved the review either way.
@@RenatoRegalado Other tests show the GPU performing really well. It increasingly looks like it’s a lack of ray tracing capabilities on the M1 that’s the limiting factor unfortunately.
Dude finally a perfect product for me! As a colorist it's impossible to get such a powerful mac without breaking the bank heavily getting a Mac Pro, this is finally the solution I've been waiting for.
My doubt it's still this or a high end PC. I work mostly with 3D rendering in C4D+Redshift and the Blender tests really threw me off, unfortunately. I'll wait for some benchmarks on RS Metal, but I think I'll have to stick with my big box for a little longer :c
I have a very small suspicion that Dave didn’t enable GPU rendering in the render settings? It’s very easy to miss as the reasonable expectation is that would be the default. If that isn’t the case though, the 3D performance is really poor. Especially to say that so much of apples marketing material seems to involve 3D artists being showcased now.
@@automaticbutt78 hm, that could be the case, yeah. But given that Metal is pretty recent these results aren't too surprising to me since there haven't been too much optimization for it yet. We'll just have to wait and see, I guess.
For 3d, viewport performance is more important than render times.. you can always render your stuff in render farms, or sure can wait extra few minutes or an hour for a render, but without a smooth viewport performance, its not worth it..
Well but some people just can't look past GB & rendering times (which btw very few actually mention is due to the dedicated encoders+decoders, not necessarily the CPU/GPU)
Yeah the viewport performance sucks currently with Blender on a M1, but it was always shitty on a Mac. Hope Apple will improve this now they're a top tier supporter
Okay. Dave2D you got me with the ultimate badge: the black Apple stickers! Love the chassis size and form factor. I’ll get the Max version w/ 1TB SSD. Think that’s the sweet spot.
For all enthusiasts ( i.e. brand indifferent ) the best value prop is the M1 Max base with x2 32 LG ultra Fine 4k ( 2 are less than 1 Apple display) + a sandisk Extreme pro or 2 Most people already have their choice of keyboard and mouse
im not expecting it to match the 3090 but at least come close because if you look at how it scales in some benchmarks you can tell there's a lot to be desired.
I just picked up the base Studio and it's probably more powerful than what I needed for my basic Final Cut and Logic Pro use but I think I'm good to go for the next few years. I can't imagine how fast that Ultra would be!!!
Last bit is very important. Dont fall into the trap of wanting something vs needing something. If not needed there are always other great things to spend your money on. If you don't care about the money you spend I highly doubt you'd be in the YT comments anyway. I think it's a beast machine but still very much a Mac which has positive and negative sides to it.
@@theoneonly5368 It might feel to you like that but I don't think it's an "Apple-only" argument. I assure you the same thing applies to other brands. I don't really get the second part of your response, are you saying that unless I get the best and newest of everything I should just get nothing instead? Most of my purchases are born from a need, I suppose that's bad? Why do people buy a BMW vs Toyota? Because badge and performance and some people choose to spend their money in the way THEY want. Clearly some people like to buy based on how a product "feels/looks/is marketed" vs what they need (basic transportation). I don't think that is a bad conclusion to spending habits. This being the best computer in the history of computing? That's plain ridiculous. If that's your take then you're just a fish in the pond. Decide for yourself whether this particular product is meaningful and useful to you, I think that's the general idea behind my comment which was basically an agreement with Dave's opinion.
@Dave2D, Intel did *try* to make something that powerful in that small of a chassis, the whole Intel NUC line of products. I don't think they'll be able to compete with a company that can control the experience vertically like Apple can.
6 months later, Cook will remove the A13 chip from the Studio Display and slot in a M1 Max and call it iMac 27 inch. That's why this display is so thick.
Between 4k and 5k, there isn't a very noticable difference in terms of pixture clarity at 27", but the key here is that 5k is exactly 2x 1440p, which means you can simply double the pixels when it comes to displaying UI. With 4k, you're having to rely on 1.5x scaling which is all kinds of pain for a lot of applications. Especially for someone doing graphics design work. 1.5x sits uncomfortably between not being able to design at 1x scale (everything is too small), and not 2x because it will look to big. For example, if you're trying to draw a 1px line, you can't go 1px because that would look too tiny, and you can't go 2px because that would look too big, and you certainly can't go 1.5px because that would just look blurry. This is why it's very frustrating to see only 4k panels in the PC space. A straight 2x scale would be so much better than all the weird inbetween scales. Like do you go for 1.25 at 32" 4k? And if you do 2x scale on a 27" 4k monitor, you end up with everything looking nice and crisp, but you're left with only 1080p worth of screen realestate, AND everything looks physically bigger than it's supposed to. So, it's not about the resolution, it's about the scaling factor you can practically use.
Yep, as someone who has written code for my desktop apps to fix up buggy windows scaling issues, and adapt to many display sizes, you are spot on. In windows 1080p x 2 = 2160p, i.e., 4K, which is 2x the scaling. Hence why most people run 4K displays on windows with a 2x scaling to keep everything nice and crisp. Also many icons sets are optimized for 2k, 4k The mac world, standardized on different defaults, hence why 5K is better for macs. The question is, for a 27inch display, how much crisper it it?. I can see that menus, etc., are more crisp on my windows machine on a 4K, than on a mac mini using the same 4K monitor, however it is not enough to be a deal breaker. I do not have a 5K monitor to see the difference on the mac, so I can only speculate that it will be crisper on the 5K - maybe someone else can comment. And of course for media content (photos, etc.,) on a desktop, since these are rarely displayed in exact multiples on there pixel size, this is not relevant. On something like books on a iPad which has well published standards, which apple backs with app guidelines, then this is relevant, but I would think not so much for desktop.
@@m-stat9 for any new 4K TV or Monitor that need the whole 12bit colordepth with 4:4:4 chroma for HDR. Quite important if you want to master Dolby Vision content. But now you need another dongle again stealing a precious port on a "Pro" device. You can never have enough I/O on a Pro device and you always need more. Its a rule
Will, until 2021 we had to be fine with USB-C only. I regard the HDMI as nifty addition for quick connection to most projectors, monitors and PCs. If you need more fancy stuff, use the dongle just like before. I am sure M2 will add HDMI 2.1 it just wasn't possible yet (48Gbps bandwidth)
So, this is what I'm getting from this. Correct me if I'm wrong. 1. The intake on this thing is tiny and has little access to fresh air. The fans must have really high static pressure to even get enough air to cool this thing, and even then I'm kind of thinking they just have super beefy heatsinks because they don't want to make the fans loud, which means that this computer could have thermal issues once dust gets in there. This looks like a classic Apple "We're gonna bake this thing instead of having noise" approach to me. 2. That monitor has way, way too much going on for the features advertised. It's either a complete waste of silicon during a global shortage, or it's going to get new features added in later via an update. I really don't think it's worth it now, particularly without higher refresh rates and with most other monitors being extremely color accurate and pixel dense for less money, but it will be interesting to see if they add more to it post-launch. 3. Apple really hates high refresh rates. When they finally caved and upgraded SOME of their devices to finally have notoriously low-quality high-refresh-rate displays, they turned around and released multiple "pro" devices at 60 Hz and now the most powerful computer they've ever made has an outdated HDMI port that won't even let you use the higher refresh rates of other monitors even if Apple decides to release some. 4. The main problem I'm seeing is a (relative) lack of single-core performance in their newest chip. I mean, it's still pretty good, but most pro software won't even see a need for this many cores already and as Dave mentioned the lack of multithreading in many tasks means this won't be a meaningful upgrade for lots of people. The single-core performance is the new bottleneck here. If they want to make a new Mac Pro, they are in a rough spot where adding more cores is honestly irrelevant and they can't increase single-core performance without a major leap forward on the scale that M1 accomplished already. I just fail to see how a new Mac Pro could be competitive unless it launches with an M2 long before anything else does, or maybe they have another SOC with fewer, higher performance cores that they will connect together into a massive high-core-count CPU using their version of the Infinity Fabric. It just seems like the hope for a new Mac Pro is pretty much dead for another year or two at least.
I would have loved to hear about how the thermals were doing on the Studio. My mac Mini 2018 i7 gets warm just watching youtube videos at 1080p. This was a big concern for this product! Edit: Oops, he talks about it at 5:00 !!
It's probably not great. Apple has a poor track record when it comes to cooling, and the intake is too small and too removed from the fans to be very effective. Blower fans like this uses also aren't great in general.
@@Bitshift1125 You obviously haven't kept up with the track record. First of all, all of their ARM based M series chips, as with their A series chips, are very low power, stay very cool, and keep their performance without throttling in most situations, even over long periods. Yes, back when Apple used Intel products while trying to keep the laptops efficient and small they skimped as much as they could without causing damage to the mac or cpu. Simple answer is, cooling with their new chips is insanely good, mostly because the chips themselves are insanely good.
@@af4396 Except people noticed less than two months after the M1's release that you can improve their performance by around 20-30% by just replacing their thermal solution. Apple just does not cool processors adequately because they want everything to be as thin and light as possible.
@@Bitshift1125 Now we're not talking about something overheating, but improving the performance on an already impressive chip, which im sure you can do on many laptops and devices by replacing thermal paste as well, except most x86 chips wouldn't have a similar response curve because they're nowhere near as efficient. This is not the same subject that the concerned person was asking about. The M series run very cool, and very fast. Period.
I would love to get this for my music studio but the problem is that I also need a new computer for live gigs with my MIDI keyboard. So that's why Macbook Pro is what I have to get even though it's more expensive. It's just so handy to be able to put it in a bagpack and go wherever to work/play music.
For the Blender test, were you using the Apple Silicon version or the regular (Intel) version? I ask because Intel is the default and you have to look for the Apple Silicon version.
@@beoxsgaming9388 It gives me ARM64 version on my Linux System from official blender download site, so it doesn't seem to be that, anyway. I think that perf is ARM64 one.
@@white-bunny Very interesting. I have an Intel machine myself, but looking at my wife's m1 MacBook Pro it still gives her the Intel version as the default. Again, you would think.... 🤷♂ On my Ubuntu (20.04LTS) machine it gives me the Linux version as the default. Looking at the "Daily Builds" list for Linux there are only x64 builds, not a single ARM64.
Very cool. Apple is on top of their game right now. I use PC, but only really because of gaming and familiarity (also like to be able to have hacky solutions that only exist on windows tho). If apple would get support for gaming for real, id probably switch.
@@52thephotoshop absolute rubbish some far better software on MacOS than windows. Gaming is a nonstarter though. I personally use Mac for work and have a pc for gaming. Some fantastic third party devs for Mac though.
The monitor is thick because the integrated power supply, from the fact the power supply inside can support the display and 96W USB PD makes me guess there is a 150W power supply inside. Unless you go to planar transformer / inductor on both the PFC as well as the flyback or LLC section of the power supply, it is really hard to make the power supply thinner than 20mm. Put this power supply behind an LED panel, you got a fairly thick display. The 24 inch Mac is thin purely due to fact apple choose to use a separate adaptor (or brick).
The Mac Studio is the first Apple device in years, decade+, that "had" me stoked about it. Upon announcement I was like Apple lets goooo!!! I'm a 3D motion designer and after seeing the Blender test I'm back to unimpressed. Seems Apple build their systems for the video editors and 2d motion graphic artist. Looks like I'll be sticking with my 17in razer i9 with the 3080ti for the foreseeable future. Such a let down.
Blender devs could consider updating their app to work with these chips. We’re at the beginning of Apple’s switch away from Intel, so I could see a blender update in the coming twelve months to take advantage of all those graphics cores in there.
Thats the thing, they make such a specialised product that only works well with certain tasks. Its not a windows machine that can handle everything quite equally well.
Nice and early as always Dave! I’m a PC guy but really interested in M1 Ultra to see if there claims were true or not. Well it’s not as powerful as 3090 as apple claims it to be but it’s pretty sweet for the 1% editors and developers.. Thanks for the video btw. Cheers!
Yeah, it's not for gaming brah. This much power in a mac is useless for general consumers when u can't game on it! High end Professionals are super niche, for those this is worth. At this price I would anyday pick up 3090 all day everyday (if I can get it that is)
@Siddhart Jatav lol, leave it. He himself is a kid typing from his mom's iPhone or something hating cause his parents didn't buy him a decent gaming PC.
The power cable for the studio is removable. Wrap the cable around your hand a few times and pull. You need someone extra to hold the display while your pulling it. It comes out really hard so stand clear.
The reason for just an HDMI 2.0 is that it's meant to be connected to a TV. Most TVs still only have HDMI 2.0. Thunderbolt four supports the DisplayPort 1.4 standard so they are definitely the preferred connection for advanced displays such as the Apple studio display.
Professional Grading Monitors often use HDMI, and you just cant use 4K 60Hz 10bit with full Chroma via HDMI 2.0. And you definitely cant use it for 4K 60Hz 12bit with full Chroma. Booth are necessary to master HDR content.
Every time I watch a Dave2D video the more I wanna be friends with Dave. Like he seems like such a genuine person and we would have so much to talk about with tech!
so the biggest downside is that if one of the rams fail, there's no way to replace them. or any of the components for that matter. seems like a huge deal-breaker.
im not a fan but apple care or even without apple care they would swap it and replace it without any issues also i want to say this to everyone that buys apple products, don''t buy them in a third world country or where the customer service is non-existence. it's not worth it
My guess about the Studio Display thickness is that Apple made it thick so they can still include the speakers and other extras without giving the display a chin. I remember one tech reviewer (forgot who), mentioned this when the M1 iMac was announced - he didn't like the chin and wished the iMac was thicker...
The chin housed the fans, the board and part of the speakers. Also, the iMac has an external power supply, while this monitor has it inside and it probably is a beefier one to charge a macbook and run the bigger display.
I am alone in wishing it was prettier? Was Blender just using the GPU ? Or both GPU+CPU ? From the tests i've seen online GPU+CPU can give better perfomance.
Great review as always. Did you test out the camera on the monitor at all? I’ve read mixed to disappointing reviews on the camera and was hoping to hear your take in this video
What I admire the most from Apple is how their fans with "old" intel systems that underperformed poorly still didn't want to see the alternatives being more capable, with more I/O, affordable storage and everything... now M1 laptops and desktops are VERY good value. I not only admit it, I admire it. But man, people where buying intel macbook airs and "pros"(without even i/o for a professional use, and don't even talk about professional keyboard reliability...) and lie to themselves. This is wat I like about Dave's channel. He didn't say old ones where awesome and good value/performance. Now he is, and gets excited and so do I. But don't forget, people, it is not a religion. Don't justify it when they do it wrong!
I don't think the I/O's improved that much. I recently hooked up a 144Hz monitor to my MacBook, and they use DisplayPort. Wouldn't that have been more practical for professionals than HDMI?
@@abubakrakram6208 All modern monitors, especially high refresh rate ones, should support Thunderbolt and HDMI 2.1. Apple unfortunately doesn't support HDMI 2.1 yet, but Thunderbolt should be a requirement, especially considering how it's near standard on most high-end PC laptops.
@@RubmaLione I wish more monitors had Thunderbolt. I miss it from my LG 5K display. Now I have to run 3 cables (power, display, and peripheral hub) where I used to run one.
Best reviewer along with mrwhosetheboss and MKBHD man, thanks for the awesome videos! I do feel like the “ultra” name isn’t like what it suggests since I saw that it couldn’t give out performance compared to the 3080 and 3090. Still amazing though.
Today I filmed on my bike using 4 gopro camera's running maxed out 5.6K and extra audio from a zoom recorder. I loaded it all on my M1 Mac mini which is the lowest model and into Davinci Resolve, shockingly (No proxies, 4K timeline) it is handling all that with no stuttering at all. I Ordered a Mac Studio... Can't wait for it to arrive!
So I purchased a 2020 27inch 5k iMac right as they were launching M1 because it didn't seem like the iMac M1 was on the horizon for the near future. I do most of my work in Final Cut, Premiere, and heavy in After Effects. The M1 24in iMac would not have been enough for what I do. I installed 128gb of Ram myself after only purchasing it w/ 8gb. I maxed out the graphics card when I purchased though. My computer struggles a little w/ 4k 120fps footage and when I do heavy rotoscoping work in After Effects, it takes me a while. That being said, when I purchase a new computer....I'm thinking about the money I'm spending today to at least future proof me for the next 4-5 years. I think I'm going to hold off for another 12 months before upgrading because right now, I feel the FOMO of wanting the heavy M1 Ultra. I'd like to see more vex editors review it and make a more informed decision as to which one I actually need realistically. Thanks for the thorough review!
What I dont like about these pro apple devices that even with that expensive price it can't do everything like gaming,. Blender and game development whereas if a PC might underperform compared to M1 Ultra in some cases but it cam do everything. And also 0 upgradebality
Regarding power consumption and heat. Its not as surprising once you know that ARM is completely different to traditional CPU's. Its a low power architecture that has a way lower number of instructions that the cpu is capable of doing.
this is exactly where i thought apple would go, and its the direction that pc manufacturers need to move in. I did think they would end their mac lineup here though, i wasn't expecting a new mac pro. With this, apple will change how computing is done.
Good GOD what a break from Renee Ritchie review with his Apple loving narrative. So glad Dave also does reviews and does as one of the top best. Fully covered material, handled properly, angles all covered for consumers. @9:03 THATS THE KEY REASON I love this channel. DAVE still stays down to earth even being around top products. He knows how avg person may consider purchasing even though it’s not in their field. I got my advice there. Saved me tons of money
So this is mostly a video power desktop. To be a true workstation, it needs one very important thing: being able to upgrade. It is simply not possible with this one. So choose wisely your configuration. This could also be a great computer for any task that needs a lot of cores. But for ordinary users, stick to M1-based machines. I know the urge we all have for the fastest - newest, but it is just a very, very expensive urge in the period of potential crisis.
Apple's target consumer for this product does not care about upgradability. Professionals, businesses, and institutions will simply buy the new product and sell these.
@@thenoobperspective7588 You can add more storage, more RAM, second GPU, other cards, upgrade CPU,...replace part that is not working anymore... Professionals do upgrade, my 20+ years in the business counts something I think
Great review! How was overwatch on it? It looks like you weren’t hitting 3 digit FPS, but is that because of the 5k resolution? Is this a possible contender in the SFF market using a thunderbolt to hdmi 2.1 cable?
@@propersod2390 Guess what, there are people who do creative prof work and just game occassionally, then it would be stupid to pay another 2-3 grand just for a 2nd PC
I think this Apple Studio is going to be something that will hold value for years, even the M1 Max version. I think 10 years from now, this will still be expensive on the 2nd hand market, but people are going to be buying them.
SMOL BUT THICC
Thicc but smol
Dave being one of the first to post a vid abt the mac studio. W TH-camR
Hi Dave, what Sennheiser device you plugged in at 2:06?
THICCCCCC
4 grand and HDMI 2.0 eh?
Absolutely love Dave’s style of presenting with his vids. It just feels like he’s just having a conversation with you about tech!
Yes you're right.
@@fakepolana1674 ratio
@@fakepolana1674 bro there's no reason to bring up your sexuality here..
@@user-zc2hz3yj2k ohhhhhhhhhhhhhhhhh burnt straight through 🙌😂
@@fakepolana1674 your dad
I love it that Dave tells you when you shouldn't buy a product when you don't need it
that's almost case with every new gadgets
What you should do it expand your business until you can use all the power.
I don’t need it. I just want it…Like Dave said.
Do u need Dave to tell you that and remind you every time?
the only reason he was able to say that is because Apple allowed him to say that
i purchased a base M1 Mac Mini a week before this was announced. I have no regrets. This is fun, and the level of power is very impressive, but I don't need it. Just nice to see someone who isn't completely fawning over it and sees it for what it is - a great, powerful option for some people, but not everyone.
agreed, but i wouldn't mind that cooler on the M1, i bet it'd basically run passive lol
Nowadays, I just watch your videos for a chill, laid-back, and meditative time. Very relaxing.
Dave, I must commend you for your unbiased reviews. Most youtubers flex the GPU power of the M1's to levels claiming it surpasses all existing GPUs, but your Blender result showed a lot of practical comparison for someone I know who uses Blender and was deciding between an M1 mac (not even M1 Ultra) and a 3080 GC windows one, it's clear from your video what would perform better, I really really love this comparison and really wish more youtubers did comparisons for real world proper non-apple applications.
You’re such an underrated tech TH-camr Dave. I really enjoy your reviews. Nothing flashy, nothing over the top, no attitude or ego, just a nice to watch, honest and credible review. Also love the studio you have. Super minimalist the way I like it. Looks like a spaceship.
Great review, as always, Dave.
I’m currently using a MBP 16 Max. I’d love the smaller footprint of the Studio on my desk, but if I’m choosing a single machine… the laptop still wins. Having the flexibility to work from a location other than my desk is value I won’t trade. That said, I do find myself fantasizing about the Studio despite it not making logical sense.
Same here with mbp 16 m1 max, and loving it. You forgot to mention we have liquid retina hdr displays with 120hz refresh rate. They will have more powerful processor next year on 3nm, and will be in the macbook pro, which might end up faster than top of the line mac studio.
@@biosgl The display is a beauty! I’m typically at my desk, so mostly looking at external panels. Even so, I find reasons to work in different parts of the house just to experience the great keyboard and 120hz refresh. 😆
@@biosgl Then the next mac studio will crush the mbp again.
Yeah bro! I finally got my 16 inch a couple of weeks ago. Before that I was working (video editing business) off of a desktop PC. And when I would go and stay up north with my parents for a week I'd have to unplug my cables, monitor and PC and lug it all to my car, plus set it up when I got home again. Hours of effort when now I can literally just pop it in my bag in 2 seconds. Worked in a cafe with it for the first time as well, was so fun, and my battery only drained 30% in 4 hours while editing Multicam in FCP.
My dad uses a 2019 iMac Pro currently for work, (with the intel) so if he does this with a nice monitor he would be amazed.
I would love to see a video about the M1 Max version of the Mac studio with 32 core GPU and check if performance is better than MacBook Pro with better cooling due to bigger size of the Mac Studio and being plugged in all the time. Ultra is obviously performance king, but way overkill for most.
This is the video you're looking for bro: th-cam.com/video/mQxbDVOdBNc/w-d-xo.html
I can't even imagine what a full fledged Mac pro tower with no power limitations will be like. Remember, these new mac products are mostly mobile and pulling less than 250 watts. The old mac pros fully specd used to pull 1000+ watts power. Now imagine the performance from an silicon with 500+ watts. Insane
But does it scale inearly?
Bro, cpu power to perform scaling is not linear, it can use 2x the power for 10-20% improved performance thus expecting night and day difference in performance is not ideal.
The problem is it won't scale like that, the more power you pump the less real world performance you get and Apple got the sweetspot for their product
@@mehuljain5916 Well, in M1 Ultra's case it is almost linear both in power consumption and in performance.
@@mehuljain5916 You're right, but I don't think there's been any cpus that have the kind of efficiency these chips have and running at the low wattage it is running at. Thus, they have a lot more head room in terms of drawing more power if they choose to go in that direction. I'm not tooooo well-versed in x86 cpus so I may be wrong about this, but I don't think there are cpus doubling their power draw either from one generation to another or through turbo boost.
Honestly the most impressive thing I find about this is the packaging. I wish more products opened up like a flower like the Mac Studio and display instead of having to tug on a device that tightly heled together by foam that you pretty much will have to crack parts off of it to get the item out of the box.
6:30
If you use a Mac, it might be worth it because MacOS doesn't do fractional scaling, so you'll get slightly blurry text on a 4k screen (because Apple upscales).
The power cable being unremovable on the Studio Display is absolutely insane. Why does apple do this?
money? of course
Agree! Would have loved the same connector as the iMac 24”
To give the idea of a cheaper product I imagine, so the XDR Display has that extra polish of a super premium display
@@nakedexposure964 so you have to pay for it's repair, that's always the reason
Not like you can go wireless with it anyway.
I can definitely notice the difference between 4k and 5k at 27" because Mac OS is awful at rendering text at non-retina resolutions.
So true i had the M1 mac mini with a really good display but the txt looked awful. So i had to sell it and get am imac and oh boy the difference is sooooooo glaring
Eh, rendering @5k on a 4k display and downsampling looks fine
@@franciscoa750 kind of sad that they can't do it right for 4k.. like there are many people using 4k
Can you expand on this a bit? I have a 28" 4k samsung monitor and haven't noticed any rendering weirdness. What have you seen?
@@joeldb are you using an Apple silicon chip? I’m not sure if the issue is present in any Intel mac. It was fine with my Intel but when I switched to Apple Silicon suddenly my 4K display was borderline unreadable for long periods because of how fonts were rendered on this technically-not-retina-dpi display. If you’ve seen fonts on non retina iPads it’s similar to that.
I love that you reviewed this for what it is. Many in the tech community are talking Benchscores and truly aren't giving this machine credit.
I like that you just tell it like it is! Only review I need to watch.
Nice video, but the graphs are a bit hard to read quickly. I have to check what each color is.
Thanks, for me this one is overkill. I’m waiting for the M2 MacBook Air, which should be good enough for me. 🐯😇😎
Same dude. I need to get a new computer in July, hopefully the new Air will be out
M2 in the Air would make a lot of the previous line-up in cluding the macbook pros from the previous gen obsolete, i think they would rather stuff a M1 Pro/Max/Ultra in it instead.
Even though apple did "out perform* the MBP with the mac studio it is not portable power yet and what you ask for will kill MBP's market because its sole selling point is that power.
Edit:- After reading various kind replies i have understood that i had a severe lapse of judgement.
@@YourLocalCafe They've already hinted at it bro and they will just phase out the old pro line up. pretty standard...
@@YourLocalCafe also the m2 is slower than the m1 pro or max
@@brandenwatt likely if you’re talking the successor to the minimal M1, but..l it’s not released yet. I expect the lower end M2 to be faster, but not hugely faster than the M1 it replaces.
If nothing else, I think this system proves people's desire for an incredibly powerful computer built into as small a chassis as possible.
If you can call this small
@@TankTheSpank compared to Dave's build it's crazy small
Does it? Or did Apple just decide to make it? I think the demand would be higher if they put the same chips in a chassis 2x as tall with some room for storage expansion.
@@the_wiki9408 Then you should go for the upcoming Mac Pro.
but wouldn't that make it not upgradable? For a desktop work/gaming computer, I'd much prefer having a better upgrade path (at least upgradable RAM and nvme) then a smaller chassis
You are the first one to unbox mac studio on TH-cam
I love the termals of the M1 chips! I have the M1 Max 16" MBP and when I play games with it, it stays cool.
Before M1 I had the lates 16" Intel MBP and it was almost untouchable when I played games with it.
Define "GAMES" ...
also one is mobile SOC suped up and the other is a PC architecture CPU hence the compatibility issues with numerous games/programs.
@@KO_YTB I was wondering about his word selection too.
@@theoneonly5368 calm down fanboy. arm arch may not be a laptop arch pre se but it was build from the ground up for low power consumption first, so there you go. of course, anyone can license an arm arch and build their own chip and blablabla. and no, apple is not the superior computer corp. none is. as with the rest, m1 is great at a lot of tasks and just good at others. but that's true for most chips out there...
@@giornikitop5373 so now saying facts gets you called a fanboy? fml
You play games on Mac?
I find it weird that Apple pulls out all the stops for their own silicon (on cooling) but really limited Intel CPUs thermally in their products. I get the M1 line is a bit better at power per watt, but this could have been done with a copper heat sink with Xeons or K processors a LONG time ago.
as APPLE knew they would go over to ARM, which would stop them from doing proper cooling for intel's cpu, so they could get hot and look bad in tests, that would make ARM's test look faster and better
They did do it with Xeons in the iMac Pro and Mac Pro, several years ago. While the iMac Pro looks like a standard iMac, it has a totally different internal cooling setup and managed to run a Xeon with little fan noise.
Besides, most of their products are laptops and sticking the Studio's cooling setup in a laptop would make for a _very_ different product...
I mean why would they overkill when they can give enough? Afterall r&d ing a pc for another company CPU vs their own CPU is vastly different in terms of resource needed as well as ROI on component. I think Apple silicone is already planned wayy far ahead and their research simply isn't done is the only reason they use intel.
@@andynormancx you do see they basically doubled a Mac mini here and killed off (at least so far) an iMac line to accomidate this cooling tower...right? Right?
Mac Pro Station with Intel were huge, like PC. Intel macbook air had Fan to keep temps down while m1 does nto have it.
I dont buy this "Apple pulls out all the stops for their own Silicon on Cooling" since they clearly arent.
I was waiting for this. Thank you for the work that you do - it has helped me make tech purchase decision for a long time.
I stopped buying Apple products over 10 years now. I like to have full control of my devices and not have anything locked down. Thanks for the review.
I think the winning combination would be M1 MacBook Air + M1 Max Studio instead of m1 max MacBook Pro.
If you already have peripherals this combination actually saves you money, while providing incredible portability (with very respectable performance) and crazy performance at home.
agreed. also my preference. thing is i want a bigger display when mobile. if there was a 15-16” Air then i would buy it and the studio with joy.
until then i will lug around the new 16” MBP
ive got an ipad Pro 13 inch with 16gb. Will get the Macbook Pro 16 inch with 32gb. The Mac Studio will be with the M1 Max processor with 64gb.
iPad Pro I can take with me to the office. The MacBook will be specifically a travel companion (international flights, etc). The Mac Studio will be the home office machine.
The Ultra offers too little for double the price. Better off getting an additional PC desktop machine with that extra $3000.
Ya nailed it. Ive got an Air, and bought studio max. I dont want any of the nitpicky issues of having to use clamshell mode. I also dont like having to drag a 6k machine all over the place. I want a cheap ish machine for the road I can bang around and the heavy boy at home.
@@edamameme1789 wouldn’t it be better to plug your 16 inch MBP to your desk setup’s monitor via thunderbolt/type-C?
Saves money as well as allowing you to instantly resume working on whatever you were doing on your MacBook. The MacBook can also act as an additional display for improved multitasking.
I’m doing all my creative work off the m1 Mac mini. I’m thinking you are right. But after watching this video. I don’t think I need the m1 ultra. The performance gains aren’t that high. I’m just a hobbyist with 3D and the performance gain doesn’t seem that great especially if I go with cloud rendering.
I think you are one of the better tech reviewers out there, so I would love to see you include software development as a use case when reviewing such high-end hardware. Almost all tech reviewers on TH-cam seem to think that video editing is the only profession out there. They don't understand that not everyone out there is going to be editing videos ... I would guess that a lot of potential Mac Studio buyers are programmers, because tests have shown that this thing runs Xcode really well.
Love that Dave actually called out the fact that most apps are still single-core-working apps… performance in most apps is nearly the same whether you’re on the Pro, Max, or Ultra (3.22Ghz freq). 64-cores is great and all but… yeah… doesn’t mean much if you’re not using even a quarter of them, which is why the Air and the base 14” MBP are probably best for 99% of people.
The people who need it know who they are. It's a very different beast to the Air or base Macbook pro. And yes, there are people who will be tempted but don't actually need this, as he pointed out. Even most video producers can do fine with any of the 16Gb+ Macbook Pros (not so much the air), but if you want an M-series desktop, this is it for now. And it has way better cooling and port selections. A no-brainer for many studios.
“Why is the display so thick? Not everyone wants to pay extra for speakers built into the display.” Dave, don’t jinx it for those of us who cherish built-in audio quality and functionality over thinness. I want iMac Pro to be this thick, too. And include every port you can dream of.
Every component should have only one function. The function of the display should be only to display image. As soon as something alien is added into the design, like speakers or usb hubs, the whole system becomes full of compromises, it becomes heavier, therefore less stable on the arm mounts; bulkier, meaning taking up more space on the desk; pricier, meaning that for the same price you will get worse performing device. And the worst is when you completely don't need all of those features, but you have no other choice but overpay for them because there is no alternative.
I'm a musician so I always have pro speakers connected to an external interface and even I don't care about thickness. It's not a phone or tablet, it's meant to stay on a desk. Im glad the super thin obsession is no longer the priority even in laptops, finally!! Function over form.
@@MikeeRogers, smartphones are all-in-one because they simply cannot be anything else. Yet smartphones are terrible at pretty much every task imaginable and are a one big compromise.
1. They are too small for comfortable video/movie watching, unless you connect it to an external display - device created SPECIFICALLY for this task.
2. Terrible for music listening unless you get headphones - device created SPECIFICALLY for audio listening.
3. Web browsing is highly restricted by the screen size and most of web devs have to cut a lot of functionality in order to make their websites useable on smartphones.
4. Gaming is rudimentary not so much from the lack of computing power, but because of lack of input methods and their low precision.
5. Any smartphone camera is still worse than a cheap DSLR because of physical constrains and is only good for casual users and even then, decent photos/videos could be made only in good conditions.
6. Any professional tasks are out of consideration.
7. No flagship phone can hold the charge for an entire day if you fully utilize your phone, meaning many users have to constantly worry about running out of battery if the use their phones "too much".
All of this beacuse of the neccesity to have everything crammed into one spot.
With desktops, however, you have no such neccessity.
Apple sells their new display for 1500$. Imagine if you could spend just 1000$ and have 500$ saved, or you could buy pretty much top quality desktop speakers and place them wherever you want with any device you want resulting in marginally better sound than you can get with any built-in speakers in any monitor.
iMacs are solutions for buisness and you need to be really dumb to by one for personal use, unless you are in a situation of extreme lack of space.
@@MikeeRogers, I definitely didn't mention another type of iMac buyers - people who don't care about money spending efficiency and just want something that works well out of the box and looks good. However, from this comment thread I assumed I am talking to people who value function over form and would prefer to get more for the same price. I guess, my mistake, since you in particular never said so about yourself.
But, you know, it really doesn't make a good conversation when "that was a whole lot of nothing" gets thrown at you after you carefully argument your point of view:/ yes, my reply might have been unnecessarily long, but noone is perfect and I am not used to online discussions.
Very interesting to see benchmarks and render times from a system out in the real world... A little surprising that it isn't just crushing the Cinebench scores from the 12900K or 5950X, and while seeing it match a 3080 is impressive, it probably doesn't bode well for the price (i.e. starting at $3999 means you could build/buy a competitive Windows PC for less even with the insane GPU pricing right now).
I'm sure there are "professionals" who both **can** use the compute power this system has on tap and **want** the form factor... But from my perspective as a "content creator," I'm less concerned with the size of the machine compared to upgradability, and lacking the ability to swap out or add RAM, SSD/HDD (or to just have both internally, bc trust me that's nice), GPU, or CPU... yeah, that's too big of a compromise. I've incrementally upgraded the system I originally built two years ago to scale with my needs, and I still have headroom to grow in all the categories listed above if needed. Oh, and I haven't yet spent $4000.
I'm not convinced you can easily build a quality PC for much less than the price of the Ultra, with similar specs. Put together a 12900K, 60GB of DDR5, a Thunderbolt Z690 board, 1TB of SSD, a 3080 Ti (it is still hard to find a 3080 actually selling for less than a 3080 Ti most days), decent case and PSU. Add it all up and you're above $3,500 and well on the way to $3,999. That'll come down a bit when GPUs finally settle. And the PC will beat the Mac on some stuff, but so with the Mac beat it on plenty of other stuff. And yes, the PC is upgradeable, but the Mac is a Mac and that is what some of us want.
They are a lot closer in price than you'd imagine before you add up all the costs.
@@andynormancx so you just proved their point, it's less, and it is more versatile, especially when you don't have to deal with MacOS (which is a pile of shit)
This!
@@aiexzs if you don't see any value in macOS then you're not going to see any value in any Mac, whatever the price. Which is fine, do you own thing.
@@andynormancx Truth is, the mac is great for video editing, but for 3d work and gaming, the PC is just a superior device. Plus, a PC is upgradable. The ugly truth is that in between mac and pc, the pc will benefit consumers more, since there are far more gamers than there are video editors.
Best reviews on TH-cam, I feel like he is the only high quality honest reviewer
Love the review Dave.
Question - Was the Blender test definitely using the GPU? Because even in 3.1, Blender still defaults to the CPU. You have to manually select it in the prefs, and then the properties panel afterwards.
I’d expect those scores to be much higher if they were on the GPU tbh.
Loved the review either way.
Verge got similar tests. Apple just basically lying when it comes to their GPU performance as usual.
@@RenatoRegalado did you see the premiere performance? Did they fake that too? Is Dave lying to us now?
@@RenatoRegalado Other tests show the GPU performing really well. It increasingly looks like it’s a lack of ray tracing capabilities on the M1 that’s the limiting factor unfortunately.
@@charles_wren_films Its the media engines on the m1 ultra making it really good at video editing
Maybe these workloads are optimized for tensor cores and cuda. I remember some of the blender tests were demoed at nvidia RTX launches.
Dude finally a perfect product for me! As a colorist it's impossible to get such a powerful mac without breaking the bank heavily getting a Mac Pro, this is finally the solution I've been waiting for.
My doubt it's still this or a high end PC. I work mostly with 3D rendering in C4D+Redshift and the Blender tests really threw me off, unfortunately. I'll wait for some benchmarks on RS Metal, but I think I'll have to stick with my big box for a little longer :c
For 3d rendering go with PC. Apple suck here hard.
I have a very small suspicion that Dave didn’t enable GPU rendering in the render settings? It’s very easy to miss as the reasonable expectation is that would be the default. If that isn’t the case though, the 3D performance is really poor. Especially to say that so much of apples marketing material seems to involve 3D artists being showcased now.
Wait for RTX 4000 series and then compare with this, in these times a high end pc will cost 2× more then the mac studio.
@@automaticbutt78 hm, that could be the case, yeah. But given that Metal is pretty recent these results aren't too surprising to me since there haven't been too much optimization for it yet. We'll just have to wait and see, I guess.
@@ruhailali7317 You can do it for about 5 grand, and that includes the stupidly expensive 3090...just sayin
Dave doesn't script his contents, he just wakes up, turns on the camera and shares the experience he has had.
For 3d, viewport performance is more important than render times.. you can always render your stuff in render farms, or sure can wait extra few minutes or an hour for a render, but without a smooth viewport performance, its not worth it..
Well but some people just can't look past GB & rendering times (which btw very few actually mention is due to the dedicated encoders+decoders, not necessarily the CPU/GPU)
Wait, so does this thing boost viewport performance significantly compared to non-apple graphic cards?
@@bubblesbubbles3880 no
Yeah the viewport performance sucks currently with Blender on a M1, but it was always shitty on a Mac.
Hope Apple will improve this now they're a top tier supporter
Very disappointing for 3D
Still absolutely loving my high end iMac Pro, workhorse for professional video editing on a daily basis. Already paid for itself countless times.
Okay. Dave2D you got me with the ultimate badge: the black Apple stickers!
Love the chassis size and form factor. I’ll get the Max version w/ 1TB SSD. Think that’s the sweet spot.
2:58 I would think the 2 rubber rings, peel them up and reveal Apples Torx screws or w/e they are using now.
For all enthusiasts ( i.e. brand indifferent ) the best value prop is
the M1 Max base
with x2 32 LG ultra Fine 4k ( 2 are less than 1 Apple display)
+ a sandisk Extreme pro or 2
Most people already have their choice of keyboard and mouse
Nah that thumbnail is heat, too much fire 🔥🔥🔥🔥🔥Dave not playing with us.
Whoa, the Blender benchmark is even more disappointing than expected. I guess 3D or gaming on Mac is still not going to be a thing.
because its not marketed towards gaming?
Ikr. Hope it improves with future updates
@@xxDxxism They said it's faster than RTX 3090 in "industry standard workloads" so I expected more than just a video editing station.
yeah its really weird how after a year and a half, nothing utilizes the gpu properly except video editing apps.
im not expecting it to match the 3090 but at least come close because if you look at how it scales in some benchmarks you can tell there's a lot to be desired.
I just picked up the base Studio and it's probably more powerful than what I needed for my basic Final Cut and Logic Pro use but I think I'm good to go for the next few years. I can't imagine how fast that Ultra would be!!!
Forget the Mac, Let’s talk about package design!!!!!!!!!!!!!!!!!! I need an unboxing of the box.
LOL!
Dave Is the man when it comes to reviews and what positives and negatives are in product that my maybe thinking of buying.
Last bit is very important. Dont fall into the trap of wanting something vs needing something. If not needed there are always other great things to spend your money on. If you don't care about the money you spend I highly doubt you'd be in the YT comments anyway.
I think it's a beast machine but still very much a Mac which has positive and negative sides to it.
@@theoneonly5368 It might feel to you like that but I don't think it's an "Apple-only" argument. I assure you the same thing applies to other brands.
I don't really get the second part of your response, are you saying that unless I get the best and newest of everything I should just get nothing instead? Most of my purchases are born from a need, I suppose that's bad?
Why do people buy a BMW vs Toyota? Because badge and performance and some people choose to spend their money in the way THEY want. Clearly some people like to buy based on how a product "feels/looks/is marketed" vs what they need (basic transportation). I don't think that is a bad conclusion to spending habits.
This being the best computer in the history of computing? That's plain ridiculous. If that's your take then you're just a fish in the pond.
Decide for yourself whether this particular product is meaningful and useful to you, I think that's the general idea behind my comment which was basically an agreement with Dave's opinion.
@@theoneonly5368 ''the best computer in the history of computing'' 🤡
Had wished for more benchmarks. You only showed 4-5 ones. DaVinice, FCPX, Logic/Ableton, XCode, Browser, etc...
@Dave2D, Intel did *try* to make something that powerful in that small of a chassis, the whole Intel NUC line of products. I don't think they'll be able to compete with a company that can control the experience vertically like Apple can.
3:23 dave plugged it into itself to get unlimited power!
6 months later, Cook will remove the A13 chip from the Studio Display and slot in a M1 Max and call it iMac 27 inch. That's why this display is so thick.
I would like to have the 5K imac. I am ok with a little thicker. The current line is too easy to chip, fracture the glass tho I think it looks nice.
Between 4k and 5k, there isn't a very noticable difference in terms of pixture clarity at 27", but the key here is that 5k is exactly 2x 1440p, which means you can simply double the pixels when it comes to displaying UI. With 4k, you're having to rely on 1.5x scaling which is all kinds of pain for a lot of applications. Especially for someone doing graphics design work. 1.5x sits uncomfortably between not being able to design at 1x scale (everything is too small), and not 2x because it will look to big. For example, if you're trying to draw a 1px line, you can't go 1px because that would look too tiny, and you can't go 2px because that would look too big, and you certainly can't go 1.5px because that would just look blurry.
This is why it's very frustrating to see only 4k panels in the PC space. A straight 2x scale would be so much better than all the weird inbetween scales. Like do you go for 1.25 at 32" 4k? And if you do 2x scale on a 27" 4k monitor, you end up with everything looking nice and crisp, but you're left with only 1080p worth of screen realestate, AND everything looks physically bigger than it's supposed to.
So, it's not about the resolution, it's about the scaling factor you can practically use.
Yep, as someone who has written code for my desktop apps to fix up buggy windows scaling issues, and adapt to many display sizes, you are spot on. In windows 1080p x 2 = 2160p, i.e., 4K, which is 2x the scaling. Hence why most people run 4K displays on windows with a 2x scaling to keep everything nice and crisp. Also many icons sets are optimized for 2k, 4k
The mac world, standardized on different defaults, hence why 5K is better for macs. The question is, for a 27inch display, how much crisper it it?. I can see that menus, etc., are more crisp on my windows machine on a 4K, than on a mac mini using the same 4K monitor, however it is not enough to be a deal breaker. I do not have a 5K monitor to see the difference on the mac, so I can only speculate that it will be crisper on the 5K - maybe someone else can comment.
And of course for media content (photos, etc.,) on a desktop, since these are rarely displayed in exact multiples on there pixel size, this is not relevant. On something like books on a iPad which has well published standards, which apple backs with app guidelines, then this is relevant, but I would think not so much for desktop.
I can't believe it's not HDMI 2.1... 😩 tho i can't wait to rip FCP on that sucker
just use one of the thunderbolt ports then for 4k 120
who needs HDMI 2.1 and for what?!
You have TB4, use that if you really need an 6K display or whatever, omg
@@m-stat9 HDMI 2.1 is more useful on new TV’s for gaming at 120hz or outputing 8k
@@m-stat9 for any new 4K TV or Monitor that need the whole 12bit colordepth with 4:4:4 chroma for HDR. Quite important if you want to master Dolby Vision content.
But now you need another dongle again stealing a precious port on a "Pro" device.
You can never have enough I/O on a Pro device and you always need more. Its a rule
Will, until 2021 we had to be fine with USB-C only. I regard the HDMI as nifty addition for quick connection to most projectors, monitors and PCs. If you need more fancy stuff, use the dongle just like before. I am sure M2 will add HDMI 2.1 it just wasn't possible yet (48Gbps bandwidth)
So, this is what I'm getting from this. Correct me if I'm wrong.
1. The intake on this thing is tiny and has little access to fresh air. The fans must have really high static pressure to even get enough air to cool this thing, and even then I'm kind of thinking they just have super beefy heatsinks because they don't want to make the fans loud, which means that this computer could have thermal issues once dust gets in there. This looks like a classic Apple "We're gonna bake this thing instead of having noise" approach to me.
2. That monitor has way, way too much going on for the features advertised. It's either a complete waste of silicon during a global shortage, or it's going to get new features added in later via an update. I really don't think it's worth it now, particularly without higher refresh rates and with most other monitors being extremely color accurate and pixel dense for less money, but it will be interesting to see if they add more to it post-launch.
3. Apple really hates high refresh rates. When they finally caved and upgraded SOME of their devices to finally have notoriously low-quality high-refresh-rate displays, they turned around and released multiple "pro" devices at 60 Hz and now the most powerful computer they've ever made has an outdated HDMI port that won't even let you use the higher refresh rates of other monitors even if Apple decides to release some.
4. The main problem I'm seeing is a (relative) lack of single-core performance in their newest chip. I mean, it's still pretty good, but most pro software won't even see a need for this many cores already and as Dave mentioned the lack of multithreading in many tasks means this won't be a meaningful upgrade for lots of people. The single-core performance is the new bottleneck here. If they want to make a new Mac Pro, they are in a rough spot where adding more cores is honestly irrelevant and they can't increase single-core performance without a major leap forward on the scale that M1 accomplished already. I just fail to see how a new Mac Pro could be competitive unless it launches with an M2 long before anything else does, or maybe they have another SOC with fewer, higher performance cores that they will connect together into a massive high-core-count CPU using their version of the Infinity Fabric. It just seems like the hope for a new Mac Pro is pretty much dead for another year or two at least.
I would have loved to hear about how the thermals were doing on the Studio. My mac Mini 2018 i7 gets warm just watching youtube videos at 1080p. This was a big concern for this product!
Edit: Oops, he talks about it at 5:00 !!
It's probably not great. Apple has a poor track record when it comes to cooling, and the intake is too small and too removed from the fans to be very effective. Blower fans like this uses also aren't great in general.
@@Bitshift1125 M1 Macbooks Pros are doing quite well; i7 chips never were cool :DD
@@Bitshift1125 You obviously haven't kept up with the track record. First of all, all of their ARM based M series chips, as with their A series chips, are very low power, stay very cool, and keep their performance without throttling in most situations, even over long periods. Yes, back when Apple used Intel products while trying to keep the laptops efficient and small they skimped as much as they could without causing damage to the mac or cpu.
Simple answer is, cooling with their new chips is insanely good, mostly because the chips themselves are insanely good.
@@af4396 Except people noticed less than two months after the M1's release that you can improve their performance by around 20-30% by just replacing their thermal solution. Apple just does not cool processors adequately because they want everything to be as thin and light as possible.
@@Bitshift1125 Now we're not talking about something overheating, but improving the performance on an already impressive chip, which im sure you can do on many laptops and devices by replacing thermal paste as well, except most x86 chips wouldn't have a similar response curve because they're nowhere near as efficient.
This is not the same subject that the concerned person was asking about. The M series run very cool, and very fast. Period.
I would love to get this for my music studio but the problem is that I also need a new computer for live gigs with my MIDI keyboard. So that's why Macbook Pro is what I have to get even though it's more expensive. It's just so handy to be able to put it in a bagpack and go wherever to work/play music.
For the Blender test, were you using the Apple Silicon version or the regular (Intel) version?
I ask because Intel is the default and you have to look for the Apple Silicon version.
That's the thing, it SHOULD give you the download for ARM64 Architecture, rather than x86_64 one, IF that is the issue.
@@white-bunny You would think...
@@beoxsgaming9388 It gives me ARM64 version on my Linux System from official blender download site, so it doesn't seem to be that, anyway. I think that perf is ARM64 one.
And you need to set the bucket/tile sizes larger for Apple Silicon
@@white-bunny Very interesting.
I have an Intel machine myself, but looking at my wife's m1 MacBook Pro it still gives her the Intel version as the default. Again, you would think.... 🤷♂
On my Ubuntu (20.04LTS) machine it gives me the Linux version as the default. Looking at the "Daily Builds" list for Linux there are only x64 builds, not a single ARM64.
I love the style and presentation of your videos. And the impartiality you keep. Ape has built something special here.
Very cool. Apple is on top of their game right now. I use PC, but only really because of gaming and familiarity (also like to be able to have hacky solutions that only exist on windows tho). If apple would get support for gaming for real, id probably switch.
Apples biggest weaknesses continues to be software they put the m1 is and iPad give you nothing to run on it
@@52thephotoshop absolute rubbish some far better software on MacOS than windows. Gaming is a nonstarter though. I personally use Mac for work and have a pc for gaming. Some fantastic third party devs for Mac though.
The monitor is thick because the integrated power supply, from the fact the power supply inside can support the display and 96W USB PD makes me guess there is a 150W power supply inside.
Unless you go to planar transformer / inductor on both the PFC as well as the flyback or LLC section of the power supply, it is really hard to make the power supply thinner than 20mm. Put this power supply behind an LED panel, you got a fairly thick display. The 24 inch Mac is thin purely due to fact apple choose to use a separate adaptor (or brick).
The Mac Studio is the first Apple device in years, decade+, that "had" me stoked about it. Upon announcement I was like Apple lets goooo!!! I'm a 3D motion designer and after seeing the Blender test I'm back to unimpressed. Seems Apple build their systems for the video editors and 2d motion graphic artist. Looks like I'll be sticking with my 17in razer i9 with the 3080ti for the foreseeable future. Such a let down.
it is not good for cycles bc of lack of optimization (the m1 release for blender is still in alpha/beta)
Blender devs could consider updating their app to work with these chips. We’re at the beginning of Apple’s switch away from Intel, so I could see a blender update in the coming twelve months to take advantage of all those graphics cores in there.
Thats the thing, they make such a specialised product that only works well with certain tasks. Its not a windows machine that can handle everything quite equally well.
@@gigago2529 Blender 3.1 is already stable and has Metal support for Cycles.
Yeah Blender still needs to release their polished version for the M1 chips.
Min 4:33 makes it clear for audio production since it uses single threads to process sound.
Nice and early as always Dave! I’m a PC guy but really interested in M1 Ultra to see if there claims were true or not. Well it’s not as powerful as 3090 as apple claims it to be but it’s pretty sweet for the 1% editors and developers.. Thanks for the video btw. Cheers!
Yeah, it's not for gaming brah. This much power in a mac is useless for general consumers when u can't game on it! High end Professionals are super niche, for those this is worth. At this price I would anyday pick up 3090 all day everyday (if I can get it that is)
@@ramlalmaheshwari7505 Tell me you're a kid without telling me you're a kid
Better than 3090 in 2d rendering but if you need it for 3d rendering or gaming, it falls short quickly.
@LostChances Arjun tyagi is right.
You Tell me, r u a sheep or a keyboard warrior? Only 2 possibilities here
@Siddhart Jatav lol, leave it. He himself is a kid typing from his mom's iPhone or something hating cause his parents didn't buy him a decent gaming PC.
The power cable for the studio is removable. Wrap the cable around your hand a few times and pull. You need someone extra to hold the display while your pulling it. It comes out really hard so stand clear.
The reason for just an HDMI 2.0 is that it's meant to be connected to a TV. Most TVs still only have HDMI 2.0. Thunderbolt four supports the DisplayPort 1.4 standard so they are definitely the preferred connection for advanced displays such as the Apple studio display.
Professional Grading Monitors often use HDMI, and you just cant use 4K 60Hz 10bit with full Chroma via HDMI 2.0.
And you definitely cant use it for 4K 60Hz 12bit with full Chroma.
Booth are necessary to master HDR content.
Still no reason to put the newest version of it on there as it's backwards compatible and would future proof it more.
FYI, blender metal support is super early, and they haven't sorted out cpu+gpu rendering well yet. Was your test *just gpu* ?
Loved the review. Well explained some of my questions. The display is way over prices but my next mac might be this. Thanks.
The thumbnails keeps getting better and better
Amazing content as always!
❤️👏🏾 Honest, straightforward, zero fluff to extend vid time and as always, answers our questions before we ask.
Every time I watch a Dave2D video the more I wanna be friends with Dave. Like he seems like such a genuine person and we would have so much to talk about with tech!
Dave your new studio is so gorgeous it makes me want to cry happy tears.
so the biggest downside is that if one of the rams fail, there's no way to replace them. or any of the components for that matter. seems like a huge deal-breaker.
you have to buy a new one
didnt they tell you ?
once it broke you must buy a new one because it is apple ? LMAO
the ram would not fail lmao. it's like ram on your phone. if it fails, the whole motherboard fails.
im not a fan but apple care or even without apple care they would swap it and replace it without any issues also i want to say this to everyone that buys apple products, don''t buy them in a third world country or where the customer service is non-existence. it's not worth it
@@omniyambot9876 and that's exactlly his Point. He can build a system way less than 4k which will perform better and also have an upgrade path
It will not happen to a Machine like this, if so then it would of happen to previous M1 Enabled Macs
That thumbnail though!! 🔥🔥🔥
Spaceship vibes
My guess about the Studio Display thickness is that Apple made it thick so they can still include the speakers and other extras without giving the display a chin. I remember one tech reviewer (forgot who), mentioned this when the M1 iMac was announced - he didn't like the chin and wished the iMac was thicker...
The chin housed the fans, the board and part of the speakers. Also, the iMac has an external power supply, while this monitor has it inside and it probably is a beefier one to charge a macbook and run the bigger display.
I am alone in wishing it was prettier? Was Blender just using the GPU ? Or both GPU+CPU ? From the tests i've seen online GPU+CPU can give better perfomance.
Great review as always. Did you test out the camera on the monitor at all? I’ve read mixed to disappointing reviews on the camera and was hoping to hear your take in this video
o man o man you make literally some of youtube's best thumbnails
I am still wondering why apple is calling this “Modular”
Because you can BYODKM
They are not. They describe a “modular system” meaning separate tower and display rather than an all in one.
Fast, simple, clean and no BS... I love it, and the Mac is great too
What I admire the most from Apple is how their fans with "old" intel systems that underperformed poorly still didn't want to see the alternatives being more capable, with more I/O, affordable storage and everything... now M1 laptops and desktops are VERY good value. I not only admit it, I admire it. But man, people where buying intel macbook airs and "pros"(without even i/o for a professional use, and don't even talk about professional keyboard reliability...) and lie to themselves.
This is wat I like about Dave's channel. He didn't say old ones where awesome and good value/performance. Now he is, and gets excited and so do I.
But don't forget, people, it is not a religion. Don't justify it when they do it wrong!
I don't think the I/O's improved that much. I recently hooked up a 144Hz monitor to my MacBook, and they use DisplayPort. Wouldn't that have been more practical for professionals than HDMI?
@@abubakrakram6208 All modern monitors, especially high refresh rate ones, should support Thunderbolt and HDMI 2.1. Apple unfortunately doesn't support HDMI 2.1 yet, but Thunderbolt should be a requirement, especially considering how it's near standard on most high-end PC laptops.
@@RubmaLione I wish more monitors had Thunderbolt. I miss it from my LG 5K display. Now I have to run 3 cables (power, display, and peripheral hub) where I used to run one.
*Yo is there a big difference between the 32 core gpu and 24 on the Mac Studio? cause the 24core outperformed the 32 on the MacBooks*
Best reviewer along with mrwhosetheboss and MKBHD man, thanks for the awesome videos! I do feel like the “ultra” name isn’t like what it suggests since I saw that it couldn’t give out performance compared to the 3080 and 3090. Still amazing though.
Today I filmed on my bike using 4 gopro camera's running maxed out 5.6K and extra audio from a zoom recorder. I loaded it all on my M1 Mac mini which is the lowest model and into Davinci Resolve, shockingly (No proxies, 4K timeline) it is handling all that with no stuttering at all. I Ordered a Mac Studio... Can't wait for it to arrive!
The ULTRA max's out at around $8,500.
TR 3975WX 32c/64t costs $2800 with 2 tb ram support .( Plus ram+Mobo cost)
but probably even the richest professionals will choose external SSDs
Definitely less than ppl were expecting. Very great price
@@Racko. ?????????? That's like $4000 for a bit more storage 😂😂 what a rip off
@@propersod2390 Alright buddy we get it, you're everywhere just spreading hate, its starting to get lame
So I purchased a 2020 27inch 5k iMac right as they were launching M1 because it didn't seem like the iMac M1 was on the horizon for the near future. I do most of my work in Final Cut, Premiere, and heavy in After Effects. The M1 24in iMac would not have been enough for what I do. I installed 128gb of Ram myself after only purchasing it w/ 8gb. I maxed out the graphics card when I purchased though. My computer struggles a little w/ 4k 120fps footage and when I do heavy rotoscoping work in After Effects, it takes me a while. That being said, when I purchase a new computer....I'm thinking about the money I'm spending today to at least future proof me for the next 4-5 years. I think I'm going to hold off for another 12 months before upgrading because right now, I feel the FOMO of wanting the heavy M1 Ultra. I'd like to see more vex editors review it and make a more informed decision as to which one I actually need realistically. Thanks for the thorough review!
If they could just provide egpu support in that machine that would be really helpful for creative work.
Nah they'll release Mac pro for GPU support. You can't just buy something from apple and make it better. That hurts sales of their other products
Man, I just love the fact that we get to see Dave's videos for free
Who even needs to buy something like this? Superb review as always Dave
Content creators
those who bought $10k mac pros
@@bred4808 Not all of them, considering the Blender workload.
favourite tech tuber , so straightforward and precise !
What I dont like about these pro apple devices that even with that expensive price it can't do everything like gaming,. Blender and game development whereas if a PC might underperform compared to M1 Ultra in some cases but it cam do everything. And also 0 upgradebality
Regarding power consumption and heat. Its not as surprising once you know that ARM is completely different to traditional CPU's. Its a low power architecture that has a way lower number of instructions that the cpu is capable of doing.
this is exactly where i thought apple would go, and its the direction that pc manufacturers need to move in. I did think they would end their mac lineup here though, i wasn't expecting a new mac pro. With this, apple will change how computing is done.
That's what they've been telling us for the last 40 years. Nothing much has changed, I'd never buy Apple, I don't trust them and I don't want them.
looks like a fanboy talking nonsense
@@bobdragon7887 you must be young.
Good GOD what a break from Renee Ritchie review with his Apple loving narrative. So glad Dave also does reviews and does as one of the top best. Fully covered material, handled properly, angles all covered for consumers.
@9:03 THATS THE KEY REASON I love this channel. DAVE still stays down to earth even being around top products. He knows how avg person may consider purchasing even though it’s not in their field. I got my advice there. Saved me tons of money
So this is mostly a video power desktop. To be a true workstation, it needs one very important thing: being able to upgrade. It is simply not possible with this one. So choose wisely your configuration. This could also be a great computer for any task that needs a lot of cores. But for ordinary users, stick to M1-based machines. I know the urge we all have for the fastest - newest, but it is just a very, very expensive urge in the period of potential crisis.
Apple's target consumer for this product does not care about upgradability. Professionals, businesses, and institutions will simply buy the new product and sell these.
What’s the difference between a $4000 quadro GPU upgrade in 5 years vs. a $4000 new mac upgrade in 5 years?
@@thenoobperspective7588 You can add more storage, more RAM, second GPU, other cards, upgrade CPU,...replace part that is not working anymore...
Professionals do upgrade, my 20+ years in the business counts something I think
@@thenoobperspective7588 The 3D performance I guess.
Great review! How was overwatch on it? It looks like you weren’t hitting 3 digit FPS, but is that because of the 5k resolution? Is this a possible contender in the SFF market using a thunderbolt to hdmi 2.1 cable?
He wasn't hitting 3 digit fps because it's a mac 😂 if u want to play games then get a pc. My rtx 2060 is faster in overwatch than this $4000 mac
@@propersod2390 Guess what, there are people who do creative prof work and just game occassionally, then it would be stupid to pay another 2-3 grand just for a 2nd PC
would also like to know how you got overwatch on it 😱
I think this Apple Studio is going to be something that will hold value for years, even the M1 Max version. I think 10 years from now, this will still be expensive on the 2nd hand market, but people are going to be buying them.
That will not be the case since now we are charging ahead and in 10 years there will be an M8 or M10 and it will make this look old and slow.
I wonder if the Studio Display's cable is like the HomePod's cable. It comes out, but you have to give it a good bit of force?
that display isn't worth it, it exists to keep the "apple is luxury brand" vibe
I dig your no BS format.
Apple display quality is always superior. You can get a lot of monitors side by side and Apple almost always gets the best picture.
can't really agree when they almost always source their monitors from LG. It isn't like LG isn't using this tech anywhere else.
That Mac studio looks soo good with the matching background
*it's like, MacStudio was made for D2D*