@The Verge: Apple doesn't use "Micro LED" (9:07) on the MacBook Pro and iPad Pro. It uses Mini LED - a small but important distinction. "Words have meanings." - Nilay Patel (March 17, 2022)
I usually don't comment videos, but I loved this type of review, really going for the target audience of the product, well scripted. Congrats to all the team involved in this, specially Monica who amazes me as a reviewer on every single piece. If the algorithm is telling you to do shorter videos instead of this, please ignore it!
Have to agree, and I'm not even a Mac user. Yet.... I think the multiple viewpoints added a lot, and the benchmarks weren't dwelled upon. Great work Verge team!
@@Harvester88 probably not on a Mac, unfortunately, but maybe that’s all about to change. I think it’s just a matter of practice on fine tuning the mouse. If he’s been using macs since OS9, it’s likely he uses a low DPI setting on the mouse. Back in the day, CRT iMacs used to come stock with really really low DPI settings out of the box. In my old digital and photo digital imaging classes, these were always set really low and it helps a lot.
The Python test really depends on what you did in Python since a majority of tasks will be single threaded and you won’t see a benefit from extra cores.
There's no way that display should sell for $1599. If the 27" iMac were still around, it'd be about time for it to get HDR and high refresh rate at the same starting price.
No High refresh is probably not Apples fault tho. 5K is literally almost TWICE the resolution of 4K. Not just a bit more. And since Apple wants stuff to be simple and use Thunderbolt for that - the currently can't do that because TB4 simply does not support 5K at 120Hz.
They’re selling a product that you can’t buy anywhere else so there’s that. What company sells a unique product for cheap? That’s basic marketing people.
Well, it’s meant to compete with the LG UltraFine 5K that retails for $1299. You pay a premium for the Thunderbolt connectivity. Also, it has a great webcam built-in. This is also not meant for the average consumer. Someone who has a lot of money and wants the Apple-only aesthetic will love this monitor.
Idk if I'm off base, but it seems like the new Alienware 34" QD-OLED is only $1,300. Why would anyone choose Apple?! QD OLED is a new technology 3 generations beyond the new apple display (LCD > Dimming zones > OLED > QD OLED). I'm perplexed.
I was very impressed with the thoroughness of this review. 10/10 for delivering an exceptional review covering all the bases and also considering so many workflows and people's opinions. And wow. That camera situation? The way it was presented with all those other Apple cameras was impressive. Not to mention the seamless audio comparison. Well done team.
Thoroughness? You may be right with everything else except the GPU part. 1. It is proved that Geekbench Compute doesn’t ramp up apple silicon soon enough before it completes the benchmark. 2. Gaming is neither advertised nor considered by Apple for comparing it with a 3090 because they know very few, if any AAA games are optimised for Apple silicon and all of them including Tomb Raider are running through Rosetta. There were absolutely no real world professional use case comparison with a 3090.
The reason you got almost identical results on the scientific test is python, unless explicitly coded otherwise, will only use 1 thread. So essentially what you tested is the new chip in single core performance for a real world application. It's good to know the single core performance is better on the new chip (I suspect cooling has a lot to do with that result), but ideally you could complement it with a multicore workload for future reviews, you can do that using Numba if you chose to run python or some other language that is more oftenly used for multicore workloads (Such as Java or C++) :) PS: I'd be happy to share a basic Numba script you could try out :D
Hi! We ran the npbench Python test, which supports Numba. (Indeed, we all had to install Numba on various machines.) We'll keep poking at it, there's always a chance we did something wrong.
@@reckless1280 A lot of the people who are in that "really want a 5k display for its resolution @2x" are iOS/macOS engineers. Would love to see input from an indie dev and project compilation times of Swift in Xcode in addition to Python in the future.
@@reckless1280 Using Numba doesn't automatically mean multithreading. I've never seen npbench before, but a very quick look at the source code seems to suggest they're not doing parallelization at all, since there's no @jit and no manual thread spawning from what I can tell.
@@danispringer TH-cam isn't letting me reply with a link (my comments with a simple Reddit comment link keep getting autodeleted), so I'm going to quote the Redditor MrTimscampi on this topic and then my own reply him. MrTimscampi: "Indeed, but The Verge's entire methodology is flawed here. They mention "real world" testing using Puget's benchmarking tools. Puget's own website mentions: "Adobe Photoshop version 20.x, 21.x, 22.x, 23.x. M1 Native version is not yet supported" for the Photoshop one. The Puget bench for Premiere is still fixing bugs on M1 as recently as the latest version, and the article doesn't even specify which version they used for the test. Since they're running other stuff in Rosetta 2 and not mentioning it, are they even running a native version of Premiere here? They don't mention fairly important information, like which Python and NumPy versions their NPBench is running on How is Geekbench configured? Which backend are they using for the compute tests? Why does the Boxx Apexx 4 not have CineBench scores? For the Tomb Raider part, they mention that "it does emphasize that if you’re running a computing load that relies primarily on a heavy-duty GPU, the Mac Studio is probably not the best choice". There is no mention of it running through Rosetta 2 or the DirectX to Metal translation layer the port uses, which would lead to worse performance overall. I get that there's no "perfect" benchmarking tool for this, but the entire way the review presents the benchmarks is misleading and lacks vital information to make sense of the numbers. And yes, it's partly Apple's fault for providing graphics that are misleading by omission. But you'd expect a tech-focused publication like The Verge to actually do their due diligence and properly run (and document) their benchmarks. You'd also expect all the other tech-focused publications to point out the flaws in methodology that are everywhere in that piece. I don't know, maybe watching too much Gamers Nexus influenced what I'm looking for in benchmarks, but this all feels really amateurish for somebody whose job revolves around benchmarking systems." My reply: "To add to your list, they didn't include the 4K Shadow of the Tomb Raider results for the RTX 3090 system, despite the fact that the 4K results would be the most GPU-bound and therefore the best way to compare the 3090 with the M1 Ultra's GPU performance with that particular benchmark. In addition, their scientific workload didn't emphasize multithreaded performance at all, and they seemed completely oblivious to the fact that all they had tested there was single-threaded performance."
yo! Alex from The Verge team here. I had a blast filming with Becca and the team, they're incredible and always knock it out of the park! Using the mac studio was also a dream. Feel free to hit me with any questions, I'll do my best to share what I can. Love ya! - Alex / The Verge
Is there anything you wish was different about the Mac Studio or the Studio display? Follow up thought: I wish apple would have a Mac mini option with an m1 pro. Right now it feels like quite the gap in between those two lines. (Mac mini/Mac studio)
Can you give us some feedback on the noise the studio generates under load? Can you tell when the fans ramp up? Also - do you think the intake being on the underside will cause issues in the long run as it inevitably gets filled with dust? Especially since there is no way to open the Studio and clean it properly.
@@reopru We weren't in the quitest environment when I was using the studio so take with a grain of salt. I'd say it's fairly quiet, def wasn't noticeable with everything else going on but I didn't spend time with it in a personal office / bedroom situation. I'm not too sure about the vents, I'd say it's something to consider of course but I'm not usually worried about these things with computers (not the best, I know)
@@AlNemec I'm surprised that Apple hasn't made these displays with a higher refresh rate. Seems pretty ridiculous to have such an expensive display that's only 60hz. It was a really clear display but personally I'd pick any number of other displays before buying from Apple.
Honestly one of the best Verge reviews I've seen in a long time. Balanced, thorough, fun and - most importantly - useful. I'm going to be saving my money for a while I think!
iMac Pro, with fast Intel processor and fast gpu too. Great for Windows apps, and Windows games too. M1 means Boot Camp is gone, 100% Windows compatibility is *GONE* .
If it had one more thunderbolt port, I’d buy it. I’ll have to spend that much on quality designer display and dock, but since I don’t care about 5k, I have little interest.
This was an amazingly helpful and thorough review!! We're so much less stressed about making the decision to buy one after knowing it was tested by professionals doing the same things we plan to use Mac Studio for 👍👍
Thanks for the in-depth review. I just canceled my Studio Display order. I hadn't read the tech specs, and no micro LED/OLED and no HDR makes this a no go.
They didn't even bother to include the 4K result for the 3090, which is the only one that would be close to GPU-bound. It really confuses me how they can mess up their benchmark analysis so badly. They really need to hire someone explicitly to review their benchmarking methodology, because it is severely lacking.
Finally a REAL review of these new Apple. Every other small TH-camr is too worried that Apple won't send them review units for their future videos. THANK YOU the Verge for being straight with us.
The machine has Thunderbolt ports on the back, and USB-C at the front for the M1 Max version and 2 additional Thunderbolt ports for the M1 Ultra version. It’s the USB-C connector but not just USB-C ports.
Fingers crossed that Apple can resolve the web cam issue but otherwise I have every intention of getting the Studio with the Max chip and the Studio Display. And yes, the term "edge-to-edge" doesn't feel quite right.
@12:03 If that is an HMI light on the left side, you never want to have an HMI pointing up like that. Having them point up is not good for the bulbs because of how those lights work and how those bulbs cool down and dissipate heat. The max tilt angle while operating the light should be about 45 degrees up or down. Those bulbs could explode and be very dangerous.
Excellent video Guys - please could we have separate videos from each professional so that we can see exactly what they did with this machine. Personally, I am interested in its Audio capabilities - I use Logic Pro along with Sibelius - a Score writing program. Thanks for the presentation.
*SoC's. The M1 series of processors is a combined CPU/GPU/Neural chip, It is not a GPU or two GPU's linked together anything like AMD/NVIDIA's past dual GPU cards as was referenced here @1:30 into the video APUs are general purpose processors that feature integrated graphics processors, but that normally was for AMD chips, so lets just call Apple Silicon chips SoC's.
Hi, @The Verge. Can you please explain to me why in the Shadow of the Tomb Raider benchmark (14:32) all of them are providing better FPS in 4K setting while in the 1080p setting is performing at a much lower FPS. Is this a mistake or the actual FPS for these resolutions?
Yeah weird the 3090 4K performance isn't shown as I'd assume they got the labels wrong. Actually pretty impressive the ultra is doing about double the 4K fps compared to the Max and better than the Mac Pro in 1440 and 4K.
No cinebench test, no blender render test, no 2-3 days non stop 3d rendering test to check stability, no cad test to see the viewport limit for triangles the gpu can draw, no resolve test, no nuke test, no fusion test. Is not the new macpro but still we the people who need these machines to "print money" with them are interested in pure performance/$ and stability. 99% of people wont get this for Ps or Indesign. Maybe is finally a decent alternative for Ae instead of MacPro.
Best review ever. This is exactly the kind of real-world, detailed info that I need in order to decide what to buy (I'm a video editor still fairly happy with a 2017 5K iMac, but for some projects, I really could use more power).
Any thoughts on the nano texture glass and how much it effects clarity / colours? Also any experience with the stand, lot of money but wonder if the adaptability is worth it?
Big fan of the review style where the team finds practical use cases and then explains how it works in each of those scenarios in a thoughtful, articulate way.
I'm definitely one of the nerds that cares about having a monitor that can display macOS in pixel-perfect resolution, but I'm not going to bite on the Studio display. Its essentially a $700 premium over the Ultrafine 5K if you include the height adjustable stand (which is a necessity!). We're too deep into the HDR-era to be buying new displays at such premium prices that don't support it. I'm also spoiled by my M1 16 inch Pro's display and a lack of ProMotion and HDR isn't going to cut it for $2000
Guess you will be waiting a while until we get thunderbolt 5 lol Thunderbolt 4 can only support up to 4k 120hz HDR is definitely coming with the new Pro Display tho
@@chidorirasenganz That's pretty much all I want, along with everyone else. I think apple could do 5k 120hz via DSC though. Regardless, I'm ready to move on from LED. Mini-LED, OLED, QD-OLED, are all things everyone wants now.
@@0741921 There are but none that Apple would entertain. Like using two thunderbolt ports or using HDMI 2.1 It’s in the works yes but even once it’s finalized it’ll still be a little before it’s implemented. I think we are at least a year or two out. We just got Thunderbolt 4 in 2020 and Thunderbolt 3 was used even longer
Love the dose of Backlon at the end. I ordered mine on the phone while the keynote was still going and I'm glad I won't regret my decision! Thank you for the quick and thorough review with real pros. Also love the new cinematography look; real environments, natural lighting, wider and closer lensing. Go Becca!
I think it’s great that they use Adobe for testing but they need to test Final Cut and Davinci as Adobe has been sluggish to fully optimize their software to these new chips as Adobe has seem to slowly move to focusing on Windows more than MacOS
@@itsCcallahan I know that pros use the full suite. But there are plenty of pros who use Final Cut and DaVinci. Especially when it comes to video editing. Myself in the video the community has seen many professionals moving off Premiere as the software yet powerful is not reliable and many professionals have been moving to DaVinci and Final Cut.
If it works well with Adobe apps, it’ll work most likely even better with Final Cut and DaVinci as that has been the case with the most recent M1 Max Macbook Pros.
Kinda nice to see Apple putting more pressure on Intel and AMD on terms of chip design and innovation. While not a new technology technique their fusion of the two chips without much latency is really impressive
I think for their desktop line of Macs they should just make their own dedicated GPU instead of it being a APU being how strong the CPU side of things are
There is something wrong with the GPU tests you are doing. (Not saying you are doing something wrong, here) There has to be a bottleneck somewhere and it is unlikely it is on the chip because of the architecture. Most likely the software is simply not taking advantage of all the GPUs available or there are single core dependencies caping the results. I suspect these results will change as the architecture is better understood.
I think what you are not taking into account is the encoders/decoders included on chip, which greatly speeds things up for certain workflows - which is a smart way to create the chip for its primary audience. All chip architects do this for a target audience, just as there are specialized chips (co-processors) that most people have never heard of, which runs circles around generic hardware for certain types of applications. The problem is that these are hardware dependent, hence, with new codecs in the future, the speed advantages may evaporate. As an example, take the modified chips that are in higher end cameras, which would be useless for any kind of general computing product.
This review is a lot better than MacBook Pro review. This is probably the first time they considered there are more professional workflows than the ones common at verge. I remember last time, they said the M1 Max GPU was only weaker for gaming (and never considered professional workflows dependent on Graphical Performance)
"The general consensus was that the M1 Ultra was a bit faster... but not 3000 dollars faster" THANK YOU. I feel like I've watched a dozen videos about the Studio, and none of them have really answered this basic question.
Excellent "review" , not run of the mill near-blind awe that vast majority of "reviewers" spew out. I would have preferred that they used less jargon and explained why it's not way ahead of M1 Macbook pros, however still high grade to The Verge for this review.
Why is the Verge testing the limits of what M1 Ultra could do with a 4K video that too at 4:2:2 10-bit even an iPhone can handle? If not 8K, at least do 12-bit 4:4:4 to test 4K meanwhile there's Linus Tech Tips with Blackmagic Camera on 12K at 120fps 16-bit …I thought the people at the Verge were familiar with MKBHD?
Are the Shadow of the Tomb Raider graphs correct? Wondering why the FPS is much lower with 1080p than 1440p and that 4k has the highest FPS…? 🤔 Shouldn‘t they be the other way around?
Great review -- just one caveat. When you use Python programs, you should specify the use cases. Most regular Python programs are single-threaded processes and its no wonder that you got the same performance on M1 Max and Ultra.
I love video review. So well done. I do though have one question. At the 14:30 min mark their are graphs showing frame rates for shadow of the tomb raider. It shows the highest FPS is at 4k and the lowest at 1080p. I think the legend is backwards. Wouldn't you be getting the highest FPS at 1080 and progressively worse as you went to 1440p and 4k?
Great review! Really honest and I like how you review different professional workflows. I will defiantly get the Studio, but I'm not sure about the Ultra?
Best non-bias review I've seen. I subscribed due to that. I am extremely disappointed in the GPU performance you reported. I don't play games, but I do 3D rendering. I hardly ever see a real world 3D rendering test. You also have me thinking if ULTRA worth buying if the CPU performance is negligible compared to MAX. Thanks again for a great review.
I would like to see more PRO use cases... Some simulations (tension "map", fluid, heat transfer etc.) Everyone says "it's so powerful!" So, why most advanced things are done on windows then? I feel like apples products are only good for video and picture editing (and it has very low power consumption)
Their target audience is mostly creatives for these pro machines. Clear from trailers too. Their CUDA alternative Apple Metal is so hard to code on. It's a mess tbh. So most Devs are better off with Linux or windows machines at the moment. Hope apple sorts this out soon.
Great review but you're just missing a repairability and upgradability section in it. More attention needs to be put on this as we can't carry on making products (at least desktops and notebooks) that do not at the very least expand with increasing amounts of data we produce and offer an easy way to repair them if they break within a 10 year lifespan. Maybe also energy usage would be good too, I know Apple is winning here at least.
Can we the people get a link to the music playlists??? Oh yea the review was amazing, probably the best on youtube so far I've seen. Thank you and good day.
I still use a 27-inch iMac from late 2015 for most of my heavy tasks and definitely would be the target audience for the new desktop as I edit a lot on Final Cut Pro X & Logic. I'm not thrilled that Apple wants $1,500 for essentially the same screen I've been using over nearly the past 7 years. Every year I hold out for something better but I'd really like to upgrade my workflow. Decisions decisions...
Thank you. You are the first one to answer my question. I am going with the M1 Max. Thank you so much :) It would have been helpful if you could mention what's the configuration you used for the M1 Max
“What else can you glue two of together and double it’s performance?”
all kinds of protection. okay sorry that was insensitive.
@The Verge: Apple doesn't use "Micro LED" (9:07) on the MacBook Pro and iPad Pro. It uses Mini LED - a small but important distinction. "Words have meanings." - Nilay Patel (March 17, 2022)
The Wii was 2 gamecube ducktaped together. Does that count ?
chocolate bars?
Depression and anxiety.
“Words have meanings.”
Dieter’s love of semantics has left it’s mark on the Verge!
Came here to say this!
Dieter and I realized we'd be friends within like 30 seconds of meeting each other because of things like this ;)
@@reckless1280 just wanna say that ur interview with the Wordpress CEO was amazing. You have a knack for getting people to just feel chill.
@@reckless1280 truly the greatest love story of our generation
Apple: Edge of the inner edge is an edge, think differently.
I usually don't comment videos, but I loved this type of review, really going for the target audience of the product, well scripted. Congrats to all the team involved in this, specially Monica who amazes me as a reviewer on every single piece. If the algorithm is telling you to do shorter videos instead of this, please ignore it!
Have to agree, and I'm not even a Mac user. Yet.... I think the multiple viewpoints added a lot, and the benchmarks weren't dwelled upon.
Great work Verge team!
💯
Could be worded better. They kept on saying “fusing two GPU’s together” rather than the whole SOC
Yep, this was an A+ review. Thorough and honest.
Definitely one of the best reviews ever I would say. Great introduction to many of the key people at The Verge was a bonus. Team effort for sure.
I think the most impressive thing in this video is Alex drawing with a mouse (vs. a pen/tablet). 1) who does that? 2) how is he doing it so well?
Some people think pen tablets are hype and learned on mice instead. I think Wacom tablets are too expensive for what they do.
He has mastered the mouse, wonder if he ever plays FPS games. He could probably frag out.
@@Harvester88 probably not on a Mac, unfortunately, but maybe that’s all about to change.
I think it’s just a matter of practice on fine tuning the mouse. If he’s been using macs since OS9, it’s likely he uses a low DPI setting on the mouse. Back in the day, CRT iMacs used to come stock with really really low DPI settings out of the box. In my old digital and photo digital imaging classes, these were always set really low and it helps a lot.
You get used to that.
@@Harvester88I do play a lot of fps games lmao can't believe all that time was actually productive in the end. incredible
This is why The Verge is the best. I like the multiple perspectives. Very comprehensive review. Nicely done.
The Python test really depends on what you did in Python since a majority of tasks will be single threaded and you won’t see a benefit from extra cores.
There's no way that display should sell for $1599. If the 27" iMac were still around, it'd be about time for it to get HDR and high refresh rate at the same starting price.
No High refresh is probably not Apples fault tho. 5K is literally almost TWICE the resolution of 4K. Not just a bit more. And since Apple wants stuff to be simple and use Thunderbolt for that - the currently can't do that because TB4 simply does not support 5K at 120Hz.
They’re selling a product that you can’t buy anywhere else so there’s that. What company sells a unique product for cheap? That’s basic marketing people.
Well, it’s meant to compete with the LG UltraFine 5K that retails for $1299. You pay a premium for the Thunderbolt connectivity. Also, it has a great webcam built-in. This is also not meant for the average consumer. Someone who has a lot of money and wants the Apple-only aesthetic will love this monitor.
It also has great built-in speakers.
Idk if I'm off base, but it seems like the new Alienware 34" QD-OLED is only $1,300. Why would anyone choose Apple?! QD OLED is a new technology 3 generations beyond the new apple display (LCD > Dimming zones > OLED > QD OLED). I'm perplexed.
I was very impressed with the thoroughness of this review. 10/10 for delivering an exceptional review covering all the bases and also considering so many workflows and people's opinions. And wow. That camera situation? The way it was presented with all those other Apple cameras was impressive. Not to mention the seamless audio comparison. Well done team.
You could say they covered all their bases, from edge to edge.
@@Merabbit 🤣🤣🤣
Thoroughness?
You may be right with everything else except the GPU part.
1. It is proved that Geekbench Compute doesn’t ramp up apple silicon soon enough before it completes the benchmark.
2. Gaming is neither advertised nor considered by Apple for comparing it with a 3090 because they know very few, if any AAA games are optimised for Apple silicon and all of them including Tomb Raider are running through Rosetta.
There were absolutely no real world professional use case comparison with a 3090.
The reason you got almost identical results on the scientific test is python, unless explicitly coded otherwise, will only use 1 thread. So essentially what you tested is the new chip in single core performance for a real world application.
It's good to know the single core performance is better on the new chip (I suspect cooling has a lot to do with that result), but ideally you could complement it with a multicore workload for future reviews, you can do that using Numba if you chose to run python or some other language that is more oftenly used for multicore workloads (Such as Java or C++)
:)
PS: I'd be happy to share a basic Numba script you could try out :D
Exactly, without filling the chips with threads there is no point.
The Verge isn't really interested in doing their reviews properly or finding out WHY there's a problem in cases like this.
Hi! We ran the npbench Python test, which supports Numba. (Indeed, we all had to install Numba on various machines.) We'll keep poking at it, there's always a chance we did something wrong.
@@reckless1280 A lot of the people who are in that "really want a 5k display for its resolution @2x" are iOS/macOS engineers. Would love to see input from an indie dev and project compilation times of Swift in Xcode in addition to Python in the future.
@@reckless1280 Using Numba doesn't automatically mean multithreading. I've never seen npbench before, but a very quick look at the source code seems to suggest they're not doing parallelization at all, since there's no @jit and no manual thread spawning from what I can tell.
Probably the best review The Verge ever done. Very in-depth analysis of every relevant part of the products. Congrats to the team!
True That
Sem dúvida. Não estava esperando esse nível de detalhes.
This is so much more professional and informative than most other reviews it’s ridiculous.
But they don't understand the benchmarks. It's a problem.
@@danispringer TH-cam isn't letting me reply with a link (my comments with a simple Reddit comment link keep getting autodeleted), so I'm going to quote the Redditor MrTimscampi on this topic and then my own reply him.
MrTimscampi: "Indeed, but The Verge's entire methodology is flawed here.
They mention "real world" testing using Puget's benchmarking tools. Puget's own website mentions: "Adobe Photoshop version 20.x, 21.x, 22.x, 23.x. M1 Native version is not yet supported" for the Photoshop one.
The Puget bench for Premiere is still fixing bugs on M1 as recently as the latest version, and the article doesn't even specify which version they used for the test. Since they're running other stuff in Rosetta 2 and not mentioning it, are they even running a native version of Premiere here?
They don't mention fairly important information, like which Python and NumPy versions their NPBench is running on
How is Geekbench configured? Which backend are they using for the compute tests? Why does the Boxx Apexx 4 not have CineBench scores?
For the Tomb Raider part, they mention that "it does emphasize that if you’re running a computing load that relies primarily on a heavy-duty GPU, the Mac Studio is probably not the best choice". There is no mention of it running through Rosetta 2 or the DirectX to Metal translation layer the port uses, which would lead to worse performance overall.
I get that there's no "perfect" benchmarking tool for this, but the entire way the review presents the benchmarks is misleading and lacks vital information to make sense of the numbers.
And yes, it's partly Apple's fault for providing graphics that are misleading by omission. But you'd expect a tech-focused publication like The Verge to actually do their due diligence and properly run (and document) their benchmarks. You'd also expect all the other tech-focused publications to point out the flaws in methodology that are everywhere in that piece.
I don't know, maybe watching too much Gamers Nexus influenced what I'm looking for in benchmarks, but this all feels really amateurish for somebody whose job revolves around benchmarking systems."
My reply: "To add to your list, they didn't include the 4K Shadow of the Tomb Raider results for the RTX 3090 system, despite the fact that the 4K results would be the most GPU-bound and therefore the best way to compare the 3090 with the M1 Ultra's GPU performance with that particular benchmark. In addition, their scientific workload didn't emphasize multithreaded performance at all, and they seemed completely oblivious to the fact that all they had tested there was single-threaded performance."
Nah it isn‘t they called it a giant gpu with double the cpu cores and things like this…
I feel so happy seeing a bunch of the verge team making an appearance on this video
yo! Alex from The Verge team here. I had a blast filming with Becca and the team, they're incredible and always knock it out of the park! Using the mac studio was also a dream.
Feel free to hit me with any questions, I'll do my best to share what I can. Love ya! - Alex / The Verge
Is there anything you wish was different about the Mac Studio or the Studio display?
Follow up thought: I wish apple would have a Mac mini option with an m1 pro. Right now it feels like quite the gap in between those two lines. (Mac mini/Mac studio)
Can you give us some feedback on the noise the studio generates under load? Can you tell when the fans ramp up? Also - do you think the intake being on the underside will cause issues in the long run as it inevitably gets filled with dust? Especially since there is no way to open the Studio and clean it properly.
@@reopru We weren't in the quitest environment when I was using the studio so take with a grain of salt. I'd say it's fairly quiet, def wasn't noticeable with everything else going on but I didn't spend time with it in a personal office / bedroom situation. I'm not too sure about the vents, I'd say it's something to consider of course but I'm not usually worried about these things with computers (not the best, I know)
@@AlNemec I'm surprised that Apple hasn't made these displays with a higher refresh rate. Seems pretty ridiculous to have such an expensive display that's only 60hz. It was a really clear display but personally I'd pick any number of other displays before buying from Apple.
@@alexcas cheers, thank you for sharing your impressions!
Honestly one of the best Verge reviews I've seen in a long time. Balanced, thorough, fun and - most importantly - useful. I'm going to be saving my money for a while I think!
Still absolutely loving my iMac Pro, true workhorse for professional video editing.
iMac Pro, with fast Intel processor and fast gpu too. Great for Windows apps, and Windows games too.
M1 means Boot Camp is gone, 100% Windows compatibility is *GONE* .
The Studio Display is a huge let down. Very disappointed.
If it had one more thunderbolt port, I’d buy it. I’ll have to spend that much on quality designer display and dock, but since I don’t care about 5k, I have little interest.
Actually, it may be one of the more accurate displays in that price range, as I’ve done some digging.
I really love the unbiased point of view of the Verge reviews
This was an amazingly helpful and thorough review!! We're so much less stressed about making the decision to buy one after knowing it was tested by professionals doing the same things we plan to use Mac Studio for 👍👍
Thanks for the in-depth review. I just canceled my Studio Display order. I hadn't read the tech specs, and no micro LED/OLED and no HDR makes this a no go.
Seeing Nilay is always fun....
Amazing editing! And great practical review. Instead of just listing specs and comparing with others. Great job!
I am going to miss Dieter so much. Him and Josh had so much charisma on mic and video. I feel like the Verge needs a talent bump.
SOrry, forgot about Becca. Becca should review everything until further notice.
Using Tomb Raider as a benchmark is no sense without considering it is running emulated. Hope to see it updated and optimised for Apple Silicon.
They didn't even bother to include the 4K result for the 3090, which is the only one that would be close to GPU-bound. It really confuses me how they can mess up their benchmark analysis so badly. They really need to hire someone explicitly to review their benchmarking methodology, because it is severely lacking.
Every TH-cam channel’s production team went OFF on their Mac Studio reviews. Bravo!
Finally a REAL review of these new Apple. Every other small TH-camr is too worried that Apple won't send them review units for their future videos. THANK YOU the Verge for being straight with us.
Btw, I huge miss on your video, the power cable of the studio display is not swappable. This is a huge concern for long term use of this monitor.
The machine has Thunderbolt ports on the back, and USB-C at the front for the M1 Max version and 2 additional Thunderbolt ports for the M1 Ultra version.
It’s the USB-C connector but not just USB-C ports.
Fingers crossed that Apple can resolve the web cam issue but otherwise I have every intention of getting the Studio with the Max chip and the Studio Display. And yes, the term "edge-to-edge" doesn't feel quite right.
Love that the reviewers keep it real. Nice cutting through Apple's marketing hyperbole!!!
@12:03 If that is an HMI light on the left side, you never want to have an HMI pointing up like that. Having them point up is not good for the bulbs because of how those lights work and how those bulbs cool down and dissipate heat. The max tilt angle while operating the light should be about 45 degrees up or down. Those bulbs could explode and be very dangerous.
Excellent video Guys - please could we have separate videos from each professional so that we can see exactly what they did with this machine.
Personally, I am interested in its Audio capabilities - I use Logic Pro along with Sibelius - a Score writing program.
Thanks for the presentation.
*SoC's. The M1 series of processors is a combined CPU/GPU/Neural chip, It is not a GPU or two GPU's linked together anything like AMD/NVIDIA's past dual GPU cards as was referenced here @1:30 into the video APUs are general purpose processors that feature integrated graphics processors, but that normally was for AMD chips, so lets just call Apple Silicon chips SoC's.
Hi, @The Verge. Can you please explain to me why in the Shadow of the Tomb Raider benchmark (14:32) all of them are providing better FPS in 4K setting while in the 1080p setting is performing at a much lower FPS. Is this a mistake or the actual FPS for these resolutions?
This has to be a mistake from them.
Yeah weird the 3090 4K performance isn't shown as I'd assume they got the labels wrong. Actually pretty impressive the ultra is doing about double the 4K fps compared to the Max and better than the Mac Pro in 1440 and 4K.
It is an error for sure also they would specify that Tomb Raider is running emulated on Mac Studio. So pretty impressive indeed.
The BEST review format. Please use this format for future reviews, thanks Verge!!!
I love Becca's energy and enthusiasm! 🥰
Great video, is very important to hear the people who actually work with it.
Best review I’ve watched thus far… 👌🏾
No cinebench test, no blender render test, no 2-3 days non stop 3d rendering test to check stability, no cad test to see the viewport limit for triangles the gpu can draw, no resolve test, no nuke test, no fusion test. Is not the new macpro but still we the people who need these machines to "print money" with them are interested in pure performance/$ and stability. 99% of people wont get this for Ps or Indesign. Maybe is finally a decent alternative for Ae instead of MacPro.
We ran cinebench! It's on the site: www.theverge.com/22981815/apple-mac-studio-m1-ultra-max-review
Best review ever. This is exactly the kind of real-world, detailed info that I need in order to decide what to buy (I'm a video editor still fairly happy with a 2017 5K iMac, but for some projects, I really could use more power).
This is such an incredibly well balanced review. I love it!
Any thoughts on the nano texture glass and how much it effects clarity / colours? Also any experience with the stand, lot of money but wonder if the adaptability is worth it?
Big fan of the review style where the team finds practical use cases and then explains how it works in each of those scenarios in a thoughtful, articulate way.
Love the multi person reviews. Monica is crushing it.
That audio test was probably the best I’ve seen…heard… on a YT review. 👍🏻
Oh guys!!! the monitor specialist guy!! I bloody love him!!! so clear! so straight forward and honest!!! yessssss we love you man!!!
I didn’t know you guys had J. Cole working at the studio!
This review was comprehensive and yet easily digestible. The editing and cinematography is remarkable. Thanks for all your hard work!
This is a really well done review! I'm a graphic designer and the mac studio max is really interesting, but i guess i can wait for another generation.
I'm definitely one of the nerds that cares about having a monitor that can display macOS in pixel-perfect resolution, but I'm not going to bite on the Studio display. Its essentially a $700 premium over the Ultrafine 5K if you include the height adjustable stand (which is a necessity!). We're too deep into the HDR-era to be buying new displays at such premium prices that don't support it. I'm also spoiled by my M1 16 inch Pro's display and a lack of ProMotion and HDR isn't going to cut it for $2000
Guess you will be waiting a while until we get thunderbolt 5 lol
Thunderbolt 4 can only support up to 4k 120hz
HDR is definitely coming with the new Pro Display tho
@@chidorirasenganz That's pretty much all I want, along with everyone else. I think apple could do 5k 120hz via DSC though. Regardless, I'm ready to move on from LED. Mini-LED, OLED, QD-OLED, are all things everyone wants now.
@@chidorirasenganz I'm sure there are work arounds, but wouldnt be elegant. That being said, tb5 is in the works, so the wait wouldn't be forever
@@0741921 There are but none that Apple would entertain. Like using two thunderbolt ports or using HDMI 2.1
It’s in the works yes but even once it’s finalized it’ll still be a little before it’s implemented. I think we are at least a year or two out. We just got Thunderbolt 4 in 2020 and Thunderbolt 3 was used even longer
Same here. Do you have other alternative monitors I should consider?
Love the dose of Backlon at the end. I ordered mine on the phone while the keynote was still going and I'm glad I won't regret my decision! Thank you for the quick and thorough review with real pros. Also love the new cinematography look; real environments, natural lighting, wider and closer lensing. Go Becca!
Colour grading seems a bit more... natural, as well? Or, washed out? It doesn't seem as contrasty as before. New look though.
“Words have meanings.” Such a simple statement. Such a deep, complex idea. Well said Nilay.
One of the best computer reviews I've ever seen. Thankyou very much.
Becca needs their own channel ❤️
she has. just search her name on youtube
Truly next level post Dieter era video.. congrats team.. faith continues..
Absolutely incredible review! Very comprehensive and gorgeous cinematography.
Thank you for a well thought out review without all the hype. Just the facts in Real World daily usage.
I think it’s great that they use Adobe for testing but they need to test Final Cut and Davinci as Adobe has been sluggish to fully optimize their software to these new chips as Adobe has seem to slowly move to focusing on Windows more than MacOS
But the point is that most pros use the Adobe creative suite
So It makes sense to test it first on a computer marketed for pros
@@itsCcallahan I know that pros use the full suite. But there are plenty of pros who use Final Cut and DaVinci. Especially when it comes to video editing. Myself in the video the community has seen many professionals moving off Premiere as the software yet powerful is not reliable and many professionals have been moving to DaVinci and Final Cut.
I think that was the point. If they can be a beast with Adobe , it will be fine with others.
You’ve missed the point. They haven’t picked Adobe for testing, they’ve picked Adobe because their internal staff are using Adobe everyday.
If it works well with Adobe apps, it’ll work most likely even better with Final Cut and DaVinci as that has been the case with the most recent M1 Max Macbook Pros.
Does the display work with Mac mini ?
Kinda nice to see Apple putting more pressure on Intel and AMD on terms of chip design and innovation. While not a new technology technique their fusion of the two chips without much latency is really impressive
I think for their desktop line of Macs they should just make their own dedicated GPU instead of it being a APU being how strong the CPU side of things are
There is something wrong with the GPU tests you are doing. (Not saying you are doing something wrong, here) There has to be a bottleneck somewhere and it is unlikely it is on the chip because of the architecture. Most likely the software is simply not taking advantage of all the GPUs available or there are single core dependencies caping the results. I suspect these results will change as the architecture is better understood.
I think what you are not taking into account is the encoders/decoders included on chip, which greatly speeds things up for certain workflows - which is a smart way to create the chip for its primary audience. All chip architects do this for a target audience, just as there are specialized chips (co-processors) that most people have never heard of, which runs circles around generic hardware for certain types of applications. The problem is that these are hardware dependent, hence, with new codecs in the future, the speed advantages may evaporate.
As an example, take the modified chips that are in higher end cameras, which would be useless for any kind of general computing product.
Are your Shadow of the Tomb Raider graphs mislabeled? 1080p gets the lowest fps and 4k gets the highest??
Surprised by the lack of mentioning the non removable power cable of the display
Is there a list of software online that makes use of the multi thread mac software
Super informative. As a premiere user I'm incredibly excited for this computer, but even more excited for the Mac Pro.
The reviewer is great addition for the verge
The audio quality of this video is like next level good! Really good! Amazing
Nilay's outro rant was perfect! 😄
Don't bother with the stock Apple stand. Get the VESA mount version. It is a lot more versatile for placement and orientation.
This review is a lot better than MacBook Pro review. This is probably the first time they considered there are more professional workflows than the ones common at verge. I remember last time, they said the M1 Max GPU was only weaker for gaming (and never considered professional workflows dependent on Graphical Performance)
"The general consensus was that the M1 Ultra was a bit faster... but not 3000 dollars faster" THANK YOU. I feel like I've watched a dozen videos about the Studio, and none of them have really answered this basic question.
This was very in depth thank you to everyone who made this vid
I love how you did this video with your own folks giving feedback on it!
Excellent "review" , not run of the mill near-blind awe that vast majority of "reviewers" spew out. I would have preferred that they used less jargon and explained why it's not way ahead of M1 Macbook pros, however still high grade to The Verge for this review.
Best Caption For A Review.. SimpleAsThat
Hahah Nilay caught me off guard with that "edge-to-edge" rant. Spot on reviews as always Verge team!
Are the benchmarks for Shadow of the Tomb Raider backwards?
How is it running better at 4K than 1080p? 😅
By far in my opinion the best review about the newest Mac 🖥👍🏽
Nilay"s passion for displays is endearing.
Why is the Verge testing the limits of what M1 Ultra could do with a 4K video that too at 4:2:2 10-bit even an iPhone can handle? If not 8K, at least do 12-bit 4:4:4 to test 4K meanwhile there's Linus Tech Tips with Blackmagic Camera on 12K at 120fps 16-bit …I thought the people at the Verge were familiar with MKBHD?
I will put this into my home music studio this year!
One of the best reviews from The Verge.
Are the Shadow of the Tomb Raider graphs correct? Wondering why the FPS is much lower with 1080p than 1440p and that 4k has the highest FPS…? 🤔 Shouldn‘t they be the other way around?
Saw that as well, looks strange
Great review -- just one caveat. When you use Python programs, you should specify the use cases. Most regular Python programs are single-threaded processes and its no wonder that you got the same performance on M1 Max and Ultra.
I love video review. So well done. I do though have one question. At the 14:30 min mark their are graphs showing frame rates for shadow of the tomb raider. It shows the highest FPS is at 4k and the lowest at 1080p. I think the legend is backwards. Wouldn't you be getting the highest FPS at 1080 and progressively worse as you went to 1440p and 4k?
Great review! Really honest and I like how you review different professional workflows. I will defiantly get the Studio, but I'm not sure about the Ultra?
Best non-bias review I've seen. I subscribed due to that. I am extremely disappointed in the GPU performance you reported. I don't play games, but I do 3D rendering. I hardly ever see a real world 3D rendering test. You also have me thinking if ULTRA worth buying if the CPU performance is negligible compared to MAX. Thanks again for a great review.
I would like to see more PRO use cases... Some simulations (tension "map", fluid, heat transfer etc.) Everyone says "it's so powerful!" So, why most advanced things are done on windows then? I feel like apples products are only good for video and picture editing (and it has very low power consumption)
I think kinda shot themselves on the foot by not being more open to gaming and similar tasks.
Their target audience is mostly creatives for these pro machines. Clear from trailers too.
Their CUDA alternative Apple Metal is so hard to code on. It's a mess tbh. So most Devs are better off with Linux or windows machines at the moment. Hope apple sorts this out soon.
What about a HARDWARE update to the included webcam?
Great review but you're just missing a repairability and upgradability section in it. More attention needs to be put on this as we can't carry on making products (at least desktops and notebooks) that do not at the very least expand with increasing amounts of data we produce and offer an easy way to repair them if they break within a 10 year lifespan. Maybe also energy usage would be good too, I know Apple is winning here at least.
Can we the people get a link to the music playlists??? Oh yea the review was amazing, probably the best on youtube so far I've seen. Thank you and good day.
This is by far most deep review of a product within 2days I guess, great work to the team. Am highly impress, keep it up.
Nilay's jacket game is always on point. Props
The presenter is great! Good job!
The Verge is really good at what they do.
I’m here dreaming of a Less CPU cores but more GPU cores equipped MacBook which can satisfy Mac gamers too.
Weird that no one understood the meaning of “Relative Performance” in their graph and expected it to be better than a 3090 😅
Amen. The only thing more frustrating than a visually misleading marketing chart is a reviewer that does not bother to read it properly.
As someone who has just ordered the M1Max Studio but said hell no to the same monitor as my 6yr old iMac, this review makes me extremely happy.
I still use a 27-inch iMac from late 2015 for most of my heavy tasks and definitely would be the target audience for the new desktop as I edit a lot on Final Cut Pro X & Logic. I'm not thrilled that Apple wants $1,500 for essentially the same screen I've been using over nearly the past 7 years. Every year I hold out for something better but I'd really like to upgrade my workflow. Decisions decisions...
if you havent splurged already there are ways to diy your old imac into a display!
Thank you. You are the first one to answer my question. I am going with the M1 Max. Thank you so much :) It would have been helpful if you could mention what's the configuration you used for the M1 Max
Use Affinity Photo instead of Photoshop, just the software makes any Mac faster, smoother and more interactive.
The industry is on Adobe