They’ll satisfy every mac user LOL, that was the point of every Macbook for years The difference is this years Macs are even better at being macs then all of the previous Macs and that is not a bad thing
@@Jo21 Yes, but these are simply not. There is bunch of people with priorities that this device is tailored for and they are going to be incredibly happy with it. The same way there are people who are going to love Asus G15 and people who will tell you that for their priorities it’s complete trash.
2:54 That's a bold choice to use a 90's brick phone as a pointing device instead of literally _anything_ else. What a delightfully weird repurposing of e-waste, I fully support it.
Isn't it a fair comparison? I've seen a lot of guys "reviewing" or showcasing their M1 Max Macs as equal or better than a desktop! A desktop which they don't have or had a 5 year old one with middling specs! 🤣 I've also gotten comments from people who bought the propaganda and tell me it has RTX 3080 (not mobile) performance! 🤣
It's a no go if you're serious about music production. Rosetta is literal trash to run VSTs on and more than 70% of the 3rd party plugins are just not compatible on M1 which gliches out and have performance issues. Audio interface doesn't shows up some times and if you've to change your choices of VSTs based on machine, it's not a good machine to begin with
@@clickbaitpro You're dead wrong about pretty much all of that. There are very very few plug ins that cannot run in Rosetta, and the hit at 10% is far less than the CPU advantage that the M1 provides. NI are lagging, but NI have traditionally lagged, they're pretty much the worst at adopting to anything new. That said, Kontakt runs in Rosetta etc.
I clicked on the video expecting very good comparisons in diffrent scenarios and such. All i got is dolphin emulator and some random bench data. How disappointing.
I dunno if/how you're coaching Anthony to host these things, whatever you're doing, keep doing it. He just keeps getting better and better for every video, and he was good enough with a decent margin to begin with!
I was wondering why the rtx had an arrow head on its bar graph, while the others were normal rectangles. Then I realized the rtx was so much higher than the others that it was being truncated so you could even compare the other bars 😂
Yes, because it makes good sense to compare one of the most expensive desktop configs you can buy to these LAPTOPS. There are not enough eyeballs available in the world to roll for this asinine comparison…
I wouldn’t doubt yourself so quickly. Max Tech did a response video to this revealing a surprising and very concerning set of anomalies in the data presented in this video suggesting either serious issues with testing methodology or massive pro Intel bias. An update urgently needed by the Linus team to respond to those observations and recover lost credibility
I'm a little puzzled about Apple's claims. On the CPU side, it makes sense. The M1 is using a much more efficient architecture compared to the x86 that AMD and Intel is using. But that's not how GPU's work, right?
I’m pretty sure that we can’t tell that much since Most apps and engines are not even optimised for ARM architecture much less this SoC… Till now, It was mostly Rosetta 2 which was doing most of the work, however it doesn’t really mean much against other PC’s
It’s also using a more efficient architecture for the GPU. Normally you’ll have the CPU feeding data to the GPU, and the GPU storing in it’s memory. This is why high end GPUs have higher bandwidths, because this is a limiting factor. This new chips don’t need to do this, the memory is unified, and CPU and GPU can share memory directly. This obviously requires massive changes in the application. Right now what we can see, is that the M1 macs have very limited graphics performance because Rosetta can’t use this trick, it emulates the previous architecture by copying data from CPU memory to GPU memory (in this case they’re the same). This essentially halves the throughput, and that’s why performance is so poor.
It's kind of the stepped up version of Smart Access Memory. The M1 series is the first modern implementation of a consumer unified direct access memory architecture. Just like how the most effective/efficient Mining GPU's are really the best memory bus implementations, these unified CPU/GPU/DRAM chips are going to start eating the modular systems lunch as long as they can get a large enough memory pool.
@@ezicarus8216 shhhh, they might realize that the imaginary system reserved memory for the igpu is actually a thing also.... x86 consoles go as far back as the original xbox... which yea it was shared memory for the gpu/cpu
GPUz have way more than RAW performance. For example, OPTIX of nvidia, is way smarter, therefore better for rendering than CUDA on the same card. Thing of games that RT cores, are incoparable to raw performance for raytracing, even if they consume less die space and power consumption
These recent reviews with Anthony hosting are so damn high quality that I can't wait for some of the LTT Lab content to drop in the next year. It's gonna be absolutely sick.
One thing to note, you can’t set up the pro max chip with 16GB of memory, so if you don’t NEED 32GB, it’s actually a $600 difference to go from the base M1 pro to the base M1 pro max. The $200 for the chip upgrade itself and $400 for the memory upgrade.
Yeah when I bought my M1 Pro 16 in November I wanted to spring for the max but like you said it quickly became a grand difference in price and I’m really happy with the Pro.
I would not buy a laptop with 16gb of memory. I know it's use-case dependant, but I constantly am running up on 16GB on laptop and desktop. Given usually it's when doing 3D workflows, or container development, but it does feel like even more casual use and gaming workloads are going to be pushing up on 16GB soon enough that the cost of upgrade is worth it to make the computer relevant longer. Like, I'm using 10.6GB of mem right now just to have like 20 tabs of firefox/chrome and spotify open.
@@QuakerAssassin yeah it’s def application specific. I bought my Mac for just working with Lightroom and Photoshop and to work with raw files and it’s amazing for that. But I really don’t render anything or game so it works great for me as far as productivity goes.
This "the answer may surprise you. It sure surprised me" thing is starting to be a distinctive mark of Anthony's videos and I like it. Love the energy!
I thought he was going to say Apple was true to their word and that their marketing accurately reflected their products; that would have been shocking.
@@cristhiantv they’ll probably say some delusional things like ”pro users always have their laptop plugged in anyway so power consumption isn’t an issue”.
@@sqlevolicious you don’t know his life, do u? Also power consumption is important if you’re rendering videos on the go…. But you’re gonna probably reply something telling us how stupid we are just by looking at your comments before… so don’t mind answering, have a good day
If you want to actually see the promised performance gains: Use it for software development. Build times went from 4 minutes (2019, 16" MBP, max spec) to
I used to respect the guy but i'm not sure what to think about him or LTT at this point. If they don't address this i'm unsubscribing. - th-cam.com/video/g1EyoTu5AX4/w-d-xo.html M1 MAX - 32 CORE BENCHMARKS v RTX 3080 & 3090 Will Blow Your Mind! - th-cam.com/video/OMgCsvcMIaQ/w-d-xo.html M1 Max 32 Core GPU v 165W RTX 3080 Laptop - th-cam.com/video/JM27aT9qhZc/w-d-xo.html 16" MacBook Pro vs RTX 3080 Razer Blade - SORRY Nvidia.. - th-cam.com/video/YX9ttJ0coe4/w-d-xo.html 16" M1 Max MacBook Pro vs. My $6000 PC The list keeps going. These results have been out for a while too so LTT really have no excuses. They didn't use optimized software, they didn't compare laptop performance while on battery, they didn't use readily available GPU benchmarking software thats already proven to place the M1 Max around the 3080 level. They need to explain.
Something I think was missing was battery life under load. A key part of Apple’s claims was sustained performance even on battery and at much greater efficiency. So I’m curious how the gaming comparisons would look if you capped framerates to 60 or 30 across machines and compared battery life then. You showed Apple exaggerated how close they were in raw performance, and now I want to know how much they exaggerated on efficiency.
If you have such a workhorse, why use it on battery where it would die in less than 4 hours IF it was at 100% battery? Seems extremely unrealistic scenario.
I'm really tired of mobile parts being called the same name (eg: 3080) as their exponentially more powerful discrete counterparts. They're fundamentally different parts I feel
I mean, they’re up to twice as powerful on desktop, but that’s plenty to mislead consumers. AMD and Apple aren’t doing that, though. Just Nvidia. I take issue with your use of the word “discrete” here - the 3080 laptop GPU is still discrete graphics because it’s not on-die with the CPU. Still, I take your point, and I second it.
@@djsnowpdx That's a fair distinction, is there a category to describe desktop + workstation + server GPUs? The only thing I can think of is 'PCIe GPUs', vs mobile GPUs and iGPUs. There's also the distinction between the specially-made rackmount-only versions, like the A100, which although use PCIe, are not PCIe-socketable, which futher muddies things
Would be interesting if they used TeraFLOPS as a unit of measurement to determine estimated GPU performance. :) Now it's not the best unit to use, but the FLOP can show 32-bit precision calculations per second.
Not only not the best, Teraflops is quite possibly the worst measurement to use, since for every generation and architecture performance per flop can differ so much. The only thing it's good for is marketing number shows (also relative estimated performance withing one gpu family of the same generation, but that's besides the point).
@@ZerograviTea. Wow. I didn't know it was the worst. So, what is the best unit of measurement for GPU performance? GPU bandwidth (GB/s), throughput, or something totally different?
@@Prithvidiamond every laptop you say? You do know that there are laptops with desktop cpu and desktop gpu? I mean they are absolutely huge, are barely able to be transported but they are still laptops and they will be 2 to 3 times more powerful than m1 macs for the same price. It's not a fair comparison but you might want to lower your expectations on Apple claim.
@@Natsukashii1111 Laptop and Portable computer aren't the same. Macbook is a laptop. Some Clevos that you are talking about are "portable" computer whith whom you can do evrything as long as you have a desk and power socket. Without those two it's a bigass brick good for nothing.
It's no wonder all of the reviews were so glowing when these laptops came out. It's because all of them almost exclusively focus on video editing and the Adobe Suite. "Benchmarking" often times is just video render times, and it's frustrating, as you can clearly see, it doesn't paint a good picture overall. The Zephyrus is what, at least $1k less? And it performs largely the same, at the cost of battery life? I guess efficiency is a good thing, but these laptops are good for only really very specific purposes, and I question whether they entirely deserved the ubiquitous glowing reviews when they dropped.
If you also consider programming, then also the m1 pro and max outshines the competition. Android projects and java projects are significantly faster than even the top-end machines running linux. Python and TensorFlow builds are also faster, although there somehow the m1 pro trains and builds the ML model faster than the m1 max due to some reasons. So in the departments of media creation and programming these laptops are truly top of the class.
Apple's gig has never been good value. I would actually consider buying it for the hardware if not for the OS lock-in. $1k for weight/battery life/build quality? Sure, why not.
@@Lodinn This is why, despite it's many downsides, I still kind of like the MacBook 16in 2019 with the updated keyboard and i7. Boot camp gives it longevity, and being that it runs x86, it runs all modern day apps. Obviously efficiency isn't nearly there, but all the other MacBook perks are, which makes it a rather nice machine. Outclassed for sure by these last few years of laptops by orders of magnitude, but hey, until Razer or Microsoft can get the build quality down as good as Apple has, it's an attractive option.
@@aritradey8334 That's fair! I haven't seen too many benchmarks I guess in the programming world, which I feel is telling when it comes to the reviewer landscape. With that being said, I remember some of the Hardware Unboxed review, and now this one, and they are such a stark contrast to the uniform praise these recieved upon launch. Great machines for sure, especially for those who use the areas they excel at. I guess I'm just rather exhausted at all of the review outlets only reviewing things for videography, simply because that's what they do. Their reviews shouldn't be a general "review" and should be more a "videographer review", so that those who don't watch/read 18 reviews like a lot of us here who do this for fun, don't get the wrong ideas.
I did wonder and reminded me of how Volkswagen optimized their software for specific use cases. I considered M1 briefly for a Linux laptop but then quickly reconsidered - if not else for the keyboard - and went for a Thinkpad Ps. I don't think these Macs are good for generic purpose computers. They are fine for the same task a Chromebook is also good for, or for the special video editing stuff. Seems quite niche, lucky them they can sell it with marketing.
Great perspective, appreciate the continued, in-depth coverage on these. I also appreciate what feels like an objective, enthusiastic investigation of the tech, neither a takedown nor blind exaltation, thank you so much for your work!
I’m a video editor , I have used Mac and pc for a long time. Recently built a nice PC and I game too much on it lol so now I’m thinking of getting the M1 Max for portability. Glad to hear it’s a beast at what I need it for. This is definitely not for everyone
It definitely is a beast especially if it has a native support for Apple silicon. If you game unfortunately there isn’t any game that natively supports it yet, if there was then you’d get close to 3080’s performance for far greater efficiency. The biggest advantage of these chips are the performance power you get on the go versus any other laptop on the go. The MacBooks just smoke them there and if you travel a lot getting a MacBook over the others is going to be a no brainer. Just remember that you’d have to sacrifice playing some AAA game titles though but if Apple themselves release some AAA games for the Mac, I’m sure more game devs would see the potential in the Mac and port titles to them. That possibility definitely exists but it’s going to be a gamble.
Interestingly Max Tech did a response video to this revealing a surprising and very concerning set of anomalies in the data presented suggesting either serious issues with testing methodology or massive pro Intel bias. Either way an update urgently needed by the Linus team to respond to those observations and recover lost credibility
@@skadi7654 no, they misrepresented data for whatever reason. Others have proven the reality, but although IMO LTT we’re raising an important and valid concern about these laptops, they did it in a very sketchy and either underhand or unprofessional way. See the Max Tech response for more details.
They pretty much had to for this M1 chip anyway. Can't really run widely compatible API's if you're going to do specialised hardware & also claim it slays a top of the line dGPU while using less than half the power. They just don't tell you that the software to actually get the claimed performance isn't widely available (yet).
@@MLWJ1993 Just wait until the community implements OpenGL using Metal, similar to MoltenVK. It's not really "specialized hardware", it's just a graphics API, that's how every GPU works. That's why OpenGL support is still ubiquitous on non-Apple GPUs, even though they're architecturally much more geared towards Dx12 and Vulkan, which are very similar to Metal (in fact, Metal itself is barely anything more than a deliberately incompatible clone of Vulkan because Apple is still Apple). The M1 CPU may be awesome at clearing up decades-long inefficiencies of the x86 architecture, but the GPU world has long progressed way beyond that. Apple has no such advantage there. The only reason they are even remotely competitive in a performance per watt benchmark is TSMC's 5nm node, to which they currently have exclusive access, but from an architectural standpoint they have a lot of catching up to do with both AMD and Nvidia.
@@DeeSnow97 The M1 just sucks for "community anything though" since Apple doesn't really do much of anything to have "the community" fix up their slack. Most of the time they specifically go down the path where they like "the community" to be able to do absolutely nothing. Like doing basic servicing of a device...
I would love one day to see Deep Learning Benchmarks as well ... as a DL practitioner, looking forward to the comparison for both CPU and GPU workloads.
@Christos Kokaliaris You can get these notebooks with 500nits, 4k 120Hz displays if you are willing to spend the cash. Personally I use external monitors.
@@MrGeometres if you run stuff on cloud, nothing beats a 900 dollar Macbook Air. You get a wonderful display, great touchpad, nice keyboard. At some point you have to run stuff on cloud if you are doing serious business. It does not makes sense to put thousands dollars to workstations that don't run most of the time and don't scale at all.
Unfortunately, the answer on HDMI 2.1 adapters is currently no, for software reasons. I think if you guys make a video on it that could get Apple’s attention to finish it
I'm interested if the laptops were connected to power. Also interested what the battery percentages would be at the end of the test with all laptops disconnected from power, and how hard the fans blew.
I think it's pretty clear that Macs run much better on battery power than most PCs. At least until the latest Intel and AMD chips are properly put to the test.
@@angeloangibeau5814 I disagree heavly. The point of laptops is portability, but that doesn't mean I will use them unplugged. Battery life is good but not as important as Apple makes it out to be. It's not that important like phones. When I am using my laptop for more than an hour, it's usually on a desk and almost all places I visit with a desk, they have an outlet.
Anthony, your screen-presence has improved so much from your debut. You’ve clearly gotten much more comfortable in front of the camera, and you provide a wonderfully logical insight (pun intended) into the things you present. I know you’re going by a script, but surely you contribute, and you make it yours.
Great review but I'm curious about differences between pro and max for development benchmarks i.e. code compilation. This is generally a very large use case for these macbooks.
Depends on what you're compiling, if your stuff can compile on mac and is not using GPU acceleration, then the difference is minimal/non-existent. The efficiency cores on Intel next year will be very interesting, and AMD finally moving to 5nm, though that is supposedly end of year, will be very interesting to see performance jump with that including the new cache stacking. It's great getting past the stagnation. I'm probably upgrading end of next year, will move from laptop (i7 9750H, it's 3 years old now) to PC since moved continents, and things like Rider and VS Code having remote support means I can just have home PC host the stuff (which I do often enough on my NUC if I need to run overnight).
3 ปีที่แล้ว +9
Check Alexander Ziskind youtube channel for many, many development benchmarks done to the M1/Pro/Max machines, most videos are very short and to the point. In general, CPU-bound work sees very little difference between the Pro and Max chips, you end up seeing more differences being caused by the number of cores available on the different versions than in the kind of CPU. In some cases, specially single-threaded ones like some javascript tests, a MBP16 running a maxed out i9 might beat the numbers, but if the workflow is multithreaded the M1 chips do it better. Unless your workflow really needs more than 32GB of RAM a 10 core M1 Pro is probably the "sweet spot" for development at the moment.
My friend is a senior engineer for Apple and he does both iOS and MacOS compiling. He got a Pro for himself and they gave him a Pro for work too because the Max isn't necessary for their developers for the most part. Only certain developers would get allocated a Max but he hasn't heard of any devs getting them.
The lack of Vulcan, Cuda, or OpenCL support on Macs is absolutely killing multi platform compatibility for even professional workloads and games have taken a giant leap backwards.
That is Apple's work, they just remove and destroy a industry-standard like OpenCL and OpenGL / CUDA (they never supported the most powerful GPUs, which are Nvidia). In Linux and Windows, when you get a new standard, they let you use the old one, it does not just get removed, which destroys a lot of software. You can still run 32 Bit Apps on Win and Linux very well and that is how you must do it. Apple is just typically arrogant and does not care about its users. That is the reason why they have not had more than 10% marketshare globally, not once in 44 years the company has existed.
@@nigratruo x86 is stagnant and needs a complete reboot... but noone got the guts for it... Apple did and they now have quite powerful machines that uses little power... perfect? not yet... but way better for what they are meant for and then on top of that they can game decently... but again not perfectly... yet. but the extra power of the M1 chips? especially the pro and the max? well they could (should) be interesting for game devs to tap into
the progression of Anthony and how much better/confident he has become on camera should be an inspiration for everyone to practice confidence in social setting (which is even worse on camera when you're staring into a lens instead of talking to people)
@@davide4725 “Thanks TSMC” You sound like the kind of guy who loves bringing up John Lennon’s wife beating tendencies every time someone mentions they like the Beatles lmao
I am also loving the progression for the ARM space. What really excites me isn't the CPU or GPU in these, its the optimizations they made to make ARM that competitive. They're getting asic-like performance for a lot of low-level stuff.
@@davide4725 i find it funny how you called the other guy "kid" while here you are having absolutely no knowledge on RnD, Design, audit, documentation, subcontracting and manufacturing process works in the tech industry. "Thank TSMC" lol. Kid please.
Blender 3.1's metal support is very nice. I still don't think it beats out some of the higher end RTX cards, but it still performs very well, even in the alpha stages
Things i still want to see covered: 1) How much can the USB-C take? 8 hubs fully loaded with all the native monitors going, with also X extra monitors using displayLink, while running a USB connected NAS and 10gb Ethernet dongle 2) EGPU support? If not, what happens if you try it? What if you try to force the Nvidia or AMD drivers with rosetta? 3) Wipe one of the system and use it as a daily driver for a week but this time refusing to install Rosetta. How are the proformance numbers changed without the emulator running or even listening
I have the M1 Pro Max. First Apple computer I have owned. And I am nothing but impressed... Sure I could find something I don't like about it. But... I could show you a list of complaints with my last laptops that are far worse. How efficient it is does have a lot of value. My last laptop was $2,000 when i purchased it from Lenovo. And I needed a Go Pro for a project. realized the memory was full and it killed my laptop battery before it could get the footage off. Even Chrome would noticeably kill battery life. Having a laptop that is useless without a being plugged in sucks.
I like your tests and I am not an Apple Fanboy, but your results here are very different from most of the other Tech TH-cam channels that have tested these MacBooks
which other tech channel results differ from this? post a real tech channel, not a fanboy channel. before you post make sure that channel does a variety of reviews not only praising apple products.
@@truthseeker6804 All that I've seen actually. Here are some - th-cam.com/video/g1EyoTu5AX4/w-d-xo.html M1 MAX - 32 CORE BENCHMARKS v RTX 3080 & 3090 Will Blow Your Mind! - th-cam.com/video/OMgCsvcMIaQ/w-d-xo.html M1 Max 32 Core GPU v 165W RTX 3080 Laptop - th-cam.com/video/JM27aT9qhZc/w-d-xo.html 16" MacBook Pro vs RTX 3080 Razer Blade - SORRY Nvidia.. - th-cam.com/video/YX9ttJ0coe4/w-d-xo.html 16" M1 Max MacBook Pro vs. My $6000 PC The list keeps going. These results have been out for a while too so LTT really have no excuses. They didn't use optimized software, they didn't compare laptop performance while on battery, they didn't use readily available GPU benchmarking software thats already proven to place the M1 Max around the 3080 level.
@@andremessado7659 so i watched the first video, and the m1 max actually lost to the laptop and desktop, in the chart in export times, but it did well in the timeline playback, thats literally the same as this video in the davinci resolve section at 5:28. in the second video, the gaming laptop totally destroyed the m1 max on power, not on battery. i skipped the third bias max tech apple fanboy channel video. regarding the fourth video, the m1 max lost in all the charts except the 6k braw export, which is interesting because the first link you posted had a faster than the m1 max export speed on the gpu. so in summary from the first, second and fourth video, the m1 max does best in video playback on a video editing timeline, but loses to 3080 or 3090 in video exporting, stabilization, rendering, benchmarks, everything else.
I was just auditioned for an animation job, I was put on a last gen Intel iMac, fired up Blender and put a normal map on one surface in the scene and the GPU almost caught fire and the whole macOS GUI dropped to 0.5fps, I'm not sh1tting you!!!
M1 can throw a lot of weight around as a DAW host, especially running Logic and AS-native plugins. It's reportedly less-well-suited to realtime audio tasks (like recording live guitars through software amp sims at low latency in a busy session) but it absolutely pommels at mixing and composing tasks that don't require super-low RTL figures under load. The 32GB Max variant will benefit a serious composer who wants all of the orchestral libraries and soft synths loaded at once, although all that GPU will be drastically underutilized in the same scenario.
I know it’s not the same as a real thorough test, but most benchmarks agree that the M1 (any variant) run virtually identical both plugged and unplugged.
Absolutely, One of the main use cases for a laptop is while unplugged. The first test they should do is fully charged and unplugged performance testing, then while charging, and then when fully charged but plugged in. Large differences in performance can result in various situations, and only testing while plugged (or unplugged) can skew results to the testers desires. I think Apple's claims might be correct IF the laptop they were comparing against did poorly while unplugged, so Apple's results would look more impressive.
Watching Anthony go from absolutely HATING being on camera to being so much more comfortable that he cracks me the eff up with an intro like that! Bravo Anthony! 👏 👏 👏 I almost spit out my coffee lol'ing at that. Great work.
@@bear2507 the illusion is that apple claimed the performance is about the same as an rtx 3080, not just M1 barely beaten the rtx 3060 its not even close to rtx 3080 and i mean mobile rtx gpu, an rtx is a GAMING gpu, so when they made these claim people will think about its performance for gaming obviously, should have compared it to a profesional gpu like quadro instead of being either brave or stupid to compare them to rtx
@@foxley95 yeah, i’ll go tell my research lab to shut down our datacenter with hundreds of 3080s, because some kid on youtube said these gpus are for games only and not generic compute. comments are full of children who have never touched anything ouside minecraft, but have an opinion on everything hahah
So much of this is really optimization in code, for those of us that lived through the changes from carbon to coco to metal and from Motorola to PPC and then to Intel, one of the things that happened was after a giant change in architecture, over time as software gets updated the macs would get faster. Even Apples OS is still hitting rosetta. The review is still fair, but in a year the results from the same hardware will most likely be significantly different.
Steam drains the battery on my 16” Mac(M1 Max) faster than running Windows on Arm(Parallels) + ECAD(Altium) or Keysight ADS for EM field solving. Yeah… Just having the Steam launcher running, not even with a game going. Oh well, i never intended to game on the Mac anyways since I have a gaming PC… but in terms of work, the Mac can do everything I need it to do in portable form factor, while maintaining all day battery life.
@@LiLBitsDK I was just backing up his point that unoptimized things can run really bad no matter the device. Like in my case, something as trivial as the Steam launcher
Exactly. That's another crazy thing about M1. It will just get faster as we get updates. Normally machines will be slower as they age since software gets more complicated.
That's the first time I heard that someone is preferring the silver color. I also got a silver one and, looking around online, it seems like I'm way in the minority with that decision.
I agree with what you say: M1 max is literally only for professional video editors, which is a super ultra niche market, for everyone else, it's not worth it.
I think it'd be more accurate to say media professionals and developers in general. It's absolutely fantastic for professional audio production and software development. Silent the vast majority of the time and can easily handle on-location and remote tasks with it's awesome battery life with full power whether plugged in or not. The high-impedance-capable headphone jack and best-sound-in-a-laptop ever doesn't hurt either. I think it's important to compare Apples to Apples here (pun intended). They're not designed for gamers, they are designed for professionals. As an equal Windows and MacOS user, my experience with these has been top-notch. For pros, Apple has hit a home-run here IMHO. Also, I think the power-per-watt here should not be ignored and I don't believe this was mentioned - add that factor to the benchmarks and you'd see some very different results. Energy costs money and affects the environment. And a hot, noisy laptop isn't particularly enjoyable to use day in and day out.
Super niche. Because let's face it, the m1 air can do 4k editing. How many editors need to edit 12 simultaneous 4k streams? Most youtube viewers don't even watch in 4k yet rofl. I really wish it performed better at 3d design.
@@wykananda for audio professionals most of them were good with an older generation macbook with high memory configuration tho. also for non video editing/audio professionals, macos is really really difficult to use. even more so with arm. basic stuff like a volume mixer and any sign of useful window management are absent out of the box. what is the point if you are spending such a premium to get a sub par experience with non video editing/audio professionals.
@@pupperemeritus9189 Hi pupper. I'm not sure I understand your comments. Sadly, the previous Macbook laptop generations were all limited to16gb of ram - so high-memory configs were simply not possible. Moving to the ARM architecture did not change the underlying operating system, MacOS, it simply made the laptop hardware run faster, smoother, quieter, and for much longer on a single battery charge. As for the difficult-to-use / sound control / window management - the latest Windows and MacOS are both more than reasonably user-friendly and well-equipped in all these areas - these OSs have both been around for many years and generations now and it shows. As a multi-OS power-user I could nit-pick plenty at both OSs here and there for sure though. However, in my experience, for the countless newbies that I've trained and continue to help, MacOS has to get the nod for getting productive and comfortable more quickly with less frustration and confusion and less problems over the long haul. Let's face it, both operating systems are DEEP. They're both very capable and stable at this stage but either will take time and effort to learn to get the most out of them. Curiously, my current "go to" Windows-based laptop is a 2015 Macbook Pro running Boot Camp - ironically, it's easily the best Windows laptop I've ever owned - cool, quiet, fast, stable, good battery life, well-built, expandable - and, of course, it runs MacOS like a champ too. I'll likely get another 3-4 good years out of it before I hand it down the line. IMO, the 2015 MBP was the best overall professional laptop ever made for Windows, MacOS, or Linux until now. While I can run the ARM version of Windows on the latest MBP via Parallels and so on, I'll have a new laptop King if-ever/when Microsoft Windows gets fully up to ARM speed and these new killer Macs can boot into it natively.
Yup. Honestly no idea why it took LTT so long to get these videos out. All this information is widely known by now. Seems like a huge miss on their part for being so late to the game on these. If they didn't receive the products in time then sure thats fine but it's also LTT.... Surely they could of worked it out.
As time goes, I start realizing that Max Tech tends to only or mostly show the advantages of M1*. Have to watch other channels to find out for instance about the only 90%ish Adobe RGB (this is bad for Photoshop semi/professional editing) of this screen and the very slow screen response times (35-100 ms) - find Hardware Unboxed for these 2 - th-cam.com/video/p2xo-hDCgZE/w-d-xo.html . Or what it is here.
@@ContraVsGigi AdobeRGB? lol, a lot of professionals don't need or want 100% AdobeRGB coverage because they're working in an SRGB or Displayp3 workspace. Non-issue. 90% is actually a very good result for AdobeRGB anyway.
@@DriveCancelDC Personally not a fan of the guy or his channel but i'll give him credit for shitting out a buttload of videos when the M1 Pro and Max Dropped. He was on it from day 1. It's been almost 2 months and Linus is only just putting out a video now? I expected better honestly.
So, the short of it I'm getting is that Intel has just been intentionally selling bunkus laptop CPUs until the M1 came out and then they're like "Oh, you wanted a serious mobile chip?"
What about software compilation, data analysis and heavy crunching like that? Can you 🙏 test compiling Linux or something similar workflow for the 16” review? Pretty please 🥺 It’s a lot more relevant for someone like me
Slight correction at 3:44. The closed captions currently display: "with middle GPU rendering and support" This should be changed to: " with Metal GPU rendering support"
8:57 To make matters worse for Mario Sunshine, the starting level is the easiest level to run on lower end hardware. So the fact that it hovered around 70s and 60s, that's not looking for the m1 max. However it may just be due to the rendering API being wonky on Macs
Thx. This made me re-consider buying a MacBook, after using one for mobile purposes for 4 years. I work as a freelance architect and I fell in love with Twinmotion, a UE based real time renderer. Path tracing is not a thing with Mac and it realy sucks for the price. It might get a metal update, but I need it like now and can’t wait another year. Gonna get a W notebook again I guess. Also the price difference is ridiculous.
Using Apple laptops for UE wouln't make much sense until Epic natively supports them and that will take at least a year (because they first have to release UE5).
I'm using a macbook and honestly, overpriced as hell. You should go with a high end laptop if you're willing to pay the same price for better performance. There's the Razer Blade Stealth 13 for around the same price, it's a thin and light with just better performance most of the time.
My pick would be gaming laptop. For the extra GPU power and cooling for UE. I have Lenovo Legion and when I have to upgrade again I will buy the same brand again. It is a bit clunky but it stays cool all day long with i5 on turbo and nvidia 2060.
One thing not mentioned when doing the benchmarks, how do all the laptops ( MacBooks and Zephyrus) perform while only on battery. Yes battery usage length is great, but how is the horsepower of the cpu/GPU effected running apps while on battery. I think some surprises might arise.
After rewatching this review, I went ahead and bought the base model of the 14 inch m1 pro. I will be doing more cpu than gpu heavy work but I didn't think the 2 extra cores was worth the money
I’d really love a software dev take on this. For my use case fast cpu, good battery life and 64gb of ram are compelling - but are distinctly not video rendering.
Developer here, I wouldn't buy any of these besides the base-level MacBook non-pro. You can literally code on a Raspberry, unless you're compiling something crazy-complex like an entire browser you're not going to feel the difference, so why pay extra for literally nothing? A USB-A port would have been a compelling addition, but oh well.
Other developer here. Never found myself desperate for a usb A port while developing but have definitely found a use for better cpu and ram. Not sure what serious developers are developing on trash hardware tbh.
@@JackiePrime Web, for example. I don't develop on trash hardware because I can afford better equipment, but if I still had my old FX-8320 it wouldn't slow me down in any way. Peripherals are way more important at that point. Also, every single hardware debugger uses USB-A, and even if you just want hobbyist stuff have fun hunting down a USB mini-B (not micro-B) to USB-C cable just because you can't use the included mini-B to A. But it does make sense, if you only develop for iOS (which is literally the only reason I've ever considered buying a Mac) then you won't run into any of those issues, and Xcode being a hot mess does necessitate a faster CPU and more RAM. But there's a lot more to development than just Apple's walled garden, and if you step out of it it's a lot more important to be able to mess with any device you want to.
Also a developer here, gpu on the max is absolutely useless and 64 gb of ram is overkill for my line of work. 32 gb ram and 10 core pro is plenty plan to keep for about 4 to 5 years.
Another Developer here, I have the M1 Max with 64GB, 32GPU and 1TB SSD. While this setup is overkill, first I can afford it and feels good not having to worry about performance while working. On the technical side, running Webstorm, and other ides, multiple node apps, multiple docker containers, electron apps that suck like Slack etc takes a toll on any computer. If you can afford it, specially since software engineering is a well paid job, plus the resell value down the line, why not?
Normally I completely agree. Seems very skewed and that the apps selected were designed to show this in a poor light. Was it purposefully? Guess time will tell, but I believe this video will not age well. However PC fans will point this this sole video as why the new MacBook Pros suck, despite an overwhelming number of other reviewers showing the performance in a different light, several that are also typical PC reviewers.
@@BootStrapTurnerVideography you mean max tech right? A guy who literally said that M1 max MacBook is just as fast as 5950x desktop with RTX 3090. Yeah that guy is totally not biased at all. I think what Anthony wanted to point out here is that those apple marketing slides for M1 max were very very misleading.
not on the 14inch. if needed, yes on the 16in. best to stick with the Pro on the 14in, or if needed Max with the 24core GPU. the 32 core is voltage limited in the 14in.
Benchmarks in C4D/redshift don't tell the full story. You need to go into redshift's render settings and manually increase bucket size to 256/512, then you'll see a 25%+ improvement in render times.
I’m a software dev who also edit in Resolve and do some Blender on my spare time and went for an 10/16c M1 Pro with 32Gb RAM, I don’t regret it : I don’t intent to game on it, Blender Metal support is coming, and ho boy that XCode build time is just fabulous ! That’s not just the extra CPU, faster SSD and mem bandwidth do make a huge difference, easily cut my build times in half. picking an M1 Max was just wasting battery life for me, as the test shown in the video is the best case, the drop in daily workloads is more like 30%
@@StefanUrkel a huge one! Used both on M1 16Gb, very decent performance but swap usage was way too high and memory pressure often in the yellow area. 32Gb is way way better!
Thanks Anthony. Now I have to decide whether the better memory is worth it or whether it’s not good enough and I’ll have to push code off to the compute cluster anyway.. In the future, could you guys run some memory intensive data analysis code, like something that inverts large matrices in memory in python? That’s a good reason to get these if I have to avoid the hassle of pushing code to a network machine and can run it on my laptop and play with it.
This is what I was looking for, Apple made very bold claims about their SoC and although it's still impressive it's not the "PC killer" they make it appear to be, and the thing is that they match the price to better laptops (raw performance wise) of course if you live in Apple's garden this is still better than anything and if you use mostly arm native apps you'll love this but it's not really worthy for many PC users to change to mac when the performance claims Apple made are not true. We'll see what happens next year and in 2023 when the chip shortage "ends" things could be very different by then for hardware and software.
Not in GPU performance, sure. But the apple silicon is still game changing in cpu performance, efficiency, and battery life, not to mention the build quality, track pad, speakers/mic, and screen quality improvements you get over the vast majority of windows laptops. Of course it's not a "PC killer", but for anyone who's not a gamer and looking to spend $1k-$3k on a laptop, it's very hard to justify getting anything else unless you absolutely need windows for something.
People treated these things like the second coming of Jesus. It's interesting to watch them shave their claims and astonishment back to the regular levels we have for every new tech release.
@@leonfrancis3418 what I hate is that the MI mac in both generations is actually really innovative, but apple makes it seem like life will never be the same every freaking keynote so the actually good stuff like this still feels like it is falling short
One of the main use cases for a laptop is while unplugged. Large differences in performance can result in various situations, and only testing while plugged in (or unplugged) can skew results to the testers desires. I think Apple's claims are probably true IF the laptop they were comparing against did poorly while unplugged, so Apple's results would look more impressive. I would bet the power hungry high end PC laptops would all make comparisons while plugged in.
And THAT should be a huge point actually. We're talking about Laptops here. They're meant to be at least partly used without being plugged in. And most Windows Laptops are SO much slower on battery while Macbooks just run the same and still last literally 3 times longer.
@@Raja995mh33 Yes but when you do huge workload you cab have a workstation or a desktop in the PC world, in the M1 world it is not possible. Also a desktop graphic card or cpu is more powerful than a laptop one and can destroy even more the M1 pro max. Mac user have no alternative to what apple gives them. PC user has many options. th-cam.com/video/rQJTkWWkc0g/w-d-xo.html&ab_channel=TallyHoTech
@@inwedavid6919 Oh wow dude a Deskop is more powerful than a Laptop. No shit Sherlock. We're talking about Laptop vs Laptop here! NOT Desktop vs. Laptop. That is the whole point. And the point is also that you CAN do these heavy workloads on a Macbook without a problem and it will do it just as fast and does not need like 10 times longer like many Windows Laptops that like to throw around big numbers that only count while being plugged in.
@@Raja995mh33 Is your country that starved on power outlets though? I've literally never had a problem with laptops, even has enough performance & battery life unplugged for the small commutes inbetween places that have a power outlet available to me 🤔
@@ezicarus8216 Just because you may not do that doesn't mean no one does. Many people just don't do that BECAUSE their Windows Laptop becomes useless at doing these tasks when it runs on battery. I know a lot of people that do work and also do heavy workloads on the go etc. and often on battery and all of them use Macbooks for that. And even when you're at home or whatever it's nice to know that you can juss unplug your machine and may work a while in the living room or whatever for at least 2-3 hours and you won't have any bottle necks or throttling.
I love LTT but didn’t you admit to forgetting WoW runs natively in your first MacBook Pro vid, I suppose you forgot it again? Or just selecting results? Also why step up to the M16 for this one in comparison to your last vid when comparing a 14 inch laptop and not when you test the 16 inch version? And finally how about rerunning tests that are native across all platforms and then run the laptops with these tests unplugged to see how real world use would be.
so if he runs wow, and it does well, that does mean the laptop is now good for gaming?, dude put your bias aside, this test is to prove this laptop isnt good for gaming, even if it does well in one game. gamers arent only playing one or two games. if im buying a laptop it should run all apps. is macOS a phone OS or a computer OS?
@@truthseeker6804 no it’s not a gaming laptop. But saying it can’t play any games well is not true. Some people who use laptops for things other than gaming still like to play one or two games. Additionally there are other reviews out there by PC guys with fairer comparisons. If you’re going to compare something compare like tests. Asking me to high jump against an Olympic high jumper is just silly and so is running bench marks on machines that aren’t natively supported across all platforms when there is a myriad of benchmarks that are.
@@baldwanderer8188 yes some people play one or two games on pc, and im an example, but definetely everyone dont play the same one or two games. if he only did wow test and the macbook does well, many people are going to conclude its great for gaming and would do similarly in their games, which it isnt. so a test should show as much weakness of a device. if you want a fanboy channel that only show pros, there's max tech channel for that. asking you to high jump against an olympic jumper is a good test, because if i was in the market looking to buy a jumper i'd want to see where i get the best bang for buck. dont forget these macbooks are priced the same and even more than alot of these windows laptops that can do more.
I’m primarily interested in the M1 for ML workloads. While it offers potential for edge applications most enterprises ML is on RHEL using the x86_64 architecture so the Intel CPU Mac’s end up being the better choice for a development workstation (windows just ain’t it, and find me an IT department willing to deploy native nix laptops. I’ll wait). That dynamic will probably shift over time,but I’ll probably treasure my intel MacBooks for longer than I should. All this is to say LTT: love your content, but would appreciate reviews with a data scientist perspective.
Sorry for such a straightforward question. My work is based on Java most of the time. Are the compiling times better on M1? Right now I'm on 10th gen i7 10700k. Should I upgrade to alder lake or M1?
The CPU stuff on numpy seems really promising. But training on GPU seems rather meh, at least in Computer Vision the performance improvements of using tensor cores is just way to large. Seems like performance per watt is rougly on par with a full power 3090 - not exactly a efficient GPU. But IMHO any GPU heavy workload is much more suited for remote servers than a laptop.
Well I suddenly had an idea... Have you tried testing the workloads without external power? I think Apple might have put benchmarks made on battery power only (since it's supposed to be a laptop an that is the use case aimed at) I would just be curious if that changes the ratio... I mean, sure you can have a 3080 in a laptop... But can it deliver 3080 power only running on battery?
of course, just switch the performance to max, it will burn faster through the battery, but deliver. But in the end, all Laptops overheat, they are not really for professional usage, you buy (or custom build) a desktop for that, or what 90% of all people do, a PC, with 2 RTX 3080, that is insane performance.
@@nigratruo alot of big gaming laptops actually can't perform as well on battery. Alot of them can't handle full power delivery without the external brick plugged in. Unsure which laptops are affected by this tho
The answer t your question is a huge no. Unplugged the 3080 gets destroyed, but no change in the Apple. So if you judge a laptop in terms of being disconnected then it is really no contest at all. But that would not make this video as much fun for the PC guys. Honestly if you are just going to leave a laptop plugged in, then why get a laptop at all? Desktop all the way. But for those that want professional capable laptops then the Apple is a good way to go even though they cost quite a bit if you spec them up.
@@justinbuettner8547 Huh? Are you really this clueless? Of course you can switch the performance profile on a windows machine to full performance, no powersaving and then you get the full power, plugged in or not. This is not your typically dumbed-down OS like MacOS for people that are clueless about technology. And only in your Apple fanboy deluded dreams this pathetic Apple GPU can best the most powerful consumer GPU on the planet currently. The M1 chip overheats like crazy if you run the CPU and GPU full tilt, SoC is a really dumb idea, to put all the massive heat generation in one tiny little spot.
@@justinbuettner8547 I use my laptop for video editing and it's almost always plugged in. Why buy a laptop then? Well so I can carry it from my home to the workplace and back again, also on trips, both business and leisure. I just plug it in to the external monitor (one at home and other at the work place) and a wall socket and I'm ready to go. And for the price difference I can afford a good 4k colour calibrated monitor which is important to me.
its bad because software support is bad. for anyone who already have a bunch of x86 music software/VST/VSTi-s using M1s is a pain in the butt. and thats pretty much all of professional music producers. to use it you'd have to start from scratch and ignore pretty much all software that already exist or you'd have to wait for those bright times when there will be decent enough options for M1 silicon to replace every piece of software that producers use now.
@@rawdez_ Not true. I produce music and I’d consider myself probably somewhere in the intermediate level. I upgraded to a 2020 M1 MacBook Pro this summer and I’ve had very few, if any, plugins outright not work. Sure, we’re still waiting for a lot of plugins run natively on Apple Silicon, but in the meantime I’ve had no issues with running many plugins through Rosetta and the M1 handles them like a champ. As for what I actually use - both of my DAWs, Ableton Live 11 and FL Studio 20 as well as plugins such as Serum, FabFilter, Omnisphere, Waves, iZotope, Arturia, Native Instruments, etc. all run flawlessly.
@@mike.d99 I've said "professional music producers", not "hobby musicians" as far as writing music goes you literally can write music using only wave editor, check Burial. so basically you can do it online in a browser on a $100 smartphone. still doesn't mean that M1 macs are actually good for music production. but yes, you can write pretty much anything on any modern device, M1 with its limitations isn't the worst option. just can be annoying AF for people who are used to x86 software and straight up bad option for professionals who make money and need EVERYTHING to work, "very few, if any, plugins outright not work" isn't acceptable. what would you do if that one piece of software that you own, know how to use and actually need for your current project doesn't work properly, produce glitches or is unstable?
Tailoring the chip to videomakers is essentially marketing genius, since it'll satisfy all the TH-cam reviewers
They’ll satisfy every mac user LOL, that was the point of every Macbook for years
The difference is this years Macs are even better at being macs then all of the previous Macs and that is not a bad thing
@@dominikhanus9320 well they ben crap since 2016
@@Jo21
Yes, but these are simply not.
There is bunch of people with priorities that this device is tailored for and they are going to be incredibly happy with it.
The same way there are people who are going to love Asus G15 and people who will tell you that for their priorities it’s complete trash.
Lol
@@dominikhanus9320 The problem is the price though. It’s just not a good value.
Man, that single photo of Linus has gotten more mileage than all of the cars I've ever owned combined.
Lol
But not as much mileage than you mom lolololol get rekt kid
@@maddrone7814 Bro how old are you
@@maddrone7814 nice, keeping it old school.
mozly not as old as your mom lolololol get rekt kid
2:54 That's a bold choice to use a 90's brick phone as a pointing device instead of literally _anything_ else. What a delightfully weird repurposing of e-waste, I fully support it.
Therapist: Gangthony isn't real, he can't hurt you.
Gangthony: 'MAXED OUT MACS ARE MAD MACS WITH M1 MAX TO THE MAXXX'
It’s like 2000’s cringe all over again lol
No offense I love LTT and I get it’s ironic
Thugthony
Max max super max max super super max max.
Gangthony > Punk Linus
@@randomrdp3356 Max Super Max Max Super Super Max Max Max
i love that the 5950X testbench is so much faster that it just goes off the charts
i know it's not a fair comparison, it just looks funny
Well, all serious work that can afford a $3000 Macbook can afford, and should use, a $3000 desktop, and can eat the added peripheral cost
It was just to remind fan boys it's still a laptop and not a monstrosity
Isn't it a fair comparison? I've seen a lot of guys "reviewing" or showcasing their M1 Max Macs as equal or better than a desktop! A desktop which they don't have or had a 5 year old one with middling specs! 🤣
I've also gotten comments from people who bought the propaganda and tell me it has RTX 3080 (not mobile) performance! 🤣
I’m just sad that there was no Davinci on Linux AMD performance comparison in the graph. Genuinely interested in how it compares either way.
It is absolutely fair. If i will spend 3000 for a computer, I need all the comparissons
I wish you guys did more testing in audio production, I'd be curious to see how Logic pro and Ableton Live run with lots of VSTS on the M1 Max
THIS!! They need to focus more on the audio production side
Alas, for sound is often forgotten but always essential
It's a no go if you're serious about music production. Rosetta is literal trash to run VSTs on and more than 70% of the 3rd party plugins are just not compatible on M1 which gliches out and have performance issues. Audio interface doesn't shows up some times and if you've to change your choices of VSTs based on machine, it's not a good machine to begin with
@@clickbaitpro How come other videos I've watched people have little to no problem with their VSTs on the new mac?
@@clickbaitpro You're dead wrong about pretty much all of that. There are very very few plug ins that cannot run in Rosetta, and the hit at 10% is far less than the CPU advantage that the M1 provides. NI are lagging, but NI have traditionally lagged, they're pretty much the worst at adopting to anything new. That said, Kontakt runs in Rosetta etc.
You guys should really add model training to your benchmark. Find a basic TensorFlow notebook and see how M1 fares against a neural workload.
Please please please!
Especially compared to Nvidia's Tensor cores on 20 and 30 series cards, which are both crazy powerful and have great platform support
I clicked on the video expecting very good comparisons in diffrent scenarios and such. All i got is dolphin emulator and some random bench data.
How disappointing.
@@OG_ALviK check hardware unboxed, LTT is like fast food despite being the biggest tech channel out there
100%, no idea wtf the dolphin review was. Odd at best.
I dunno if/how you're coaching Anthony to host these things, whatever you're doing, keep doing it. He just keeps getting better and better for every video, and he was good enough with a decent margin to begin with!
Bro Anthony is the best!!! I just wanna be best friends with him 😂
Plot twist:Anthony is coaching Linus so that he doesn't drop thousands of dollars worth of electronics every show.
i think he is just smart enough to do it naturally
I was originally offput but him but I now love the man and need him to make more videos. He's great !
Anthony is the best ❤️
I was wondering why the rtx had an arrow head on its bar graph, while the others were normal rectangles. Then I realized the rtx was so much higher than the others that it was being truncated so you could even compare the other bars 😂
^this
Yes, because it makes good sense to compare one of the most expensive desktop configs you can buy to these LAPTOPS. There are not enough eyeballs available in the world to roll for this asinine comparison…
@@ryanw8664 whatever it takes for them to make the mac look like a piece of trash. Honestly what trashy review
@@HuyTran-sb2ql malding comment
I wouldn’t doubt yourself so quickly. Max Tech did a response video to this revealing a surprising and very concerning set of anomalies in the data presented in this video suggesting either serious issues with testing methodology or massive pro Intel bias. An update urgently needed by the Linus team to respond to those observations and recover lost credibility
I'm a little puzzled about Apple's claims. On the CPU side, it makes sense. The M1 is using a much more efficient architecture compared to the x86 that AMD and Intel is using. But that's not how GPU's work, right?
I’m pretty sure that we can’t tell that much since Most apps and engines are not even optimised for ARM architecture much less this SoC…
Till now, It was mostly Rosetta 2 which was doing most of the work, however it doesn’t really mean much against other PC’s
It’s also using a more efficient architecture for the GPU.
Normally you’ll have the CPU feeding data to the GPU, and the GPU storing in it’s memory. This is why high end GPUs have higher bandwidths, because this is a limiting factor.
This new chips don’t need to do this, the memory is unified, and CPU and GPU can share memory directly. This obviously requires massive changes in the application.
Right now what we can see, is that the M1 macs have very limited graphics performance because Rosetta can’t use this trick, it emulates the previous architecture by copying data from CPU memory to GPU memory (in this case they’re the same). This essentially halves the throughput, and that’s why performance is so poor.
It's kind of the stepped up version of Smart Access Memory. The M1 series is the first modern implementation of a consumer unified direct access memory architecture. Just like how the most effective/efficient Mining GPU's are really the best memory bus implementations, these unified CPU/GPU/DRAM chips are going to start eating the modular systems lunch as long as they can get a large enough memory pool.
@@ezicarus8216 shhhh, they might realize that the imaginary system reserved memory for the igpu is actually a thing
also.... x86 consoles go as far back as the original xbox... which yea it was shared memory for the gpu/cpu
GPUz have way more than RAW performance. For example, OPTIX of nvidia, is way smarter, therefore better for rendering than CUDA on the same card. Thing of games that RT cores, are incoparable to raw performance for raytracing, even if they consume less die space and power consumption
These recent reviews with Anthony hosting are so damn high quality that I can't wait for some of the LTT Lab content to drop in the next year. It's gonna be absolutely sick.
Read this before video started. Then the first 3 seconds hit me…
Yeah
LTT labs content won't be in the form of videos, they'll be mostly print media, articles and posts on their website. He said so himself.
@@vijeykumar7429 no. He said most of it. Can’t imagine them spending so much money without making use of the information in videos.
this guys voice alone is 10000x better than linus IMO.
One thing to note, you can’t set up the pro max chip with 16GB of memory, so if you don’t NEED 32GB, it’s actually a $600 difference to go from the base M1 pro to the base M1 pro max. The $200 for the chip upgrade itself and $400 for the memory upgrade.
Nope, it is currently not possible to buy an M1 Max Macbook Pro 14 with 16GB unified memory. 32GB is the lowest option on Apple's site.
Yeah when I bought my M1 Pro 16 in November I wanted to spring for the max but like you said it quickly became a grand difference in price and I’m really happy with the Pro.
@@stewardappiagyei6982 ... that's what he said...
I would not buy a laptop with 16gb of memory. I know it's use-case dependant, but I constantly am running up on 16GB on laptop and desktop. Given usually it's when doing 3D workflows, or container development, but it does feel like even more casual use and gaming workloads are going to be pushing up on 16GB soon enough that the cost of upgrade is worth it to make the computer relevant longer. Like, I'm using 10.6GB of mem right now just to have like 20 tabs of firefox/chrome and spotify open.
@@QuakerAssassin yeah it’s def application specific. I bought my Mac for just working with Lightroom and Photoshop and to work with raw files and it’s amazing for that. But I really don’t render anything or game so it works great for me as far as productivity goes.
This "the answer may surprise you. It sure surprised me" thing is starting to be a distinctive mark of Anthony's videos and I like it. Love the energy!
To the max. Haha ditto!
same here, I enjoy Anthony's vids
ok
I thought he was going to say Apple was true to their word and that their marketing accurately reflected their products; that would have been shocking.
@@dmt1994 Yeah, same
I’d be interested in the power consumption comparison during these tests
What nvidia doesn’t want you to hear.
I dont think you will ever see this in this channel.. the other machines would look like crap
@@cristhiantv they’ll probably say some delusional things like ”pro users always have their laptop plugged in anyway so power consumption isn’t an issue”.
Power consumption effects nothing in your life and costs next to nothing extra, unless you live in a shithole without reliable power.
@@sqlevolicious you don’t know his life, do u? Also power consumption is important if you’re rendering videos on the go…. But you’re gonna probably reply something telling us how stupid we are just by looking at your comments before… so don’t mind answering, have a good day
If you want to actually see the promised performance gains:
Use it for software development.
Build times went from 4 minutes (2019, 16" MBP, max spec) to
I used to respect the guy but i'm not sure what to think about him or LTT at this point. If they don't address this i'm unsubscribing.
- th-cam.com/video/g1EyoTu5AX4/w-d-xo.html M1 MAX - 32 CORE BENCHMARKS v RTX 3080 & 3090 Will Blow Your Mind!
- th-cam.com/video/OMgCsvcMIaQ/w-d-xo.html M1 Max 32 Core GPU v 165W RTX 3080 Laptop
- th-cam.com/video/JM27aT9qhZc/w-d-xo.html 16" MacBook Pro vs RTX 3080 Razer Blade - SORRY Nvidia..
- th-cam.com/video/YX9ttJ0coe4/w-d-xo.html 16" M1 Max MacBook Pro vs. My $6000 PC
The list keeps going. These results have been out for a while too so LTT really have no excuses. They didn't use optimized software, they didn't compare laptop performance while on battery, they didn't use readily available GPU benchmarking software thats already proven to place the M1 Max around the 3080 level. They need to explain.
Something I think was missing was battery life under load. A key part of Apple’s claims was sustained performance even on battery and at much greater efficiency. So I’m curious how the gaming comparisons would look if you capped framerates to 60 or 30 across machines and compared battery life then. You showed Apple exaggerated how close they were in raw performance, and now I want to know how much they exaggerated on efficiency.
Well, to actually make a comparison, the PC Laptops would need to deliver full performance on battery, which they can't.
@@andreasbuder4417 it would need a laptop that cost as much as the macs. The zephyr is was half the cost of the macs here
If you have such a workhorse, why use it on battery where it would die in less than 4 hours IF it was at 100% battery? Seems extremely unrealistic scenario.
@@Natsukashii1111 It is the same prize? at least in Denmark
wait, my mistake, it is the same prize as the low-end M1 pro 14 inch (2,600 dollars), therefore cheaper than the M1 Max 14 inch (5,000 dollars)
I'm really tired of mobile parts being called the same name (eg: 3080) as their exponentially more powerful discrete counterparts. They're fundamentally different parts I feel
I mean, they’re up to twice as powerful on desktop, but that’s plenty to mislead consumers. AMD and Apple aren’t doing that, though. Just Nvidia.
I take issue with your use of the word “discrete” here - the 3080 laptop GPU is still discrete graphics because it’s not on-die with the CPU. Still, I take your point, and I second it.
Technically the 3060 is different, has more cuda cores than the desktop variant and that's why they are actually comparable
That's their intention
@@djsnowpdx That's a fair distinction, is there a category to describe desktop + workstation + server GPUs? The only thing I can think of is 'PCIe GPUs', vs mobile GPUs and iGPUs. There's also the distinction between the specially-made rackmount-only versions, like the A100, which although use PCIe, are not PCIe-socketable, which futher muddies things
@@gustavrsh Probably right, might just be an upselling tactic
I really like the confidence Anthony grew over the times standing in front of the camera :)
would've been interesting to also include a G15, seems like a fair competitor (3070 and a pretty good ryzen chip, and about 7 hours of battery life)
I mean this comparison is fair because the Zephyrus is cheaper. So it actually isn't fair to the Zephyrus if anything.
Would be interesting if they used TeraFLOPS as a unit of measurement to determine estimated GPU performance. :) Now it's not the best unit to use, but the FLOP can show 32-bit precision calculations per second.
Not only not the best, Teraflops is quite possibly the worst measurement to use, since for every generation and architecture performance per flop can differ so much.
The only thing it's good for is marketing number shows (also relative estimated performance withing one gpu family of the same generation, but that's besides the point).
Lenovo Legion would be better because it has a MUX switch
@@ZerograviTea. Wow. I didn't know it was the worst. So, what is the best unit of measurement for GPU performance? GPU bandwidth (GB/s), throughput, or something totally different?
In laptop comparisons I believe having separate benchmarks for plugged AND unplugged scenarios would shine more light on Apple claims.
This, the mac slaughters every laptop on battery lol!
@@Prithvidiamond every laptop you say? You do know that there are laptops with desktop cpu and desktop gpu? I mean they are absolutely huge, are barely able to be transported but they are still laptops and they will be 2 to 3 times more powerful than m1 macs for the same price.
It's not a fair comparison but you might want to lower your expectations on Apple claim.
@@Natsukashii1111 "on battery"
yeah but why would you do something resource intensive on battery....
@@Natsukashii1111 Laptop and Portable computer aren't the same.
Macbook is a laptop. Some Clevos that you are talking about are "portable" computer whith whom you can do evrything as long as you have a desk and power socket. Without those two it's a bigass brick good for nothing.
The professional presentation and eloquent voice of my favorite Linus media group personalities makes this review very entertaining and informative!
So what Macs did you get?
"M1 max"
Yeah I know you got M1 Macs, but what model
"M1 Max..."
**flips desk**
lol!
Can we talk about the B-roll camera shots? Seems like they’re trying some new techniques here and I love it!
They're so clean!
@@charredolive Unlike Linus' humor. Lol ... :)
All jokes aside, LTT quality has up ticked in the last few months. :) I like this new style a lot.
It would have been nice to see the 2020 Intel MacBook Pro's included in these graphs.
It's no wonder all of the reviews were so glowing when these laptops came out. It's because all of them almost exclusively focus on video editing and the Adobe Suite. "Benchmarking" often times is just video render times, and it's frustrating, as you can clearly see, it doesn't paint a good picture overall. The Zephyrus is what, at least $1k less? And it performs largely the same, at the cost of battery life? I guess efficiency is a good thing, but these laptops are good for only really very specific purposes, and I question whether they entirely deserved the ubiquitous glowing reviews when they dropped.
If you also consider programming, then also the m1 pro and max outshines the competition. Android projects and java projects are significantly faster than even the top-end machines running linux. Python and TensorFlow builds are also faster, although there somehow the m1 pro trains and builds the ML model faster than the m1 max due to some reasons. So in the departments of media creation and programming these laptops are truly top of the class.
Apple's gig has never been good value. I would actually consider buying it for the hardware if not for the OS lock-in. $1k for weight/battery life/build quality? Sure, why not.
@@Lodinn This is why, despite it's many downsides, I still kind of like the MacBook 16in 2019 with the updated keyboard and i7. Boot camp gives it longevity, and being that it runs x86, it runs all modern day apps. Obviously efficiency isn't nearly there, but all the other MacBook perks are, which makes it a rather nice machine. Outclassed for sure by these last few years of laptops by orders of magnitude, but hey, until Razer or Microsoft can get the build quality down as good as Apple has, it's an attractive option.
@@aritradey8334 That's fair! I haven't seen too many benchmarks I guess in the programming world, which I feel is telling when it comes to the reviewer landscape. With that being said, I remember some of the Hardware Unboxed review, and now this one, and they are such a stark contrast to the uniform praise these recieved upon launch. Great machines for sure, especially for those who use the areas they excel at. I guess I'm just rather exhausted at all of the review outlets only reviewing things for videography, simply because that's what they do. Their reviews shouldn't be a general "review" and should be more a "videographer review", so that those who don't watch/read 18 reviews like a lot of us here who do this for fun, don't get the wrong ideas.
I did wonder and reminded me of how Volkswagen optimized their software for specific use cases. I considered M1 briefly for a Linux laptop but then quickly reconsidered - if not else for the keyboard - and went for a Thinkpad Ps. I don't think these Macs are good for generic purpose computers. They are fine for the same task a Chromebook is also good for, or for the special video editing stuff. Seems quite niche, lucky them they can sell it with marketing.
Great perspective, appreciate the continued, in-depth coverage on these. I also appreciate what feels like an objective, enthusiastic investigation of the tech, neither a takedown nor blind exaltation, thank you so much for your work!
I’m a video editor , I have used Mac and pc for a long time. Recently built a nice PC and I game too much on it lol so now I’m thinking of getting the M1 Max for portability. Glad to hear it’s a beast at what I need it for. This is definitely not for everyone
It definitely is a beast especially if it has a native support for Apple silicon. If you game unfortunately there isn’t any game that natively supports it yet, if there was then you’d get close to 3080’s performance for far greater efficiency. The biggest advantage of these chips are the performance power you get on the go versus any other laptop on the go. The MacBooks just smoke them there and if you travel a lot getting a MacBook over the others is going to be a no brainer. Just remember that you’d have to sacrifice playing some AAA game titles though but if Apple themselves release some AAA games for the Mac, I’m sure more game devs would see the potential in the Mac and port titles to them. That possibility definitely exists but it’s going to be a gamble.
@@almuel you should watch the video.......... it is Rtx 2060 rather than 3080
Interestingly Max Tech did a response video to this revealing a surprising and very concerning set of anomalies in the data presented suggesting either serious issues with testing methodology or massive pro Intel bias. Either way an update urgently needed by the Linus team to respond to those observations and recover lost credibility
@@skadi7654 no, they misrepresented data for whatever reason. Others have proven the reality, but although IMO LTT we’re raising an important and valid concern about these laptops, they did it in a very sketchy and either underhand or unprofessional way. See the Max Tech response for more details.
Same situation, 3080 gaming desktop but wanted a m1 for portability. You getting the m1 pro max or m1 pro ?
Apple's deprecation of OpenGL support is nasty.
They pretty much had to for this M1 chip anyway. Can't really run widely compatible API's if you're going to do specialised hardware & also claim it slays a top of the line dGPU while using less than half the power. They just don't tell you that the software to actually get the claimed performance isn't widely available (yet).
@@MLWJ1993 Just wait until the community implements OpenGL using Metal, similar to MoltenVK. It's not really "specialized hardware", it's just a graphics API, that's how every GPU works. That's why OpenGL support is still ubiquitous on non-Apple GPUs, even though they're architecturally much more geared towards Dx12 and Vulkan, which are very similar to Metal (in fact, Metal itself is barely anything more than a deliberately incompatible clone of Vulkan because Apple is still Apple).
The M1 CPU may be awesome at clearing up decades-long inefficiencies of the x86 architecture, but the GPU world has long progressed way beyond that. Apple has no such advantage there. The only reason they are even remotely competitive in a performance per watt benchmark is TSMC's 5nm node, to which they currently have exclusive access, but from an architectural standpoint they have a lot of catching up to do with both AMD and Nvidia.
@@DeeSnow97 well, Apple couldn’t “just wait.” They had a product they were ready to sell.
@@djsnowpdx lol, what a horrible take, Apple could have just kept on supporting OpenGL and not sold an incomplete product
@@DeeSnow97 The M1 just sucks for "community anything though" since Apple doesn't really do much of anything to have "the community" fix up their slack. Most of the time they specifically go down the path where they like "the community" to be able to do absolutely nothing. Like doing basic servicing of a device...
I love the way that Mario Sunshine is used as a benchmark here lmao
Only thing that Macs can run
I would love one day to see Deep Learning Benchmarks as well ... as a DL practitioner, looking forward to the comparison for both CPU and GPU workloads.
I know! The code to get a simple run of MNIST going is just a couple blocks of copy paste.
Get a workstation grade laptop. (Dell Precision / Thinkpad P-series)
@Christos Kokaliaris You can get these notebooks with 500nits, 4k 120Hz displays if you are willing to spend the cash. Personally I use external monitors.
@@MrGeometres if you run stuff on cloud, nothing beats a 900 dollar Macbook Air. You get a wonderful display, great touchpad, nice keyboard. At some point you have to run stuff on cloud if you are doing serious business. It does not makes sense to put thousands dollars to workstations that don't run most of the time and don't scale at all.
Unfortunately, the answer on HDMI 2.1 adapters is currently no, for software reasons. I think if you guys make a video on it that could get Apple’s attention to finish it
sure, cause they're apple's favorite reviewers.
yes, because Apple is well known to take into consideration what people outside Apple are saying /s
I'm interested if the laptops were connected to power. Also interested what the battery percentages would be at the end of the test with all laptops disconnected from power, and how hard the fans blew.
I think it's pretty clear that Macs run much better on battery power than most PCs. At least until the latest Intel and AMD chips are properly put to the test.
That is probably how apple got theirs to look so god in their comparisons... they unplugged the PCs...
@@petereriksson6760 lol u right. so I guess Apple actually makes laptops...while PCs are designed be lost in house fires..got it.
@@petereriksson6760 that's exactly what they've done and there's nothing wrong with that because laptops are meant to be use unplugged!
@@angeloangibeau5814 I disagree heavly. The point of laptops is portability, but that doesn't mean I will use them unplugged.
Battery life is good but not as important as Apple makes it out to be. It's not that important like phones.
When I am using my laptop for more than an hour, it's usually on a desk and almost all places I visit with a desk, they have an outlet.
Anthony was the best decision LTT has done recently. Congrats to the both of you!
this test is pure crap. They should be sued by Apple for misinformation and lies.
@@nnnnnn3647 wut? They can't show results which they measured?
Or are you gonna say they should have only used software that works better on macs?
@@damara2268 They should use software that people really use./
@@nnnnnn3647 ok
@@nnnnnn3647 cope and seethe
Blender 3.0 now run natively on M1, so that could be a nice comparación.
Anthony, your screen-presence has improved so much from your debut. You’ve clearly gotten much more comfortable in front of the camera, and you provide a wonderfully logical insight (pun intended) into the things you present. I know you’re going by a script, but surely you contribute, and you make it yours.
The actual video to the sponsor sequence transitions are always smooth af ngl
Great review but I'm curious about differences between pro and max for development benchmarks i.e. code compilation. This is generally a very large use case for these macbooks.
They use the same CPU, so while the extra bandwidth (and cache?) may make a difference, it's unlikely to be a huge one.
Depends on what you're compiling, if your stuff can compile on mac and is not using GPU acceleration, then the difference is minimal/non-existent.
The efficiency cores on Intel next year will be very interesting, and AMD finally moving to 5nm, though that is supposedly end of year, will be very interesting to see performance jump with that including the new cache stacking. It's great getting past the stagnation.
I'm probably upgrading end of next year, will move from laptop (i7 9750H, it's 3 years old now) to PC since moved continents, and things like Rider and VS Code having remote support means I can just have home PC host the stuff (which I do often enough on my NUC if I need to run overnight).
Check Alexander Ziskind youtube channel for many, many development benchmarks done to the M1/Pro/Max machines, most videos are very short and to the point.
In general, CPU-bound work sees very little difference between the Pro and Max chips, you end up seeing more differences being caused by the number of cores available on the different versions than in the kind of CPU. In some cases, specially single-threaded ones like some javascript tests, a MBP16 running a maxed out i9 might beat the numbers, but if the workflow is multithreaded the M1 chips do it better.
Unless your workflow really needs more than 32GB of RAM a 10 core M1 Pro is probably the "sweet spot" for development at the moment.
My friend is a senior engineer for Apple and he does both iOS and MacOS compiling. He got a Pro for himself and they gave him a Pro for work too because the Max isn't necessary for their developers for the most part. Only certain developers would get allocated a Max but he hasn't heard of any devs getting them.
The lack of Vulcan, Cuda, or OpenCL support on Macs is absolutely killing multi platform compatibility for even professional workloads and games have taken a giant leap backwards.
That is Apple's work, they just remove and destroy a industry-standard like OpenCL and OpenGL / CUDA (they never supported the most powerful GPUs, which are Nvidia). In Linux and Windows, when you get a new standard, they let you use the old one, it does not just get removed, which destroys a lot of software. You can still run 32 Bit Apps on Win and Linux very well and that is how you must do it. Apple is just typically arrogant and does not care about its users. That is the reason why they have not had more than 10% marketshare globally, not once in 44 years the company has existed.
@@nigratruo x86 is stagnant and needs a complete reboot... but noone got the guts for it... Apple did and they now have quite powerful machines that uses little power... perfect? not yet... but way better for what they are meant for and then on top of that they can game decently... but again not perfectly... yet. but the extra power of the M1 chips? especially the pro and the max? well they could (should) be interesting for game devs to tap into
Anthony is a wonderful personality and knows how to mix humour and information supremely well. Love his Mac content!
Anthony, you are nailing reviews recently! Your voice acting/narration is SO professional :) great stuff mate
the progression of Anthony and how much better/confident he has become on camera should be an inspiration for everyone to practice confidence in social setting (which is even worse on camera when you're staring into a lens instead of talking to people)
I am happy Apple is making great arm processors, and I’m also happy Anthony did the review for this episode again. Keep up the great work guys.
Apple makes nothing, thank TSMC
@@davide4725 “Thanks TSMC”
You sound like the kind of guy who loves bringing up John Lennon’s wife beating tendencies every time someone mentions they like the Beatles lmao
@@parkerdavis7859 Cute assumptions kid. Good bye now...
I am also loving the progression for the ARM space. What really excites me isn't the CPU or GPU in these, its the optimizations they made to make ARM that competitive. They're getting asic-like performance for a lot of low-level stuff.
@@davide4725 i find it funny how you called the other guy "kid" while here you are having absolutely no knowledge on RnD, Design, audit, documentation, subcontracting and manufacturing process works in the tech industry.
"Thank TSMC" lol. Kid please.
Loved the reference, in the intro, to the open air unboxings Linus made many years ago :))
Honestly, i love every review Anthony does. His voice is like butter on a subwoofer
0 dislikes! Great vid LTT👍👍👍
Starting to hate this joke. TH-cam should make the dislikes public again.
Blender 3.1's metal support is very nice. I still don't think it beats out some of the higher end RTX cards, but it still performs very well, even in the alpha stages
Things i still want to see covered:
1) How much can the USB-C take? 8 hubs fully loaded with all the native monitors going, with also X extra monitors using displayLink, while running a USB connected NAS and 10gb Ethernet dongle
2) EGPU support? If not, what happens if you try it? What if you try to force the Nvidia or AMD drivers with rosetta?
3) Wipe one of the system and use it as a daily driver for a week but this time refusing to install Rosetta. How are the proformance numbers changed without the emulator running or even listening
Cant wait to come back to this video 10 years from now.
Whats up future me :)
I love you
Gotta love how you included a 500W desktop system in all the benchmarks where the Macs otherwise dominated ;-)
check 6:01
Anthony, with that last ad transition one can only conclude that you have achieved your final form as a true artist. A poet!
I have the M1 Pro Max. First Apple computer I have owned. And I am nothing but impressed... Sure I could find something I don't like about it. But... I could show you a list of complaints with my last laptops that are far worse. How efficient it is does have a lot of value. My last laptop was $2,000 when i purchased it from Lenovo. And I needed a Go Pro for a project. realized the memory was full and it killed my laptop battery before it could get the footage off. Even Chrome would noticeably kill battery life. Having a laptop that is useless without a being plugged in sucks.
I'm so glad these new macbooks have proper thermals rather than the "bluetooth heatsink" of models prior...
I know right. It honestly looks like the inside of my gaming laptop in there.
The cinematography in this video is amazing.
What's hilarious is that if you read reviews of the Zephyrus it's constantly referred to as over priced and under powered 😂
I like your tests and I am not an Apple Fanboy, but your results here are very different from most of the other Tech TH-cam channels that have tested these MacBooks
Careful, he might delete this comment.
which other tech channel results differ from this? post a real tech channel, not a fanboy channel. before you post make sure that channel does a variety of reviews not only praising apple products.
@@truthseeker6804 How about Matthew Moniz, he did an video and he is not biased
@@truthseeker6804 All that I've seen actually. Here are some
- th-cam.com/video/g1EyoTu5AX4/w-d-xo.html M1 MAX - 32 CORE BENCHMARKS v RTX 3080 & 3090 Will Blow Your Mind!
- th-cam.com/video/OMgCsvcMIaQ/w-d-xo.html M1 Max 32 Core GPU v 165W RTX 3080 Laptop
- th-cam.com/video/JM27aT9qhZc/w-d-xo.html 16" MacBook Pro vs RTX 3080 Razer Blade - SORRY Nvidia..
- th-cam.com/video/YX9ttJ0coe4/w-d-xo.html 16" M1 Max MacBook Pro vs. My $6000 PC
The list keeps going. These results have been out for a while too so LTT really have no excuses. They didn't use optimized software, they didn't compare laptop performance while on battery, they didn't use readily available GPU benchmarking software thats already proven to place the M1 Max around the 3080 level.
@@andremessado7659 so i watched the first video, and the m1 max actually lost to the laptop and desktop, in the chart in export times, but it did well in the timeline playback, thats literally the same as this video in the davinci resolve section at 5:28.
in the second video, the gaming laptop totally destroyed the m1 max on power, not on battery.
i skipped the third bias max tech apple fanboy channel video.
regarding the fourth video, the m1 max lost in all the charts except the 6k braw export, which is interesting because the first link you posted had a faster than the m1 max export speed on the gpu.
so in summary from the first, second and fourth video, the m1 max does best in video playback on a video editing timeline, but loses to 3080 or 3090 in video exporting, stabilization, rendering, benchmarks, everything else.
I was just auditioned for an animation job, I was put on a last gen Intel iMac, fired up Blender and put a normal map on one surface in the scene and the GPU almost caught fire and the whole macOS GUI dropped to 0.5fps, I'm not sh1tting you!!!
M1 can throw a lot of weight around as a DAW host, especially running Logic and AS-native plugins. It's reportedly less-well-suited to realtime audio tasks (like recording live guitars through software amp sims at low latency in a busy session) but it absolutely pommels at mixing and composing tasks that don't require super-low RTL figures under load. The 32GB Max variant will benefit a serious composer who wants all of the orchestral libraries and soft synths loaded at once, although all that GPU will be drastically underutilized in the same scenario.
I think the focus with these computers is probably performance while being unplugged, which is something I really wish they had tested.
I know it’s not the same as a real thorough test, but most benchmarks agree that the M1 (any variant) run virtually identical both plugged and unplugged.
@@AlejandroLZuvic yeah, but intel laptops are not
You clearly didnt see the video until the end, lol
Absolutely, One of the main use cases for a laptop is while unplugged. The first test they should do is fully charged and unplugged performance testing, then while charging, and then when fully charged but plugged in. Large differences in performance can result in various situations, and only testing while plugged (or unplugged) can skew results to the testers desires. I think Apple's claims might be correct IF the laptop they were comparing against did poorly while unplugged, so Apple's results would look more impressive.
at around 90° C as show by Anthony's testing, i don't see unplugged rendering being viable.
Watching Anthony go from absolutely HATING being on camera to being so much more comfortable that he cracks me the eff up with an intro like that! Bravo Anthony! 👏 👏 👏 I almost spit out my coffee lol'ing at that. Great work.
So much for faster than a 3060, lol.
The media engine seems to be giving the GPU the illusion of more performance than it's actually capable of.
Yeah, it's all smoke and mirrors. People that actually believed these claims..I mean..first time? 🤣
@@shibbychingching4845 I mean no where did I hear apple say it was a gaming machine. The illusion is what you keep telling yourself.
@@bear2507 the illusion is that apple claimed the performance is about the same as an rtx 3080, not just M1 barely beaten the rtx 3060 its not even close to rtx 3080 and i mean mobile rtx gpu, an rtx is a GAMING gpu, so when they made these claim people will think about its performance for gaming obviously, should have compared it to a profesional gpu like quadro instead of being either brave or stupid to compare them to rtx
@@foxley95 m1 don't have the capability to beat 1% of quadro GPU in 3d task
@@foxley95 yeah, i’ll go tell my research lab to shut down our datacenter with hundreds of 3080s, because some kid on youtube said these gpus are for games only and not generic compute. comments are full of children who have never touched anything ouside minecraft, but have an opinion on everything hahah
So much of this is really optimization in code, for those of us that lived through the changes from carbon to coco to metal and from Motorola to PPC and then to Intel, one of the things that happened was after a giant change in architecture, over time as software gets updated the macs would get faster. Even Apples OS is still hitting rosetta. The review is still fair, but in a year the results from the same hardware will most likely be significantly different.
Steam drains the battery on my 16” Mac(M1 Max) faster than running Windows on Arm(Parallels) + ECAD(Altium) or Keysight ADS for EM field solving. Yeah… Just having the Steam launcher running, not even with a game going.
Oh well, i never intended to game on the Mac anyways since I have a gaming PC… but in terms of work, the Mac can do everything I need it to do in portable form factor, while maintaining all day battery life.
@@Cat-kp7rl but you CAN game on a mac, but I am surprised what I can push out of my OG 13" M1 Air
@@LiLBitsDK I was just backing up his point that unoptimized things can run really bad no matter the device. Like in my case, something as trivial as the Steam launcher
Exactly. That's another crazy thing about M1. It will just get faster as we get updates. Normally machines will be slower as they age since software gets more complicated.
@@LiLBitsDK Yep. I was running Diablo 3 wuthout issues. It heats my laptop from the sam e year like crazy
you guys really need to do code compilation tests, that is honestly all that I'm interested in.
That's the first time I heard that someone is preferring the silver color. I also got a silver one and, looking around online, it seems like I'm way in the minority with that decision.
Hope they get eGPUs up and running for the M1 chip soon. Imagine the possibilities.
Absolutely MAD MAX intro! And that was the smoothest and most hilarious segue ever to a sponsor! Well done haha
I agree with what you say: M1 max is literally only for professional video editors, which is a super ultra niche market, for everyone else, it's not worth it.
I think it'd be more accurate to say media professionals and developers in general. It's absolutely fantastic for professional audio production and software development. Silent the vast majority of the time and can easily handle on-location and remote tasks with it's awesome battery life with full power whether plugged in or not. The high-impedance-capable headphone jack and best-sound-in-a-laptop ever doesn't hurt either. I think it's important to compare Apples to Apples here (pun intended). They're not designed for gamers, they are designed for professionals. As an equal Windows and MacOS user, my experience with these has been top-notch. For pros, Apple has hit a home-run here IMHO. Also, I think the power-per-watt here should not be ignored and I don't believe this was mentioned - add that factor to the benchmarks and you'd see some very different results. Energy costs money and affects the environment. And a hot, noisy laptop isn't particularly enjoyable to use day in and day out.
Super niche. Because let's face it, the m1 air can do 4k editing. How many editors need to edit 12 simultaneous 4k streams? Most youtube viewers don't even watch in 4k yet rofl. I really wish it performed better at 3d design.
@@wykananda for audio professionals most of them were good with an older generation macbook with high memory configuration tho. also for non video editing/audio professionals, macos is really really difficult to use. even more so with arm. basic stuff like a volume mixer and any sign of useful window management are absent out of the box. what is the point if you are spending such a premium to get a sub par experience with non video editing/audio professionals.
@@pupperemeritus9189 Hi pupper. I'm not sure I understand your comments. Sadly, the previous Macbook laptop generations were all limited to16gb of ram - so high-memory configs were simply not possible. Moving to the ARM architecture did not change the underlying operating system, MacOS, it simply made the laptop hardware run faster, smoother, quieter, and for much longer on a single battery charge. As for the difficult-to-use / sound control / window management - the latest Windows and MacOS are both more than reasonably user-friendly and well-equipped in all these areas - these OSs have both been around for many years and generations now and it shows. As a multi-OS power-user I could nit-pick plenty at both OSs here and there for sure though. However, in my experience, for the countless newbies that I've trained and continue to help, MacOS has to get the nod for getting productive and comfortable more quickly with less frustration and confusion and less problems over the long haul. Let's face it, both operating systems are DEEP. They're both very capable and stable at this stage but either will take time and effort to learn to get the most out of them. Curiously, my current "go to" Windows-based laptop is a 2015 Macbook Pro running Boot Camp - ironically, it's easily the best Windows laptop I've ever owned - cool, quiet, fast, stable, good battery life, well-built, expandable - and, of course, it runs MacOS like a champ too. I'll likely get another 3-4 good years out of it before I hand it down the line. IMO, the 2015 MBP was the best overall professional laptop ever made for Windows, MacOS, or Linux until now. While I can run the ARM version of Windows on the latest MBP via Parallels and so on, I'll have a new laptop King if-ever/when Microsoft Windows gets fully up to ARM speed and these new killer Macs can boot into it natively.
@@wykananda i appreciate your patient reply
Max Tech already showed weeks ago that the 14” throttles the M1 Max chip compared to the 16” due to much smaller cooling.
Yup. Honestly no idea why it took LTT so long to get these videos out. All this information is widely known by now. Seems like a huge miss on their part for being so late to the game on these. If they didn't receive the products in time then sure thats fine but it's also LTT.... Surely they could of worked it out.
As time goes, I start realizing that Max Tech tends to only or mostly show the advantages of M1*. Have to watch other channels to find out for instance about the only 90%ish Adobe RGB (this is bad for Photoshop semi/professional editing) of this screen and the very slow screen response times (35-100 ms) - find Hardware Unboxed for these 2 - th-cam.com/video/p2xo-hDCgZE/w-d-xo.html . Or what it is here.
Max Tech is an annoying surface level dweeb that only posts reviews to get clicks.
@@ContraVsGigi AdobeRGB? lol, a lot of professionals don't need or want 100% AdobeRGB coverage because they're working in an SRGB or Displayp3 workspace. Non-issue. 90% is actually a very good result for AdobeRGB anyway.
@@DriveCancelDC Personally not a fan of the guy or his channel but i'll give him credit for shitting out a buttload of videos when the M1 Pro and Max Dropped. He was on it from day 1. It's been almost 2 months and Linus is only just putting out a video now? I expected better honestly.
[Looks at video][Slaps it]: This bad boy can include more than 3 ads!
Love sponsors.
So, the short of it I'm getting is that Intel has just been intentionally selling bunkus laptop CPUs until the M1 came out and then they're like "Oh, you wanted a serious mobile chip?"
@@TotallySlapdash oh interesting,you think apple made Intel look bad for all that years in the sole purpose of making their chips look good ?
What about software compilation, data analysis and heavy crunching like that? Can you 🙏 test compiling Linux or something similar workflow for the 16” review? Pretty please 🥺 It’s a lot more relevant for someone like me
Slight correction at 3:44.
The closed captions currently display: "with middle GPU rendering and support"
This should be changed to: " with Metal GPU rendering support"
8:57 To make matters worse for Mario Sunshine, the starting level is the easiest level to run on lower end hardware. So the fact that it hovered around 70s and 60s, that's not looking for the m1 max. However it may just be due to the rendering API being wonky on Macs
I am playing zelda on this thing and it runs great. but man, comparing apple to windows on games, it's like comparing mercedes and a toyota lmao.
I thought you'd add battery life tests with gaming on the M1 Maxes...
Andy's on screen acting has improved so much. Proud of you bro
Thx. This made me re-consider buying a MacBook, after using one for mobile purposes for 4 years. I work as a freelance architect and I fell in love with Twinmotion, a UE based real time renderer. Path tracing is not a thing with Mac and it realy sucks for the price. It might get a metal update, but I need it like now and can’t wait another year. Gonna get a W notebook again I guess. Also the price difference is ridiculous.
Using Apple laptops for UE wouln't make much sense until Epic natively supports them and that will take at least a year (because they first have to release UE5).
I'm using a macbook and honestly, overpriced as hell. You should go with a high end laptop if you're willing to pay the same price for better performance. There's the Razer Blade Stealth 13 for around the same price, it's a thin and light with just better performance most of the time.
Lenovo has some amazing options in they're Legion range
@@Lius525 not gonna happen I think
My pick would be gaming laptop. For the extra GPU power and cooling for UE. I have Lenovo Legion and when I have to upgrade again I will buy the same brand again. It is a bit clunky but it stays cool all day long with i5 on turbo and nvidia 2060.
One thing not mentioned when doing the benchmarks, how do all the laptops ( MacBooks and Zephyrus) perform while only on battery. Yes battery usage length is great, but how is the horsepower of the cpu/GPU effected running apps while on battery. I think some surprises might arise.
After rewatching this review, I went ahead and bought the base model of the 14 inch m1 pro. I will be doing more cpu than gpu heavy work but I didn't think the 2 extra cores was worth the money
I’d really love a software dev take on this. For my use case fast cpu, good battery life and 64gb of ram are compelling - but are distinctly not video rendering.
Developer here, I wouldn't buy any of these besides the base-level MacBook non-pro. You can literally code on a Raspberry, unless you're compiling something crazy-complex like an entire browser you're not going to feel the difference, so why pay extra for literally nothing? A USB-A port would have been a compelling addition, but oh well.
Other developer here. Never found myself desperate for a usb A port while developing but have definitely found a use for better cpu and ram. Not sure what serious developers are developing on trash hardware tbh.
@@JackiePrime Web, for example. I don't develop on trash hardware because I can afford better equipment, but if I still had my old FX-8320 it wouldn't slow me down in any way. Peripherals are way more important at that point.
Also, every single hardware debugger uses USB-A, and even if you just want hobbyist stuff have fun hunting down a USB mini-B (not micro-B) to USB-C cable just because you can't use the included mini-B to A.
But it does make sense, if you only develop for iOS (which is literally the only reason I've ever considered buying a Mac) then you won't run into any of those issues, and Xcode being a hot mess does necessitate a faster CPU and more RAM. But there's a lot more to development than just Apple's walled garden, and if you step out of it it's a lot more important to be able to mess with any device you want to.
Also a developer here, gpu on the max is absolutely useless and 64 gb of ram is overkill for my line of work. 32 gb ram and 10 core pro is plenty plan to keep for about 4 to 5 years.
Another Developer here, I have the M1 Max with 64GB, 32GPU and 1TB SSD. While this setup is overkill, first I can afford it and feels good not having to worry about performance while working. On the technical side, running Webstorm, and other ides, multiple node apps, multiple docker containers, electron apps that suck like Slack etc takes a toll on any computer. If you can afford it, specially since software engineering is a well paid job, plus the resell value down the line, why not?
Anthony is such a smooth talker
I honestly feel like we're witnessing the makings of a legend. This guy's reviews are legit!
Anthony’s an OG at LTT for forever - great to see him getting screen time and having a bit more confidence presenting.
Seriously I think he's my fav LTT presenter now, even beating out Linus. lol
this test is pure crap. They should be sued by Apple for misinformation and lies.
Normally I completely agree. Seems very skewed and that the apps selected were designed to show this in a poor light. Was it purposefully? Guess time will tell, but I believe this video will not age well. However PC fans will point this this sole video as why the new MacBook Pros suck, despite an overwhelming number of other reviewers showing the performance in a different light, several that are also typical PC reviewers.
@@BootStrapTurnerVideography you mean max tech right? A guy who literally said that M1 max MacBook is just as fast as 5950x desktop with RTX 3090.
Yeah that guy is totally not biased at all.
I think what Anthony wanted to point out here is that those apple marketing slides for M1 max were very very misleading.
not on the 14inch. if needed, yes on the 16in. best to stick with the Pro on the 14in, or if needed Max with the 24core GPU. the 32 core is voltage limited in the 14in.
I love how you had to change the graphs to accomodate the intel/nvidia notebook
Benchmarks in C4D/redshift don't tell the full story. You need to go into redshift's render settings and manually increase bucket size to 256/512, then you'll see a 25%+ improvement in render times.
Interesting! Thanks for the info!
Andy is such a great addition to the team for these videos.
I’m a software dev who also edit in Resolve and do some Blender on my spare time and went for an 10/16c M1 Pro with 32Gb RAM, I don’t regret it :
I don’t intent to game on it, Blender Metal support is coming, and ho boy that XCode build time is just fabulous ! That’s not just the extra CPU, faster SSD and mem bandwidth do make a huge difference, easily cut my build times in half.
picking an M1 Max was just wasting battery life for me, as the test shown in the video is the best case, the drop in daily workloads is more like 30%
how much difference do you think 32gb vs 16gb ram on that M1 Pro makes for Blender and Resolve? using 14" or 16"?
@@StefanUrkel a huge one! Used both on M1 16Gb, very decent performance but swap usage was way too high and memory pressure often in the yellow area.
32Gb is way way better!
Thanks Anthony. Now I have to decide whether the better memory is worth it or whether it’s not good enough and I’ll have to push code off to the compute cluster anyway..
In the future, could you guys run some memory intensive data analysis code, like something that inverts large matrices in memory in python? That’s a good reason to get these if I have to avoid the hassle of pushing code to a network machine and can run it on my laptop and play with it.
My Mac's maxed out to the max, with the M1 Max for maximum MacBook maxims, like "What's good for my Mac is good for the Pro".
God I love this guy as a linus replacement when he's gone/too exhausted to do videos/building his sound setup..
This is what I was looking for, Apple made very bold claims about their SoC and although it's still impressive it's not the "PC killer" they make it appear to be, and the thing is that they match the price to better laptops (raw performance wise) of course if you live in Apple's garden this is still better than anything and if you use mostly arm native apps you'll love this but it's not really worthy for many PC users to change to mac when the performance claims Apple made are not true. We'll see what happens next year and in 2023 when the chip shortage "ends" things could be very different by then for hardware and software.
Not in GPU performance, sure. But the apple silicon is still game changing in cpu performance, efficiency, and battery life, not to mention the build quality, track pad, speakers/mic, and screen quality improvements you get over the vast majority of windows laptops. Of course it's not a "PC killer", but for anyone who's not a gamer and looking to spend $1k-$3k on a laptop, it's very hard to justify getting anything else unless you absolutely need windows for something.
its more about effeciency that makes their new soc great
People treated these things like the second coming of Jesus.
It's interesting to watch them shave their claims and astonishment back to the regular levels we have for every new tech release.
@@leonfrancis3418 what I hate is that the MI mac in both generations is actually really innovative, but apple makes it seem like life will never be the same every freaking keynote so the actually good stuff like this still feels like it is falling short
One of the main use cases for a laptop is while unplugged. Large differences in performance can result in various situations, and only testing while plugged in (or unplugged) can skew results to the testers desires. I think Apple's claims are probably true IF the laptop they were comparing against did poorly while unplugged, so Apple's results would look more impressive. I would bet the power hungry high end PC laptops would all make comparisons while plugged in.
Hey Anthony, WoW is one of the few M1-native games out there. Would love to see the resolution cranked so we can get comparisons with PC GPUs
I hope this man has his own office. Love his reporting style
You should at least mention that the Zephyrus won't reach those scores without plugged in.
And THAT should be a huge point actually. We're talking about Laptops here. They're meant to be at least partly used without being plugged in. And most Windows Laptops are SO much slower on battery while Macbooks just run the same and still last literally 3 times longer.
@@Raja995mh33 Yes but when you do huge workload you cab have a workstation or a desktop in the PC world, in the M1 world it is not possible. Also a desktop graphic card or cpu is more powerful than a laptop one and can destroy even more the M1 pro max.
Mac user have no alternative to what apple gives them. PC user has many options.
th-cam.com/video/rQJTkWWkc0g/w-d-xo.html&ab_channel=TallyHoTech
@@inwedavid6919 Oh wow dude a Deskop is more powerful than a Laptop. No shit Sherlock. We're talking about Laptop vs Laptop here! NOT Desktop vs. Laptop. That is the whole point.
And the point is also that you CAN do these heavy workloads on a Macbook without a problem and it will do it just as fast and does not need like 10 times longer like many Windows Laptops that like to throw around big numbers that only count while being plugged in.
@@Raja995mh33 Is your country that starved on power outlets though? I've literally never had a problem with laptops, even has enough performance & battery life unplugged for the small commutes inbetween places that have a power outlet available to me 🤔
@@ezicarus8216 Just because you may not do that doesn't mean no one does. Many people just don't do that BECAUSE their Windows Laptop becomes useless at doing these tasks when it runs on battery.
I know a lot of people that do work and also do heavy workloads on the go etc. and often on battery and all of them use Macbooks for that.
And even when you're at home or whatever it's nice to know that you can juss unplug your machine and may work a while in the living room or whatever for at least 2-3 hours and you won't have any bottle necks or throttling.
I love LTT but didn’t you admit to forgetting WoW runs natively in your first MacBook Pro vid, I suppose you forgot it again? Or just selecting results? Also why step up to the M16 for this one in comparison to your last vid when comparing a 14 inch laptop and not when you test the 16 inch version? And finally how about rerunning tests that are native across all platforms and then run the laptops with these tests unplugged to see how real world use would be.
so if he runs wow, and it does well, that does mean the laptop is now good for gaming?, dude put your bias aside, this test is to prove this laptop isnt good for gaming, even if it does well in one game. gamers arent only playing one or two games. if im buying a laptop it should run all apps. is macOS a phone OS or a computer OS?
@@truthseeker6804 no it’s not a gaming laptop. But saying it can’t play any games well is not true. Some people who use laptops for things other than gaming still like to play one or two games. Additionally there are other reviews out there by PC guys with fairer comparisons. If you’re going to compare something compare like tests. Asking me to high jump against an Olympic high jumper is just silly and so is running bench marks on machines that aren’t natively supported across all platforms when there is a myriad of benchmarks that are.
oh mac can run one game, nice
@@baldwanderer8188 yes some people play one or two games on pc, and im an example, but definetely everyone dont play the same one or two games. if he only did wow test and the macbook does well, many people are going to conclude its great for gaming and would do similarly in their games, which it isnt. so a test should show as much weakness of a device. if you want a fanboy channel that only show pros, there's max tech channel for that. asking you to high jump against an olympic jumper is a good test, because if i was in the market looking to buy a jumper i'd want to see where i get the best bang for buck. dont forget these macbooks are priced the same and even more than alot of these windows laptops that can do more.
@@23fx23 not only one game, probably 3 games. 😁😂
Therapist: Five-screen Linus isn’t real, they cant hurt you
Five-screen Linus: 2:00
I’m primarily interested in the M1 for ML workloads. While it offers potential for edge applications most enterprises ML is on RHEL using the x86_64 architecture so the Intel CPU Mac’s end up being the better choice for a development workstation (windows just ain’t it, and find me an IT department willing to deploy native nix laptops. I’ll wait). That dynamic will probably shift over time,but I’ll probably treasure my intel MacBooks for longer than I should.
All this is to say LTT: love your content, but would appreciate reviews with a data scientist perspective.
Sorry for such a straightforward question. My work is based on Java most of the time. Are the compiling times better on M1? Right now I'm on 10th gen i7 10700k. Should I upgrade to alder lake or M1?
The CPU stuff on numpy seems really promising. But training on GPU seems rather meh, at least in Computer Vision the performance improvements of using tensor cores is just way to large. Seems like performance per watt is rougly on par with a full power 3090 - not exactly a efficient GPU. But IMHO any GPU heavy workload is much more suited for remote servers than a laptop.
@@NaryVynnsark m1 pro 10 core model would be better but wait for the mac mini or imac pro
Was this sponsored by Intel? Nudge nudge, wink wink.
Am I the only one who notices Anthony is nothing short from a professional at this moment? I mean his confidence is on another level with this video.
Well I suddenly had an idea...
Have you tried testing the workloads without external power?
I think Apple might have put benchmarks made on battery power only (since it's supposed to be a laptop an that is the use case aimed at)
I would just be curious if that changes the ratio...
I mean, sure you can have a 3080 in a laptop... But can it deliver 3080 power only running on battery?
of course, just switch the performance to max, it will burn faster through the battery, but deliver. But in the end, all Laptops overheat, they are not really for professional usage, you buy (or custom build) a desktop for that, or what 90% of all people do, a PC, with 2 RTX 3080, that is insane performance.
@@nigratruo alot of big gaming laptops actually can't perform as well on battery. Alot of them can't handle full power delivery without the external brick plugged in. Unsure which laptops are affected by this tho
The answer t your question is a huge no. Unplugged the 3080 gets destroyed, but no change in the Apple. So if you judge a laptop in terms of being disconnected then it is really no contest at all. But that would not make this video as much fun for the PC guys. Honestly if you are just going to leave a laptop plugged in, then why get a laptop at all? Desktop all the way. But for those that want professional capable laptops then the Apple is a good way to go even though they cost quite a bit if you spec them up.
@@justinbuettner8547 Huh? Are you really this clueless? Of course you can switch the performance profile on a windows machine to full performance, no powersaving and then you get the full power, plugged in or not. This is not your typically dumbed-down OS like MacOS for people that are clueless about technology. And only in your Apple fanboy deluded dreams this pathetic Apple GPU can best the most powerful consumer GPU on the planet currently. The M1 chip overheats like crazy if you run the CPU and GPU full tilt, SoC is a really dumb idea, to put all the massive heat generation in one tiny little spot.
@@justinbuettner8547 I use my laptop for video editing and it's almost always plugged in. Why buy a laptop then? Well so I can carry it from my home to the workplace and back again, also on trips, both business and leisure. I just plug it in to the external monitor (one at home and other at the work place) and a wall socket and I'm ready to go. And for the price difference I can afford a good 4k colour calibrated monitor which is important to me.
I wonder how music production fares on M1 Max. That's what I personally see professionals using Mac OS for.
its bad because software support is bad. for anyone who already have a bunch of x86 music software/VST/VSTi-s using M1s is a pain in the butt. and thats pretty much all of professional music producers.
to use it you'd have to start from scratch and ignore pretty much all software that already exist or you'd have to wait for those bright times when there will be decent enough options for M1 silicon to replace every piece of software that producers use now.
@@rawdez_ Not true. I produce music and I’d consider myself probably somewhere in the intermediate level. I upgraded to a 2020 M1 MacBook Pro this summer and I’ve had very few, if any, plugins outright not work. Sure, we’re still waiting for a lot of plugins run natively on Apple Silicon, but in the meantime I’ve had no issues with running many plugins through Rosetta and the M1 handles them like a champ. As for what I actually use - both of my DAWs, Ableton Live 11 and FL Studio 20 as well as plugins such as Serum, FabFilter, Omnisphere, Waves, iZotope, Arturia, Native Instruments, etc. all run flawlessly.
@@mike.d99 I've said "professional music producers", not "hobby musicians"
as far as writing music goes you literally can write music using only wave editor, check Burial. so basically you can do it online in a browser on a $100 smartphone.
still doesn't mean that M1 macs are actually good for music production.
but yes, you can write pretty much anything on any modern device, M1 with its limitations isn't the worst option. just can be annoying AF for people who are used to x86 software and straight up bad option for professionals who make money and need EVERYTHING to work, "very few, if any, plugins outright not work" isn't acceptable. what would you do if that one piece of software that you own, know how to use and actually need for your current project doesn't work properly, produce glitches or is unstable?
Anthony = The full truth with no bs. Getting better and better every time.