I almost didn't click this video because Linus was in the middle and not on the left of the thumbnail. Edit: to worst part is, I didn't watch the video when I commented. I came back to it a few hours later.
It arm based processor it similar to apple m series chip so technically it will work same as m series chips with long battery life with high preformance
This has been a weird two weeks as these reviews have started coming out. I pre-ordered an X1E Surface Laptop soon after the announcement because I actually work at a chip company that makes ARM processors, so I was excited to experiment with it. It came in on June 18th and I've been running this machine into the ground trying different applications and use cases out. It's been so much fun getting to find out all of these quirks before reviews are out. I definitely hope situations like this don't happen in the future with little to no review embargo periods before other high-profile tech releases, but in this specific case, it was a lot of fun for me.
It's a stellar laptop. It's my go-to for watching videos in my reading nook and I've tested a bunch of software that I use regularly on it. I also let my friend borrow it during our D&D sessions (we play on FoundryVTT which has some visual effects that can cause weaker hardware to chug) and he said it was pretty much perfect. I've taken it with me on a trip already where it lost pretty much no battery while asleep in my bag and in general the battery has been far better than any Windows laptop I've ever owned and it only really gets hot while gaming in my testing. All of the software I've run so far has worked, if not with a few minor visual glitches here and there. I'm using ARM64 compiled software when possible (like Photoshop and VScode), but x86 emulation has worked perfectly fine and the performance hit isn't really noticeable. I really wish my company would let me use this laptop for work, because it's got plenty of CPU compute for using VScode for remote development with plenty of tabs and documentation open. The NPU only really does anything when you're explicitly using AI features (at least for now without Windows Recall), but the AI features I have used have been fun to play with and super snappy, but the only genuinely useful one I think is the camera options like the background blur which looks much better than standard software background blur, and the eye contact effect that makes it look like you're looking directly into the lense instead of below as you would in a standard video call, and it's much more subtle and far less creepy than the Nvidia solution in the past. Games run ok and this is the only circumstance in which I've had compatibility issues. There's also a significant performance increase while plugged in (like 1.5-2x) so that's definitely recommended. Some games like Hellblade: Senuas Saga and Monster Hunter World wouldn't launch, but everything else I've tried has. Doom 2016 ran great on Medium settings and the resolution scaled down to 1080p, but changing the render resolution scale in game caused unplayable artifacting. Baldur's Gate 3 runs great at Medium or Low considering the hardware (especially when in underground dungeons), but definitely struggles in Act 3 with all the NPCs and geometry in the city, and AMD FSR is pretty much required, but looks fine in Performance mode. Rocket League runs at 70-80 fps at max settings with full resolution, but you might need to drop the settings to get to 120 Hz like some of these laptops support. Basically, the CPU is awesome and general usability is top notch. The GPU is pretty weak and has driver issues in games, but that's to be expected on a new on-die GPU at this price point. The NPU doesn't have much in the way of applications yet, so it's hard to evaluate performance there. Software compatibility is very good, but do expect some hiccups at least for now. Game compatibility is hit or miss and performance is a bit all over the place right now.
Given the performance and the extended battery life, Snapdragon X feels like a good pitch to university students. I'd buy it a few generations down the line when they refine the other aspects of the chip.
@@anivicuno9473Honestly you could manage something like that with a vm. Now that you're saying it, i wonder if vm software and windows built-in hyper-v work fine on these cpus.
he's gotta protect those business relations with those large corporations so they keep him in the loop. LTT reviews have become a very unreliable source of information.
@@soundspark ASIO driver is not supported for ARM anyway, the translation layer is lack support for this driver too. So yeah, only thing to do right now is just wait or get laptop using x86_64 or Macbook. Which is really bummer
@@HorrorTroll2611 I hope they rework the audio driver all together because right now it's a mess, 6 different API and none of them work, and the one that work lacking in features
Stress tests really show the weakness of these processors. They are very snappy when cold but once they are saturated with heat from a stress test then the clock speeds plummet. Even the x84 variant drops down to clock speed of about 2.2GHz under a full load stress test. Thermals are about 78-80C in these conditions and for some reason the fans don’t really spin up fast when they reach these temps. If it is a thermal restriction then hopefully a bios patch to allow for faster fan speeds might fix it. But that clock speed drop really hurts its performance.
@@username8644 only on the MacBook Air as that no no cooling fan. The MBP’s do reach throttling limits but they don’t actually throttle noticeably.You might lose 100-200MHz but definitely not the 1-1.2GHz drop you find with these Qualcomm processors.
@@peterstainburn2871 Maybe. The main issue with these laptops though that nobody is talking about, is that ram and drive are soldered, and nobody is offering more than 16gb of ram and 1tb drive.
On my M2 Pro MacBook Pro, I have a utility called Macs Fan Control which allows me to set up my own fan curve -- or just turn the fans on full blast whenever I want. I have it set to go full blast when the CPU hits 65 degrees C. I do not get the throttling issues that others have reported. I wonder if there is (or soon will be) a fan-control utility that lets you set up your own fan curve on these Snapdragon laptops.
I tried the Samsung Book4 Edge 14" and, by god, we did not have the same experience with the Qualcomm processors. The Samsung Book4 Edge is loaded with x86/x64 Samsung bloatware and the battery life is absolutely terrible -- 8 hours at most, but regularly just getting 6 hours. Throw that one at the labs team, I'll be standing by with popcorn.
@@prekatori A galaxy phone, the classic watch, and a tablet I only kept for a week. The phone and watch were both great pieces of hardware, but their bloatware totally ruined my experience. Even now my watch wants me to pay with Samsung pay, back up with their back up software, use their special store, etc. I hate their attempt to lure you into their shitty little ecosystem. ... sorry I have a lot of feelings about this haha
Can't believe you didn't do performance tests on battery... where's the heavier workloads unplugged? where's the battery life on heavier workloads? where's performance per watt comparisons? I'm seeing Qualcomm pushing higher power budgets, but throttle more aggressively in battery compared to M chips. The same performance plugged/unplugged is a huge deal on current macbooks. You won't be able to determine that from a battery test that is only a youtube playback
Me is that him and Qualcomm keep putting BG3 in the benchmarks, like that game is like Doom 2016 and can run well on a potato. Put Cyberpunk or GTA V in there even in its lowest settings, and it is a whole different story, as I have seen. Other PCs at this price point game much better if your looking for that even with integrated graphics. Were talking Mobile RTX 4050s at this price point, which to me makes it a bit of a hard sell, even if the battery life is good.
Tech reviewers do their audience a huge disservice when they don’t test both plugged in and unplugged, especially when there’s literally 20+ years of experience that shows it makes a significant difference on Windows laptops.
I had the Samsung Galaxy Book Edge4 14" model, and I can tell you that under a gaming load it will go all the way down to under two hours like every other laptop. The performance at that power level is still impressive, but I really struggle with fine-tuning that kind of thing so for my actual daily use, it'll be about the same battery life as my old laptop so I really don't see the point in upgrading.
One thing a lot of people don't mention is the standby time. I have the Lenovo Slim 7x, which I received a week ago. When I took it out of the box, I charged it to 100%. I have not had to charge it this whole week. It is still at 35% at the last time I suspended it, and it may use 1-3% battery overnight. So not only is it insanely efficient during use, but you also don't need to worry about it dying on you while you are away. I'll never use an x86 laptop again. In fact, I wish there was an ARM desktop chip with discrete GPU support. Thanks Qualcomm!
the terrible sleep is thanks to windows modern standby (S0). i am lucky to own a modern laptop that does classic standby (S3), and the loss of charge while sleeping is very small and predictable.
I'm getting a similar experience with mine. I haven't run into any compatability issues so far, but I'm a coder and not a youtuber so I have zero interest in most of the things they want to test on them.
If you install Linux with WSL on Windows, the performance is 3x compared to installed Windows. I swear, it's true. (th-cam.com/video/nwGcv5Dxp8o/w-d-xo.html)
Considering Raspberry PI's (which is the SBC standard at this point) have ARM-based CPUs, I've expect some Linux distros to behave very well and even have a large support on most common packages.
@@FrankignoEstudiantes not sure if I don't understand what you're trying to say or you don't understand what you're saying. WSL is just a Hyper-V virtual machine with a modified and pre-loaded kernel (the same for every distro) and some (heavy) modifications to other subsystems (filesystem, audio, graphics) to allow for things like achieving better filesystem performance under WSL, enabling graphical applications via FreeRDP and enabling audio playback via PulseAudio, besides many other things of course. WSL will never be faster than the underlying Windows system, nor will it be faster than a native Linux install. WSL has to do some (or many?) translations between the WSL kernel and the Windows kernel, thus requiring much more CPU cycles than direct communication to the native Windows kernel. Even if raw speeds (data transfers for example) within WSL compared to native Windows suggests that WSL might be faster in this regard, there is still more CPU cycles involved and thereforce latency is higher. And compared to native Linux.... same thing. No kernel-to-kernel translations needed and thus no latency overhead.
I need the Linux gaming test too! I really want to see the comparisons honestly. x86 Linux/Windows Gaming vs ARM Windows Gaming, even though these PC's really aren't gaming machines... they have a gaming machine price! Edit: Like I said x86 Linux vs ARM Windows, both use compatibility layers, both have Anti Cheat issues, which games better?
I'd just like to say, the way you've changed graphs, made it a lot easier to understand them. I really like the little white pulsing dots, showing the results of the laptops you were talking about. I also like the colours of the FPS graphs as it took your vision to the 1% lows which are the most important thing but still had the average shown in muted grey.
Yeah, easier to see. Would be more better if they use a different column or color for the things they are testing, like the wattage difference in the snapdragon laptops graph.
@@GlobalWave1 Based on the specs... the snapdragon GPU isn't bad. The issue most likely is immature drivers and it reminds me a lot of Intel ARC's growing pains.
@@AwperationZ 2 and a half years late snapdragon x, Intel been pumping out drivers this long and than you release a product that wasn't done cooking in the oven shunn.
I went with the surface pro 11/ X Plus combo, and I've had a lot of fun with it, trying out different applications and games that I would expect to struggle. I've been pleasantly surprised with how much stuff "just works". Don't get me wrong, there have been glitches, but battery life has been excellent, and sleep/wake is absolutely flawless so far.
How’s the system responsiveness when you’ve got lots of tabs open and the RAM loaded up? One of my main issues with my SP9 is that my tab hoarding can bog down the machine pretty quick, which I’ve experienced a lot less of on an M2 MacBook Air
@@goinginzane probably depends as well on what browser you use, as if you use chrome, they have awful ram management. Furthermore, it tends to just be worse performance wise.
@@Dwivil agreed, it's one of the main reasons I've switched to Edge in addition to other useful features like vertical tabs, workspaces, etc. With its browser market share dominance, Google has really fallen asleep at the wheel while other browsers like Edge, Brave, Arc, etc have been innovating.
@@goinginzane I have gotten it to 14/16gb ram usage, with chrome running a twitch stream on a secondary monitor, running Rufus to create a Bootable usb, and all the normal programs you'd have open alongside, as well as about a dozen tabs in another chrome window looking through ebay listings, and it hasn't skipped a beat. Cpu usage was hovering around 10-15% with spikes up to 70 from time to time. HWmonitor showed cool temps too, like 55C on the hottest core. In another session, i loaded up snowrunner, and it grabbed my cloud save, and I picked up the campaign right where I left off, at native resolution, low details, and playable frame rate. I am pleasantly surprised.
I did the same and swapped in a 1 TB SSD. So far, so good. I had to work away from an outlet for most of the day Monday unexpectedly. I enabled Energy Saver (which is still peppy for office productivity work) and set the display refresh rate to 60 Hz. I had that "wait, what ... the battery is still at 100%?!" experience I've only had on MacBooks in the past.
Impressive test results on Snapdragon X Elite chips. Love that we're seeing more competition in the processors market especially in terms of power efficiency and performance.
I had quite an interesting issue with Snapdragon/other mobile chip laptops is that some device drivers have issues with them because of the ARM based system. The laptops themselves are great, but when our shop needed to download label printer software onto them, we found that those printers weren't compatible with ARM. That was a strange 3 hour troubleshooting process because I had no prior experience with ARM, and didnt realize standard x64/x86 drivers just didnt work on them.
Hopefully Microsoft can do some of the heavy lifting to get better compatibility. If there's one thing you can't criticize Microsoft with x86 windows it's that.
@@aronseptianto8142 Technically, you could try emulating x86 and then running a second windows instance on it and passing through the USB device, but that would be extremely slow and power hungry. Other than that, the only solution is using an x86 device, or waiting for an ARM driver.
Yeah, I'm not sure these are ready for current businesses. Most run some kind of label or check printer in my experience (been a computer tech/consultant for 29 years) and I don't think they can just emulate drivers since they talk to the hardware directly in some cases.
It is a great start. After this video though there are a few things I have: 1. Windows: People touch on it but do not cover it enough. The first iterations of ARM based WIndows as just terrible. The complete re-do MS took to make this current iteration work better was intensive and in a relatively short time frame. The team who worked on this have done AN AMAZING JOB. Work to do but its good! 2. Discussions not covered still are how ARM can work with GPU's. Not their built in or mobile stuff. If ARM is going to be the "future"? then how is a desktop setup looking. How external upgradable memory is going to work, how the GPU is going to work, will current GPU's and their setup going to work and if not what will need to change here. I really want to see someone like LTT tackle these things in more depth on a video.
the ARM architecture itself does not prevent discrete GPUs from working. In fact, several server and workstation ARM CPUs explicitly support Nvidia enterprise GPUs, and AMD GPU support is being worked on by the Linux community. But none of these drivers have made their way to Windows so far. It partially also depends on how good Qualcomm's PCIe implementation is, because some ARM manufacturers (e.g. Broadcom) half ass their PCIe controllers on the assumption that the feature would only ever be used for storage, networking, or USB ports.
As for ram upgradability, the clambering to go faster is driving soldered ram as the "norm" for the future. I wouldn't be surprised if I'm 10 years all your desktop is a mainboard,power supply, cooler, ssd and video card. With the rtx 1million (big number has to go up. And they will certainly rebrand by then) taking up most of the space in an otherwise shockingly small build.
These were rushed out of the oven and were not fully cooked, They are gonna flop onto the floor because if you buy this over priced laptop, You're an early adopter 😂
I can imagine the battery life would be insane considering the more lightweight nature of Linux _especially_ Gentoo. Hopefully Linux on ARM gets more battery optimization.
depends on qualcomm support. my amd laptop cores used to only throttle down to 1200Mhz when inactive. after smth amd pushed in 6.1, only then they would throttle to 400Mhz
There won’t be proper Linux support for months, sadly. You can search for Qualcomm’s announced support roadmap but it looks like they are targeting 6.11 for upstreaming of most things.
It honestly makes me concerned about gaming. I would LOVE to see a Mac Pro with an integrated GPU that works alongside the M chip. The same goes for any PC or Mac that's using an SoC. I'd rather sacrifice some battery for extra performance. But I know that some of these chips aren't designed with a graphics card in mind, they're kinda like all-in-ones, but could it theoretically be possible for them to support "real" GPUs? Also, the lack of upgradability makes me extremely worried about future-proof laptops. I think we're (kinda) screwed. (PLEASE PROVE ME WRONG)
I genuinely wish you're wrong about the upgradability aspect. But it seems like the future is buying product, then buying new product instead of upgrading your old one. I love how all these companies talk so much about the environment all while making some of the most perishable products to date. Apple is the worst in this regard. That 2023 Mac Pro, a waste of silicon and sooner rather than later to be the bane of landfills
Theoretically a dGPU should be no problem. There are 8 PCIe Gen4 lanes that are currently unused on the chip, and they wouldn't just put those on there if there was no plan to use them in the future XD. As for upgradeability with any luck CAMM will catch on, but we'll see, it isn't looking great. -AC
"It honestly makes me concerned about gaming", if you want to game to anything that's not Windows x86 you will always suffer of some forms of compatibility issues or small or big performance loss due to compatibility layers. It's the same thing for gaming on Linux. Microsoft doesn't want people to go away from Windows, and doesn't want to give away directx either, to be able to run games natively on ARM you would need a build compiled specific for ARM and gaming companies are not even interested on doing what they consider a porting. That's not an Apple move, Microsoft doesn't make Hardware, qualcomm does, the interest is from Qualcomm not Microsoft. Gaming is and will always be on Windows x86 like it has been in the last 20 years, Apple switched to ARM instantly, they forced other companies to rebuild the software for ARM bc the penalty would have been to not sell the software on millions of laptops, Microsoft isn't forcing anyone and does not have a monopoly on the hardware like Apple has either, so don't see ARM PCs as PCs where you can play, it will never be the case
I doubt that Windows will be as prominent in the future as it was in the past 20 years, but yeah it will take time to dethrone it. As soon as governments will realize the benefits of open source and FOSS will start to emerge in education it will be a downfall. I believe time is against them.
@@incandescentwithragekeep in mind, This is just the first gen cpu Qualcomm has launched for the Pc department compared to Various generation of Amd power efficient Cpus.
@@OrRaino Of course. I have no loyalty to any brand, but can you remember 1st gen Ryzen? It wasn't released being a little bit worse than Intel. My experience with ARM is it's great for high efficiency and low performance. You push the clocks and power envelope.. it's neither cheaper nor faster than x86. They succeed in the cloud space with a ridiculous high number of low performance cores per socket. That's their market. Efficient when slow, trailing edge spec but on modern process nodes.
@@shalokshalomin most amd mobile chips, the sweet spot is between 15-30 watt in cpu perf. as long as the vendor able to supply ~25 watt in battery, they will not suffer battery peformance,
Love how LTT says good things for the compatibility in this video, in the live stream a couple of days later they try live and almost everything fails... Well done LTT /facepalm
It feels like the lab is more of a cash cow than an actual lab where things get tested. There are only a few actually good laptop reviewers and they're none of the large channels. Josh is good, Alex Ziskind is very good (especially if you are a software engineer and aren't interested in the basic run of the mill photoshop tests.
Thew. I thought I was in some alternate reality. Reviewers and one coworker were finding problems running there applications. This isn't as bad as Windows on ARM was before, but still dealbreakers. They were sponsored too by Qualcomm within this month when they made a video about Snapdragon processors, so how could they not be biased? They would be inclined to keep their sponsor happy.
@@roccociccone597If you genuinely think the lab is profitable, you're you're actually quite dumb, The building alone probably cost close to as much as LTT has made off the majority of their sponsorships over the last 6 months. That is to say nothing about all of the people and equipment they have in that building, none of this stuff is cheap. Also, laptops are probably the hardest technology to test and that's ignoring software compatibility because you're using weird laptops, quite frankly, I think they should have just put a disclaimer at the beginning of this video saying you should only buy these laptops if you want to risk early adopter pain
Good video, though I would've loved to see Linux tested on these laptops. x86, especially Intel, has historically had great Linux compatibility ootb, but thanks to boards like the raspberry pi, arm has worked on Linux well for ages. You can even do some basic stuff through box86 and box64. Would love to see some Linux nerd representation on these laptops :)
Arm CPU support is typically good on Linux, but ARM systems typically require proprietary firmware. X86 PCs were designed with much more open architecture. It will depend on the CPU and system designers to either create the firmware, or have it added to the kernel.
the naming scheme is terrible tho, and the fact they launched with the lowest version of solid as well as people won't think "wow this X1E-78-100 is underperforming!" but instead just "damn the X Elite is worse than marketed!"
10:21 That mic quality difference is game changing Can genuinely see some business people getting these just for better call quality + the better battery life Though I wonder if the modern standby issue still exists on these laptops...
At 3:24, your test methodology mentions Windows Battery Saver Mode. I’m assuming this means the MacBooks were using low power mode as well? Edit: Also, thinking about it later: MacOS is still likely doing initial background checks and syncing with iCloud/indexing, since you mention it was pulled from the shelf first time being used
@@TalesOfWar I won’t claim that they’re Apple haters or have an anti-Apple agenda, but it’s just something they pay less attention to and are less educated about. That’s fine. Nobody *has* to pay attention to Apple. The only problem is that when you include them in videos, you project absolute confidence. Which is where Linus goes on rants that are completely inaccurate from a factual perspective (such as complaining about Apple taking so long to introduce window tiling when Microsoft held a patent on it). Another example of this confidence beyond knowledge is his paint rant. Which while having some good points and complaints, anyone in the paint industry laughs at some of his complete backwards understanding of some things (like thinking satin is flatter than eggshell)
yeah and they also didn't include the macs in the audio/webcam test, although the test itself also isn't too helpful since they all most likely use different mics and webcams anyway and therefore aren't directly comparable
@@jbnelson >Which is where Linus goes on rants that are completely inaccurate from a factual perspective (such as complaining about Apple taking so long to introduce window tiling when Microsoft held a patent on it). Yeah that's my main complaint I have with Linus, he doesn't know much about Macs, at all, seems like he refuses to learn (which fair, he's a busy dude) but then don't go ranting about them if you don't know stuff about them, they're just mostly Windows users who now have to put apple in these videos because their offerings don't suck balls anymore, but haven't taken the time to educate themselves properly it seems, although this is an assumption I have, not sure if this is factual or not, so apologies if I'm wrong
i have a Dell XPS 13 here for testing. the fact that I changed the power on windows to performance, and unplugged the power cord and the performance didnt change, was amazing. a bunch of tests went so well, so i can recommend the xps 13 (apart from pricing maybe).
Why do most Battery tests revolve in testing TH-cam playback? That's when the iGPU kicks in. You can have tests that try real work that create CPU spikes: 1. Opening files/launching programs. 2. Crunching some numbers/compiling/programming. These will give you better battery tests.
@@CanIHasThisName it really isn't though. It's actually because if you heavily load these chips with actual work they last about as long as the intel and AMD chips. But that would mean that Qualcom lied about their numbers. So better hide it behind a futile test.
A few other channels have gotten much different results than here. Specifically Just Josh found the Qualcomm chips underperforming both Intel and AMD when manufacturers put different chips in the same chassis. He didn’t have a Vivobook S15 so it’s conceivable that this specific model of laptop just does a better job with the chip. He had models with both the 78 and 80 SKU and found that there is no difference after normalizing power consumption.
@@Workaholic42 So what does that mean? is NEON functionally equivalent to AVX2? If that's true, would that mean they just haven't implemented it in emulation yet? Can NEON run fast enough to handle AVX workloads with the added overhead of translation?
@@CoreyKearney that means that ARM processors also have SIMD instructions that are even more modern than AVX2 and work just fine (see also Apple M processors). It is certainly possible to emulate AVX2, but obviously difficult - only Microsoft knows the details 😉
It was a good idea, many people said they would use it (including Linus). And it's inevitable, sooner or later not just Windows but all other modern OSes will have it.
@@MadafakinRioThe idea of a feature that brings back the stuff you did in the past on your computer is a good idea.... If it wasn't implemented in the most awful way possible. Other modern OSs would never adopt such a thing because the people who use those OSs care a lot more about their privacy.
@@marschallblucher6197 it wasn't even that bad. People were overreacting as usual. It just should have been encrypted from the start (as they said it would) and it would have been fine. As for the other OSes, just you wait bro. I'm certain Google is already cooking something up for Android since AI is their whole thing now. And even apple jumped on the bandwagon, and a similar 3rd party app already exists on macos. You are heavily overestimating how much people care about privacy. If you do, that's great, but most don't that much. It's not as if the current OSes are ultra secure and private, yet everyone is using them.
3:24 this graph should have included the watt/hour rated capacity of the battery of each laptop, not just the chip in it, that makes it so much more clear why it lasted that long, even for those with the same chip like the asus is rated for 75wh, and the m3 14' is 69wh, that could very much be the extra 2h showed in this graph lol, not just the chip, show me how much watts those chips use too, not just ok bigger battery lasted lounger "duh"
The HP Omnibook X has a 59Wh battery (this info was in the video) so it isn't just "bigger battery lasted longer". Besides battery life typically the display efficiency and BOIS tuning have just as much of an impact as the outright capacity. -AC
@@LinusTechTips nice you got my point the graph sounds even more impressive knowing the HP laptop at the top has the same size battery as the MacBook, thanks for the reply
For a laptop this is very exciting but what I'm more excited about is the potential to rival x86 handhelds such as the steam deck with not only a better form factor but also with a far better battery life but we'll have to wait and see how far they go with this x86 translation.
This looks like marketting fluff, lacks clarity and details. Things I want to see - 1) How does the x86 programs work, that doesn't have a native arm port take notepad++, sublime editor, Visual studio. Even eclipse IDE is not available on ARM AFAIK 2) How does the productivity tools like MS office work compared to the x64 counterparts 3) Does it support installing old xp era 32 bit applications natively? 4) Obviously managers and CXOs can switch to windows on arm, what are the challanges 5) A simple demo of installing some of the x86 / x64 applications on windows ARM edition seriously LTT, what gives? What's the point of just repeating marking talking points, it would hardly take 1 day to do few of these tests
I have personally tried Windows 11 on ARM64 (though on my phone (look up "windows on miatoll" if you dont believe me) not an X elite device) and any x86-64 app i threw in just worked, because of emulation (look up "how x86 emulation works on windows on arm" for info from MS). Office usage is the exact same as is on x86-64 because they have ARM64 releases. As for XP x86 apps, that also wont be a problem because again, x86-64 emulation.
That's why he said the detailed video will come after 1 month of rigorous testing , do you guys even watch the video or just directly come to the comment box to whine away?
So I bit the bullet and got a SP11, it is alright I guess. most of the x86 programs "just work", occasionally there's an app saying unable to install, but it was very scarce, I think macrim was the only one. the x86 installation process is exactly the same as a normal PC, you wouldn't tell the difference unless go into task manager. programming is obviously case-by-case, but I expect most active platforms to catch up very soon. VS is native though. Sublime worked fine. some problems: WSA/android studio just crashes, not surprising given that development of WSA had stopped, but what a shame you can not run android apps natively on an arm processor that is built by qualcomm. ram usage pretty high, 12gb/16 during normal work load. some stability issues with third party SSDs, might be surface only
This is Apple’s fault for having confusing and overpriced options in their lineup but for the same $1999 you can get an M3 pro with 18GB of ram. It’s still not quite price comparable to the Qualcomm stuff but would be interesting to see how the different parts of the M3 lineup compare with Qualcomm
Well, of you count the 7-8 free major OS updates (as 7-8 purchased Windows is quite costly, for the same amount of new features), and how well the Macbook keep their price (the 4 years old Macbook M1 still $699, so you can sell about your 4 years old laptop for $600.. which is only $400 lower like the initial $999 purchased value..), so in a long term even the $1999 M3 Pro can be cheaper like the $1400 X Elite (which will lose about the 40% of it's price just in the first year, compared to the 4 years for M1, because once any of the manufacturers start to sell it with discojnted price, all others and even the used ones will lost a lot of their initial price, and they must lower their price too..)
@@TamasKiss-yk4st Except you typically don't have to buy a new Windows license when (and if) they release a new version? People who bought Windows 7 Ultimate were able to upgrade all the way to Windows 10 (maybe 11, not sure), for "free"
Hey ! Great video ! It would be great to have an idea of how well linux distributions would support these new chips ... if it does support them at all.
Worth noting that with an aarch64 platform you’re going to end up compiling your own software for it more often. However in my experience this has been pretty straightforward in most circumstances.
It’s harder to skip sponsorship clips when the creator has taken the fort to clearly outline where they start and where then end and keeping them as brief and to the point as this. Keep it up LTT.
Well, I'm not so hyped seeing that 5 of my current external devices are not supported on arm due to missing drivers, and the vendors are not willing to provide any.
Well that's the magic of Windows Prism. It works like a charm just like Windows regular releases lmao. Joke aside, their is a big difference between Apple moving to Arm and Microsoft. Apple: it's obligatory or you're dead meat, banned for life. Microsoft: I know a lot of developers won't do the switch soon. Oh, jolly good! We can port everything for them! Time for magic Windows Prism sauce with cheerios and lucky charms 😎 The user: Ah fudge I trusted MS and ended up with a overhyped, overpriced laptop that runs like crap sometimes. Those uberexcited for Arm, Qualcomm. I don't see this compability issues be fixed, at least 5 years from now. It's pointless to pay high and have that machine be sometimes good other times mediocre. It's at least 1,200 us. It's just too expensive to be experimental with our hard-working money.
Hilariously, it would probably have a better chance running Linux on the same hardware, especially with the amount of open-source drivers increasing with Windows blunders driving people to Linux
@@MrPtheMan I couldn't agree more, I currently have a Lenovo Yoga 7 with an AMD 6800U that cost me around 900€ and will outperform those Qualcom chips while still being 100% compatible with every soft- and hardware which leaves the increased battery life - which I personally fixed by having a USB-C PD 100W power bank in my backpack.
@@UNSCPILOT I was also very excited about Linux support but Qualcomm has postponed that to a later date. Maybe the community can make it work now that the devices are out there. But for sure this would improve the driver situation, I also have a linux machine where even ancient analogue capture cards from the late 90s run without issues.
@belzebub16 I totally agree. Idk why everyone got the fever to triple down on Arm X Elite. And when I mean everyone, I mean all OEMs going ape crazy with premium parts and MS doing a belly dance. There's a reason why Apple was successful. A) Apple users are Apple users and will always buy Apple things at any cost. B) Apple launched M1 series... almost 5 years ago. x86 performance was much lower than today. Hence, M1 vs Intel was a astronomical gain. Today the landscape of Intel and AMD is so different. This generation are really putting tears to arm performance, gains while being x86. So then, what's the magical advantage of the Qualcomm chip? Sh*&t ton of problems with app compability? Double f that. Like the first comment stated, DRIVERS for arm lol. No manufacturer that already sold their products or older devices will take the time to make new drivers for their external devices. So then, Mr. Snappy Snappy Elite watccha going to do with all tham fast USB ports? You saaaaay 40Gbps lol for mice and keyboard? Ohh jolly Qualcomm, you are magnificent, thanks! C) Lastly, the grand one: Apple is an ecosystem, that means included encoders, decoders, accelerators, on their chips. Just very potent for pro apps that professionals really invest top dollar. Meh, X Elite is just a CPU, not all of that I mentioned above. Hence why I stated earlier that I considered the pricing stupid... It's like grabbing a Lamborghini engine and throwing inside of a Civic and then trying to sell it for 8.3 million! Hey it's a Lamby... kinda, well not really, well close your eyes and pretend you are sitting in one.
To clarify it's not a first generation product, it's more like a 4th or 5th. Before that one there were Snapdragon cx 3, 2 and 1 and Snapdragon 850, but they had very little volume. My location sells 2 models of cx gen 3.
Why is this relevant? The important thing here is that Snapdragon is actually equivalent at lower prices! Ofc it's not ready yet but it's obvious that future generations will be actual buying options... The same way we buy vfm Snapdragon android phones we will be able to buy Snapdragon Windows laptops. That's the highlight here!
@@shsu2020 Because there's a lot of cheerleading about how amazing Qualcomm is for nailing it in one, and how the rough edges can be forgiven because it's the first time they've done this and their second generation is going to be absolute fire now that they've got experience from having chips in the field, and how this proves that the extinction of x86 is immanent not just in the thin-and-light form factor but in gaming laptops/mobile workstations and potentially even desktops. The fact that it's actually a 4th generation product and still not ready yet paints a less exciting picture for Windows-on-ARM, where this isn't an Apple Silicon moment and is just another step on a long and difficult grind where the architectures will coexist for many, many years.
@@jameslake7775 no cheerleading at all. It will need at least 2 years or even more to be usable. But the possibility of laptops that are powerful and efficient at lower prices is really exciting. Remember how embarrassing was the first results from M1?
It's nice, but also a pro and a con. Since the webcam chip is part of the soc, everyone is shipping that. No one could put in a better (or cheaper) webcam. It's just what you get.
I am sceptical, this sounds like the typical situation where one power brings up something big, and the other power is forced to find a quick and easy fix for the problem. this cant be as good as it seems. I wait for a year or two like I did with the M1
The funny thing is, they're not first generation products. Snapdragon has had chips in Windows devices for years at this point. And Windows has existed for ARM since 2012, 2011 if you include Microsoft's announcement of the Surface RT. So we're about 12-13 years into Windows on ARM and it's still buggy and incomplete. But nobody mentions it.
What about the touchscreen issue I've seen noted in some other videos. Performance seems to be *boosted* sometimes when simply touching the screen. Theorized to be something built into the chip architecture from mobile phones.
I want to see someone install linux on those machines and see how many ways worth of battery they get. Linux on ARM has been around for a long time, I guess it should be more robust than Windows on ARM.
I'd say they are about the same on desktop. Linux got older and better support on random embedded boards but these manage peripherals in kernel instead of normally having an on-board system firmware like BIOS/UEFI on PCs that handle low-level tasks. Meanwhile Microsoft among several others steer UEFI, which is what those "Arm PCs" use and is also relatively new to both Windows and Linux.
The only issue I'm aware of is that not all the drivers are upstreamed into the kernel yet, so while the CPU and GPU nominally work, peripherals don't because for whatever reason the USB controller isn't finished yet.
even tested under standard conditions, I am wary of those battery tests if there's a 50W profile present lol. I've been burned so many times by Windows deciding that an important meeting / presentation is a great time to start updating itself and/or drivers, and it just burning through the battery life in an hour until a prompt shows up in the middle of the talk that the battery is low
I’m hoping to start seeing these SoC’s on arm based windows mini pcs in a relatively short time, or mobile on desktop mini-itx boards for power efficient NAS and homelab setups.
Would have been nice to see these snapdragon's tested under Linux as well - as Linux has been supporting more varied CPU options very well for a long time and though I'd expect a bit more manual effort required by the user to really get the best out of it Linux can often do better on battery life and performance vs Windows (however windows tend to come with better out of the box experiences). But this is new 'weird' hardware...
Linux support is not there yet. Need to wait for kernel 6.11, so give it a few months. The nice thing is that Qualcomm themselves are working on it, so when we do get support, it should be solid.
Am I the only one that doesn't think the webcam looks better? Only + that it's higher resolution beside that it looks soo over processed, nearly cartoony, while intel one looks natural
@@Unknown_Genius my eyes are fine, how about yours? I have a low prescription, and only need to wear glasses while driving at night just for safety sakes.
What do you mean by statement "Gone are my sponsor obligations to Qualcom. Now I can say everything I want"? That all what you have said until now were just lies instructed by Qualcom?
I mean it's not like this video was anything better. Unlike what he wants you to believe, he's veeeeery heavily influenced by these companies. HE LITERALLY WAS SPONSORED TWO WEEKS AGO. Even if this video isn't sponsored he wont speak his mind because those b2b relationships are more important than his integrity.
As an IT MSP, I'm all in. Better battery life, better performance and lower cost, most apps for normal business users are native or web based. Total win
Weird Apple comparisons. Battery life shows Snapdragon battery Wh but not Apples (Its 50% more battery on the Snapdragon for 40% more battery life compared to MBA). Weird spin on the video. Saying it has good emulation when they have had a decade of experience getting ARM to Windows and these chips launched with way worse emulation than Apple's Macs did day 1. Acting like the battery life is amazing when in reality it is only comparable to its only rival on the same architecture.. and most reviews I have seen that battery life gets cut in half when running emulated apps, where the competitor only takes a 10% hit during emulation. The GPU is downright terrible when compared to it's only rival on ARM (Basically it goes head to head with the 4 year old M1 Base chips and still manages to lose by a little bit). CPU performance is just bad overall regardless of architecture. Seems like another flop from Qualcomm, yet its being hyped for some reason. Don't see the reason for this to exist. All you need is basic functionality and long battery life?, Mac exists, ARM Chromebooks exist. Need lots of software options? x86 exists. This is just the worst of either end. There is no real advantage to this chip. Consumers need another ARM option for hardware and they need Microsoft to take the compatibility seriously on the software end. Instead both companies just put all their effort into marketing and lies. Also this is the "Elite" series. Are we talking about total laptop build comparisons and based on price?... or are we actually supposed to be looking at the chip performance here, because where is M Pro and M Max? Elite is the highest offering they have for a desktop OS, with Plus underneath it and if the leak is reliable, there is a basic Snapdragon X under the Plus. So X=M, X Plus=M Pro, and X Elite=M Max.
I mostly agree with you, the comparisons are very weird and seem to be influenced by the hype. The numbers just don't seem to add up to the X elite being revolutionary despite TH-camrs insisting they do (it uses the same power as M3 Pro but performs worse than M3 in a lot of cases, not to mention that M3 is last generation at this point). The one use case it seems to be really good for is a decent but relatively power efficient CPU in a Windows laptop, which didn't really exist before.
If they were 800 bucks they’d be competitive in consumer. But the reality is that these are only going to be really bought by corporate buyers who are overpaying even with bulk discounts. Next gen might be better. I love efficiency as much as the next guy, but I am never away from an outlet and needing my computer for more than an hour or two tops, and I travel between continents regularly.
More gains to efficiency than just your maximum time away from a power source - electricity costs money too, so getting the same amount of work done for less, or for most folks idling at a lower power draw can be well worth it even if the machine costs a bit more - ask yourself how much that extra 10-20 Watt power consumption average will cost you over a couple of years the device can be expected to last. Also the models tested are clearly in the more premium thin and light product class that all cost at least in the ballpark of 2-3 times your budget no matter the CPU - when the bargain basement fantastic plastic cheapo laptop with one of these chips is produced then you can compare it to that price range of machine.
@@foldionepapyrus3441let's say on average you save 25 watt hours of energy a day. If used everyday, that's....18.25kwh in two years. You saved $2.56 in energy at 14 cents a kwh. I spend on avg $225 a month for energy, which includes an ev. That's not even a rounding error.
@@foldionepapyrus3441 20W * 8h/day * 365days/year * 3years lifetime * $0.2/kWh * 1kW/1000W = $35 So actually, energy consumption doesn't matter much for laptops in terms of cost.
@@foldionepapyrus3441 Pure coping. Electricity is cheap as dirt, if you would want to save on that - turn off your room fan/AC/stop using microwave first, they eat WAY more then 10-20 watts.
Most of this is surface level testing compared to Just Josh's video. Highly recommend anyone to watch his video on these Qualcomm Snapdragon X1 chips. He gives you details on almost every commonly used applications for regular users, developers on Linux, and content creator softwares with benchmarks. The BEST laptop reviews in my opinion ❤
I'm a windows user, but the specific battery life feature about Mac isn't just the battery life, but the fact that it's almost the same when you close the lid, and open it after several hours. That ain't never gonna happen to windows.
Hey, I've got the Galaxy Book 4 Edge 14, the one with the more powerful Snapdragon X elite. You can literally close the lid and open it the next day and it'll only have like 2 or 3% less battery. It's way better than any other Windows laptop I've ever had.
I want to see you compare the M3 Max MacBook Pro not the slowest M3 MacBook Pro. Not because I assume a massive difference, just curious how different the performance would be between the best Apple has vs the best Qualcomm has to offer. Did you choose the lowest end model of MacBook Pro in order to control for price?
They also had the windows laptops on power saving mode but likely didn't do the same on the Mac. I always recommend Safari to any Mac user, followed by Firefox if on Windows or Linux - never Chrome or any Chromium browser. I find it very hard to believe the MacBook only got 12 hours of very light usage when I get 10+ hours with an M1 Pro (drawing more power, with less efficiency cores) and only 83% battery health.
@@Toma-621 That's fair honestly. Linux has a lot of drawbacks for me compared to macOS, just not nearly as many as Windows - personally I'd rather use wine for apps like photoshop if I couldn't use macOS than suffer through using Windows. But to each their own.
That was not a fair battery comparison, linus. Various things matter like how system and CPU draw and manage power under different work loads. Playing youtube videos and saying Elite lasts more than others is not the right test.
It doesn't matter though. Its hard to do specific tests that are possible on all machines and OSes. A video play is not perfect, but not that bad either to form an opinion.
Why would it not be fair? All of the laptops were doing the same thing. Yes maybe it's not very useful because nobody just watched youtube for 15 hours, but it gives you a rough estimate.
I bought the Samsung Galaxy Book4 Edge with the new processor. Performance and battery life are phenomenal. I ended up returning the laptop though because prism/emulation just isn't there yet. So many x86 apps still won't work on it. It's unfortunate because these chips are Awesome!
for the next couple of months when comparing ARM and X86, please include the power draw in benchmarks. i feel like this is ARM's only reason to exist. i also would've liked to see the wattage in the overlays in your livestream.
Yeah I'd much rather take the "bad" Intel webcam over the vaseline filter. Neither of them are great, but the Qualcomm one just makes you look like a horrifying human doll.
1:58 You can take the XPS there and mod the bios to unlock undervolting, then undervolt the CPU and IGPU along with frequency tuning to get like 4x the battery no joke. I'd like to see if the arm chips can compare to that since the intel chips just seem to go for benchmark scores instead of actual battery. I get 20 hours of battery in browser and a few other apps like obsidian with the mods (on not even the highest capacity battery)
Yeah, you can shave A LOT of volts from intel processors, i just tired it and 0.15 undervolt is working no issues (from base voltage), and that AFTER rasing the clocks.
I am actually considering these chips for a linux laptop: 1. Qualcomm announced they have upstreamed the drivers 2. Due the open ecosystem a lot of our applications are already ported to arm 3. We have better and faster x86 emulators
Linus these laptops are anything but first generation and anything but "largely flawless". Microsoft has been trying to make Windows on ARM a thing since 2011 and they still are under cooked and not market ready. If Apple released something in this poor of a state back in 2020 with their M1 chips people would've gotten out their pitchforks and screamed bloody murder. There still is a very long way to go before these things are actually good. But why would I expect an actually critical look at something that you've been previously sponsored to showcase, can't risk those b2b relations am I right?.
@@ootheboss9307 did you have problems where the keyboard would just stop working or you couldn't adjust screen brightness? Yes M1 wasn't perfect but unlike M$ it was Apple's first Arm based MacOS version. And the fact they successfully moved architectures before gave more credence to the fact they'll pull it off successfully. Microsoft had 13 years to make it usable (Surface RT was announced in 2011) and here we are over a decade later, still having issues like random black screens and non functioning keyboards. M$ still hasn't managed to roll out their Snapdragon dev kits, something that should've happened BEFORE the release of these products, Apple has understood this for decades at this point. The fact that Lunar Lake dev kits are already out in the wild while these Snapdragon ones aren't is very telling. The efficiency of these laptops is greatly overstated, when you do actual work on your laptop the battery barely lasts longer than intel or AMD and is miles behind Apple's laptops. And with all three having new chips right around the corner these snapdragon laptops will be nothing but over hyped rubbish, like previous Windows attempts to make Windows on ARM actually decent. Arm isn't a magic bullet to get rid of issues, it's simply an opportunity to improve the state of the art for laptops. Microsoft pretends like its issues with customers is related to x86 when in reality it's their own incompetence in handling Windows as a platform, regardless of whether it's on x86 or ARM. Wendell fro Level1 Techs is very right about this.
I've had a Samsung Galaxy Book 5S with a Qualcomm 8cx for a few years now. Even that old version was impressive for what it was. It could run Steam games surprisingly well as long as I avoided FPS games, which is pretty easy for me since I don't play them much. I'm looking forward to getting one of these new Snapdragon X laptops.
The Handbrake example, is a poor one for the Macintosh M series. To my knowledge handbrake is not optimized to use the built-in hardware encoders... Or the choice of encoding is certainly not use the hardware encoder on the M2.
They're using x264 software encoding on specific presets for everything which keeps things consistent vs comparing hardware encoders which are way faster but can output very different quality and bitrates because they're all different. It's been a while since I used Handbrake but I think there's an option for using hardware encoding via "videotoolbox" on Mac but then you can't really make a good comparison anymore. It'll be interesting to see results for M3 Pro or even the new M4 when that chip finally comes to Mac.
I almost didn't click this video because Linus was in the middle and not on the left of the thumbnail.
Edit: to worst part is, I didn't watch the video when I commented. I came back to it a few hours later.
Underrated comment
🤣🤣🤣
I did click on it because he was in the middle and not the left. #LinusMiddleGang rise up
Well played
I almost didnt click this video because the handcuffs were fluffy
Please show benchmarks while
1. Plugged + High Performance Profile
2. Unplugged + High Performance Profile
3. Unplugged + Low Power Mode
Yes! Unplugged performance is so important
Do it yourself
@@romangeneral23 haha
It arm based processor it similar to apple m series chip so technically it will work same as m series chips with long battery life with high preformance
@@kamlesh_gdNot necessarily
This has been a weird two weeks as these reviews have started coming out.
I pre-ordered an X1E Surface Laptop soon after the announcement because I actually work at a chip company that makes ARM processors, so I was excited to experiment with it.
It came in on June 18th and I've been running this machine into the ground trying different applications and use cases out. It's been so much fun getting to find out all of these quirks before reviews are out.
I definitely hope situations like this don't happen in the future with little to no review embargo periods before other high-profile tech releases, but in this specific case, it was a lot of fun for me.
What are your thoughts on this laptop compared to x86 ones?
Would love to hear your thoughts comparing this to the regular x86.
You have thoughts?
thought (4) ?
It's a stellar laptop. It's my go-to for watching videos in my reading nook and I've tested a bunch of software that I use regularly on it.
I also let my friend borrow it during our D&D sessions (we play on FoundryVTT which has some visual effects that can cause weaker hardware to chug) and he said it was pretty much perfect.
I've taken it with me on a trip already where it lost pretty much no battery while asleep in my bag and in general the battery has been far better than any Windows laptop I've ever owned and it only really gets hot while gaming in my testing.
All of the software I've run so far has worked, if not with a few minor visual glitches here and there.
I'm using ARM64 compiled software when possible (like Photoshop and VScode), but x86 emulation has worked perfectly fine and the performance hit isn't really noticeable. I really wish my company would let me use this laptop for work, because it's got plenty of CPU compute for using VScode for remote development with plenty of tabs and documentation open.
The NPU only really does anything when you're explicitly using AI features (at least for now without Windows Recall), but the AI features I have used have been fun to play with and super snappy, but the only genuinely useful one I think is the camera options like the background blur which looks much better than standard software background blur, and the eye contact effect that makes it look like you're looking directly into the lense instead of below as you would in a standard video call, and it's much more subtle and far less creepy than the Nvidia solution in the past.
Games run ok and this is the only circumstance in which I've had compatibility issues. There's also a significant performance increase while plugged in (like 1.5-2x) so that's definitely recommended. Some games like Hellblade: Senuas Saga and Monster Hunter World wouldn't launch, but everything else I've tried has. Doom 2016 ran great on Medium settings and the resolution scaled down to 1080p, but changing the render resolution scale in game caused unplayable artifacting. Baldur's Gate 3 runs great at Medium or Low considering the hardware (especially when in underground dungeons), but definitely struggles in Act 3 with all the NPCs and geometry in the city, and AMD FSR is pretty much required, but looks fine in Performance mode. Rocket League runs at 70-80 fps at max settings with full resolution, but you might need to drop the settings to get to 120 Hz like some of these laptops support.
Basically, the CPU is awesome and general usability is top notch.
The GPU is pretty weak and has driver issues in games, but that's to be expected on a new on-die GPU at this price point.
The NPU doesn't have much in the way of applications yet, so it's hard to evaluate performance there.
Software compatibility is very good, but do expect some hiccups at least for now. Game compatibility is hit or miss and performance is a bit all over the place right now.
The audio recording difference was insane!
Was curious with how it compared to the M processors which of course wasn't included lol
@@25rangebomb Good idea! I am sure they just forgot it though
There are plenty of videos showing that , watch those shill@@25rangebomb
It's mostly rather hard to do apples to apples @@25rangebomb
Absolutely! While the image certainly was better, the increased microphone clarity was much more impressive.
Given the performance and the extended battery life, Snapdragon X feels like a good pitch to university students. I'd buy it a few generations down the line when they refine the other aspects of the chip.
Right up until some random program that your prof needs you to install for homework or a lab doesn't run
@@anivicuno9473 Or you need to run an X86 VM. Basically unusable on Apple Silicon.
@@anivicuno9473matlab crashing all the time
@@anivicuno9473Honestly you could manage something like that with a vm. Now that you're saying it, i wonder if vm software and windows built-in hyper-v work fine on these cpus.
@@anivicuno9473This exactly. Good luck running solidworks or matlab on a cell phone processor...
I learn more from the comments than the video. Thanks guys.
he's gotta protect those business relations with those large corporations so they keep him in the loop. LTT reviews have become a very unreliable source of information.
Same here.
Finally i can play clash of clans on it
Did someone say hog rider?
Finally
@@wearefamily6820
Hog rideeeeeer!
I mean you already could with the Google Play pc app, it even has dedicated keybinds so it's not just touch/mouse
Boycott Supercell. Greedy enemies of their creators and players.
Apparently, none of the major music production software can run on ARM, despite the promise of "full compatibility" via a translation layer.
Does the software have DRM that uses kernel-mode drivers?
@@soundspark ASIO driver is not supported for ARM anyway, the translation layer is lack support for this driver too. So yeah, only thing to do right now is just wait or get laptop using x86_64 or Macbook. Which is really bummer
@@HorrorTroll2611 I hope they rework the audio driver all together because right now it's a mess, 6 different API and none of them work, and the one that work lacking in features
Logic does.
Lol. I can run Logic Pro, Studio One, Pro Tools, and Reaper on ARM already. No problem on latency. Where did you get that fake information.
Stress tests really show the weakness of these processors. They are very snappy when cold but once they are saturated with heat from a stress test then the clock speeds plummet. Even the x84 variant drops down to clock speed of about 2.2GHz under a full load stress test. Thermals are about 78-80C in these conditions and for some reason the fans don’t really spin up fast when they reach these temps. If it is a thermal restriction then hopefully a bios patch to allow for faster fan speeds might fix it. But that clock speed drop really hurts its performance.
Same problem as the M chips in apple laptops.
@@username8644 only on the MacBook Air as that no no cooling fan. The MBP’s do reach throttling limits but they don’t actually throttle noticeably.You might lose 100-200MHz but definitely not the 1-1.2GHz drop you find with these Qualcomm processors.
@@peterstainburn2871 Maybe. The main issue with these laptops though that nobody is talking about, is that ram and drive are soldered, and nobody is offering more than 16gb of ram and 1tb drive.
Extended high speed will also have a huge battery life cost, I wonder if that was part of why they didn't bother.
On my M2 Pro MacBook Pro, I have a utility called Macs Fan Control which allows me to set up my own fan curve -- or just turn the fans on full blast whenever I want. I have it set to go full blast when the CPU hits 65 degrees C. I do not get the throttling issues that others have reported. I wonder if there is (or soon will be) a fan-control utility that lets you set up your own fan curve on these Snapdragon laptops.
I tried the Samsung Book4 Edge 14" and, by god, we did not have the same experience with the Qualcomm processors. The Samsung Book4 Edge is loaded with x86/x64 Samsung bloatware and the battery life is absolutely terrible -- 8 hours at most, but regularly just getting 6 hours. Throw that one at the labs team, I'll be standing by with popcorn.
Hmm the Samsung one has the xe80 compared to the xe78 in the video, maybe their boosting takes a lot of power? Plus Samsung bloatshit
I don't understand how people can like Samsung products. That has been my experience on every product of theirs I've ever had.
You said it, bloatware sucks
@@DanKaschel I believe that their are mainly liked for their mobile products (phones, watches, tablets). Which products did you have experience with?
@@prekatori A galaxy phone, the classic watch, and a tablet I only kept for a week. The phone and watch were both great pieces of hardware, but their bloatware totally ruined my experience. Even now my watch wants me to pay with Samsung pay, back up with their back up software, use their special store, etc. I hate their attempt to lure you into their shitty little ecosystem.
... sorry I have a lot of feelings about this haha
Can't believe you didn't do performance tests on battery... where's the heavier workloads unplugged? where's the battery life on heavier workloads? where's performance per watt comparisons? I'm seeing Qualcomm pushing higher power budgets, but throttle more aggressively in battery compared to M chips. The same performance plugged/unplugged is a huge deal on current macbooks. You won't be able to determine that from a battery test that is only a youtube playback
Me is that him and Qualcomm keep putting BG3 in the benchmarks, like that game is like Doom 2016 and can run well on a potato. Put Cyberpunk or GTA V in there even in its lowest settings, and it is a whole different story, as I have seen. Other PCs at this price point game much better if your looking for that even with integrated graphics. Were talking Mobile RTX 4050s at this price point, which to me makes it a bit of a hard sell, even if the battery life is good.
Tech reviewers do their audience a huge disservice when they don’t test both plugged in and unplugged, especially when there’s literally 20+ years of experience that shows it makes a significant difference on Windows laptops.
Watch some other videos , there are many at this point , on battery the performance is like 98% of plugged in performance
I had the Samsung Galaxy Book Edge4 14" model, and I can tell you that under a gaming load it will go all the way down to under two hours like every other laptop. The performance at that power level is still impressive, but I really struggle with fine-tuning that kind of thing so for my actual daily use, it'll be about the same battery life as my old laptop so I really don't see the point in upgrading.
@@JonaB03 samsung laptops are trash , internals are trash , they overheat , you don't put rocket engines in scooter and expect eacape velocity
One thing a lot of people don't mention is the standby time. I have the Lenovo Slim 7x, which I received a week ago. When I took it out of the box, I charged it to 100%. I have not had to charge it this whole week. It is still at 35% at the last time I suspended it, and it may use 1-3% battery overnight.
So not only is it insanely efficient during use, but you also don't need to worry about it dying on you while you are away. I'll never use an x86 laptop again. In fact, I wish there was an ARM desktop chip with discrete GPU support. Thanks Qualcomm!
Ampere Altras. ;)
the terrible sleep is thanks to windows modern standby (S0). i am lucky to own a modern laptop that does classic standby (S3), and the loss of charge while sleeping is very small and predictable.
@@victortitov1740 This Snapdragon is in the connected standby and still getting great battery life.
I'm getting a similar experience with mine. I haven't run into any compatability issues so far, but I'm a coder and not a youtuber so I have zero interest in most of the things they want to test on them.
how is the performance on battery? does running intensive tasks like rendering cut the battery by a ton?
Now I need someone to test Linux on that thing.
If you install Linux with WSL on Windows, the performance is 3x compared to installed Windows. I swear, it's true. (th-cam.com/video/nwGcv5Dxp8o/w-d-xo.html)
@@FrankignoEstudiantesLinux as in not WSL I guess, also curious to know how major distros run on bare metal
Considering Raspberry PI's (which is the SBC standard at this point) have ARM-based CPUs, I've expect some Linux distros to behave very well and even have a large support on most common packages.
i smell this comment mile away
@@FrankignoEstudiantes not sure if I don't understand what you're trying to say or you don't understand what you're saying. WSL is just a Hyper-V virtual machine with a modified and pre-loaded kernel (the same for every distro) and some (heavy) modifications to other subsystems (filesystem, audio, graphics) to allow for things like achieving better filesystem performance under WSL, enabling graphical applications via FreeRDP and enabling audio playback via PulseAudio, besides many other things of course.
WSL will never be faster than the underlying Windows system, nor will it be faster than a native Linux install. WSL has to do some (or many?) translations between the WSL kernel and the Windows kernel, thus requiring much more CPU cycles than direct communication to the native Windows kernel. Even if raw speeds (data transfers for example) within WSL compared to native Windows suggests that WSL might be faster in this regard, there is still more CPU cycles involved and thereforce latency is higher.
And compared to native Linux.... same thing. No kernel-to-kernel translations needed and thus no latency overhead.
where is the ultimate battery test ? :)
Idk
😂😂😂😂😂
I need the Linux gaming test too! I really want to see the comparisons honestly. x86 Linux/Windows Gaming vs ARM Windows Gaming, even though these PC's really aren't gaming machines... they have a gaming machine price! Edit: Like I said x86 Linux vs ARM Windows, both use compatibility layers, both have Anti Cheat issues, which games better?
Chrome tabs
On your channel :)
I'd just like to say, the way you've changed graphs, made it a lot easier to understand them. I really like the little white pulsing dots, showing the results of the laptops you were talking about. I also like the colours of the FPS graphs as it took your vision to the 1% lows which are the most important thing but still had the average shown in muted grey.
I really liked the white dots as well!
Yeah, easier to see. Would be more better if they use a different column or color for the things they are testing, like the wattage difference in the snapdragon laptops graph.
if they actually appeared on the screen for more than a frame, it would be better.
Digital Foundry tested games in general & games suggested for the laptop, the results are was below Steam Deck.
Yeah it’s so terrible for games.
Even higher end M series SOC’s are decent at running games natively.
They have no games for snapdragon arm as of now , all fames are x86 based emulated by snapdragon chip , atleast give it some time to catch up bruh!
@@GlobalWave1 Based on the specs... the snapdragon GPU isn't bad. The issue most likely is immature drivers and it reminds me a lot of Intel ARC's growing pains.
@@somnambulist6636 Or perhaps look at emulators. I know Dolphin has an ARM64 JIT that works with Windows and Mac.
@@AwperationZ 2 and a half years late snapdragon x, Intel been pumping out drivers this long and than you release a product that wasn't done cooking in the oven shunn.
I went with the surface pro 11/ X Plus combo, and I've had a lot of fun with it, trying out different applications and games that I would expect to struggle. I've been pleasantly surprised with how much stuff "just works". Don't get me wrong, there have been glitches, but battery life has been excellent, and sleep/wake is absolutely flawless so far.
How’s the system responsiveness when you’ve got lots of tabs open and the RAM loaded up? One of my main issues with my SP9 is that my tab hoarding can bog down the machine pretty quick, which I’ve experienced a lot less of on an M2 MacBook Air
@@goinginzane probably depends as well on what browser you use, as if you use chrome, they have awful ram management. Furthermore, it tends to just be worse performance wise.
@@Dwivil agreed, it's one of the main reasons I've switched to Edge in addition to other useful features like vertical tabs, workspaces, etc. With its browser market share dominance, Google has really fallen asleep at the wheel while other browsers like Edge, Brave, Arc, etc have been innovating.
@@goinginzane I have gotten it to 14/16gb ram usage, with chrome running a twitch stream on a secondary monitor, running Rufus to create a Bootable usb, and all the normal programs you'd have open alongside, as well as about a dozen tabs in another chrome window looking through ebay listings, and it hasn't skipped a beat. Cpu usage was hovering around 10-15% with spikes up to 70 from time to time. HWmonitor showed cool temps too, like 55C on the hottest core. In another session, i loaded up snowrunner, and it grabbed my cloud save, and I picked up the campaign right where I left off, at native resolution, low details, and playable frame rate. I am pleasantly surprised.
I did the same and swapped in a 1 TB SSD. So far, so good. I had to work away from an outlet for most of the day Monday unexpectedly. I enabled Energy Saver (which is still peppy for office productivity work) and set the display refresh rate to 60 Hz. I had that "wait, what ... the battery is still at 100%?!" experience I've only had on MacBooks in the past.
Impressive test results on Snapdragon X Elite chips. Love that we're seeing more competition in the processors market especially in terms of power efficiency and performance.
I had quite an interesting issue with Snapdragon/other mobile chip laptops is that some device drivers have issues with them because of the ARM based system. The laptops themselves are great, but when our shop needed to download label printer software onto them, we found that those printers weren't compatible with ARM. That was a strange 3 hour troubleshooting process because I had no prior experience with ARM, and didnt realize standard x64/x86 drivers just didnt work on them.
don't leave us hanging, how did you solve it
Drivers will almost certainly be the compatibility issue with the longest tail
Hopefully Microsoft can do some of the heavy lifting to get better compatibility. If there's one thing you can't criticize Microsoft with x86 windows it's that.
@@aronseptianto8142 Technically, you could try emulating x86 and then running a second windows instance on it and passing through the USB device, but that would be extremely slow and power hungry. Other than that, the only solution is using an x86 device, or waiting for an ARM driver.
Yeah, I'm not sure these are ready for current businesses. Most run some kind of label or check printer in my experience (been a computer tech/consultant for 29 years) and I don't think they can just emulate drivers since they talk to the hardware directly in some cases.
What TDP were the AMD and Intel chips running at? Especially since you used the lower powered 7840u and not the H or HS with comparable power draw.
It is a great start.
After this video though there are a few things I have:
1. Windows: People touch on it but do not cover it enough. The first iterations of ARM based WIndows as just terrible. The complete re-do MS took to make this current iteration work better was intensive and in a relatively short time frame. The team who worked on this have done AN AMAZING JOB. Work to do but its good!
2. Discussions not covered still are how ARM can work with GPU's. Not their built in or mobile stuff. If ARM is going to be the "future"? then how is a desktop setup looking. How external upgradable memory is going to work, how the GPU is going to work, will current GPU's and their setup going to work and if not what will need to change here.
I really want to see someone like LTT tackle these things in more depth on a video.
the ARM architecture itself does not prevent discrete GPUs from working. In fact, several server and workstation ARM CPUs explicitly support Nvidia enterprise GPUs, and AMD GPU support is being worked on by the Linux community. But none of these drivers have made their way to Windows so far.
It partially also depends on how good Qualcomm's PCIe implementation is, because some ARM manufacturers (e.g. Broadcom) half ass their PCIe controllers on the assumption that the feature would only ever be used for storage, networking, or USB ports.
As for ram upgradability, the clambering to go faster is driving soldered ram as the "norm" for the future. I wouldn't be surprised if I'm 10 years all your desktop is a mainboard,power supply, cooler, ssd and video card. With the rtx 1million (big number has to go up. And they will certainly rebrand by then) taking up most of the space in an otherwise shockingly small build.
AMD and Intel also have different wattage modes if you show wattage on snapdragon you should do the same on the other cpus!
These were rushed out of the oven and were not fully cooked, They are gonna flop onto the floor because if you buy this over priced laptop, You're an early adopter 😂
True, although most likely Intel's performance wouldn't catch up to the lower nanometre process used by amd and snapdragon at lower wattage
Kinda surprised no compilation benchmarks, would love to use one of these on gentoo will full package support and great battery life
I can imagine the battery life would be insane considering the more lightweight nature of Linux _especially_ Gentoo.
Hopefully Linux on ARM gets more battery optimization.
depends on qualcomm support. my amd laptop cores used to only throttle down to 1200Mhz when inactive. after smth amd pushed in 6.1, only then they would throttle to 400Mhz
There won’t be proper Linux support for months, sadly. You can search for Qualcomm’s announced support roadmap but it looks like they are targeting 6.11 for upstreaming of most things.
If I don't compile gentoo each month, I feel bad
@@marschallblucher6197 The truth is, Linux always has worse battery life out of the box.
Once again half-baked misleading conclusions about poorly tested hardware. Well done Linus!
It honestly makes me concerned about gaming. I would LOVE to see a Mac Pro with an integrated GPU that works alongside the M chip. The same goes for any PC or Mac that's using an SoC. I'd rather sacrifice some battery for extra performance. But I know that some of these chips aren't designed with a graphics card in mind, they're kinda like all-in-ones, but could it theoretically be possible for them to support "real" GPUs? Also, the lack of upgradability makes me extremely worried about future-proof laptops. I think we're (kinda) screwed. (PLEASE PROVE ME WRONG)
I genuinely wish you're wrong about the upgradability aspect. But it seems like the future is buying product, then buying new product instead of upgrading your old one. I love how all these companies talk so much about the environment all while making some of the most perishable products to date. Apple is the worst in this regard. That 2023 Mac Pro, a waste of silicon and sooner rather than later to be the bane of landfills
Theoretically a dGPU should be no problem. There are 8 PCIe Gen4 lanes that are currently unused on the chip, and they wouldn't just put those on there if there was no plan to use them in the future XD. As for upgradeability with any luck CAMM will catch on, but we'll see, it isn't looking great. -AC
@@LinusTechTips Thanks for the info!
"It honestly makes me concerned about gaming", if you want to game to anything that's not Windows x86 you will always suffer of some forms of compatibility issues or small or big performance loss due to compatibility layers. It's the same thing for gaming on Linux. Microsoft doesn't want people to go away from Windows, and doesn't want to give away directx either, to be able to run games natively on ARM you would need a build compiled specific for ARM and gaming companies are not even interested on doing what they consider a porting. That's not an Apple move, Microsoft doesn't make Hardware, qualcomm does, the interest is from Qualcomm not Microsoft. Gaming is and will always be on Windows x86 like it has been in the last 20 years, Apple switched to ARM instantly, they forced other companies to rebuild the software for ARM bc the penalty would have been to not sell the software on millions of laptops, Microsoft isn't forcing anyone and does not have a monopoly on the hardware like Apple has either, so don't see ARM PCs as PCs where you can play, it will never be the case
I doubt that Windows will be as prominent in the future as it was in the past 20 years, but yeah it will take time to dethrone it. As soon as governments will realize the benefits of open source and FOSS will start to emerge in education it will be a downfall.
I believe time is against them.
I don’t know, it still looks like the amazing AMD 7840U chip that’s now over a year old is still amazing
Yeah the 7840U at 28w is getting higher performance per watt than the Snapdragon at 35w..
Battery performance?
@@incandescentwithragekeep in mind, This is just the first gen cpu Qualcomm has launched for the Pc department compared to Various generation of Amd power efficient Cpus.
@@OrRaino Of course. I have no loyalty to any brand, but can you remember 1st gen Ryzen?
It wasn't released being a little bit worse than Intel.
My experience with ARM is it's great for high efficiency and low performance.
You push the clocks and power envelope.. it's neither cheaper nor faster than x86.
They succeed in the cloud space with a ridiculous high number of low performance cores per socket. That's their market.
Efficient when slow, trailing edge spec but on modern process nodes.
@@shalokshalomin most amd mobile chips, the sweet spot is between 15-30 watt in cpu perf. as long as the vendor able to supply ~25 watt in battery, they will not suffer battery peformance,
Love how LTT says good things for the compatibility in this video, in the live stream a couple of days later they try live and almost everything fails... Well done LTT /facepalm
Another low effort laptop review from the boys at LTT. Disappointing
It feels like the lab is more of a cash cow than an actual lab where things get tested. There are only a few actually good laptop reviewers and they're none of the large channels. Josh is good, Alex Ziskind is very good (especially if you are a software engineer and aren't interested in the basic run of the mill photoshop tests.
Thew. I thought I was in some alternate reality. Reviewers and one coworker were finding problems running there applications. This isn't as bad as Windows on ARM was before, but still dealbreakers. They were sponsored too by Qualcomm within this month when they made a video about Snapdragon processors, so how could they not be biased? They would be inclined to keep their sponsor happy.
@@roccociccone597If you genuinely think the lab is profitable, you're you're actually quite dumb, The building alone probably cost close to as much as LTT has made off the majority of their sponsorships over the last 6 months. That is to say nothing about all of the people and equipment they have in that building, none of this stuff is cheap. Also, laptops are probably the hardest technology to test and that's ignoring software compatibility because you're using weird laptops, quite frankly, I think they should have just put a disclaimer at the beginning of this video saying you should only buy these laptops if you want to risk early adopter pain
He's literally sponsored by Qualcomm. This is deceptive sponsored garbage
Good video, though I would've loved to see Linux tested on these laptops. x86, especially Intel, has historically had great Linux compatibility ootb, but thanks to boards like the raspberry pi, arm has worked on Linux well for ages. You can even do some basic stuff through box86 and box64. Would love to see some Linux nerd representation on these laptops :)
Theres also FEX and i think a few more x86 emulators
+1 Would love to just see if you can easily get linux running with audio/networking/webcam functioning out of the box.
Arm CPU support is typically good on Linux, but ARM systems typically require proprietary firmware. X86 PCs were designed with much more open architecture. It will depend on the CPU and system designers to either create the firmware, or have it added to the kernel.
@@Rigel_Z It would be nice, but the answer is almost certainly that it will not be easy. It depends on whether Qualcomm and Asus care enough.
@@tschorsch But considering it's Qualcomm, it shouldn't be too dissimilar to their other chips which all run Linux kernels.
the naming scheme is terrible tho, and the fact they launched with the lowest version of solid as well as people won't think "wow this X1E-78-100 is underperforming!" but instead just "damn the X Elite is worse than marketed!"
Not sure the "Day 1" claims are upheld by the graphics performance across most benchmarks & applications. Lots of work to do.
10:21 That mic quality difference is game changing
Can genuinely see some business people getting these just for better call quality + the better battery life
Though I wonder if the modern standby issue still exists on these laptops...
Nope. I have one. The standby drain is like 1-3% per night, consistently. It's incredible for a Windows laptop.
At 3:24, your test methodology mentions Windows Battery Saver Mode. I’m assuming this means the MacBooks were using low power mode as well?
Edit: Also, thinking about it later: MacOS is still likely doing initial background checks and syncing with iCloud/indexing, since you mention it was pulled from the shelf first time being used
Ya, more clarity is needed
@@TalesOfWar I won’t claim that they’re Apple haters or have an anti-Apple agenda, but it’s just something they pay less attention to and are less educated about.
That’s fine. Nobody *has* to pay attention to Apple. The only problem is that when you include them in videos, you project absolute confidence. Which is where Linus goes on rants that are completely inaccurate from a factual perspective (such as complaining about Apple taking so long to introduce window tiling when Microsoft held a patent on it).
Another example of this confidence beyond knowledge is his paint rant. Which while having some good points and complaints, anyone in the paint industry laughs at some of his complete backwards understanding of some things (like thinking satin is flatter than eggshell)
yeah and they also didn't include the macs in the audio/webcam test, although the test itself also isn't too helpful since they all most likely use different mics and webcams anyway and therefore aren't directly comparable
Yeah, I was wondering the same. The battery life results for the MacBooks don’t quite add up.
@@jbnelson >Which is where Linus goes on rants that are completely inaccurate from a factual perspective (such as complaining about Apple taking so long to introduce window tiling when Microsoft held a patent on it).
Yeah that's my main complaint I have with Linus, he doesn't know much about Macs, at all, seems like he refuses to learn (which fair, he's a busy dude) but then don't go ranting about them if you don't know stuff about them, they're just mostly Windows users who now have to put apple in these videos because their offerings don't suck balls anymore, but haven't taken the time to educate themselves properly it seems, although this is an assumption I have, not sure if this is factual or not, so apologies if I'm wrong
i have a Dell XPS 13 here for testing. the fact that I changed the power on windows to performance, and unplugged the power cord and the performance didnt change, was amazing. a bunch of tests went so well, so i can recommend the xps 13 (apart from pricing maybe).
Price?
Why do most Battery tests revolve in testing TH-cam playback?
That's when the iGPU kicks in.
You can have tests that try real work that create CPU spikes:
1. Opening files/launching programs.
2. Crunching some numbers/compiling/programming.
These will give you better battery tests.
Because that’s the most common usecase.
@@CanIHasThisName it really isn't though. It's actually because if you heavily load these chips with actual work they last about as long as the intel and AMD chips. But that would mean that Qualcom lied about their numbers. So better hide it behind a futile test.
@@roccociccone597 You might want to get a reality check on that.
A few other channels have gotten much different results than here. Specifically Just Josh found the Qualcomm chips underperforming both Intel and AMD when manufacturers put different chips in the same chassis. He didn’t have a Vivobook S15 so it’s conceivable that this specific model of laptop just does a better job with the chip. He had models with both the 78 and 80 SKU and found that there is no difference after normalizing power consumption.
How's the battery life on linux for the snapdragon? Software/kernel level power management can make quite a big difference os to os.
Lacking AVX2 is pretty rough. CPU features-wise, that’s like having a pre-Sandy Bridge i7. That’s well over a decade ago.
Note that the lacking AVX2 support applies only to *emulated* apps. Natively, NEON rocks.
@@Workaholic42 So what does that mean? is NEON functionally equivalent to AVX2? If that's true, would that mean they just haven't implemented it in emulation yet? Can NEON run fast enough to handle AVX workloads with the added overhead of translation?
@@CoreyKearney that means that ARM processors also have SIMD instructions that are even more modern than AVX2 and work just fine (see also Apple M processors). It is certainly possible to emulate AVX2, but obviously difficult - only Microsoft knows the details 😉
i never thought i'd see a modern cpu get blown out of the water by my old 4790k
@@Cooe. NEON already has wide support with both autovectorizers and manually vectorized libraries, what are you talking about?
I’d be curious to see if a Snapdragon X board is made for Framework laptops.
Yeah I'd love to see an X elite main board for framework. With that and the new risc v board we'd have every major ISA on framework
Someone could design and build one, but this one is almost certainly designed with a different layout.
I highly doubt it. Qualcomm is in for the big bucks and Framework wouldn’t take this kind of exclusivity deal
Upcoming AMD CPU will be lot more powerful and also bit more efficient than last gen AMD
@@TednTin Is it though? Linus said they are the same, but different, but still the same. AI is the only difference, mainly.
U MIGHT NOT SEE THIS COMMENT AND MAYBE U'RE NOT THE PERSON TO ASK THIS, BUT CAN U TEST THOSE PROCESSORS WITH LINUX, PLEASE?
I agree. I would love to see some Lennox performance with it. I know that endeavor OS actually does have a arm port.
That there are people at Microsoft who thought Recall was a good idea to begin with feels outright disturbing.
We are getting closer and closer to cyber dystopia.
Lol it was people at the NSA, not Microsoft. M$ just sold the idea
It was a good idea, many people said they would use it (including Linus). And it's inevitable, sooner or later not just Windows but all other modern OSes will have it.
@@MadafakinRioThe idea of a feature that brings back the stuff you did in the past on your computer is a good idea....
If it wasn't implemented in the most awful way possible. Other modern OSs would never adopt such a thing because the people who use those OSs care a lot more about their privacy.
@@marschallblucher6197 it wasn't even that bad. People were overreacting as usual. It just should have been encrypted from the start (as they said it would) and it would have been fine.
As for the other OSes, just you wait bro. I'm certain Google is already cooking something up for Android since AI is their whole thing now. And even apple jumped on the bandwagon, and a similar 3rd party app already exists on macos. You are heavily overestimating how much people care about privacy. If you do, that's great, but most don't that much. It's not as if the current OSes are ultra secure and private, yet everyone is using them.
10:07 the sequin pillow 😂😂 top tier set prop right there. Great review y’all!
Watching this after Josh new release video this hour. Ooo..
I watched this before watching Josh's video days ago. I think Josh's criticisms are exaggerated.
@@AA-db9cb i don't think you watched Josh's video, or watched it attentively. It was posted a day ago, not "days ago".
@@randy89555 i was referring to linus' video.
Yeah it's disappointing especially after LTT video from 4 days ago boasting about the new lab!
@@randy89555 I was talking about Linus' video.
I wanna see a factorio benchmark with those arm chips
Its gone be shit... Even the creator has done some testing on arm and uuu its not good...
factorio is CPU bottleneck right? it's been a moment...
pretty sure the chip more than matches up to your factorio skills... 😉
@@PrograError yes happens slowly over time wen the base get bigger and bigger and bigger and bigger (THE FACTORY MUST GROW)
3:25 little weird that they turned on Windows' battery saver mode but not Low Power Mode on mac?
Apple from Temu needs all the cards stacked in its favor, or it will fall apart quite fast.
Yeah, that seems like a huge oversight. 10 hrs for the M2 MacBook Air is about half what I would expect, given my usage of it.
Here goes Linus again, getting himself in trouble. Ego sometimes trumps logic...
3:24 this graph should have included the watt/hour rated capacity of the battery of each laptop, not just the chip in it, that makes it so much more clear why it lasted that long, even for those with the same chip
like the asus is rated for 75wh, and the m3 14' is 69wh, that could very much be the extra 2h showed in this graph lol, not just the chip, show me how much watts those chips use too, not just ok bigger battery lasted lounger "duh"
That doesn't matter. What matters is how long the laptop can last on battery power.
@@techzolutethe video is about chips, not specific laptops, so it would make more sense to directly compare power consumption.
The HP Omnibook X has a 59Wh battery (this info was in the video) so it isn't just "bigger battery lasted longer". Besides battery life typically the display efficiency and BOIS tuning have just as much of an impact as the outright capacity. -AC
@@LinusTechTips nice you got my point the graph sounds even more impressive knowing the HP laptop at the top has the same size battery as the MacBook, thanks for the reply
@@LinusTechTips It should have been included nevertheless, it's a big enough factor that it should obviously be included in a chart.
I want to see you guys switch for a month
For a laptop this is very exciting but what I'm more excited about is the potential to rival x86 handhelds such as the steam deck with not only a better form factor but also with a far better battery life but we'll have to wait and see how far they go with this x86 translation.
This looks like marketting fluff, lacks clarity and details. Things I want to see -
1) How does the x86 programs work, that doesn't have a native arm port take notepad++, sublime editor, Visual studio. Even eclipse IDE is not available on ARM AFAIK
2) How does the productivity tools like MS office work compared to the x64 counterparts
3) Does it support installing old xp era 32 bit applications natively?
4) Obviously managers and CXOs can switch to windows on arm, what are the challanges
5) A simple demo of installing some of the x86 / x64 applications on windows ARM edition
seriously LTT, what gives? What's the point of just repeating marking talking points, it would hardly take 1 day to do few of these tests
I have personally tried Windows 11 on ARM64 (though on my phone (look up "windows on miatoll" if you dont believe me) not an X elite device) and any x86-64 app i threw in just worked, because of emulation (look up "how x86 emulation works on windows on arm" for info from MS).
Office usage is the exact same as is on x86-64 because they have ARM64 releases.
As for XP x86 apps, that also wont be a problem because again, x86-64 emulation.
That's why he said the detailed video will come after 1 month of rigorous testing , do you guys even watch the video or just directly come to the comment box to whine away?
so ask for a livestream where they demo this :)
So I bit the bullet and got a SP11, it is alright I guess.
most of the x86 programs "just work", occasionally there's an app saying unable to install, but it was very scarce, I think macrim was the only one.
the x86 installation process is exactly the same as a normal PC, you wouldn't tell the difference unless go into task manager.
programming is obviously case-by-case, but I expect most active platforms to catch up very soon. VS is native though. Sublime worked fine.
some problems:
WSA/android studio just crashes, not surprising given that development of WSA had stopped, but what a shame you can not run android apps natively on an arm processor that is built by qualcomm.
ram usage pretty high, 12gb/16 during normal work load.
some stability issues with third party SSDs, might be surface only
@@youbangsun5606 do you experience lags or jitter due to high base ram usage ? Like multitasking or opening 4-5 chrome tabs etc?
This is Apple’s fault for having confusing and overpriced options in their lineup but for the same $1999 you can get an M3 pro with 18GB of ram. It’s still not quite price comparable to the Qualcomm stuff but would be interesting to see how the different parts of the M3 lineup compare with Qualcomm
Well, of you count the 7-8 free major OS updates (as 7-8 purchased Windows is quite costly, for the same amount of new features), and how well the Macbook keep their price (the 4 years old Macbook M1 still $699, so you can sell about your 4 years old laptop for $600.. which is only $400 lower like the initial $999 purchased value..), so in a long term even the $1999 M3 Pro can be cheaper like the $1400 X Elite (which will lose about the 40% of it's price just in the first year, compared to the 4 years for M1, because once any of the manufacturers start to sell it with discojnted price, all others and even the used ones will lost a lot of their initial price, and they must lower their price too..)
@@TamasKiss-yk4st Except you typically don't have to buy a new Windows license when (and if) they release a new version? People who bought Windows 7 Ultimate were able to upgrade all the way to Windows 10 (maybe 11, not sure), for "free"
@@lucassilvas1 That person also doesn't mention you essentially have to pay a subscription service to apple for updates every few months.
This is false.. Apple updates have been free for a decade. Any of their subscriptions are not needed to use a Mac.
@@jasonvors1922 Also, Windows licenses regularly sell for < $5 in the gray market if you're short on cash, and then there's always the seven seas
Where is your battery life test methodology? Why is it not linked in the video description?
This is the comparison I've been waiting for! Thanks!
Hey ! Great video ! It would be great to have an idea of how well linux distributions would support these new chips ... if it does support them at all.
They don't. SoC support is there, but getting it to boot on these laptops needs some work per specific laptop (that's in the works though)
Worth noting that with an aarch64 platform you’re going to end up compiling your own software for it more often. However in my experience this has been pretty straightforward in most circumstances.
@@Xaeroxe Not used to that currently, so I might not migrate right away. But thanks for the heads up !
It’s harder to skip sponsorship clips when the creator has taken the fort to clearly outline where they start and where then end and keeping them as brief and to the point as this. Keep it up LTT.
Well, I'm not so hyped seeing that 5 of my current external devices are not supported on arm due to missing drivers, and the vendors are not willing to provide any.
Well that's the magic of Windows Prism. It works like a charm just like Windows regular releases lmao.
Joke aside, their is a big difference between Apple moving to Arm and Microsoft.
Apple: it's obligatory or you're dead meat, banned for life.
Microsoft: I know a lot of developers won't do the switch soon. Oh, jolly good! We can port everything for them! Time for magic Windows Prism sauce with cheerios and lucky charms 😎
The user: Ah fudge I trusted MS and ended up with a overhyped, overpriced laptop that runs like crap sometimes.
Those uberexcited for Arm, Qualcomm. I don't see this compability issues be fixed, at least 5 years from now. It's pointless to pay high and have that machine be sometimes good other times mediocre. It's at least 1,200 us. It's just too expensive to be experimental with our hard-working money.
Hilariously, it would probably have a better chance running Linux on the same hardware, especially with the amount of open-source drivers increasing with Windows blunders driving people to Linux
@@MrPtheMan I couldn't agree more, I currently have a Lenovo Yoga 7 with an AMD 6800U that cost me around 900€ and will outperform those Qualcom chips while still being 100% compatible with every soft- and hardware which leaves the increased battery life - which I personally fixed by having a USB-C PD 100W power bank in my backpack.
@@UNSCPILOT I was also very excited about Linux support but Qualcomm has postponed that to a later date.
Maybe the community can make it work now that the devices are out there.
But for sure this would improve the driver situation, I also have a linux machine where even ancient analogue capture cards from the late 90s run without issues.
@belzebub16 I totally agree. Idk why everyone got the fever to triple down on Arm X Elite. And when I mean everyone, I mean all OEMs going ape crazy with premium parts and MS doing a belly dance.
There's a reason why Apple was successful.
A) Apple users are Apple users and will always buy Apple things at any cost.
B) Apple launched M1 series... almost 5 years ago. x86 performance was much lower than today. Hence, M1 vs Intel was a astronomical gain. Today the landscape of Intel and AMD is so different. This generation are really putting tears to arm performance, gains while being x86. So then, what's the magical advantage of the Qualcomm chip? Sh*&t ton of problems with app compability? Double f that. Like the first comment stated, DRIVERS for arm lol.
No manufacturer that already sold their products or older devices will take the time to make new drivers for their external devices. So then, Mr. Snappy Snappy Elite watccha going to do with all tham fast USB ports? You saaaaay 40Gbps lol for mice and keyboard? Ohh jolly Qualcomm, you are magnificent, thanks!
C) Lastly, the grand one: Apple is an ecosystem, that means included encoders, decoders, accelerators, on their chips. Just very potent for pro apps that professionals really invest top dollar. Meh, X Elite is just a CPU, not all of that I mentioned above.
Hence why I stated earlier that I considered the pricing stupid... It's like grabbing a Lamborghini engine and throwing inside of a Civic and then trying to sell it for 8.3 million! Hey it's a Lamby... kinda, well not really, well close your eyes and pretend you are sitting in one.
To clarify it's not a first generation product, it's more like a 4th or 5th. Before that one there were Snapdragon cx 3, 2 and 1 and Snapdragon 850, but they had very little volume. My location sells 2 models of cx gen 3.
This. These reviews always curiously skip the Surface Pro X.
Why is this relevant?
The important thing here is that Snapdragon is actually equivalent at lower prices!
Ofc it's not ready yet but it's obvious that future generations will be actual buying options... The same way we buy vfm Snapdragon android phones we will be able to buy Snapdragon Windows laptops.
That's the highlight here!
No, remember it is a first generation Nuvia design.
@@shsu2020 Because there's a lot of cheerleading about how amazing Qualcomm is for nailing it in one, and how the rough edges can be forgiven because it's the first time they've done this and their second generation is going to be absolute fire now that they've got experience from having chips in the field, and how this proves that the extinction of x86 is immanent not just in the thin-and-light form factor but in gaming laptops/mobile workstations and potentially even desktops.
The fact that it's actually a 4th generation product and still not ready yet paints a less exciting picture for Windows-on-ARM, where this isn't an Apple Silicon moment and is just another step on a long and difficult grind where the architectures will coexist for many, many years.
@@jameslake7775 no cheerleading at all. It will need at least 2 years or even more to be usable.
But the possibility of laptops that are powerful and efficient at lower prices is really exciting. Remember how embarrassing was the first results from M1?
The webcam difference HOLY COW
Ngl, it's genuinely better than Apple's shtty overprocessed AI resampled image quality.
It's nice, but also a pro and a con. Since the webcam chip is part of the soc, everyone is shipping that. No one could put in a better (or cheaper) webcam. It's just what you get.
@@LucasHolt Well when it comes to laptops, there's *only* worse webcams. So it's a win. It'd get upgraded every cycle like phones.
I am sceptical, this sounds like the typical situation where one power brings up something big, and the other power is forced to find a quick and easy fix for the problem. this cant be as good as it seems. I wait for a year or two like I did with the M1
Very cool. For my home computer, however, I'll stick with x86 for full compatibility. I wonder how VM Workstation would run on Arm?
Linus is being sponsored here people don't be fooled. I don't trust he's reviews anymore!!!! This is a first generation product!!!
The funny thing is, they're not first generation products. Snapdragon has had chips in Windows devices for years at this point. And Windows has existed for ARM since 2012, 2011 if you include Microsoft's announcement of the Surface RT. So we're about 12-13 years into Windows on ARM and it's still buggy and incomplete. But nobody mentions it.
As long as i can get Linux on ARM to get to work on one of them, i am really looking forward to these.
probably gonna be a bit later but hopefully not too long of a wait 👍
Apparently Tuxedo announced ARM laptops. I really have high hopes for this.
Nope. The bootloader is locked.
Same! but according to another commentor, the firmware isnt in the kernel yet
@@somebodysomewhere8217 really??? please no..
Can we have a test against the M3 Pro? The camera looks fantastic.
Dear writer of the video,
Best reference ever at 2:09. The Melee community sends our respect.
What about the touchscreen issue I've seen noted in some other videos. Performance seems to be *boosted* sometimes when simply touching the screen. Theorized to be something built into the chip architecture from mobile phones.
That is less a CPU issue and more a firmware issue.
@@Hathos9 of course, but I was wondering how it might affected some of these performance numbers.
I want to see someone install linux on those machines and see how many ways worth of battery they get.
Linux on ARM has been around for a long time, I guess it should be more robust than Windows on ARM.
I'd say they are about the same on desktop. Linux got older and better support on random embedded boards but these manage peripherals in kernel instead of normally having an on-board system firmware like BIOS/UEFI on PCs that handle low-level tasks. Meanwhile Microsoft among several others steer UEFI, which is what those "Arm PCs" use and is also relatively new to both Windows and Linux.
The only issue I'm aware of is that not all the drivers are upstreamed into the kernel yet, so while the CPU and GPU nominally work, peripherals don't because for whatever reason the USB controller isn't finished yet.
They're internationally locking it. Some exclusivity shit with Microsoft. But I know the community will prevail.
even tested under standard conditions, I am wary of those battery tests if there's a 50W profile present lol. I've been burned so many times by Windows deciding that an important meeting / presentation is a great time to start updating itself and/or drivers, and it just burning through the battery life in an hour until a prompt shows up in the middle of the talk that the battery is low
This is the earliest I’ve ever been to an LTT video
same
same
Welcome to the party
wow, so interesting
Same
I’m hoping to start seeing these SoC’s on arm based windows mini pcs in a relatively short time, or mobile on desktop mini-itx boards for power efficient NAS and homelab setups.
Tomorrow's video: Linus apologizes for "misleading" laptop buyers.
there is no misleading in buying guide youll have to mention smth anyways
waiting to install linux in it
Would have been nice to see these snapdragon's tested under Linux as well - as Linux has been supporting more varied CPU options very well for a long time and though I'd expect a bit more manual effort required by the user to really get the best out of it Linux can often do better on battery life and performance vs Windows (however windows tend to come with better out of the box experiences). But this is new 'weird' hardware...
No linux.. no buy.
Linux support is not there yet. Need to wait for kernel 6.11, so give it a few months. The nice thing is that Qualcomm themselves are working on it, so when we do get support, it should be solid.
What the hell is the point of a laptop that simply can't run half the applications you throw at it?
Half a point is much sharper than a full point.
That's what they said about mac
Am I the only one that doesn't think the webcam looks better? Only + that it's higher resolution beside that it looks soo over processed, nearly cartoony, while intel one looks natural
Yep! The Qualcomm webcam looked a fair bit worse to my eyes!
You should probably go in for an eye exam.
@@arzsupra You may want to spend less time on social media if you don't think it looks over processed and filtered to death.
@@Unknown_Genius my eyes are fine, how about yours? I have a low prescription, and only need to wear glasses while driving at night just for safety sakes.
they looked comparable. The Sounds was better though.
What do you mean by statement "Gone are my sponsor obligations to Qualcom. Now I can say everything I want"? That all what you have said until now were just lies instructed by Qualcom?
Apparently, even all he said after the obligations were gone, is also BS.
I mean it's not like this video was anything better. Unlike what he wants you to believe, he's veeeeery heavily influenced by these companies. HE LITERALLY WAS SPONSORED TWO WEEKS AGO. Even if this video isn't sponsored he wont speak his mind because those b2b relationships are more important than his integrity.
As an IT MSP, I'm all in. Better battery life, better performance and lower cost, most apps for normal business users are native or web based. Total win
Weird Apple comparisons. Battery life shows Snapdragon battery Wh but not Apples (Its 50% more battery on the Snapdragon for 40% more battery life compared to MBA).
Weird spin on the video. Saying it has good emulation when they have had a decade of experience getting ARM to Windows and these chips launched with way worse emulation than Apple's Macs did day 1. Acting like the battery life is amazing when in reality it is only comparable to its only rival on the same architecture.. and most reviews I have seen that battery life gets cut in half when running emulated apps, where the competitor only takes a 10% hit during emulation. The GPU is downright terrible when compared to it's only rival on ARM (Basically it goes head to head with the 4 year old M1 Base chips and still manages to lose by a little bit). CPU performance is just bad overall regardless of architecture.
Seems like another flop from Qualcomm, yet its being hyped for some reason. Don't see the reason for this to exist. All you need is basic functionality and long battery life?, Mac exists, ARM Chromebooks exist. Need lots of software options? x86 exists. This is just the worst of either end. There is no real advantage to this chip. Consumers need another ARM option for hardware and they need Microsoft to take the compatibility seriously on the software end. Instead both companies just put all their effort into marketing and lies.
Also this is the "Elite" series. Are we talking about total laptop build comparisons and based on price?... or are we actually supposed to be looking at the chip performance here, because where is M Pro and M Max? Elite is the highest offering they have for a desktop OS, with Plus underneath it and if the leak is reliable, there is a basic Snapdragon X under the Plus. So X=M, X Plus=M Pro, and X Elite=M Max.
I mostly agree with you, the comparisons are very weird and seem to be influenced by the hype. The numbers just don't seem to add up to the X elite being revolutionary despite TH-camrs insisting they do (it uses the same power as M3 Pro but performs worse than M3 in a lot of cases, not to mention that M3 is last generation at this point).
The one use case it seems to be really good for is a decent but relatively power efficient CPU in a Windows laptop, which didn't really exist before.
It's not hype. I own one, and the performance and battery life are amazing.
It’s not better than Apple but maybe good enough so Windows die hards don’t have to migrate to Mac anymore- I think that’s the real play
I haven’t even been able to seriously consider a Windows laptop for the last 5 years
On an M1 pro MBP here
Also using power saving mode on x elite but not using low power mode on mac at least
Definitely would love to see more testing, especially when It comes to Photo/Video editing, cause it'd be a great alternative to a MacBook.
I’m impressed with the camera. Tks for adding the comparison side by side Linus
Linus, i thought you weren't doing those weird ass thumbnails anymore
He isn't doing the open fly catcher mouth so that is a plus... Unless I was served the thumbnail version without it, and the other one has it...
@@tinncan fair enough
we clicked on the vid less than 10 mins after it went live. That video wasnt for us lol
Super interesting inside baseball though.@@MrGigs94
To be fair ,the chains and tape were actually part of the intro.
If they were 800 bucks they’d be competitive in consumer. But the reality is that these are only going to be really bought by corporate buyers who are overpaying even with bulk discounts.
Next gen might be better.
I love efficiency as much as the next guy, but I am never away from an outlet and needing my computer for more than an hour or two tops, and I travel between continents regularly.
More gains to efficiency than just your maximum time away from a power source - electricity costs money too, so getting the same amount of work done for less, or for most folks idling at a lower power draw can be well worth it even if the machine costs a bit more - ask yourself how much that extra 10-20 Watt power consumption average will cost you over a couple of years the device can be expected to last. Also the models tested are clearly in the more premium thin and light product class that all cost at least in the ballpark of 2-3 times your budget no matter the CPU - when the bargain basement fantastic plastic cheapo laptop with one of these chips is produced then you can compare it to that price range of machine.
@@foldionepapyrus3441let's say on average you save 25 watt hours of energy a day. If used everyday, that's....18.25kwh in two years. You saved $2.56 in energy at 14 cents a kwh.
I spend on avg $225 a month for energy, which includes an ev. That's not even a rounding error.
Worst part is these are more expensive for no good reason - the soc is cheaper than Intel or amd.
@@foldionepapyrus3441 20W * 8h/day * 365days/year * 3years lifetime * $0.2/kWh * 1kW/1000W = $35
So actually, energy consumption doesn't matter much for laptops in terms of cost.
@@foldionepapyrus3441 Pure coping. Electricity is cheap as dirt, if you would want to save on that - turn off your room fan/AC/stop using microwave first, they eat WAY more then 10-20 watts.
Most of this is surface level testing compared to Just Josh's video. Highly recommend anyone to watch his video on these Qualcomm Snapdragon X1 chips. He gives you details on almost every commonly used applications for regular users, developers on Linux, and content creator softwares with benchmarks. The BEST laptop reviews in my opinion ❤
Please try doing that "Trying any game on the Snapdragon X Elite" live stream again!
I'm a windows user, but the specific battery life feature about Mac isn't just the battery life, but the fact that it's almost the same when you close the lid, and open it after several hours. That ain't never gonna happen to windows.
Hey, I've got the Galaxy Book 4 Edge 14, the one with the more powerful Snapdragon X elite. You can literally close the lid and open it the next day and it'll only have like 2 or 3% less battery. It's way better than any other Windows laptop I've ever had.
@@senna476 And then you can run benchmarks to show good results as most apps are buggy
I want to see you compare the M3 Max MacBook Pro not the slowest M3 MacBook Pro. Not because I assume a massive difference, just curious how different the performance would be between the best Apple has vs the best Qualcomm has to offer. Did you choose the lowest end model of MacBook Pro in order to control for price?
I assume they’re running chrome on the Mac’s which invalidates battery life results.
nobody on mac uses chrome?
@@Ryan-093the chrome port on macOS is absolutely terrible, and gives you about half the battery life of Safari.
@@DRMCC0YIssue is that Safari is not an option for most organizations.
@@Hathos9 That's fine, it's just the clarification that's needed.
They also had the windows laptops on power saving mode but likely didn't do the same on the Mac.
I always recommend Safari to any Mac user, followed by Firefox if on Windows or Linux - never Chrome or any Chromium browser.
I find it very hard to believe the MacBook only got 12 hours of very light usage when I get 10+ hours with an M1 Pro (drawing more power, with less efficiency cores) and only 83% battery health.
I need one of these videos but with DAW compatibility on the ARM architecture, Ableton, Reaper, ProTools, etc.
@@Echinder Ableton doesn't work at all according to just josh
Linus, it's a pleasure to see you +/- 10 years later, and you are still the same great person. Real Linus :) Have a nice day!
How long do you think it'll be before there is a git-hub repository that removes copilot entirely
It's a 1 line command.
Clue: It's an Appx package
It's called installing Linux.
@@thebuddercweeper Until Photoshop runs natively on Linux AND includes all the driver support for it, I'm going to have to decline
@@Toma-621 That's fair honestly. Linux has a lot of drawbacks for me compared to macOS, just not nearly as many as Windows - personally I'd rather use wine for apps like photoshop if I couldn't use macOS than suffer through using Windows. But to each their own.
That was not a fair battery comparison, linus. Various things matter like how system and CPU draw and manage power under different work loads. Playing youtube videos and saying Elite lasts more than others is not the right test.
Perhaps not, but it is a very realistic simulation of how laptops are generally used.
It doesn't matter though. Its hard to do specific tests that are possible on all machines and OSes. A video play is not perfect, but not that bad either to form an opinion.
Why would it not be fair? All of the laptops were doing the same thing. Yes maybe it's not very useful because nobody just watched youtube for 15 hours, but it gives you a rough estimate.
I bought the Samsung Galaxy Book4 Edge with the new processor. Performance and battery life are phenomenal. I ended up returning the laptop though because prism/emulation just isn't there yet. So many x86 apps still won't work on it. It's unfortunate because these chips are Awesome!
2:15 dude just switched from Mini to Wambo
"The lab running our most rigorous tests". Yeah we played 10 hours of crab rave on TH-cam till it died, so the battery life is great!
what's hilarious is that these new chips are barely any more power efficient if you actually stress test them.
for the next couple of months when comparing ARM and X86, please include the power draw in benchmarks. i feel like this is ARM's only reason to exist. i also would've liked to see the wattage in the overlays in your livestream.
that webcam footage really didnt look any better imo, just like it had some smoothing filters.
Exactly, same crap like on smartphones, "beautifying" faces by heavy blurring etc...
Yeah I'd much rather take the "bad" Intel webcam over the vaseline filter. Neither of them are great, but the Qualcomm one just makes you look like a horrifying human doll.
Most people would vastly prefer the Qualcomm image. But it's cool if you're an exception
@@Creepus_Explodus Vaseline filter - ymmd :)
@@DanKaschel it would be OK if those filters could be deactivated.. but let me guess: if course not...
1:58 You can take the XPS there and mod the bios to unlock undervolting, then undervolt the CPU and IGPU along with frequency tuning to get like 4x the battery no joke. I'd like to see if the arm chips can compare to that since the intel chips just seem to go for benchmark scores instead of actual battery. I get 20 hours of battery in browser and a few other apps like obsidian with the mods (on not even the highest capacity battery)
Yeah, you can shave A LOT of volts from intel processors, i just tired it and 0.15 undervolt is working no issues (from base voltage), and that AFTER rasing the clocks.
I am actually considering these chips for a linux laptop:
1. Qualcomm announced they have upstreamed the drivers
2. Due the open ecosystem a lot of our applications are already ported to arm
3. We have better and faster x86 emulators
Linus these laptops are anything but first generation and anything but "largely flawless". Microsoft has been trying to make Windows on ARM a thing since 2011 and they still are under cooked and not market ready. If Apple released something in this poor of a state back in 2020 with their M1 chips people would've gotten out their pitchforks and screamed bloody murder. There still is a very long way to go before these things are actually good. But why would I expect an actually critical look at something that you've been previously sponsored to showcase, can't risk those b2b relations am I right?.
M1 wasn't perfect either??? Sure the snapdragon chips are missing some useful things but don't act like M1 had all native applications on day 1
@@ootheboss9307 did you have problems where the keyboard would just stop working or you couldn't adjust screen brightness? Yes M1 wasn't perfect but unlike M$ it was Apple's first Arm based MacOS version. And the fact they successfully moved architectures before gave more credence to the fact they'll pull it off successfully. Microsoft had 13 years to make it usable (Surface RT was announced in 2011) and here we are over a decade later, still having issues like random black screens and non functioning keyboards.
M$ still hasn't managed to roll out their Snapdragon dev kits, something that should've happened BEFORE the release of these products, Apple has understood this for decades at this point. The fact that Lunar Lake dev kits are already out in the wild while these Snapdragon ones aren't is very telling.
The efficiency of these laptops is greatly overstated, when you do actual work on your laptop the battery barely lasts longer than intel or AMD and is miles behind Apple's laptops. And with all three having new chips right around the corner these snapdragon laptops will be nothing but over hyped rubbish, like previous Windows attempts to make Windows on ARM actually decent.
Arm isn't a magic bullet to get rid of issues, it's simply an opportunity to improve the state of the art for laptops. Microsoft pretends like its issues with customers is related to x86 when in reality it's their own incompetence in handling Windows as a platform, regardless of whether it's on x86 or ARM. Wendell fro Level1 Techs is very right about this.
Hooray for videos not on floatplane. Now I get to pay for premium service while also watching ads on TH-cam
uBlock Origin / ReVanced:
@@commander3494On a TV?!
@@akyhne SmartTube
It's on fp, seems to have come out about the same time as well
@@bacon.cheesecake it wasn't. It showed up a good 20 minutes after this was uploaded.
I've had a Samsung Galaxy Book 5S with a Qualcomm 8cx for a few years now. Even that old version was impressive for what it was. It could run Steam games surprisingly well as long as I avoided FPS games, which is pretty easy for me since I don't play them much. I'm looking forward to getting one of these new Snapdragon X laptops.
The Handbrake example, is a poor one for the Macintosh M series. To my knowledge handbrake is not optimized to use the built-in hardware encoders... Or the choice of encoding is certainly not use the hardware encoder on the M2.
They're using x264 software encoding on specific presets for everything which keeps things consistent vs comparing hardware encoders which are way faster but can output very different quality and bitrates because they're all different. It's been a while since I used Handbrake but I think there's an option for using hardware encoding via "videotoolbox" on Mac but then you can't really make a good comparison anymore.
It'll be interesting to see results for M3 Pro or even the new M4 when that chip finally comes to Mac.
it is likely not for Qualcomm either?