Small update about the Lenovo Yoga Slim 7i Aura Edition. The OLED version in this video WILL NOT be available in North America. Instead we get an 120Hz IPS screen with touch (the OLED does not have touch functionality) and essentially the same overall specs as the IPS version.
There is a way to be objective while ignoring power parity, you check price parity lol If CPU A is significantly faster than CPU B but consumes double the power I dont care if they both cost the same (maybe I care about battery life but again if they cost the same I can measure the difference in battery life and performance and see if I want to sacrifice some of one thing for an other if both cost about the same)
@@FroggyTWrite and it's 100 watthours PER BATTERY Yet a lot of laptops have a lot less (around the 70wh mark) Also it means that a laptop can have TWO separate batteries 100 watthours each last but not least they could sell laptops with bigger batteries and just mark them as not suitable for travelling with airlines. Not everyone travels on an airplane that frequently that he or she needs their laptop during the plane travel. most people dont travel via plane at all or like once every few years, yet almost ALL of us want a bigger battery or at least wouldnt say no if we would have been offered with one. or they could have the excess wattage battery removable so that you can use it on a plane (actually in the past there were laptops with a removable battery and even laptops with multiple removable batteries and I think also ones with a non removable battery and a slot to add a second one) This "they dont put bigger batteries because you cant bring the laptop on a plane" is just a hearsay shit the companies dont comment on and leave it to circulate because it suits their greed and leaves them alone to cheap out on battery life.
To be expected, they finally got to 3nm process. What we see on the desktop segment is insanity because its essentially an ancient 10nm architecture pushed to the MAX. Its super impressive for what it is and what it does but just like with pushing anything way past the limit, efficiency just isn't there. With just a mild overclock you can make a 13700k draw roughly low 400W which is insane to be honest but thankfully intel watts are a lot easier to cool as the heat is a lot more spread out compared to amd CPUs for example. Up until now AMD had huge advantage, take ryzen 9000 for example, at ONLY 1.0v some samples can do 4.9GHz prime stable, let that sink in. Yet at 4.7GHz it matches the performance of Ryzen 7000 max oc at 5.6ghz all-core in multithreaded. On the intel chip on the other hand at 1.0v you'd probably be limited to potato speeds.
@@auritro3903 I am talking about their integrated GPU, Not Graphics Cards. I know it sounds crazy but looking at the timeline, this is in fact the truth. As fast as the GPU goes, both Intel and AMD are making the same mistake, not helping support the game developers, like Nvidia did. And try to fix things at their end after a game is released. That's why they both, especially Intel have a significant Driver boost. At the right price, that's not a bad thing. But it basically means the GPU is poorly optimized at the beginning.
Where is the biggest revenue source for laptops? Its office workers who works mainly in Office and cloud native apps. I think Intel has got the right balance here, so they have saved themselves for the time being. Looking at the improvements over the last year I think they are on track for a great comeback.
Yeah I agree, orgs dont really care too much how much corporate devices cost as long as its not stupid expensive and the lunar lake chips seem to be perfect for office use.
@@innosanto Literally not a single office worker I work with would note an improvement from more multithreading, however, they'd geek out over the battery life.
I agree. For office use a 1.2 kg laptop sounds like a good deal. The main thing I expect is good battery life which it delivers. What I want to see is how it works in real life use. Like browser tabs 15+, some office application open. To see the actual battery usage.
Damn, cyberunk on medium above full hd with average fps above 60 on an integrated card?! Glorious age to be living in, I'm really looking forward to not having to get a dedicated gpu anymore in a couple years.
SAME! one of my favorite things about newer chips is their iGPU is significantly better. I think we kinda have to thank Apple silicon for that. If more demanding games can run 1080p medium graphics at 60fps on iGPU I'll be happy. Me and many other people aren't gaming all the time but having that iGPU performance for the occasional use is awesome while maintaining top notch battery life.
Honestly, with everyone focusing on being more battery efficient, i'm less concerned with seeing how these computers last on battery life. What I do want to see is how much battery drain there is when in standby mode now more than anything else. Being able to close the lid of my laptop, walk away for a day and come back and not have my battery be dead or close to dead and be able to use it off the charger is more important to me.
While standby should work well, if you're walking away from your computer for any significant amount of time you should be using hibernate not sleep; it would solve the issue. That and it's better for your computer.
@@Kyt2024 So i've had multiple x86 laptops in the past where there was significant battery drain in sleep and i did have them switch to hibernate. But with the Snapdragon X processors, i got a surface and haven't had that kind of battery drain at all. I can close my laptop lid, walk away from it and return to it the next day and only 3 or 4 percent of the battery drain in that timeframe. Everyone is talking about how power efficient their processors are and that's great, but I'd like to see how power efficient they are in standby/sleep.
It's still garbage because of Windows modern shartby (standby) sucking up power, the CPUs are AMAZINGLY efficient in reviews but Microsoft and stock garbage OS configs are clearly ruining things. There's no reason this should be the case when I got a 2020 Windows 10 ryzen 4600H laptop that takes forever to drain, because it's Windows 10, without trash running and wasting batter on idle :/
@@Sunyup804 yeah I’ve seen similar and I do agree it should work. Just wanted to say if you’re walking away from your computer for a day, it shouldn’t be in sleep. Hibernate is just better because it gives your computer the opportunity to fully power down its components and it’s only marginally slower to resume use vs sleep.
@@hyenadae2504 Yeah you're right about Windows Modern standby being a battery drain, but it still stands with my Snapdragon X laptop, there is barely any battery drain on standby. I still want to see how both AMD and Intel are responding to that as this is a genuine issue that needs to be addressed.
It really amazes me how people, especially in the tech space, DO NOT UNDERSTAND HOW PERCENTAGES WORK. Processor 258v has 8 total threads. Processor 155h has 22 toal threads. You want to round down to 20 threads for easy math, ok, that means the 155h has 150% more threads, NOT 250% more. 100% more means DOUBLE, when you are talking about more, you don't start at 100%. If you were making 8$ an hr and your pay increases 100%, you are now making $16 an hr., or double. This isn't hard. 250% would be 3.5x. 200% and 2x are not the same thing.IDK if this is a failure of the school system or what.
I'm guessing he probably meant to say "250% *as many* threads." He did misspeak, but I dont know if it was a big enough error to deserve this massive rant.
You’re right, but to clarify to those confused: it depends how you phrase it. It’s a 150% increase, or (20-8/8)*100, but 8 is 250% of 20, ((250/100)*8)=20. 250% is the multiplier, while 150% is the percent difference.
I would have liked to see *some* CPU benchmarks on the balanced modes, pushing those chips to 28 or 30W takes them out of their sweet spot so comparisons against their lower power modes especially in MT benchmarks would have been an interesting comparison.
Yeah that will be another video for another time since we actually struggled to find laptops in our inventory that hit a nominal 30W, let alone under 20W. We're working on getting some in though.
@@HardwareCanucks Do you think it's "Fair" that these laptops (in battery life tests) have a higher res and 120hz screen versus the M3's 1440P tier 60hz panel? Even at the same general brightness, the Lunar Lake laptop will be wasting a lot more display *and* GPU power possibly pushing frames and refreshing the screen, unless it has true VRR (ie Freesync) for dynamic refresh at locked framerate applications. I'm not sure how it applies to OLEDs, but I hope there's more tests and other models that aren't OLED reviewed soonish :)
beg to differ. this product is redundant and moving the industry back. incredible lose for shareholders of intel , customers, margins, QUALCOMM, domestic foundry, us govt chips act, because.. they aren't using their own foundry. they are entering a new space and competing with QUALCOMM and every other portable is ARM based, so while the hot tower case has its place. Its not on a tablet/ pc or portable. customers choosing between the two is hard. . MSFt switched surface to QUALCOMM and recovered fagging sales. , Apple switched, its physics. staying in this space, multicore ,drain the battery and get hot. ARM based cpus scale easily to many cores. these test of one core is irrelevant. Intel is trading at 2000 levels and if this is sold, squeezing margins. amd64, 0x86 is legacy, this is more work for OEMs ,needs a fan, and more work for software developers and device drivers , so its basically attacking QUALCOMM and they can do a takeover and no one will block it. the only triumph is the gnu, so i don't need nVidia and intel. i hold shares in both and don't care it i loose all in short term call , im sure i will.. This is against progress. The CHIPS act was to prevent having to use offshore foundries. QUALCOMM will have to buy intel chip design, so they can focus on the foundry and make chips for others.. sad management given they cant even call this an intel chip its made in Taiwan to compete with AMD chips. the Geopolitics is too codependent. playing older games under emulation should work if intel/qualcom combine their teams, or didnt so much redundant work.
Having had my Meteor Lake laptop now for 6+ months, I can definitely say the insane number of cores and threads isn't needed. I might prefer a 4+4 configuration if that means better battery life and better responsiveness. Sometimes the thing feels like a slug because the power profile often limits the cores to 800mhz, and I hate how laggy everything feels when it's in that mode. But if I enable the higher power profiles for a responsive experience, I get much higher fan noise. Honestly I might even prefer a smaller configuration like 2+4 with a cut-down GPU for everyday office use.
@@lorenzospadaro6077 Web browsing, TH-cam, Netflix, Day Trading, Monitoring my Twitch Feed, e-mails, oh and of course some software development i.e. I compile code on the Xeon and then I deploy and perform remote debugging on the laptop from time to time, mostly to check performance and compatibility on mobile.
Still crazy to see M3 running most of these tests on 11W.. If only they add a 90Wh battery we will really get a two days battery. But good job Intel for finally getting something good for laptops!
Which is why testing laptops with power parity is nonsense. Since each manufacturer can write their own BIOS, they each have different behaviours despite the same silicon. Overall implementation is way more important as it is reflected in battery size which affects weight and so on.
Excellent review, thank you folks! Lunar Lake is indeed a right step for Intel in the mobile market segment and great inspiration for the higher tier Intel mobile and desktop CPUs. It's also cool how Intel inspired themselves by the Apple's Mx design with the on package memory. Still when it comes to pure efficiency and performance per Watt Apple's silicon reigns as supreme to all.
Why are MacBooks being exempted from those gaming tests you're running? If they can't run it - they should appear there with "DNS"/"DNF" or something. You shouldn't reward them for not being able to do certain tasks x86 processors can easily do.
The MacBook isn't in the game tests since we have no reliable metric for game performance logging (frametimes, framerates over time) on Apple devices. We're currently working on one though.
A few days ago, I bought the new Asus TUF A14 (Ryzen HX 370, 32GB DDR5, RTX 4060) for $1350 on eBay. I could use the extra processing power, but I can TOTALLY see how Lunar Lake will be a new favorite for businesses that mostly use cloud-native apps. I'm just not sure if the value is there for consumers, $1500 is a hard sell. The general public is better off saving $500+ and getting an Intel 155H laptop that performs better and doesn't lose too much in terms of battery life.
Tbf, a snapdragon device would be a more compelling option specifically for business users that work in cloud. Great battery life, touchpads, displays, touchscreen, camera, they look and feel amazing. The price is surprisingly decent for an ultra book, and you don't even need the higher skew processor. The only downside is compatibility, which you don't care about when working from cloud. They're plain cheaper for the same if not better experience.
I would say they are off to a decent start here. If they can get that "rentable unit" technology off the ground, their processors going forward are going to be pretty awesome actually. Rentable units is their proposed replacement for HT and it should be more efficient, and work better with their hybrid core architecture. Doesn't excuse their major issues with Raptor Lake, but it is a bright spot.
hardware cancucks is starting to become my favourite laptop review channel. i dont understand why most other channels review a laptop on its own but dont compare it to other laptops, how the hell are people going choose a laptop among so many options if youre not gonna compare it with other laptops
@@HardwareCanucks btw, you did QC dirty here by not saying in which benchmarks are native or emulation, the performance will spike when devs port to ARM, Cinebench R23 will never go to ARM as it's deprecated and saying it does much better and destroys QC here just because of some miracle is bad journalism, it's emulation vs native... Also you tested a 78 SKU which doesn't have the ST boost and thus loses to Intel, if you used a XIE-80 SKU, Intel only reaches parity. It's a bit misleading just to say the least.
Great review, only one drawback is the MT perf because of 8C8T CPU. I mean, this makes a lot of sense because Lunar Lake is mainly for Thin&Light laptop. For people who really care or work on 3D/Video (locally), they should get something more powerful (Ryzen AI HX or Arrow Lake) For me, I work on cloud server mostly. I don't really care MT perf that much, 8C is enough. I much prefer thin&light, battery life, and overall snappy experience
Thanks for such an in depth review! How are the thermals on Aura edition for day to day tasks like wathcing netflix, youtube and office work? Do the fans trigger in these scenarios and does it get warm/hot on lap?
Agree with the missed opportunity regarding the battery size, it could really have underpinned the architecture's battery consumption, but also been a great sales pitch to include a larger capacity battery. One issue though, when flying in EU, a battery between 100 WH to 160 WH have to be approved by the airline, and battery capacity over 160 WH are usually not allowed in a commercial flight. So a 100 WH battery would probably be max, i do not know the rules international ect..
I run my 6800H at 16W most of the time. It's definitely not meant to be run like this, but it actually works well. Would be very interested to see your same benchmarks with that kind of power
So inferring from the heavy load testing we should expect 2-3 hours on heavy games and some more on indies? Been looking for a good battery testing on lunar lake that test gaming time. Previous laptops and all even handhelds are always 1-2 hours. Would appreciate something like that as have not been able to find videos or discussion about it.
Appreciate the content and benchmarking. After all the snapdragon hype (and I did get one of the machines and happy with it regardless) I’ve realised many companies weaponise strange benchmark results. In the same way snapdragon called out the M3 but in reality M3 is just better (when it has 16GB RAM). I say this after hearing the battery test oddity (not getting up to 24 hours) Overall though…. great improvements for Intel. Great year to get a laptop.
I am happy to see the mobile SOC space be so fiercely competitive, but I am disappointed to see all 4 SOC manufacturers abandon upgradable memory. I hope in the future, at least one of these manufacturers supports SODIMM or CAMM modules.
intel isn't, its only in lunar lake where on package memory forces them to. On the higher powered arrow lake cpus releasing in a couple months it'll be replaceable again because every little bit of efficiency doesn't matter so much anymore at higher powers
Its actually all about power and space savings, not to mention latency improvements. Having the memory so close to "home" has tangible benefits in all those areas. I also think that CAMM is a pipe dream for thin & lights. Gaming laptops? Maybe but right now the SOC approach will remain king for ultra portables.
How do you think the best Mac achieves almost 10x better memory bandwidth than the best gaming PCs? Modularity comes at a price, also in laws of physics.
Do you think when will Surface laptops get Lunar lake, if ever? I need a laptop before the middle of next year and was gonna get Snapdragon X but after seeing Lunar lake performance, I’m waiting
Hard to judge battery run times with different screens. OLED will definitely take a lot more power than the CPU. Lunar Lake looks pretty good. I want as many core/threads as possible though, and probably shouldn't buy a new laptop any time soon.
This comment might come off as a bit unfair, but the M4 series chipset are a pretty large bump vs M3 to put it mildly. I'm not really sure how anyone competes with the M-series chips going forward, the benchmark numbers coming off the iPad pro M4 are staggering. That being said, these are nice gains from Intel, and I look forward to see some of the desktop offerings.
if you use Applebench6.3 that rigged M4 result by introducing SME extension that nobody use then yeah M4 is a bump but on other benchmark its very close.
@@drinkwater9891 Damn well these glorified phone chips are kicking intel desktop and consumer AMD chips in the teeth. Must be an old guy stuck on X86. Also might we remember Nvidia and AMD plan on making arm chips in the next few years if I’m not mistaken.
I've been "Team Red" since the Ryzen 7 4800H, but I'll be going Intel in my next laptop (in a few months). I've enjoyed AMD, but the poor quality video encoding/decoding just puts me off. The things that are most important for my use case: 1. Battery life. 2. Excellent iGPU performance a low wattage (not for gaming, just smooth desktop and scrolling). 3. Runs cool and quiet. 4. Good video transcoding. 5. Thunderbolt port (or ports). 6. Okay performance (I don't care about bleeding edge multi-core performance, so long as performance is "good enough").
Can somebody in plain english tell me if a developer (containers, backend apps) should go for lunar lake (better single core) or meteor lake (better multi core)? Overall Lunar looks better but I guess development tasks would be more benefitted by multicore performance?? I don't know. PLEASE HELP
Yes they should be better with higher MT but you should buy a Zen 5 instead if you don want to buy lunar lake, and you should wait for real laptops not this Starbuck PCs, if not LNL then Zen 5
@@reiniermoreno1653 AMD (Zen 5) laptops are too costly where I live. I don't know why but they are charged a premium. Sorry but didn't get you. "should be better with higher MT" you mean this for MeteorL? Do you mean to say ML will be better than AL for my usecase??
One bit of information that was missed from this review is that while it's true that the windows PCs from Intel and AMD had larger battery, they also had 120HZ OLED displays which consume almost 50% more power, specially during web browsing. So despite having a 30-40% larger battery than the M3 Air, these chips were at a disadvantage.
So there's a few things here we should actually add to the charts. All devices are set to a fixed brightness output and refresh rate (60Hz). This evens the playing field by a lot.
Yep, it's difficult to compare CPU performance on laptop bc of the wide variety of components. This video really only gives a general sense of the performance, which is fairly competitive. We'd need the same laptop with just a CPU chipset change to have accurate data and make judgements on which CPU is more powerful or more efficient
@@deansmits006 pcworlds test uses the Dell XPS 13 for Snapdragon, Meteorlake and Lunarlake testing with pretty much everything matched. I think that's about as close as you can get right now. Sadly, no Dell Ryzen laptop so not in the comparison, but still. Strong showing from Intel.
These benchmarks show how vast the difference between Apple and the others is. M3 is using 3x less power than the top of the line Intel and AMD chips, but it stays competitive. This is a great step forward for Intel, but calling this chip perfect is coping.
@@ayusharya1397 Software optimization too. MacOS and its ecosystem are written to work on limited/specific hardware, nothing else (more efficient), but Windows and its ecosystem are written to work on a broad variety of hardware choices (less efficient).
Those tests with 20+ hours runtime use the native media player in Windows 11, which is much efficient than vlc or other legacy media players. I use it despite lacking features to preserve battery life.
@@Nanerbeet Well, I've been doing this for 50 years and it sure beats punch cards. The AMD HX 370 has 12 cores and 24 threads. I have a 2TB Crucial T500 and I can run 2 external monitors. I'm running a local SQL Server and usually 2 copies of Visual Studio. Frankly Teams uses the most resource ATM (especially when sharing).
Looks like the HX 365 competes pretty well against 258V. I think the HX 370 will look even better against the 288V since the former adds CPU and GPU cores while the latter just clocks them up.
I own almost all intel hardware, but laptops are amd market. Ryzen have being the option number 1 for gaming laptops and now with the new hx370 the thin still better
@@otozinclus3593 it varies according to so many different factors. Like like bio settings, power profiles, cooling solution. But on Asus g16 zephirus 4070 great laptop
It will be great if Intel can get their foundry to put out these chips on 18A instead of relying on TSMC. This change to ASML was never going to be smooth, but so many parts in Meteor Lake were Intel, so that bodes well.
will they be releasing another architecture with a higher performance ceiling? because a dual architecture strategy might actually be a good idea now, its been a long time since the Atom era, lower performance isn't nearly as much of a limitation now that the performance floor is much higher
They need a matte version or to use the anti-reflective coating from the Asus G16 370 AI... Looks like they didn't apply an anti reflective coating on this laptop at all.
the fact that this battery life is achieved by going back to just 4 main cores and another 4 for light work is understated though when commenting the efficiency part....but when the multi-core performance benchmarks are on it is then overstated. It's not a bad chip don't get me wrong, but i just can't justify the hype...great iGPU though that's great to see
The question is what’s better for the average user: better power efficiency with worse multi core or better multi core with with power efficiency. I’ll wager the first, most things you do won’t take advantage of the number of cores in a Ryzen.
@@zachb1706 i was raising the point that the presentation of many outlets did not explain that the better battery life is a result of the decrease of the number of good cores
The battery life was not archieved by the lower core count, but by a architectural overhaul, a better node from TSMC, on die memory and the replacement of the VRM
It seems like Intel is somewhat back. This looks like a compelling offering from them if priced attractively. Intel has somewhat of a history with over-hyping and under-delivering. I'm convinced of the capability now, but what about availability...last time they announced something it took months for it to finally start appearing in products that I was interested in...
I agree with intel's new strategy with lunar lake. I have been wishing for so long before even the M1 that all chip makers start focusing on efficiency over raw performance. Finally it has started! Yay. Battery life and a cool to the touch device (think about your lap with a laptop on top in a warm climate just browsing the web) is invaluable for many (most?) people. I am a software developer and the amount of insanely inefficient code I see out there in the industry is way too much. Lack of raw compute is mostly not the main culprit in causing lag and slowdowns in day to day computer usage. All this is only my 2 cents though. Business be booming without it too evidently. Now only if Apple was crazy about FOSS, it would have been perfect...
I’d love to see a 258v processor in a laptop with a 4080 GPU that could be totally turned off when not required….ie, when you are doing something like Blender or Resolve (where the 258v is relatively weak), you could enable the 4080 for performance when you really required it, but if just doing general surfing, you could disable the 4080 to improve battery life. Kinda best of both worlds.
So the Meteor Lake H series is out of question for basic office and stem focused data analysis projects? I had almost decided for the Ultra 5 125H and now I'm confused if I should wait another few months for the 226V. 😭
No surprise with the stellar battery life. It’s basically M3 in idea. Video playback is not very indicative since they have their own very efficient engines doing decode. Light task and heavy task is what I am interested in. What I am curious for is steel nomad on M3. How is it next to Lunar Lake and strix point?
Looking for the Zenbook s14. Waiting for a deal. Here in Germany the 32gb/ultra 7 258v version costs 1699 €. I really don't understand why everything got so expensive. It was always the same as US ..
Intel should have used TSMC-fabricated SoCs/processors a long time ago. If they had, they would likely be in a much stronger position in the CPU market today.
So the tl;dr is that Apple is still King in runtime and speed, regardless of being plugged in or on battery. Still nice to see Intel catch up, they are finally an option for mobile devices again.
Most of the benchmarks are actually under a sustained load. For example Cinebench we use the 10 minute warmup, our Creator focused tests are quite long (especially Resolve and Premiere)...so I'd think we put enough load on that M3 to even out the playing field.
Testers use the fan less MB Air to give Intel a chance because if they used the same base M3 chip in the MB Pro 14 it would make it look even worse, which would upset these Window’s dominated channels. The reality is a fair test would be the MB Pro 14 M3 that can be had for $1099 recently and the 16GB version is similar in price to these VIVOBOOKs. Then of course, you have the M3 Pro MB Pro 14, and in a few weeks the M4 MB Pro 14!
@@HardwareCanucks would be good to see the two stacked up at that lower TDP for Intel. But based on the charts Apple is equaling Intel's perf at 1/3rd the power right now which is crazy.
@@xskeetskeetskeetxThat's the TDP, not what the CPU is actually drawing. The only time you can be certain it's drawing 30 W is in benchmarks like Cinebench that stress all cores, and the M3 is trounced there. Not to mention if you lowered its power limits to the M3 it would probably still score as well due to exponential power scaling.
@@kirby21-xz4rx So does Qualcomm what's your point? X86 is inherently less efficient than ARM, Intel is trying to prove otherwise. So far they haven't.
if that's what you want then you will probably go with AMD because Intel doesn't have dedicated GPUs in lunar lake laptops, and even the iGPU is better for such tasks. you should find HX370 laptops with a 4050/4060 at the same price as Lunar lake laptops.
@@mariuspuiu9555 thank you for the response, the reason I want to get a laptop with an igpu is because although I'm an architect student atm, I also work as a freelance graphic designer and video editor, normally, my heavy work is done in the graphic design department regarding brochures and large portfolios of a file, the video editing part simply compiles of reels and small advertisement videos at 1080P Max, so the laptops that are available in the market in my area within the price range with gray color reproduction for my freelance work that I want normally have igpus and the ones that have an rtx 4050 or 3050 TI have terrible srgb and Adobe RGB gamut.
ARM has had very little time and effort put into it to make it compete with x86 systems, which has had decades of consumer use and competitive innovation put into it. And yet, ARM, with comparatively much less time to refine itself, is already showing that it can be competitive in the computer market. It just needs to develop a little further, it needs more adoption. ARM is a good idea, and has so much room to grow. I believe in Snapdragon’s mission, but unfortunately it still needs time in the oven. It’s not yet matured enough for my liking, but I’m glad they’re making the moves and pushing for its adoption. This is the catalyst that will help ARM to continue advancing, and one day, overtake x86 when x86 peeks or plateaus. Maybe next generation I’ll consider an ARM PC.
I would like to see for example how many cinebech runs/Wh or blender frames)/Wh the processors are doing for a true performance per watt comparison! You could measure the power at the outlet and substract overall idle system power to get only CPU power. Screens, batteries true capaco and motherboards, network cards all influence the results to heavily in thise simple run time tests....
Where are you finding a Yoga Slim 7i with an OLED? Is it a country thing? On their US website and on Best Buy I am only seeing it available with an IPS and no OLED but would love that as a possibility.
Spot on. 👍🏽 Reviewers (& everyone else) need to recalibrate; forget about no. of threads & clock speeds. This is about quiet & cool running, all day battery life, no significant performance drop off on battery and premium features (great screens, speakers, webcam, keyboard, memory speeds, Wi-Fi 7, Bluetooth 5 etc). A high quality user experience in other words. Two predictions: this frees up OEMs to push the envelope on form factor design evolution. And it will throw a very harsh spotlight on how shite Windows OS is and all the bloatware running on these devices. 😏
Intel CPUs only having the RAM on the cpu package seems a problem to me since they max out at 32GB. It would be good to have LPCAMM2 type RAM as an optional way to go beyond that. Have 32G on the package and maybe 16G or 32G more on a LPCAMM2 card if you want more.
it's some sort of 3D cache... having the ram with the CPU is the way to go for sure. I expected this for a long time. I hope than AMD will offer this on desktop CPU. 1 CCD with 8 to 10 cores and 1 CCD of cache or RAM it's not complicated for them to do it, so I don't understand why they have still not released this type of setup. Intel do it first, but on laptops...
When you consider lunar lake is for thin and light laptop with mostly browsing and office task it's perfect. For blender and everything else is for arrow lake h
The recurring issue now with Intel is things get so delayed, instead of being 'WoW', when they finally come out they are relegated to middle of the pack or barely an incremental improvement over the competition.
Small update about the Lenovo Yoga Slim 7i Aura Edition. The OLED version in this video WILL NOT be available in North America. Instead we get an 120Hz IPS screen with touch (the OLED does not have touch functionality) and essentially the same overall specs as the IPS version.
Whoever thought north america should only get oled touch panel should be fired. Not everyone wants touch.
There is a way to be objective while ignoring power parity, you check price parity lol
If CPU A is significantly faster than CPU B but consumes double the power I dont care if they both cost the same (maybe I care about battery life but again if they cost the same I can measure the difference in battery life and performance and see if I want to sacrifice some of one thing for an other if both cost about the same)
fyi - airlines to have limits on the size of lithium ion batteries you can bring with you, even inside laptops
@@FroggyTWrite and it's 100 watthours PER BATTERY
Yet a lot of laptops have a lot less (around the 70wh mark)
Also it means that a laptop can have TWO separate batteries 100 watthours each
last but not least they could sell laptops with bigger batteries and just mark them as not suitable for travelling with airlines.
Not everyone travels on an airplane that frequently that he or she needs their laptop during the plane travel.
most people dont travel via plane at all or like once every few years, yet almost ALL of us want a bigger battery or at least wouldnt say no if we would have been offered with one.
or they could have the excess wattage battery removable so that you can use it on a plane (actually in the past there were laptops with a removable battery and even laptops with multiple removable batteries and I think also ones with a non removable battery and a slot to add a second one)
This "they dont put bigger batteries because you cant bring the laptop on a plane" is just a hearsay shit the companies dont comment on and leave it to circulate because it suits their greed and leaves them alone to cheap out on battery life.
@@sagarsubedi NA does not get an OLED touch panel. Read it again.
Intel and power savings? What a time to be alive
thanks to TSMC😂😂
Desktop post 12th Gen 💀
To be expected, they finally got to 3nm process. What we see on the desktop segment is insanity because its essentially an ancient 10nm architecture pushed to the MAX. Its super impressive for what it is and what it does but just like with pushing anything way past the limit, efficiency just isn't there. With just a mild overclock you can make a 13700k draw roughly low 400W which is insane to be honest but thankfully intel watts are a lot easier to cool as the heat is a lot more spread out compared to amd CPUs for example.
Up until now AMD had huge advantage, take ryzen 9000 for example, at ONLY 1.0v some samples can do 4.9GHz prime stable, let that sink in. Yet at 4.7GHz it matches the performance of Ryzen 7000 max oc at 5.6ghz all-core in multithreaded. On the intel chip on the other hand at 1.0v you'd probably be limited to potato speeds.
14900K @400W XDDD
Is the new intel chips on tsmc 3nm?
Intel doing well with integrated graphics? My god the times are changing. In a good way
Nothing new really, when the 155H came out, it is also the Most powerful iGPU, but it got beat by AMD a few months after.
@@iokwong1871 The only place where the older ARC GPU beat AMD was synthetic benchmarks, it was dogshit in gaming. Now, that seems to be changing.
@@auritro3903 I am talking about their integrated GPU, Not Graphics Cards. I know it sounds crazy but looking at the timeline, this is in fact the truth. As fast as the GPU goes, both Intel and AMD are making the same mistake, not helping support the game developers, like Nvidia did. And try to fix things at their end after a game is released. That's why they both, especially Intel have a significant Driver boost. At the right price, that's not a bad thing. But it basically means the GPU is poorly optimized at the beginning.
Didn’t intel invent integrated graphics?
Where is the biggest revenue source for laptops? Its office workers who works mainly in Office and cloud native apps. I think Intel has got the right balance here, so they have saved themselves for the time being. Looking at the improvements over the last year I think they are on track for a great comeback.
Yeah I agree, orgs dont really care too much how much corporate devices cost as long as its not stupid expensive and the lunar lake chips seem to be perfect for office use.
Meteor lake are very good processors and with stronger performance with regard to multicore vs Lunar Lake. Lunar lake is light laptop CPUs.
@@innosanto yeah, that;s the point. 90% of office/student laptops are thin and lights
@@innosanto Literally not a single office worker I work with would note an improvement from more multithreading, however, they'd geek out over the battery life.
I agree. For office use a 1.2 kg laptop sounds like a good deal.
The main thing I expect is good battery life which it delivers.
What I want to see is how it works in real life use. Like browser tabs 15+, some office application open. To see the actual battery usage.
Damn, cyberunk on medium above full hd with average fps above 60 on an integrated card?! Glorious age to be living in, I'm really looking forward to not having to get a dedicated gpu anymore in a couple years.
Absolutely - these things are fine for casual and indie gaming.
SAME! one of my favorite things about newer chips is their iGPU is significantly better. I think we kinda have to thank Apple silicon for that. If more demanding games can run 1080p medium graphics at 60fps on iGPU I'll be happy. Me and many other people aren't gaming all the time but having that iGPU performance for the occasional use is awesome while maintaining top notch battery life.
Honestly, with everyone focusing on being more battery efficient, i'm less concerned with seeing how these computers last on battery life. What I do want to see is how much battery drain there is when in standby mode now more than anything else. Being able to close the lid of my laptop, walk away for a day and come back and not have my battery be dead or close to dead and be able to use it off the charger is more important to me.
While standby should work well, if you're walking away from your computer for any significant amount of time you should be using hibernate not sleep; it would solve the issue. That and it's better for your computer.
@@Kyt2024 So i've had multiple x86 laptops in the past where there was significant battery drain in sleep and i did have them switch to hibernate. But with the Snapdragon X processors, i got a surface and haven't had that kind of battery drain at all. I can close my laptop lid, walk away from it and return to it the next day and only 3 or 4 percent of the battery drain in that timeframe. Everyone is talking about how power efficient their processors are and that's great, but I'd like to see how power efficient they are in standby/sleep.
It's still garbage because of Windows modern shartby (standby) sucking up power, the CPUs are AMAZINGLY efficient in reviews but Microsoft and stock garbage OS configs are clearly ruining things. There's no reason this should be the case when I got a 2020 Windows 10 ryzen 4600H laptop that takes forever to drain, because it's Windows 10, without trash running and wasting batter on idle :/
@@Sunyup804 yeah I’ve seen similar and I do agree it should work. Just wanted to say if you’re walking away from your computer for a day, it shouldn’t be in sleep. Hibernate is just better because it gives your computer the opportunity to fully power down its components and it’s only marginally slower to resume use vs sleep.
@@hyenadae2504 Yeah you're right about Windows Modern standby being a battery drain, but it still stands with my Snapdragon X laptop, there is barely any battery drain on standby. I still want to see how both AMD and Intel are responding to that as this is a genuine issue that needs to be addressed.
It really amazes me how people, especially in the tech space, DO NOT UNDERSTAND HOW PERCENTAGES WORK. Processor 258v has 8 total threads. Processor 155h has 22 toal threads. You want to round down to 20 threads for easy math, ok, that means the 155h has 150% more threads, NOT 250% more. 100% more means DOUBLE, when you are talking about more, you don't start at 100%. If you were making 8$ an hr and your pay increases 100%, you are now making $16 an hr., or double. This isn't hard. 250% would be 3.5x. 200% and 2x are not the same thing.IDK if this is a failure of the school system or what.
258v is a bigger number, so why does the 155v have more threads…?
AMD an Intel trying to outdo each other with weird numberings?
@@mikaelbiilmann6826 1 and 2 only represent core ultra's generation. They have reduced the cores this year.
@@narutokunn Oh, ok. Thanks. 🙏
I'm guessing he probably meant to say "250% *as many* threads." He did misspeak, but I dont know if it was a big enough error to deserve this massive rant.
You’re right, but to clarify to those confused: it depends how you phrase it. It’s a 150% increase, or (20-8/8)*100, but 8 is 250% of 20, ((250/100)*8)=20. 250% is the multiplier, while 150% is the percent difference.
I would have liked to see *some* CPU benchmarks on the balanced modes, pushing those chips to 28 or 30W takes them out of their sweet spot so comparisons against their lower power modes especially in MT benchmarks would have been an interesting comparison.
Yeah that will be another video for another time since we actually struggled to find laptops in our inventory that hit a nominal 30W, let alone under 20W. We're working on getting some in though.
@@HardwareCanucks Do you think it's "Fair" that these laptops (in battery life tests) have a higher res and 120hz screen versus the M3's 1440P tier 60hz panel? Even at the same general brightness, the Lunar Lake laptop will be wasting a lot more display *and* GPU power possibly pushing frames and refreshing the screen, unless it has true VRR (ie Freesync) for dynamic refresh at locked framerate applications. I'm not sure how it applies to OLEDs, but I hope there's more tests and other models that aren't OLED reviewed soonish :)
@@hyenadae2504 But M3 air has a much smaller 52.5Wh battery while the lunar lake laptops have 70+Wh batteries so it's fair.
@@hyenadae2504 They could have set all the laptops at 60 hz as well.
@@Son37Lumiere We did.
It's an incredible win for Intel, great battery life , better compatibility for apps, far better gaming performance, solid price point.
Standby time is not good
@@soberanisfam1323 Even shut down time is not good. Dont expect for standby
why cant they at least match the standby and boot time of windows on arm laptop ? @@soberanisfam1323
beg to differ. this product is redundant and moving the industry back. incredible lose for shareholders of intel , customers, margins, QUALCOMM, domestic foundry, us govt chips act, because.. they aren't using their own foundry. they are entering a new space and competing with QUALCOMM and every other portable is ARM based, so while the hot tower case has its place. Its not on a tablet/ pc or portable. customers choosing between the two is hard.
. MSFt switched surface to QUALCOMM and recovered fagging sales. , Apple switched, its physics. staying in this space, multicore ,drain the battery and get hot. ARM based cpus scale easily to many cores. these test of one core is irrelevant.
Intel is trading at 2000 levels and if this is sold, squeezing margins. amd64, 0x86 is legacy, this is more work for OEMs ,needs a fan, and more work for software developers and device drivers , so its basically attacking QUALCOMM and they can do a takeover and no one will block it. the only triumph is the gnu, so i don't need nVidia and intel. i hold shares in both and don't care it i loose all in short term call , im sure i will.. This is against progress.
The CHIPS act was to prevent having to use offshore foundries. QUALCOMM will have to buy intel chip design, so they can focus on the foundry and make chips for others.. sad management given they cant even call this an intel chip its made in Taiwan to compete with AMD chips. the Geopolitics is too codependent.
playing older games under emulation should work if intel/qualcom combine their teams, or didnt so much redundant work.
This shows how amazing intel quicksync decoders are. The best for video editing.
hn bhai
Thanks for the video and just ordered the Zenbook S14.
Having had my Meteor Lake laptop now for 6+ months, I can definitely say the insane number of cores and threads isn't needed. I might prefer a 4+4 configuration if that means better battery life and better responsiveness. Sometimes the thing feels like a slug because the power profile often limits the cores to 800mhz, and I hate how laggy everything feels when it's in that mode. But if I enable the higher power profiles for a responsive experience, I get much higher fan noise. Honestly I might even prefer a smaller configuration like 2+4 with a cut-down GPU for everyday office use.
I would recommend taking a look at Throttlestop. You can change clock speeds manually and set up custom performance profiles.
what do you use your laptop for? gaming, wen browsing etc?
@@lorenzospadaro6077 Web browsing, TH-cam, Netflix, Day Trading, Monitoring my Twitch Feed, e-mails, oh and of course some software development i.e. I compile code on the Xeon and then I deploy and perform remote debugging on the laptop from time to time, mostly to check performance and compatibility on mobile.
@@GholaTleilaxu Yes.
8 cores is the new minimum, they shouldn't give us less
Lunar Lake seems like Intel are doing an Apple Silicon competitor for thin and lights. Which is not a bad idea.
nice dp
That's what they're trying
Congrats for being the first with a good review of Lunar Lake!
Thanks!
Still crazy to see M3 running most of these tests on 11W.. If only they add a 90Wh battery we will really get a two days battery. But good job Intel for finally getting something good for laptops!
I'm more wondering where the 11W comes from. As far as I know, it can draw 15W continuously (without a fan) and 22W over a short performance period
Which is why testing laptops with power parity is nonsense. Since each manufacturer can write their own BIOS, they each have different behaviours despite the same silicon. Overall implementation is way more important as it is reflected in battery size which affects weight and so on.
Great review. Pretty sold on these Lunar Lake devices... time for a laptop upgrade!
Excellent review, thank you folks! Lunar Lake is indeed a right step for Intel in the mobile market segment and great inspiration for the higher tier Intel mobile and desktop CPUs. It's also cool how Intel inspired themselves by the Apple's Mx design with the on package memory. Still when it comes to pure efficiency and performance per Watt Apple's silicon reigns as supreme to all.
Why are MacBooks being exempted from those gaming tests you're running? If they can't run it - they should appear there with "DNS"/"DNF" or something. You shouldn't reward them for not being able to do certain tasks x86 processors can easily do.
The MacBook isn't in the game tests since we have no reliable metric for game performance logging (frametimes, framerates over time) on Apple devices. We're currently working on one though.
A few days ago, I bought the new Asus TUF A14 (Ryzen HX 370, 32GB DDR5, RTX 4060) for $1350 on eBay. I could use the extra processing power, but I can TOTALLY see how Lunar Lake will be a new favorite for businesses that mostly use cloud-native apps. I'm just not sure if the value is there for consumers, $1500 is a hard sell. The general public is better off saving $500+ and getting an Intel 155H laptop that performs better and doesn't lose too much in terms of battery life.
Tbf, a snapdragon device would be a more compelling option specifically for business users that work in cloud.
Great battery life, touchpads, displays, touchscreen, camera, they look and feel amazing. The price is surprisingly decent for an ultra book, and you don't even need the higher skew processor. The only downside is compatibility, which you don't care about when working from cloud.
They're plain cheaper for the same if not better experience.
Qualcomm should be the one acquired by one of the RGB. Elite x is pure DOA
I would say they are off to a decent start here. If they can get that "rentable unit" technology off the ground, their processors going forward are going to be pretty awesome actually. Rentable units is their proposed replacement for HT and it should be more efficient, and work better with their hybrid core architecture. Doesn't excuse their major issues with Raptor Lake, but it is a bright spot.
This is a smart bet. I’m all for it.
Actually good. I'm interested in buying one.
ofc it depends on what system being used and everything else, but generally jh7 is the safest, in high quality, its easy to recommend
Lunar lake is act getting good now. I might reinvest in intel
you're a week too late...
intel sucks
The question here is now much profit / volume Lunar Lake will produce. Because Intel might still be limited by TSMC's wafer space.
@@HardwareCanucks Not more than AMD is limited really, AMD uses 4N for everything which is massively booked by Nvidia too
their fab is still holding them back tho
Cant wait to see benchmarks on ultra 9 288v
hardware cancucks is starting to become my favourite laptop review channel. i dont understand why most other channels review a laptop on its own but dont compare it to other laptops, how the hell are people going choose a laptop among so many options if youre not gonna compare it with other laptops
Thanks man!
@@HardwareCanucks btw, you did QC dirty here by not saying in which benchmarks are native or emulation, the performance will spike when devs port to ARM, Cinebench R23 will never go to ARM as it's deprecated and saying it does much better and destroys QC here just because of some miracle is bad journalism, it's emulation vs native...
Also you tested a 78 SKU which doesn't have the ST boost and thus loses to Intel, if you used a XIE-80 SKU, Intel only reaches parity. It's a bit misleading just to say the least.
Great review!!
great video, very informative and well presented
If this all translates to arrow lake on desktop we are in for a good year.
1985, 32bit 386's release was timely against 32bit MC68020 competition.
1995, Pentium Pro's release was timely against RISC-based Advanced Computing Environment (ACE).
Great review,
only one drawback is the MT perf because of 8C8T CPU. I mean, this makes a lot of sense because Lunar Lake is mainly for Thin&Light laptop. For people who really care or work on 3D/Video (locally), they should get something more powerful (Ryzen AI HX or Arrow Lake)
For me, I work on cloud server mostly.
I don't really care MT perf that much, 8C is enough.
I much prefer thin&light, battery life, and overall snappy experience
Thanks for such an in depth review! How are the thermals on Aura edition for day to day tasks like wathcing netflix, youtube and office work? Do the fans trigger in these scenarios and does it get warm/hot on lap?
For those kind of tasks you can pop it into Adaptive Mode and its perfectly quiet.
Agree with the missed opportunity regarding the battery size, it could really have underpinned the architecture's battery consumption, but also been a great sales pitch to include a larger capacity battery. One issue though, when flying in EU, a battery between 100 WH to 160 WH have to be approved by the airline, and battery capacity over 160 WH are usually not allowed in a commercial flight. So a 100 WH battery would probably be max, i do not know the rules international ect..
I run my 6800H at 16W most of the time. It's definitely not meant to be run like this, but it actually works well. Would be very interested to see your same benchmarks with that kind of power
Thanks for review.
If I commonly make office tasks and would like to play games sometimes (gta V, Hogwarts Legacy) which one should you reccomend? Aura or Zenbook?
My Lenovo Legion 7i pro with 155H : ( CB23 = 17.908 Multi; 1819 Single ) (CB24 = 962 Multi; 106 Single)
So inferring from the heavy load testing we should expect 2-3 hours on heavy games and some more on indies? Been looking for a good battery testing on lunar lake that test gaming time. Previous laptops and all even handhelds are always 1-2 hours. Would appreciate something like that as have not been able to find videos or discussion about it.
Appreciate the content and benchmarking.
After all the snapdragon hype (and I did get one of the machines and happy with it regardless) I’ve realised many companies weaponise strange benchmark results. In the same way snapdragon called out the M3 but in reality M3 is just better (when it has 16GB RAM).
I say this after hearing the battery test oddity (not getting up to 24 hours)
Overall though…. great improvements for Intel. Great year to get a laptop.
I am happy to see the mobile SOC space be so fiercely competitive, but I am disappointed to see all 4 SOC manufacturers abandon upgradable memory. I hope in the future, at least one of these manufacturers supports SODIMM or CAMM modules.
intel isn't, its only in lunar lake where on package memory forces them to. On the higher powered arrow lake cpus releasing in a couple months it'll be replaceable again because every little bit of efficiency doesn't matter so much anymore at higher powers
Its actually all about power and space savings, not to mention latency improvements. Having the memory so close to "home" has tangible benefits in all those areas. I also think that CAMM is a pipe dream for thin & lights. Gaming laptops? Maybe but right now the SOC approach will remain king for ultra portables.
CAMM2 seems to not have caught the train, I think it was a price problem.
How do you think the best Mac achieves almost 10x better memory bandwidth than the best gaming PCs? Modularity comes at a price, also in laws of physics.
@@kazioo2 What are you on about no tf they don't
Do you think when will Surface laptops get Lunar lake, if ever? I need a laptop before the middle of next year and was gonna get Snapdragon X but after seeing Lunar lake performance, I’m waiting
How does this compare with Windows on Arm (e.g. Microsoft Surface Laptop)?
Hard to judge battery run times with different screens. OLED will definitely take a lot more power than the CPU. Lunar Lake looks pretty good. I want as many core/threads as possible though, and probably shouldn't buy a new laptop any time soon.
Please include a GAMING battery life benchmark! (Should be very simple: just leave the offline game running).
I would like to see desktop CPUs come with integrated RAM AND DIMM slots for the best of both worlds.
Intel have better integrated gpu and better battery life ?
I knew it. I messed up the timeline 😭
This comment might come off as a bit unfair, but the M4 series chipset are a pretty large bump vs M3 to put it mildly. I'm not really sure how anyone competes with the M-series chips going forward, the benchmark numbers coming off the iPad pro M4 are staggering. That being said, these are nice gains from Intel, and I look forward to see some of the desktop offerings.
if you use Applebench6.3 that rigged M4 result by introducing SME extension that nobody use then yeah M4 is a bump but on other benchmark its very close.
Apple’s chips are insane from a performance to efficiency standpoint.
The M4 is going to be next level.
The biggest advantage they have is x86 software compatibility and OS flexibility.
m4 arm based so who cares, throw it in bin like snapdragon, glorified phone chips for grandma
@@drinkwater9891 Damn well these glorified phone chips are kicking intel desktop and consumer AMD chips in the teeth.
Must be an old guy stuck on X86.
Also might we remember Nvidia and AMD plan on making arm chips in the next few years if I’m not mistaken.
Definitely a better balance than Snapdragon, but I'd still go for AMD Zen 5, for my needs, if I was going for a new one today.
I've been "Team Red" since the Ryzen 7 4800H, but I'll be going Intel in my next laptop (in a few months). I've enjoyed AMD, but the poor quality video encoding/decoding just puts me off. The things that are most important for my use case:
1. Battery life.
2. Excellent iGPU performance a low wattage (not for gaming, just smooth desktop and scrolling).
3. Runs cool and quiet.
4. Good video transcoding.
5. Thunderbolt port (or ports).
6. Okay performance (I don't care about bleeding edge multi-core performance, so long as performance is "good enough").
Can somebody in plain english tell me if a developer (containers, backend apps) should go for lunar lake (better single core) or meteor lake (better multi core)?
Overall Lunar looks better but I guess development tasks would be more benefitted by multicore performance??
I don't know. PLEASE HELP
Yes they should be better with higher MT but you should buy a Zen 5 instead if you don want to buy lunar lake, and you should wait for real laptops not this Starbuck PCs, if not LNL then Zen 5
@@reiniermoreno1653 AMD (Zen 5) laptops are too costly where I live. I don't know why but they are charged a premium.
Sorry but didn't get you.
"should be better with higher MT" you mean this for MeteorL? Do you mean to say ML will be better than AL for my usecase??
great video, but do you also have any info on "skin" temperature? aka heat you feel on hands and lap?
That will be rolled into each individual laptop review
@@HardwareCanucks thanks!
One bit of information that was missed from this review is that while it's true that the windows PCs from Intel and AMD had larger battery, they also had 120HZ OLED displays which consume almost 50% more power, specially during web browsing. So despite having a 30-40% larger battery than the M3 Air, these chips were at a disadvantage.
So there's a few things here we should actually add to the charts. All devices are set to a fixed brightness output and refresh rate (60Hz). This evens the playing field by a lot.
However, if the fixed brightness is bright enough, OLED still can consume more power, right?@@HardwareCanucks
Yep, it's difficult to compare CPU performance on laptop bc of the wide variety of components. This video really only gives a general sense of the performance, which is fairly competitive. We'd need the same laptop with just a CPU chipset change to have accurate data and make judgements on which CPU is more powerful or more efficient
It's still oled though.
@@deansmits006 pcworlds test uses the Dell XPS 13 for Snapdragon, Meteorlake and Lunarlake testing with pretty much everything matched. I think that's about as close as you can get right now.
Sadly, no Dell Ryzen laptop so not in the comparison, but still. Strong showing from Intel.
I’m actually amazed by this SoC
would love to see this in a hp aero-like laptop, 13", suck only about 10-15w, passive or quiet cooling, less than 1.2kg, cost less than $1000
HWInfo64 now has an ARM64 version which works with Snapdragon X Elite to get temperatures on the system.
These benchmarks show how vast the difference between Apple and the others is. M3 is using 3x less power than the top of the line Intel and AMD chips, but it stays competitive.
This is a great step forward for Intel, but calling this chip perfect is coping.
Apple's true advantage is Rosetta. Its just SO EASY for developers to run emulation at a limited cost.
Intel run the holy x86
It's not Rosetta
It's the apple silicon
Every app you tested was native arm
@@HardwareCanucks Are you implying that the lunar lake laptops are also running through emulation? Apple silicon is simply better just admit it.
@@ayusharya1397 Software optimization too. MacOS and its ecosystem are written to work on limited/specific hardware, nothing else (more efficient), but Windows and its ecosystem are written to work on a broad variety of hardware choices (less efficient).
Those tests with 20+ hours runtime use the native media player in Windows 11, which is much efficient than vlc or other legacy media players. I use it despite lacking features to preserve battery life.
It’s funny tho when you put it into a comparison with Apple’s M3. But for Intel it’s a HUGE step forward, good job.
As a programmer, I need lots of processing threads. I'm glad I got the HX 370.
What application are you using for programming? We'd love to add some relevant benchmarks for it.
@@HardwareCanuckshi please add llvm compiler build benchmark. I'll be happy to forward any build scripts and configurations you might need.
As a programmer, I would never use a laptop for compiling code or even trying to write it.
@@Nanerbeetmatters for students and personal projects, not everyone can afford a server. 🙂
@@Nanerbeet Well, I've been doing this for 50 years and it sure beats punch cards. The AMD HX 370 has 12 cores and 24 threads. I have a 2TB Crucial T500 and I can run 2 external monitors. I'm running a local SQL Server and usually 2 copies of Visual Studio. Frankly Teams uses the most resource ATM (especially when sharing).
What to get for python, java and rust coding?
Ryzen HX 370.
Looks like the HX 365 competes pretty well against 258V. I think the HX 370 will look even better against the 288V since the former adds CPU and GPU cores while the latter just clocks them up.
I want it's benchmarks against the Zen 5 AI HX equivalent, please 🥺
Its right there. The Ryzen Ai 9 HX 365.
I own almost all intel hardware, but laptops are amd market.
Ryzen have being the option number 1 for gaming laptops and now with the new hx370 the thin still better
In the CPU side 370x performs worse in gaming than 7840H
@@otozinclus3593 it varies according to so many different factors. Like like bio settings, power profiles, cooling solution.
But on Asus g16 zephirus 4070 great laptop
@@otozinclus3593 only if the 7840 is running at 58W and the HX 370 at 38W or lower. and even then i think the 370 is slightly faster on average.
It will be great if Intel can get their foundry to put out these chips on 18A instead of relying on TSMC. This change to ASML was never going to be smooth, but so many parts in Meteor Lake were Intel, so that bodes well.
will they be releasing another architecture with a higher performance ceiling? because a dual architecture strategy might actually be a good idea now, its been a long time since the Atom era, lower performance isn't nearly as much of a limitation now that the performance floor is much higher
Yes. Check out our Lunar Lake Explained video. Arrow Lake H will scale downwards to ~40W.
Rather than increasing battery, they should also put focus on good speakers as well.
Even mobiles have batter speakers.
I don't feel like those prices are nearly as good as being suggested.
I'll remind you Copilot+ laptops launched a few months ago for $1500...with Windows on ARM.
They need a matte version or to use the anti-reflective coating from the Asus G16 370 AI... Looks like they didn't apply an anti reflective coating on this laptop at all.
the fact that this battery life is achieved by going back to just 4 main cores and another 4 for light work is understated though when commenting the efficiency part....but when the multi-core performance benchmarks are on it is then overstated. It's not a bad chip don't get me wrong, but i just can't justify the hype...great iGPU though that's great to see
The question is what’s better for the average user: better power efficiency with worse multi core or better multi core with with power efficiency.
I’ll wager the first, most things you do won’t take advantage of the number of cores in a Ryzen.
@@zachb1706 i was raising the point that the presentation of many outlets did not explain that the better battery life is a result of the decrease of the number of good cores
@@zachb1706 Snapdragon managed to get those two, why couldn't Intel?
Because Snapdragon has poor GPU performance along with app compatibility.@@YTAcct283
The battery life was not archieved by the lower core count, but by a architectural overhaul, a better node from TSMC, on die memory and the replacement of the VRM
It seems like Intel is somewhat back. This looks like a compelling offering from them if priced attractively.
Intel has somewhat of a history with over-hyping and under-delivering. I'm convinced of the capability now, but what about availability...last time they announced something it took months for it to finally start appearing in products that I was interested in...
I agree with intel's new strategy with lunar lake. I have been wishing for so long before even the M1 that all chip makers start focusing on efficiency over raw performance. Finally it has started! Yay.
Battery life and a cool to the touch device (think about your lap with a laptop on top in a warm climate just browsing the web) is invaluable for many (most?) people. I am a software developer and the amount of insanely inefficient code I see out there in the industry is way too much. Lack of raw compute is mostly not the main culprit in causing lag and slowdowns in day to day computer usage.
All this is only my 2 cents though. Business be booming without it too evidently.
Now only if Apple was crazy about FOSS, it would have been perfect...
I’d love to see a 258v processor in a laptop with a 4080 GPU that could be totally turned off when not required….ie, when you are doing something like Blender or Resolve (where the 258v is relatively weak), you could enable the 4080 for performance when you really required it, but if just doing general surfing, you could disable the 4080 to improve battery life. Kinda best of both worlds.
It's no gonna happen, lunar lake was designed to not use dGPU, you would have to wait for arrow lake
@@reiniermoreno1653 …..ironically, which might not actually need a dGPU
To be honest Lunar lake just makes the M3 look great. Give that thing a fan and it gets even better in the sustained loads.
m3 arm based so who cares, throw it in bin like snapdragon, glorified phone chips
@@drinkwater9891arm is the future
@@drinkwater9891 ah yes the idiot that thinks that everyone in the world just use the laptop for playing games. Classic
@@drinkwater9891 lol lots of people care about it more than the snapdragon.
I am fully satisfied with my Apple M4, a true beast.
Intel Core i9-13900HX still going strong 🎉
So the Meteor Lake H series is out of question for basic office and stem focused data analysis projects?
I had almost decided for the Ultra 5 125H and now I'm confused if I should wait another few months for the 226V. 😭
No surprise with the stellar battery life. It’s basically M3 in idea. Video playback is not very indicative since they have their own very efficient engines doing decode. Light task and heavy task is what I am interested in.
What I am curious for is steel nomad on M3. How is it next to Lunar Lake and strix point?
Looking for the Zenbook s14. Waiting for a deal. Here in Germany the 32gb/ultra 7 258v version costs 1699 €. I really don't understand why everything got so expensive. It was always the same as US ..
Intel should have used TSMC-fabricated SoCs/processors a long time ago.
If they had, they would likely be in a much stronger position in the CPU market today.
So the tl;dr is that Apple is still King in runtime and speed, regardless of being plugged in or on battery.
Still nice to see Intel catch up, they are finally an option for mobile devices again.
Would've loved to see a test under sustained load as MBA is well known for its throttling.
No shit. That’s what a fanned system like MBP is for.
Most of the benchmarks are actually under a sustained load. For example Cinebench we use the 10 minute warmup, our Creator focused tests are quite long (especially Resolve and Premiere)...so I'd think we put enough load on that M3 to even out the playing field.
The MBA even competing in any capacity with these turbine fanned power guzzlers is astounding, Apple truly has the best CPUs on the planet currently.
Testers use the fan less MB Air to give Intel a chance because if they used the same base M3 chip in the MB Pro 14 it would make it look even worse, which would upset these Window’s dominated channels. The reality is a fair test would be the MB Pro 14 M3 that can be had for $1099 recently and the 16GB version is similar in price to these VIVOBOOKs. Then of course, you have the M3 Pro MB Pro 14, and in a few weeks the M4 MB Pro 14!
All I can see is that the M3 is still way ahead of everything in efficiency.
Yeah but Intel did well with this one by focusing on efficiency. The only thing is that its supposed to hi peak perf / watt between 9 and 17W.
@@HardwareCanucks would be good to see the two stacked up at that lower TDP for Intel. But based on the charts Apple is equaling Intel's perf at 1/3rd the power right now which is crazy.
@@xskeetskeetskeetxThat's the TDP, not what the CPU is actually drawing. The only time you can be certain it's drawing 30 W is in benchmarks like Cinebench that stress all cores, and the M3 is trounced there. Not to mention if you lowered its power limits to the M3 it would probably still score as well due to exponential power scaling.
@@xskeetskeetskeetxbecause apple uses arm not x86 😂
@@kirby21-xz4rx So does Qualcomm what's your point? X86 is inherently less efficient than ARM, Intel is trying to prove otherwise. So far they haven't.
Finally some proper competition
Well done to Intel. That Apple M3 chip is pure magic tho
How are those with 3D rendering app and architecture apps in general?
if that's what you want then you will probably go with AMD because Intel doesn't have dedicated GPUs in lunar lake laptops, and even the iGPU is better for such tasks. you should find HX370 laptops with a 4050/4060 at the same price as Lunar lake laptops.
@@mariuspuiu9555 thank you for the response, the reason I want to get a laptop with an igpu is because although I'm an architect student atm, I also work as a freelance graphic designer and video editor, normally, my heavy work is done in the graphic design department regarding brochures and large portfolios of a file, the video editing part simply compiles of reels and small advertisement videos at 1080P Max, so the laptops that are available in the market in my area within the price range with gray color reproduction for my freelance work that I want normally have igpus and the ones that have an rtx 4050 or 3050 TI have terrible srgb and Adobe RGB gamut.
ARM has had very little time and effort put into it to make it compete with x86 systems, which has had decades of consumer use and competitive innovation put into it. And yet, ARM, with comparatively much less time to refine itself, is already showing that it can be competitive in the computer market. It just needs to develop a little further, it needs more adoption.
ARM is a good idea, and has so much room to grow. I believe in Snapdragon’s mission, but unfortunately it still needs time in the oven. It’s not yet matured enough for my liking, but I’m glad they’re making the moves and pushing for its adoption. This is the catalyst that will help ARM to continue advancing, and one day, overtake x86 when x86 peeks or plateaus.
Maybe next generation I’ll consider an ARM PC.
I would like to see for example how many cinebech runs/Wh or blender frames)/Wh the processors are doing for a true performance per watt comparison! You could measure the power at the outlet and substract overall idle system power to get only CPU power. Screens, batteries true capaco and motherboards, network cards all influence the results to heavily in thise simple run time tests....
Non upgradable memory? Is the cpu upgradable?
How were the temperatures and noise levels in these tests between the 6 products? Especially vs AMD.
How can we tell how much vram do these have? Do they have any vram?
How is that 11w M3 keeping up with all these 20+W CPUs?
It's ecosystem is far more optimized
Where are you finding a Yoga Slim 7i with an OLED? Is it a country thing? On their US website and on Best Buy I am only seeing it available with an IPS and no OLED but would love that as a possibility.
It's a regional thing. When we put together the video it seems like OLED would be the ONLY option available but now it seems there's IPS
@@HardwareCanucks Lucky!
Spot on. 👍🏽
Reviewers (& everyone else) need to recalibrate; forget about no. of threads & clock speeds. This is about quiet & cool running, all day battery life, no significant performance drop off on battery and premium features (great screens, speakers, webcam, keyboard, memory speeds, Wi-Fi 7, Bluetooth 5 etc). A high quality user experience in other words. Two predictions: this frees up OEMs to push the envelope on form factor design evolution. And it will throw a very harsh spotlight on how shite Windows OS is and all the bloatware running on these devices. 😏
I'd love to see it in NUC form factor with adequate, QUIET cooling.
Intel CPUs only having the RAM on the cpu package seems a problem to me since they max out at 32GB. It would be good to have LPCAMM2 type RAM as an optional way to go beyond that. Have 32G on the package and maybe 16G or 32G more on a LPCAMM2 card if you want more.
it's some sort of 3D cache...
having the ram with the CPU is the way to go for sure.
I expected this for a long time.
I hope than AMD will offer this on desktop CPU.
1 CCD with 8 to 10 cores and 1 CCD of cache or RAM
it's not complicated for them to do it, so I don't understand why they have still not released this type of setup.
Intel do it first, but on laptops...
It's not 3D cache is just ram on package, this is not new
I am Up for this. Great comeback by Intel. Considering Intel for my first laptop. Or what can i say AMD is also not far behind.
Was wondering which one to go for. I feel like 2 gen Ultra 7 is the sweet spot but I’m not sure. Can anybody suggest?
Absolutely I think the 258V / 256V is the sweet spot in this lineup.
@@HardwareCanucks thank you so much for replying! Definitely will go for the Ultra 7 258v
When you consider lunar lake is for thin and light laptop with mostly browsing and office task it's perfect. For blender and everything else is for arrow lake h
The recurring issue now with Intel is things get so delayed, instead of being 'WoW', when they finally come out they are relegated to middle of the pack or barely an incremental improvement over the competition.
If Microsoft/Windows goes back to x86, I will definitely switch to Apple...