💵 Save money on your next gaming laptop with our daily deals: gaminglaptop.deals 2:02 Actually the Ryzen AI Max+ 395 was not tested in the Z13, if you look at footnote 5 from the AMD PDF in the description it mentions an AMD reference board was used. Still though, they mention similar TDP, so I would expect performance to be relatively similar.
To be fair, the laptop 4070 is nowhere near its desktop counterpart, I think you even made a comparison on your channel. You'll be lucky if it matches desktop 4060 performance levels, it's definitely worse than a 4060ti, and that was like the worst GPU release in history.
@@Berserkism The name is a mouthful to an absurd degree and it sounds ridiculous. AI Max for the maximum AI POWER... Now let's somehow add + onto something that is already supposed to be at the max! So, it's AI Max Plus! Surprised they didn't call it AI Max Plus Ultra! Because the chip goes beyond plus ultra ofc. Then 395 which is likely a model number or SKU but to a consumer 395 just sounds like a random number. But, I imagine this name was something some business man in a board meeting came up with and not from AMD's actual technical naming team.
Well...kinda... The GPU part of an APU beats it. There's still a GPU in there, more so than ever. I'm being pedantic though, I get your point and yeah it's pretty wild to see. But I think Radeon integrated has had the 1050ti kicked for a bit now honestly
@@John-uh4rv for 3k you could get a 4080 and still have money for a SSD and RAM upgrade, then some left over for lunch lol. Or if a 4090 is on sale that. For 3k tho, you're better off building a pc
@enihi It's called being smart and not exposing yourself to lawsuits. But no let's cry about evil marketing people, as if you wouldn't do the same in their position.
@paincake2595there's a 5090 laptop GPU inbound that is basically a desktop 5080 and has a power limit below 200W. That might be what this person is talking about. Idk if a desktop 5090 could even function at 50W.
Yeah I was confused when I heard the news at first because that is a much bigger win from Strix Halo than I was expecting. Then I saw the power limit of the Nvidia GPU and it made sense. It's a fair comparison though because that's what Nvidia products can do in that type of form factor and power draw. If those things are important to you then Strix Halo products are going to be head and shoulders above the competition. If you just need raw performance and 16GB or 24GB is enough VRAM you'll still want Nvidia.
That's so promising. I look forward to laptops that use a single chip with powerful integrated GPU. If AMD can do what Apple has done with the M series, this will be a huge win for windows laptops.
💯 also, windows laptops have expandable RAM, so team-windows might have the upper-hand soon enough... The only issue that remains is battery life... It's an OS issue, not a hardware issue, afaik.
@m.heyatzadeh that's true, and what I'm getting at is that the windows OS has a few optimisations to be made for laptops to be more power efficient. also, we MIGHT have to eventually shift to ARM64 architecture to get some of the best LAPTOP designs ever.
But AMD's Strix Halo in Flow Z13 isn't running full power either. Strix Halo can run up to 120W at full power, so yea, would love to see that in comparison with 4070 in a full sized laptop. My guess is that they both will perform quite similarly with Strix Halo using much less power and space, giving more options for bigger ram, cooling, battery size or be used in a smaller form factor.
Based on Jarrod's previous testing where he compared the laptop RTX 4070 with different power limits, it's 31% faster at 100W than at 50W, and doesn't get notably faster beyond 100W. That would make a Radeon 8060S roughly 7% slower (though possibly about the same speed, or significantly more than 7% slower, depending on which games are benchmarked) than a full powered RTX 4070 mobile on average.
@@nathangamble125considering they are compared within the same form factor, meaning they are probably limited to the same Wattage, a full powered 100W 8060S will probably beat a full power 4070 by 20-30%. The problem remains in what OEM will make such a device, considering this is mostly aimed at lower-powered devices.
@@TH-camModsAreSnowflake the joke is playing a silly reframing around the popular notion in PC gaming discourse of AMD providing consumers with value for money & integrated graphics being seen as a cost-negligible inconsequential bundled-in add-on in APU packages. It absurdly applies this sentiment against a premium product in a now rapidly changing market segment of APUs which is neither inconsequential nor value for money just because it shares the trait of being integrated graphics & an AMD component. It's the "a hot dog is a taco" and "smoothies are a soup". Does this satisfy your bizarrely serious & chronologically demanding query about why this is a joke?
Very few mobile 40 series cards got decent power this gen, i couldnt get any 4060 laptop above 75w here in australia, most were 45w up to nearly 2k$ then 2-3k$ for a 75w that is still gimped is crazy so i just tapped out of the laptop market, upgraded my desktop and got a ryzen laptop instead.
Jarrod you have the data for the RTX 4070 with its full fat TGP, would've been interesting to also include them in your chart... No need to say likely....if you can have a better estimate to make some early conclusions on. Unless you were sponsored to make this video of course.
he can't put numbers from his own tests up against numbers from a completely different test though... he doesn't know what they were doing in those games to make it actually a fair comparison so it would be useless information to include.
@@Wolfer1OOO it would of been a perfectly good comparison he would of just needed to add that hey these testing parameters could have been different but both systems ran high settings 1080p in said games, sure its not the exact same scenes or ambient conditions but i mean it would of been atleast 95% accurate
technically it is unknown what TDP the Strix Halo chip is running at either, it could be 55W total like in other tests, which is great, or 55W GPU only...which is not so great Asus' own Timespy graphs are kinda disappointing, but Verge had Helldivers FPS and those were very efficient
@@AndrewBrownerwhile I agree those are the numbers I'd like to see I don't think it would be a good idea to include. There are too many variables with the numbers. The data would be only marginally more useful than him just saying what he said. Don't forget, every hour these creators spend on one video is an hour they can't spend on another. So he probably just felt the opportunity cost was too great for the quality of the comparison. Besides, you can go look at reviews for that 4070 laptop and compare to reviews for one that's not power limited. The info is out there. Better to wait for him to get a sample and I guarantee we will get those numbers, but properly done in a scientifically rigorous and useful way.
Considering that AI doesn't even exists, it's algorithms with machine learning, nothing new. But companies know that idiots will then say "what, no AI? No thanks then". So they slap that AI nonsense on every product.
@@IamTheGreatCornholioo It's not really about consumers demanding AI in everything. That's more about investors and product managers. I have yet to meet a consumer who's dying to have AI added to their toaster and any other product they own. Whether or not it's AI, that point doesn't really make much sense. If we do achieve whatever concept you have of AI, how else would we do it other than algorithms and machine learning? Coating an Intel Pentium in slime mold and hoping something happens?
For context a 50w 4070 is 30% slower than at 105w. The Ryzen 9 AI Max+ 395 itself can consume upto 120w. I wonder who wins in the performance per watt here.
This will definitely be great especially for mini PCs. Imagine a Mac mini with this CPU and upgradability of a mini PC. iGPUs have come a long way. Even so with Arc.
AMD is killing it in the APU space lately. I’m super happy with my Asus Vivobook S14 with an 8845HS and a 780M already. This is just on another whole new level though. I’ll never buy another laptop with a DGPU. If I can get a thin and light laptop with graphics like this, then it’s all I need for light gaming. All my heavy gaming will be done on my 3080 12GB PC (soon to be a 5080).
It's not really hard to understand how an IGP can outperform a dedicated GPU. It's all on memory speed and bandwidth and capacity and all of that can be buffered within the APU with the aid of a very large cache but the APU will be monstrous in size.
There's 4090 laptop GPU already for that. The whole idea is trade offs. Even if this has the performance of a 4060 laptop, it would be enough for some people and if they release it with pricings similar to no gpu laptops (sub-500USD) it would make sense. But yeah, it should be a lot cheaper and also why only 50W? It's unlikely someone is gonna pair this CPU with a GPU anyways, as long as it gets enough cooling, they could make it higher than 50W, more like 100+W.
@@kockoklosharcheto there is less than now way that this is cheap enough to do that. dGPU less laptops are cheaper because they don’t have any dGPU at all, this still has what is basically a dGPU it’s just built into the CPU. I’m certain that 40CU is more than they had in all their RX 7600 derived laptop dGPUs. And so that’s the _minimum_ price comparison.
You don't need to statically assign VRAM in the BIOS. The driver can dynamically allocate more memory and release it again once it's no longer needed. There's only very few, quite specific cases (i.e. as workaround to bypass VRAM capacity checks in badly written software) where it makes sense to do so.
I am PRAYING they bring this to a desktop socket. I don’t care if it’s now on am5 or next gen on am6, I wanna make a travel pc the size of a pop tart box with this thing so bad
Then stop praying for AMD to do something stupid. Strix Halo needs a 256-bit memory bus with LPDDR5X-8000 in order to get enough bandwidth to work properly. Dual-channel DDR RAM is only 128 bit, and most DDR5 has a much lower transfer rate than 8000MB/s. Putting this GPU on AM5 would cripple it, so it would be a waste of money and silicon. Maybe we'll see some heavily cut-down model with about 16-20CUs on AM5, but that obviously wouldn't be able to perform anywhere near the 40CU version. Anyway, if you have a desktop motherboard, you can get a graphics card.
@PersonalBurrito It's not that, it is the fact that the graph is deliberately confusing. The 4070 shown on the graph runs up to 65W with dinamic boost, way behing the full-powered 4070 at 140W. Jarod's testing indicates that the 65W 4070 in the omen transcend 14 performed worse than a full-powered 4060
You can unlock the TDP in G-Helper. The provided charger is rated for 140W. At my testing it sustains around 130W definitely. Also, using a 160W GaN charger which can actually sustain a 140W load for an hour at least, with G-Helper which also unlocks the power draw from 3rd party chargers, the ACRNM can easily draw 140W with both the GPU and CPU combined, but you have to cool it externally unless you want to fry something. But it works for benchmarks.
The best use case for this CPU are small computers like thin and light laptop and mini pc's, for sure this CPU alone has a premium tax attached to it because it is a niche chip. If someone will make a slim 14"/16" laptop with 64gb of ram under $1600 that would sell really really well, though every $100 they add to that laptop a lot of the customers interested are dropping like a cliff.
The name of this processor is idiotic, AMD and marketing is a misunderstanding. But on paper, this is a gamechanger when it comes to Windows laptops. And I don't mean games, but as a whole slim laptop market.
I would like to see mini pcs with that chip. Sharing the same memory between cpu and gpu provides much better utilization since gpu memory will not be sitting idle when you are not using the gpu.
They released a new eGPU with TB5 and didn't give the new flow a TB5 port to take advantage of it. I'd love to see someone doing a bottleneck test between TB4 and TB5 (could probably be done with a 4080 and 4070 Blade 18) to see how much difference it makes.
@lowpoul5552 not really, you can use a dedicated controller. Many Asus Proart laptops have TB4 and Ryzen CPU's; many AMD motherboards have TB4 as well.
AMD needs to add Hardware encoding/decoding to match Intel Quicksync and/or NVidia 50 series. That would allow video editors/content creators to move into full AMD platforms. Especislly in laptops, it would be golden.
This Unified memory can be had with up to 128GB which will be incredible for AI workloads. Apparently it can run faster than a desktop 4090 when the model is too large for the 4090s VRAM of only 24GB. Many models now are larger than 24GB!
I have a Zephyrus G16 that has the AMD Ryzen AI HX 370 (Radeon 890M integrated graphics), and was already very surprised with how well it can game with the integrated graphics, but gaming with the Nvidia RTX 4070 is still way better (about 50-60 more fps compared to integrated graphics on FFXIV with high settings).
Maybe. But they've gone on record saying they'd like to maintain the battery life of the original SDeck when designing its successor, and I'm not too sure if a 45-120w CPU would give them that. While shoving a bigger battery into it is an option, we're discussing a "gaming handheld," there's only so much weight which can be added.
AMD compare that iGPU to 50W dGPU. Steam desk combine TDP is 15W. 16cores just for handheld is waste too. The best will be if steam could word with ARM and use Snapdragon Elit, which will be cheaper and could work on 5-10W. Mobile chips are getting way better improvment
@@ShojiVR depends on what games youre playing i suppose, im on Allan wake 2 and its by far the best graphics ive seen in a game to date, 4080 laptop can just squeek out 60-80fps with every on high and max ray tracing with the help of dlss frame gen if youre playing battle royals and multiplayer shooters only then for sure you just want a steady 120+fps and graphics are barely getting noticed
@@ISSHINASHINA7 a laptop with said cpu would of been more expensive when its released then what you purchased is what he meant, this chip isnt even on the market yet.. and as far as i know it doesnt have any ray tracing cores, something youll be glad you have with every new game that launches now, some developers are even requiring RT cores for the upcoming games to launch
I have a suggestion, when you test the laptop, Instead of showing a boring graph and blabbering just show us real footage with fps meter. I have been watching your video's for a long time. Everytime I have search another good youtuber actually showing game footage so I can watch the video without and enjoy it.
The performance in the power envelope is not surprising for me. RDNA actually have big chunk of it's performance at incredibly low power usage. It's just to squeeze that extra 20% performance the efficiency just drop to a cliff. I've undervolted my RX 6800 to 930mV + underclock from 2200 to 1800 which retains 80%-ish of the performance, and the power consumption is literally sub 100W. It's insane.
4080 in my lenovo typically rides around 110-130W with the boost being an even 175w.. i dont understand the boost watts though it doesnt seem like it uses it when its needed most rather it just uses the boost immediately and then never again.. would be nice if it saved it for when frame rate was dipping, playing Allan wake 2 lately and theres been a couple times where the fps dipped from 70 to 45 for several minutes when i was in demanding scenes would of been nice to see the gpu spike upto 175w then to try and maintain that 60+fps
I love integrated graphics, but at the same time it's sad that the soldered RAM issues we have in modern laptops are making it more difficult to enjoy. Even if you just want a laptop with 32GB RAM, most manufacturers already try to rob you.
Looks like AMD is still stuck on 4nm. I expected them to move to 2nm by now. I have Ryzen 9 7940HS that max out at 5.2Ghz. This thing is a beast. New AMD cpus are not a significant upgrade other than the extra cores.
Besides the performance claims, which are still to be independently confirmed, I find the flexible assignment of RAM an incredibly valuable feature on its own. At least 12GB should be a minimum VRAM these days (from extensive studies by Hardware Unboxed and others.) So any GPUs limited to 8GB imho are likely soon to become a no-go.
Imagine AMD would make this APU with 8 cores 16 threads instead of 16/32 so the thermal limit for the gpu is higher, as well as the cost is cheaper , so they could sell medium grade laptops that crush every entry level gaming laptop in price. Why do they always include the big GPU on their top end APU's with less room for graphic gas? it seems market segmentation is too important for them and they don't want to materialize the advantage in efficiency a gpu oriented apu could make.
Ryzen AI MAX 385. Annoyingly though, the GPU is cut down to 32CU. The GPU will still be pretty good with 32CU compared to 40CU, but it would be cool if there was an 8-core SKU with the full 40CU 8060S.
@nathangamble1 Curious to see how the 385 would stack up against a ps5 pro performance wise (yes I know the CPU in it is incomparably battery than the pro but still)
When the AI Max 395 does release, could you do some AI tests? I would like to know how the 96 GB of RAM benefits local LLM performance. If you haven't done local AI tests before, I'd be happy to help you get started with something basic.
I honestly hope the reviews seal the deal for me. This is my dream system that might move me away from traditional desktop. (Currently using a older laptop with monitor but I've built three desktops in the past).
@@Fluffball555 i dont have real world testing but assuming the laptop will allow itself to pull full TDP's on both it will have the power on hand because it can draw from the battery in peak performance, the battery will only deplete incredibly slowly because its not powering the base function of the laptop just the performance peaks, and the ac adapter can top off the battery when the power draw dips down.. but 65W is really low, since the motherboard, ram, screen, fans ect all need power too
I have a Lenovo Legion Pro 5. The 2023 version with a Intel i7 13700HX 32 gigs of DDR5 RAM and a 4070 a 8 GB of vRAM. You're telling me that this is better than that. That's crazy it's true.
If we could get handhelds that ran like my laptop I would buy one so that's promising for the future cuz I would prefer a handheld but I didn't get one because I just wasn't sure that it was powerful enough yet that the technology was there yet. But seeing this is pretty cool man I definitely going to go AMD this next go around. I bought this before the big shenanigans with Intel.
What I'm hoping for is these integrated GPUs get really efficient and powerful so that we can play games while on battery. I don't like the Steam Deck mostly because it is meant for playing games only. I would like a 2-in-1 laptop that I could game on (on battery) and also watch movies and do some web browsing
That might work for other subjects, but we already know how good AMD’s APUs are. The 780m already matches a desktop GTX 1650 with only 12 RDNA 3 compute units, and the new Strix Point Halo has up to 40 RDNA 3.5. Then you take into account that the iGPU can use 96GB of RAM, instead of 4GB, and it’s easy to see how good it will be. Also, this is being compared to a cut down mobile 4070 with a low TDP. It’s nowhere near a desktop 4070.
Those numbers for Cyberpunk on 4070M look really weird unless they’re doing all maxed out no RT, because I just ran a 1080p high benchmark without upscaling and got better than that on 4060
Yep, but like he mentioned, that specific 4070m is only running at 50W. Pretty low, as most laptops running them are set to draw a max of 90-115W. It beats a severely gimped 4070, not one running at it's normally available configuration. The one in my laptop usually draws 80-90W which is considerably more.
💵 Save money on your next gaming laptop with our daily deals: gaminglaptop.deals
2:02 Actually the Ryzen AI Max+ 395 was not tested in the Z13, if you look at footnote 5 from the AMD PDF in the description it mentions an AMD reference board was used. Still though, they mention similar TDP, so I would expect performance to be relatively similar.
Yeah Radeon 8060S has 100W mode too.
To be fair, the laptop 4070 is nowhere near its desktop counterpart, I think you even made a comparison on your channel. You'll be lucky if it matches desktop 4060 performance levels, it's definitely worse than a 4060ti, and that was like the worst GPU release in history.
AMD needs to fire whoever named this chip
why?
at this point everyone doing this shit lol
@@Berserkism The name is a mouthful to an absurd degree and it sounds ridiculous. AI Max for the maximum AI POWER... Now let's somehow add + onto something that is already supposed to be at the max! So, it's AI Max Plus! Surprised they didn't call it AI Max Plus Ultra! Because the chip goes beyond plus ultra ofc. Then 395 which is likely a model number or SKU but to a consumer 395 just sounds like a random number. But, I imagine this name was something some business man in a board meeting came up with and not from AMD's actual technical naming team.
@@tot0N if we fire them all, it stops.....maybe....
for real every time I say the name on video I get it wrong like 10 times and have to read it out each time
Integrated Graphics ❌
Dedicated integrated graphics 🔆
What's the difference?
It's called a SoC like on a console or phone
finally a ps5 mobile
@user-bb6qvthe normal Ryzen AI 9 is also an SoC. Most modern desktop and laptop CPUs qualify as SoCs.
It's on the same package, it's integrated. It's just much bigger and has phat bandwidth
Ryzen AI LLM Ultra Nano Super Plus Mega
You forgot Hyper Fighting
no "XTREME"?
@@si.ari.06 It's not 2013
don't forget Max Pro HX 299X3D+
@@kevin.malone true, how about "PRO" or "Premium"
then a CPU beats my 1050ti GPU
even my 3060
That's been the story a long time ago
Well...kinda... The GPU part of an APU beats it. There's still a GPU in there, more so than ever.
I'm being pedantic though, I get your point and yeah it's pretty wild to see.
But I think Radeon integrated has had the 1050ti kicked for a bit now honestly
I woke up in a new bugatti 🗣🗣🗣🗣🗣
Actually, the snapdragon 8 gen 2 mobile chip is equivalent to 1050ti😂
At the low low cost of 3000 usd.
This was my thought as well. What kind of nvidia powered laptop could you get for the same money as this?
@@John-uh4rv for 3k you could get a 4080 and still have money for a SSD and RAM upgrade, then some left over for lunch lol. Or if a 4090 is on sale that. For 3k tho, you're better off building a pc
@@ArthurRamirezJ You could get a 4080 laptop for under 2k, 3k is the retail price of the alienware m18 4090💀
@@ArthurRamirezJnah, this thing is an AI beast. Day 1 purchase
it's 2000
Keep in mind these slides say "up to", not "average" FPS.
Marketing never miss an opportunity to be evil
Did you watch till the end? Either way, up to goes for both.
@enihi It's called being smart and not exposing yourself to lawsuits. But no let's cry about evil marketing people, as if you wouldn't do the same in their position.
i mean it's 50W 4070, like 1/3 of its full potential so...
"at similar TDP"
How would a 5090 perform at 50 watts?
Not good at all. You're shaving off over 100w of it's actual tgp. You would be better off just buying a lower tier laptop at this point.
5090 has 24,000+ cores to power so it will probably be worse. 5090 is extremely power hungry even when idle (source : Gamers nexus efficiency test)
@paincake2595there's a 5090 laptop GPU inbound that is basically a desktop 5080 and has a power limit below 200W. That might be what this person is talking about.
Idk if a desktop 5090 could even function at 50W.
Yeah I was confused when I heard the news at first because that is a much bigger win from Strix Halo than I was expecting. Then I saw the power limit of the Nvidia GPU and it made sense.
It's a fair comparison though because that's what Nvidia products can do in that type of form factor and power draw. If those things are important to you then Strix Halo products are going to be head and shoulders above the competition.
If you just need raw performance and 16GB or 24GB is enough VRAM you'll still want Nvidia.
@ understandable.
That's so promising. I look forward to laptops that use a single chip with powerful integrated GPU. If AMD can do what Apple has done with the M series, this will be a huge win for windows laptops.
💯
also, windows laptops have expandable RAM, so team-windows might have the upper-hand soon enough... The only issue that remains is battery life...
It's an OS issue, not a hardware issue, afaik.
@si.ari.06 With integrated graphics and the fact that AMD laptops usually are more power efficient, I believe the battery will be great.
@m.heyatzadeh that's true, and what I'm getting at is that the windows OS has a few optimisations to be made for laptops to be more power efficient.
also, we MIGHT have to eventually shift to ARM64 architecture to get some of the best LAPTOP designs ever.
@@si.ari.06 This uses soldered memory
Yes, and just like Apple, little to none upgradeability and super high costs.
Would be great if you could compare a full powered rtx 4070 mobile to amd's numbers
But the entire point of this being integrated is for low power applications
im sure this channel has a video on 4070 laptops at 50w. It's like 20% slower at 50w.
But AMD's Strix Halo in Flow Z13 isn't running full power either. Strix Halo can run up to 120W at full power, so yea, would love to see that in comparison with 4070 in a full sized laptop. My guess is that they both will perform quite similarly with Strix Halo using much less power and space, giving more options for bigger ram, cooling, battery size or be used in a smaller form factor.
Based on Jarrod's previous testing where he compared the laptop RTX 4070 with different power limits, it's 31% faster at 100W than at 50W, and doesn't get notably faster beyond 100W.
That would make a Radeon 8060S roughly 7% slower (though possibly about the same speed, or significantly more than 7% slower, depending on which games are benchmarked) than a full powered RTX 4070 mobile on average.
@@nathangamble125considering they are compared within the same form factor, meaning they are probably limited to the same Wattage, a full powered 100W 8060S will probably beat a full power 4070 by 20-30%. The problem remains in what OEM will make such a device, considering this is mostly aimed at lower-powered devices.
AMD just randomly giving a free 4060 with their CPU is so AMD
"free", that laptop will cost $2000+
Free? Lmao, you must be naive to think these laptops will cost less.
These replies are 100% innoculated against getting jokes.
@InnuendoXP then please explain, I'm waiting
@@TH-camModsAreSnowflake the joke is playing a silly reframing around the popular notion in PC gaming discourse of AMD providing consumers with value for money & integrated graphics being seen as a cost-negligible inconsequential bundled-in add-on in APU packages. It absurdly applies this sentiment against a premium product in a now rapidly changing market segment of APUs which is neither inconsequential nor value for money just because it shares the trait of being integrated graphics & an AMD component. It's the "a hot dog is a taco" and "smoothies are a soup".
Does this satisfy your bizarrely serious & chronologically demanding query about why this is a joke?
The *Real* Ryzen 4070
Should have maybe called it 8070S 😂
Now we wait for the bemchmark 😂
the AMDX A070
AMD Ryzen Mobile 4070E. Meaning an AMD APU with equivalent of an RTX 4070.
96GB of VRAM. This is the future of Laptop gaming.
Or everyone will buy it to run LLMs - it will be cheaper than a Mac
there's a 50w version of a rtx 4070 ? lmfao, imagine being scammed into buying it
Very few mobile 40 series cards got decent power this gen, i couldnt get any 4060 laptop above 75w here in australia, most were 45w up to nearly 2k$ then 2-3k$ for a 75w that is still gimped is crazy so i just tapped out of the laptop market, upgraded my desktop and got a ryzen laptop instead.
😂😂😂
Jarrod you have the data for the RTX 4070 with its full fat TGP, would've been interesting to also include them in your chart...
No need to say likely....if you can have a better estimate to make some early conclusions on.
Unless you were sponsored to make this video of course.
he can't put numbers from his own tests up against numbers from a completely different test though... he doesn't know what they were doing in those games to make it actually a fair comparison so it would be useless information to include.
@@Wolfer1OOO it would of been a perfectly good comparison he would of just needed to add that hey these testing parameters could have been different but both systems ran high settings 1080p in said games, sure its not the exact same scenes or ambient conditions but i mean it would of been atleast 95% accurate
technically it is unknown what TDP the Strix Halo chip is running at either, it could be 55W total like in other tests, which is great, or 55W GPU only...which is not so great
Asus' own Timespy graphs are kinda disappointing, but Verge had Helldivers FPS and those were very efficient
@@AndrewBrownerwhile I agree those are the numbers I'd like to see I don't think it would be a good idea to include. There are too many variables with the numbers. The data would be only marginally more useful than him just saying what he said.
Don't forget, every hour these creators spend on one video is an hour they can't spend on another. So he probably just felt the opportunity cost was too great for the quality of the comparison.
Besides, you can go look at reviews for that 4070 laptop and compare to reviews for one that's not power limited. The info is out there.
Better to wait for him to get a sample and I guarantee we will get those numbers, but properly done in a scientifically rigorous and useful way.
what a stupid comment
I hate seeing AI in everything even GPUs
Considering that AI doesn't even exists, it's algorithms with machine learning, nothing new. But companies know that idiots will then say "what, no AI? No thanks then". So they slap that AI nonsense on every product.
@IamTheGreatCornholioo So True
@@IamTheGreatCornholioo Yes exactly this, people look at me like I'm insane when I tell them all this bs
@@IamTheGreatCornholioo It's not really about consumers demanding AI in everything. That's more about investors and product managers. I have yet to meet a consumer who's dying to have AI added to their toaster and any other product they own.
Whether or not it's AI, that point doesn't really make much sense. If we do achieve whatever concept you have of AI, how else would we do it other than algorithms and machine learning? Coating an Intel Pentium in slime mold and hoping something happens?
Ah yes, more performance bad cuz "AI" bad. You have some problems if 2 letters piss you off so much.
For context a 50w 4070 is 30% slower than at 105w. The Ryzen 9 AI Max+ 395 itself can consume upto 120w. I wonder who wins in the performance per watt here.
You would also have to factor in the CPU and idle power draw
Literally mentions same TDP in the graph, an APU at the TDP of a GPU outpacing the GPU while still doing CPU things.
@@shk0014 Would be interesting to see how it scales up.
4070 wins by around 36% in reality probably
120W includes the CPU, the 13900H would be using some power too :) they say similar TDP so it's hopefully not too different.
Amd know that nvidia wants their console contracts and now they are working for it even more.
The next console chip is going to have to match a 4090 desktop.
@@rotm4447 There is no way it will, just like the PS5 never matched the RTX2080ti or even the GTX1080ti.
@ "ps5 doesn't match the 1080ti" citation needed.
@@MadViking82Sarcasm?
@@MadViking82 Yeah the ps5 didn't match it, because it was faster.
In my opinion iGPU beating discrete GPU's while still being more expensive is kinda pointless
This will definitely be great especially for mini PCs. Imagine a Mac mini with this CPU and upgradability of a mini PC. iGPUs have come a long way. Even so with Arc.
literally a dream, hope its not that far
AMD is killing it in the APU space lately. I’m super happy with my Asus Vivobook S14 with an 8845HS and a 780M already. This is just on another whole new level though. I’ll never buy another laptop with a DGPU. If I can get a thin and light laptop with graphics like this, then it’s all I need for light gaming. All my heavy gaming will be done on my 3080 12GB PC (soon to be a 5080).
It's not really hard to understand how an IGP can outperform a dedicated GPU. It's all on memory speed and bandwidth and capacity and all of that can be buffered within the APU with the aid of a very large cache but the APU will be monstrous in size.
Would love a mini PC with one of these with 48GB+ ram
I don't want it in ultra portable...I want a stupid fast unleashed version in a desktop replacement style laptop, lol
Exactly what I want. I don't care if it is big and strong and heavy. But it needs to be full power.
There's 4090 laptop GPU already for that. The whole idea is trade offs. Even if this has the performance of a 4060 laptop, it would be enough for some people and if they release it with pricings similar to no gpu laptops (sub-500USD) it would make sense. But yeah, it should be a lot cheaper and also why only 50W? It's unlikely someone is gonna pair this CPU with a GPU anyways, as long as it gets enough cooling, they could make it higher than 50W, more like 100+W.
@@kockoklosharcheto there is less than now way that this is cheap enough to do that. dGPU less laptops are cheaper because they don’t have any dGPU at all, this still has what is basically a dGPU it’s just built into the CPU.
I’m certain that 40CU is more than they had in all their RX 7600 derived laptop dGPUs. And so that’s the _minimum_ price comparison.
You don't need to statically assign VRAM in the BIOS. The driver can dynamically allocate more memory and release it again once it's no longer needed. There's only very few, quite specific cases (i.e. as workaround to bypass VRAM capacity checks in badly written software) where it makes sense to do so.
Can't wait to see Lenovo's take on this chip.
AMD making apus :🗿
AMD making desktop gpu :🙃
AMD making laptop gpu : ..... (non existant)
There are laptops with RX 6600m
@@ItsAdamHere yes only low end laptop gpu seems to be in stock. I have never seen a 7900s/7800m on a laptop.
@paincake2595 Because the 7900m is just bad, it's idle powerdraw is a joke
Only mini pc has 7900m and 7800m@paincake2595
@@Fluffball555 That sucks, most powerful laptop amd gpu i have seen is 7600s in framework 16.
Im still perfectly happy with my r7 4750g but man seeing this just shows how far integrated graphics has come
I am PRAYING they bring this to a desktop socket. I don’t care if it’s now on am5 or next gen on am6, I wanna make a travel pc the size of a pop tart box with this thing so bad
I'm sure it will make it into mini PCs
Mini pcs are better.
Then stop praying for AMD to do something stupid.
Strix Halo needs a 256-bit memory bus with LPDDR5X-8000 in order to get enough bandwidth to work properly. Dual-channel DDR RAM is only 128 bit, and most DDR5 has a much lower transfer rate than 8000MB/s. Putting this GPU on AM5 would cripple it, so it would be a waste of money and silicon. Maybe we'll see some heavily cut-down model with about 16-20CUs on AM5, but that obviously wouldn't be able to perform anywhere near the 40CU version.
Anyway, if you have a desktop motherboard, you can get a graphics card.
@@Pezy_4584 They won't
Question is: _Is this another 5070 beats 4090 comparison?_
Probably yes
It literally says native frames on their graph.. so not at all, unless they are straight up lying
@PersonalBurrito It's not that, it is the fact that the graph is deliberately confusing.
The 4070 shown on the graph runs up to 65W with dinamic boost, way behing the full-powered 4070 at 140W.
Jarod's testing indicates that the 65W 4070 in the omen transcend 14 performed worse than a full-powered 4060
@@Fluffball555i mean it will be still crazy if we got a full powered. 4060 laptop performance in integrated graphics tho
It's a like for like power budget comparison. Which in thin and light makes perfect sense.
The price is 2199$ for the ai max + 395 with 32gb of ram.
The price is 1999$ for the ai max + 390 with 32gb of ram
Sounds worth it
considering this is competing with apple's m4 Max machines, it aint that bad?
@TheUltraMinebox If you configure apple device with 32GB Ram a 2TB ssd....
I wonder about prices in Europe...
This can be huge. Imagine the cooling options when only one processor needs to be cooled.
You can unlock the TDP in G-Helper. The provided charger is rated for 140W. At my testing it sustains around 130W definitely. Also, using a 160W GaN charger which can actually sustain a 140W load for an hour at least, with G-Helper which also unlocks the power draw from 3rd party chargers, the ACRNM can easily draw 140W with both the GPU and CPU combined, but you have to cool it externally unless you want to fry something. But it works for benchmarks.
The best use case for this CPU are small computers like thin and light laptop and mini pc's, for sure this CPU alone has a premium tax attached to it because it is a niche chip. If someone will make a slim 14"/16" laptop with 64gb of ram under $1600 that would sell really really well, though every $100 they add to that laptop a lot of the customers interested are dropping like a cliff.
The name of this processor is idiotic, AMD and marketing is a misunderstanding. But on paper, this is a gamechanger when it comes to Windows laptops. And I don't mean games, but as a whole slim laptop market.
I don't think we will need a GPU in the future. Bruh it's better than my console
I wait for something like this in a handheld pc/consoles in the future ❤
They don't actually specify that the laptop used for the 395 is the Z13
Not total TDP and only comparing MAX fps achieved. Not average.
At this point, Companies be slapping the word *_"AI"_* on anything & everything now. 💀
I WAS interested in this until CES. When it was revealed it's only going in 2 devices.
oh rip
What's the second?
@@GIAN1702the Pro version.
The progress in laptop technology is astounding. However, I am still very happy with my 2019 Acer Nitro 5 (i5-8300H, 16GB RAM, GTX 1050 (3GB).
They need to get these things out here before tariffs start.
I would like to see mini pcs with that chip. Sharing the same memory between cpu and gpu provides much better utilization since gpu memory will not be sitting idle when you are not using the gpu.
They released a new eGPU with TB5 and didn't give the new flow a TB5 port to take advantage of it.
I'd love to see someone doing a bottleneck test between TB4 and TB5 (could probably be done with a 4080 and 4070 Blade 18) to see how much difference it makes.
TB are for Intel CPU only
@lowpoul5552 not really, you can use a dedicated controller.
Many Asus Proart laptops have TB4 and Ryzen CPU's; many AMD motherboards have TB4 as well.
Curuous if these tests included "Turbo mode" with the official charger for the Flox z13 ACRNM.
I want a handheld pc with this kind of processor. BADLY
Asus are cowards and should have offered detachable controllers.
AMD needs to add Hardware encoding/decoding to match Intel Quicksync and/or NVidia 50 series. That would allow video editors/content creators to move into full AMD platforms. Especislly in laptops, it would be golden.
This Unified memory can be had with up to 128GB which will be incredible for AI workloads. Apparently it can run faster than a desktop 4090 when the model is too large for the 4090s VRAM of only 24GB. Many models now are larger than 24GB!
I'm eagerly waiting for more testing from Jarrod and other independent reviewers
Does this mean future laptops might not even have a dedicated graphics card, because they simply wouldn't need one, and so could be built smaller?
If they used turbo mode for testing on both tablets, then its gonna be beyond impressive.
Looking forward to your testing on this 🎉
How long until we get a desktop version ?
Hm, now i want a thin n light laptop to travel, light game on trips, with thunderbolt 5 + 9070xt for home setup.
I have a Zephyrus G16 that has the AMD Ryzen AI HX 370 (Radeon 890M integrated graphics), and was already very surprised with how well it can game with the integrated graphics, but gaming with the Nvidia RTX 4070 is still way better (about 50-60 more fps compared to integrated graphics on FFXIV with high settings).
I wish some handheld PC maker puts the 395 in a handheld instead of the 370.
Technically doable, even in a power gimped form.
This is what Valve was waiting for I think for the Steamdeck 2
Maybe. But they've gone on record saying they'd like to maintain the battery life of the original SDeck when designing its successor, and I'm not too sure if a 45-120w CPU would give them that.
While shoving a bigger battery into it is an option, we're discussing a "gaming handheld," there's only so much weight which can be added.
This is way too expensive and power demanding for a Steam Deck 2
120 watts on a handheld would last for like 40 minutes 😂
AMD compare that iGPU to 50W dGPU.
Steam desk combine TDP is 15W.
16cores just for handheld is waste too.
The best will be if steam could word with ARM and use Snapdragon Elit, which will be cheaper and could work on 5-10W.
Mobile chips are getting way better improvment
ThePhawx showed this chip working at a "surprisingly low" tdp while still being very impressive so we'll see.
I hope to see these APUs for a budget desktop segment.
So a handheld PC is more powerful these days 😅
future of gaming laptop, no need dedicated gpu and less weight
so were just abandoning ray tracing? isnt that the biggest deal about upcoming titles and something thats getting a ton of attention
@@AndrewBrowner tbh i dont use ray tracing, usually i turn it off for more real FPS
@@ShojiVR depends on what games youre playing i suppose, im on Allan wake 2 and its by far the best graphics ive seen in a game to date, 4080 laptop can just squeek out 60-80fps with every on high and max ray tracing with the help of dlss frame gen
if youre playing battle royals and multiplayer shooters only then for sure you just want a steady 120+fps and graphics are barely getting noticed
I am curious to see how this will do with local LLMs
I see this as the primary use case for this chip. Gaming performance for me is just a bonus 🤌
@@zachariah380Fr lol
At this point it feels like a windows version of macbook pro
The sceptic in me feels like integrated graphics are being pushed since upgrading requires purchasing an entirely new SOC.
I can't wait for review units to get out so we can see how the new gpus are performing
This is what I've been hyped for. A tablet platform, thinness, touch screen with power. I want this lol
i just bought a rtx4070 laptop and you published this video ;(
Probably would be twice the price
What rtx4070?@@Isaiahthegoat
@@Isaiahthegoatwhat?
@@ISSHINASHINA7 a laptop with said cpu would of been more expensive when its released then what you purchased is what he meant, this chip isnt even on the market yet.. and as far as i know it doesnt have any ray tracing cores, something youll be glad you have with every new game that launches now, some developers are even requiring RT cores for the upcoming games to launch
Dedicated gpu is better bro. Just enjoy what you bought. This laptops are expensive. You are getting bang for your buck for sure.
the real issue with this is the 120w power limit. this 16 core cpu would really want to have more power budget when paired with such a big APU.
The future for handheld gaming is looking bright.
I have a suggestion, when you test the laptop, Instead of showing a boring graph and blabbering just show us real footage with fps meter.
I have been watching your video's for a long time. Everytime I have search another good youtuber actually showing game footage so I can watch the video without and enjoy it.
How does it perform against a 175w rtx 3080ti laptop?
Do you suggest the lenovo loq? With its motherboard related issues? 8845hs and 4060 should i go for it? At 900 bucks?
Can’t wait AMD to give Nvidia a real shock in the next few years like they gave Intel a few years back
Waiting for desktop version of it to drop with higher clock.
The performance in the power envelope is not surprising for me. RDNA actually have big chunk of it's performance at incredibly low power usage. It's just to squeeze that extra 20% performance the efficiency just drop to a cliff.
I've undervolted my RX 6800 to 930mV + underclock from 2200 to 1800 which retains 80%-ish of the performance, and the power consumption is literally sub 100W. It's insane.
My 4070 laptop usually runs anywhere from 90-125 watts.
4080 in my lenovo typically rides around 110-130W with the boost being an even 175w.. i dont understand the boost watts though it doesnt seem like it uses it when its needed most rather it just uses the boost immediately and then never again.. would be nice if it saved it for when frame rate was dipping, playing Allan wake 2 lately and theres been a couple times where the fps dipped from 70 to 45 for several minutes when i was in demanding scenes would of been nice to see the gpu spike upto 175w then to try and maintain that 60+fps
I love integrated graphics, but at the same time it's sad that the soldered RAM issues we have in modern laptops are making it more difficult to enjoy. Even if you just want a laptop with 32GB RAM, most manufacturers already try to rob you.
So that's the end of the 780M, 890M naming...
I could see myself installing Bazzite/SteamOS on this and just having a straight up gaming laptop.
Bazzite is AMAZING.
Hey Jarrod, would you consider Grey Zone Warefare in benchmarks?
Looks like AMD is still stuck on 4nm. I expected them to move to 2nm by now. I have Ryzen 9 7940HS that max out at 5.2Ghz. This thing is a beast. New AMD cpus are not a significant upgrade other than the extra cores.
Besides the performance claims, which are still to be independently confirmed, I find the flexible assignment of RAM an incredibly valuable feature on its own. At least 12GB should be a minimum VRAM these days (from extensive studies by Hardware Unboxed and others.) So any GPUs limited to 8GB imho are likely soon to become a no-go.
When is the embargo over so you could test this out?
Not to mention that iGPU don't have insufficient VRAM. (Unless you bought a 16g model...)
Imagine AMD would make this APU with 8 cores 16 threads instead of 16/32 so the thermal limit for the gpu is higher, as well as the cost is cheaper , so they could sell medium grade laptops that crush every entry level gaming laptop in price. Why do they always include the big GPU on their top end APU's with less room for graphic gas? it seems market segmentation is too important for them and they don't want to materialize the advantage in efficiency a gpu oriented apu could make.
Ryzen AI MAX 385.
Annoyingly though, the GPU is cut down to 32CU. The GPU will still be pretty good with 32CU compared to 40CU, but it would be cool if there was an 8-core SKU with the full 40CU 8060S.
@nathangamble1
Curious to see how the 385 would stack up against a ps5 pro performance wise (yes I know the CPU in it is incomparably battery than the pro but still)
They should make a Ryzen Max without the AI and then the cost would be 50% lower
When the AI Max 395 does release, could you do some AI tests? I would like to know how the 96 GB of RAM benefits local LLM performance. If you haven't done local AI tests before, I'd be happy to help you get started with something basic.
I honestly hope the reviews seal the deal for me. This is my dream system that might move me away from traditional desktop. (Currently using a older laptop with monitor but I've built three desktops in the past).
I’m wondering if I can use a 65 watt ac adapter for this laptop to get okay performance on a long plane ride
I don't think so, you'd get bad performance because the cpu tdp would be like 10W
@@Fluffball555 i dont have real world testing but assuming the laptop will allow itself to pull full TDP's on both it will have the power on hand because it can draw from the battery in peak performance, the battery will only deplete incredibly slowly because its not powering the base function of the laptop just the performance peaks, and the ac adapter can top off the battery when the power draw dips down.. but 65W is really low, since the motherboard, ram, screen, fans ect all need power too
@@Fluffball555 poweris allocated dynamically within 2ms on a per need basis
This was the vision of many during pandemic when GPUs were scarce and hoarded for mining
Yeah Jarrod! Love ya videos mate
I have a Lenovo Legion Pro 5. The 2023 version with a Intel i7 13700HX 32 gigs of DDR5 RAM and a 4070 a 8 GB of vRAM. You're telling me that this is better than that. That's crazy it's true.
If we could get handhelds that ran like my laptop I would buy one so that's promising for the future cuz I would prefer a handheld but I didn't get one because I just wasn't sure that it was powerful enough yet that the technology was there yet. But seeing this is pretty cool man I definitely going to go AMD this next go around. I bought this before the big shenanigans with Intel.
Yep, a gaming laptop with this type of processor it’s gonna be very good but I hope a laptop that will have a port that you can connect an EGPU
What I'm hoping for is these integrated GPUs get really efficient and powerful so that we can play games while on battery. I don't like the Steam Deck mostly because it is meant for playing games only. I would like a 2-in-1 laptop that I could game on (on battery) and also watch movies and do some web browsing
It will be installed in very expensive laptops, so you better just buy 4070 for half the price.
If it's too good to be true, then it probably is.
😂😂😂😂. Really hope they shove desktop chips of this with full power on a beefy cooled laptop.
That might work for other subjects, but we already know how good AMD’s APUs are. The 780m already matches a desktop GTX 1650 with only 12 RDNA 3 compute units, and the new Strix Point Halo has up to 40 RDNA 3.5. Then you take into account that the iGPU can use 96GB of RAM, instead of 4GB, and it’s easy to see how good it will be. Also, this is being compared to a cut down mobile 4070 with a low TDP. It’s nowhere near a desktop 4070.
Those numbers for Cyberpunk on 4070M look really weird unless they’re doing all maxed out no RT, because I just ran a 1080p high benchmark without upscaling and got better than that on 4060
Yep, but like he mentioned, that specific 4070m is only running at 50W. Pretty low, as most laptops running them are set to draw a max of 90-115W. It beats a severely gimped 4070, not one running at it's normally available configuration.
The one in my laptop usually draws 80-90W which is considerably more.
THe extra X3D cache on this chip is gpu only !
That is crazy i never imagined an igpu would beat a dedicated rtx card in anything
Wait I thought this was going to be a hands-on look? when is that video coming out?
you guys were laughing when he said Ryzen 4070, now who's the laughing stock , he was well ahead of time
hey, I hope you see this. you are the "Project Farm" of laptop TH-camrs and your content is amazing