against a 4090,+33% cude cores ,+ 27% effective fps ,+30% W power ,+30% price ...isn't worth the price but if you work on an immense amount of models in 3d rendering programs and you need speed up your workload ,or if you work on AI creations...only a dumb/very rich person would buy a 5090 selling a 4090 for games..
@@francescosorgi6633 as a 4090 owner i agree. Frame Generation is a huge let down because it causes lag. I thought about an RTX 5090 upgrade because i want FPS advantage in FPS shooters ,but finding out that FRAME GEN causes lag makes it not worth.
I saw you‘re the first user using a Ultra 285k instead of a AMD processor. Since I am more interested in 3D rendering and video editing I am aksing you, if you regret buying the Intel instead of the AMD. I am also playing games and the Intel didn‘t have any great benchmarks. What do you think? For information: I am still using my 9 year old PC with i7 6700k and gtx 1080. It‘s working fine but it struggles with Davinci Resolve Effects, Grading etc.
Gamers nowadays are only worried for the FPS instead of enjoying the game itself and the companies are filling that void. Fantastic performance though.
I dont think thats true. They definitely want to play amazing games but these games are getting more demanding on PC and the GPUs are having a tough time keeping up with some of them.
@@MatthewMoniz They keep getting demanding for little benefit. Black Myth Wukong runs very poorly for what? Looking good? I rather they make more games like Balatro or Astro Boy. Very fun stylistic games that run perfectly not needing powerful hardware
@@d24 I disagree, i think Balatro and Astro Bot is boring (does not mean i think the games are bad, they are just not for me) and prefer games like Cyberpunk, Wukong
Why did Apple's M4 MAX did so well and outperformed the RTX5090 and 9 285K? Any explanation please? Could it be said that for PHOTO EDITING an M4 MAX is faster/better than even that expensive RTX5090 + 9 285K combo? How come? Thanks!
Photoshop doesn't use a lot of GPU. Most of it is memory bandwidth and single core performance, and an Apple M4 series performance core is the most powerful CPU Core ever made. And by a good 25%.
@ They absolutely are. Okay. Find me a CPU Core that is faster in either IPC or absolute terms than an Apple M4 P-Core. It needs to score higher than 175-180 in Cinebench 2024 or 4000 in Geekbench 6. And it needs to do it at 4.4GHz or lower.
As your Titles says. Fake Frames = Fake Perfomance. The actual RAW Perfomance of the 5090 ist not really that much of a Different. Its not even noticeable in most scenarios.
My 2080TI is still holding up 5 years later! Gonna go for a laptop with the new 5080 card for my studies instead of upgrading my desktop. Asus ROG Zypherus G16
Hey @matthewmoniz I think there is a math error on the benchmark numbers. For example, the Black Myth: Wukong benchmark at 4:50 says that 49fps is 27% more than 37fps. But 49fps is actually a 32.4% increase over 37fps (37*1.324=49). There were similar math errors throughout the benchmarks. The improvement is relative to the original number. a 27% increase over 37fps would be 47fps, not 49fps. Or, to make it simpler, 15 is a 50% increase over 10.
Great review, easy to understand Appreciate the productivity part. I think this is the card for people with professional work. The $2000+ for a hobby gaming is a bit much IMO, it's better to just get 70 or 80 class and upgrade again next gen. RTX 6080 is probably cheaper and faster than this (and of course will come with 8x MFG 😂)
Thank you very much, Mathew. The information you provided regarding the M4 Max was invaluable. In light of your expertise, I would appreciate your recommendation on whether to purchase an RTX 5090 laptop or an M4 Max for professional 3D Blender work. Your guidance would be greatly appreciated. Thank you in advance for your assistance.
How much VRAM do you think you’ll need? For now definitely get the 5090, but also keep an eye out for Nvidia’s supposed APU they are rumored to be working on. If it has unified memory, it would allow you to allocate most of the system memory to the GPU, a huge advantage over current VRAM starved offerings from Nvidia
MSRP is not what it will be sold for. First run will be limited supply so out of stock most likely. Then any AIB board partner such as ASUS, MSI or Gigabyte will crank the prices with their models with their additional cost add.
Have you watched Reviews on the POWER DRAW of the 5090? with the exact settings as an RTX 4090 your overall powerdraw to play the same games is much higher. This means that your Electric Bill will be much higher while you are playing the same games ... all because you have a new graphics card. That is a in the long run bad investment for something that is not much better especially with 1080p & 1440p gaming.
@ Yes, now I have. It is basically a beefed up 4090. But that is at stock, it performs almost the same even with 10-20% lower TGP. So there are some improvements but they do come with a much higher price tag and someone buying this would want the absolute best performance possible so limiting the TGP is counter-intuitive.
I agree with folks saying AI and software magic + hardware improvements are the best option for computer graphics moving forward. I think if we rely purely on hardware upgrades we're going to continue to be bottlenecked and see even less gains out of future chips. I'd like to see continued improvements to DLSS, frame gen, and raytracing on top of hardware improvements. Rather than just trying to brute force frames through rasterized rendering. I'm hoping for future generations, Nvidia is able to tackle the latency that frame generation adds as well as improving image quality.
Kena 50s 3060, 3070 80s 1440p F1 60s 3060, 3070 100 1440p New world 1440p 3060 70s, 3070 100 Flight sim 1440p, 3060 40s, 3070 60s Forza 3070 4k 100,3080 130 on average Hitman 3070 60s 4k, 3080 80s 4060 4K tsunama at 25 fps, 4070 40s no DLSS Elden Ring no DLSS 1440p, 4060 40 fps, 4070 70
4080 Forza 4080 110 fps, 4080 140 4K Avatar 4k 50fps 4080, 4090 70 Hogwarts 4080 4k 45s, 60 4090 As you guys can see, I compared models directly under a higher class, as that's the only way it makes a higher class worth it since the jump will be bigger, which is why it makes no sense to compare a 4090 to a 5090 only 1 gen apart. It only makes sense to compare the 4080 to the 5090, which will skip the 4080s; this way it shows a good jump. The same goes for coming from a card like mine, 4070, to the 5080 or 4060 to 5070.
Thanks for the video. I am a bit dizzy so did I miss the part of power consumption other than the TDP? Temps? If I didn't, I am guessing that part is still under non disclosure. I think it looks good. Thanks again.
I think that there's real possibility for the upcoming M4 ultra/extreme to have more raw power in some cases, maby even in games if there's no RT... Imho X86 really is reaching the limits
I don't care if frames are AI generated. If it feels & plays better than without it - that's awesome. I think upscaling technologies are great. And in the future, we'll be playing fully on-the-fly generated games.
@@lukabosnjak3829 For example, it introduces latency. Think about the extreme scenario: 2fps vs natural 8fps. The 2 fps "fake frames" need to wait for half a second and half a second to be generated, the 8fps are instead rendered on screen "in real time" to your game. So you can have 8fps in both cases but the game will be in the past with respect to your screen and input. It's a bit like the sun being 8 minutes or so in the past...
Absolutely abysmal compared to the 4090. I wasn't expecting a generational leap like the 30 to 40 series had, but I was expecting more than a 10-15 FPS increase in games with traditional raster perf. On top of that I was expecting WAY better RT performance and that's not what we got either. They took away the 1600$ GPU so we could pay 2000$ for a 4090 ti. If the 5080 can'ts beat the 7900xtx Nvidia is cooked. Especially if the rumors about the 9070xt are true, Nvidia is done as far as "gaming" goes. Intel has the low to mid covered pretty well and AMD, if they play their cards right, will have the mid to high levels covered. If Nvidia is going to stop focusing on gaming and push out more "work place" kind of GPU's that fine though. Doesn't hurt.
I think I agree with Dave2D when he said that companies need to rely on AI and software for fps boost because hardware progression is slowing down as we're already down to 4nm process now AND due to power & thermal limitations in laptops, Software is the only way to boost FPS rather than hardware. Both Hardware and AI in conjunction will give a better upgrade in performance as subsequent GPU models launch, in contrast with just hardware improvement which won't give as much performance upgrade just by itself.
You really believe the spittle that’s coming out your mouth, the more you say it to your self the better it sounds. AI is BS and you are just adding your BS.
I dont think anyone was gaming at 4k with a 1070. I'm sure the GPU industry would love to not use AI to upscale but there seems to be a power constraint issue. Games are getting more graphically intensive and GPUs cant keep up unless they push more power. Unless games took the Nintendo approach and not cared about graphics and focused more on the content.
What should I get guys? Lenovo Legion Slim 5 R7000P 2024 Ryzen 7 8845H RAM 16GB SSD 1TB RTX 4060 16" 2.5K 165Hz Lenovo Legion 5 Y7000P 2024 i7 14650HX RAM 16GB SSD 1TB RTX 4060 16" 2.5K 165Hz I mainly use for coding and sometimes playing game like valorant, FO4
What do you guys think of the 5090?
against a 4090,+33% cude cores ,+ 27% effective fps ,+30% W power ,+30% price ...isn't worth the price but if you work on an immense amount of models in 3d rendering programs and you need speed up your workload ,or if you work on AI creations...only a dumb/very rich person would buy a 5090 selling a 4090 for games..
@@francescosorgi6633 as a 4090 owner i agree. Frame Generation is a huge let down because it causes lag. I thought about an RTX 5090 upgrade because i want FPS advantage in FPS shooters ,but finding out that FRAME GEN causes lag makes it not worth.
@ ah no if you play fps cant use mfg haha you would give to yourself a massive handicap also if you run 400 fps
@ EXACTLY . The best part of 5090 is Multi Frame Gen and to find out that it is not useful for FPS shooters .... huge thumbs down
Disappointing to increase the price
$2K? I'm definitely not in that league, but then I don't do the kind of work that justifies a card like that.
Ya its not for most people that's for sure
Iphone pro max costs 1,200$, and its the most popular iphone. Ofc 2,000$ can be justified for lots of people
good luck finding this one on 2k all custom non founder models start on 2.4k
this card not for common silly gamers, but for creators
I saw you‘re the first user using a Ultra 285k instead of a AMD processor. Since I am more interested in 3D rendering and video editing I am aksing you, if you regret buying the Intel instead of the AMD. I am also playing games and the Intel didn‘t have any great benchmarks. What do you think?
For information: I am still using my 9 year old PC with i7 6700k and gtx 1080. It‘s working fine but it struggles with Davinci Resolve Effects, Grading etc.
Gamers nowadays are only worried for the FPS instead of enjoying the game itself and the companies are filling that void. Fantastic performance though.
I dont think thats true. They definitely want to play amazing games but these games are getting more demanding on PC and the GPUs are having a tough time keeping up with some of them.
@@MatthewMoniz They keep getting demanding for little benefit. Black Myth Wukong runs very poorly for what? Looking good? I rather they make more games like Balatro or Astro Boy. Very fun stylistic games that run perfectly not needing powerful hardware
I think I see the trend that we have to brute force our way now for high fps. The games are demanding but for the wrong reasons.
@@MatthewMoniz make tests in REDSHIFT render! + with simuliations with particles and pyro in cinema4d. that will show us really difference vs 4090
@@d24 I disagree, i think Balatro and Astro Bot is boring (does not mean i think the games are bad, they are just not for me) and prefer games like Cyberpunk, Wukong
Why did Apple's M4 MAX did so well and outperformed the RTX5090 and 9 285K? Any explanation please? Could it be said that for PHOTO EDITING an M4 MAX is faster/better than even that expensive RTX5090 + 9 285K combo? How come? Thanks!
Photoshop doesn't use a lot of GPU. Most of it is memory bandwidth and single core performance, and an Apple M4 series performance core is the most powerful CPU Core ever made. And by a good 25%.
@@_shreyash_anand they are not the most powerful now, iSheeps like you always exaggerate products of apple
@ They absolutely are. Okay. Find me a CPU Core that is faster in either IPC or absolute terms than an Apple M4 P-Core.
It needs to score higher than 175-180 in Cinebench 2024 or 4000 in Geekbench 6.
And it needs to do it at 4.4GHz or lower.
Photoshop not optimized well for modern intel nvidia hardware.
It uses the ancient apple single core
@@_shreyash_anandsingle core era is dead
Photosohop is optmized for apple single core, the ancient computing!
As your Titles says. Fake Frames = Fake Perfomance. The actual RAW Perfomance of the 5090 ist not really that much of a Different. Its not even noticeable in most scenarios.
NVIDIA's main selling point is 30% performance increase and Fake Frames that you cannot even use for FPS shooters because it causes lag
My 2080TI is still holding up 5 years later! Gonna go for a laptop with the new 5080 card for my studies instead of upgrading my desktop. Asus ROG Zypherus G16
Hey @matthewmoniz I think there is a math error on the benchmark numbers. For example, the Black Myth: Wukong benchmark at 4:50 says that 49fps is 27% more than 37fps. But 49fps is actually a 32.4% increase over 37fps (37*1.324=49). There were similar math errors throughout the benchmarks.
The improvement is relative to the original number. a 27% increase over 37fps would be 47fps, not 49fps. Or, to make it simpler, 15 is a 50% increase over 10.
I wonder if I should camp outside Best Buy again.
The card so fake the fake frames replaced frames in this review to show positive review.
I dont buy it, literally
Great review, easy to understand
Appreciate the productivity part. I think this is the card for people with professional work. The $2000+ for a hobby gaming is a bit much IMO, it's better to just get 70 or 80 class and upgrade again next gen. RTX 6080 is probably cheaper and faster than this (and of course will come with 8x MFG 😂)
Thanks I appreciate it! But ya $2000 is a big ask for most gamers lol
Thank you very much, Mathew. The information you provided regarding the M4 Max was invaluable.
In light of your expertise, I would appreciate your recommendation on whether to purchase an RTX 5090 laptop or an M4 Max for professional 3D Blender work. Your guidance would be greatly appreciated.
Thank you in advance for your assistance.
For blender definitely 5090
How much VRAM do you think you’ll need? For now definitely get the 5090, but also keep an eye out for Nvidia’s supposed APU they are rumored to be working on. If it has unified memory, it would allow you to allocate most of the system memory to the GPU, a huge advantage over current VRAM starved offerings from Nvidia
So 5090 is a decent upgrade over the 4090 even without all the Frame Gen bs.
MSRP is not what it will be sold for. First run will be limited supply so out of stock most likely. Then any AIB board partner such as ASUS, MSI or Gigabyte will crank the prices with their models with their additional cost add.
Have you watched Reviews on the POWER DRAW of the 5090? with the exact settings as an RTX 4090 your overall powerdraw to play the same games is much higher. This means that your Electric Bill will be much higher while you are playing the same games ... all because you have a new graphics card. That is a in the long run bad investment for something that is not much better especially with 1080p & 1440p gaming.
@ Yes, now I have. It is basically a beefed up 4090. But that is at stock, it performs almost the same even with 10-20% lower TGP. So there are some improvements but they do come with a much higher price tag and someone buying this would want the absolute best performance possible so limiting the TGP is counter-intuitive.
50% at its best
I agree with folks saying AI and software magic + hardware improvements are the best option for computer graphics moving forward. I think if we rely purely on hardware upgrades we're going to continue to be bottlenecked and see even less gains out of future chips. I'd like to see continued improvements to DLSS, frame gen, and raytracing on top of hardware improvements. Rather than just trying to brute force frames through rasterized rendering.
I'm hoping for future generations, Nvidia is able to tackle the latency that frame generation adds as well as improving image quality.
Yeahh... I'm sticking to my 4080
4080 is still a great card.
I'm wondering how long I can keep my 3080. I am still satisfied with gaming performance but the extra VRAM for deep learning could be really great
Kena 50s 3060, 3070 80s 1440p
F1 60s 3060, 3070 100 1440p
New world 1440p 3060 70s, 3070 100
Flight sim 1440p, 3060 40s, 3070 60s
Forza 3070 4k 100,3080 130 on average
Hitman 3070 60s 4k, 3080 80s
4060 4K tsunama at 25 fps, 4070 40s no DLSS
Elden Ring no DLSS 1440p, 4060 40 fps, 4070 70
4080 Forza 4080 110 fps, 4080 140 4K
Avatar 4k 50fps 4080, 4090 70
Hogwarts 4080 4k 45s, 60 4090
As you guys can see, I compared models directly under a higher class, as that's the only way it makes a higher class worth it since the jump will be bigger, which is why it makes no sense to compare a 4090 to a 5090 only 1 gen apart. It only makes sense to compare the 4080 to the 5090, which will skip the 4080s; this way it shows a good jump. The same goes for coming from a card like mine, 4070, to the 5080 or 4060 to 5070.
Thanks for the video. I am a bit dizzy so did I miss the part of power consumption other than the TDP? Temps? If I didn't, I am guessing that part is still under non disclosure.
I think it looks good. Thanks again.
And also there's apparently talk that Multi frame Gen may even be coming to earlier generations, which I hope is true
I think that there's real possibility for the upcoming M4 ultra/extreme to have more raw power in some cases, maby even in games if there's no RT...
Imho X86 really is reaching the limits
Maybe with some applications like video editing but in 3D the M4 Ultra will still be significantly behind compared to using a NVIDIA GPU.
@@MatthewMoniz make tests in REDSHIFT render! + with simuliations with particles and pyro in cinema4d. that will show us really difference vs 4090
@@MatthewMoniz yeah Video Editing because of shared memory of RAM
I don't care if frames are AI generated. If it feels & plays better than without it - that's awesome. I think upscaling technologies are great. And in the future, we'll be playing fully on-the-fly generated games.
your not seeing the biggest picture... this is very bad. one thing is upscaling the screen, but generating frames or it's actually terrible!
@@unkown34x33 Why?
@@lukabosnjak3829 For example, it introduces latency. Think about the extreme scenario: 2fps vs natural 8fps. The 2 fps "fake frames" need to wait for half a second and half a second to be generated, the 8fps are instead rendered on screen "in real time" to your game. So you can have 8fps in both cases but the game will be in the past with respect to your screen and input. It's a bit like the sun being 8 minutes or so in the past...
Why is photoshop so optimized for m4 max?
why should you not enable dlss in first person shotting games?
I explain it in the video. Latency
Absolutely abysmal compared to the 4090. I wasn't expecting a generational leap like the 30 to 40 series had, but I was expecting more than a 10-15 FPS increase in games with traditional raster perf. On top of that I was expecting WAY better RT performance and that's not what we got either. They took away the 1600$ GPU so we could pay 2000$ for a 4090 ti. If the 5080 can'ts beat the 7900xtx Nvidia is cooked. Especially if the rumors about the 9070xt are true, Nvidia is done as far as "gaming" goes. Intel has the low to mid covered pretty well and AMD, if they play their cards right, will have the mid to high levels covered. If Nvidia is going to stop focusing on gaming and push out more "work place" kind of GPU's that fine though. Doesn't hurt.
good options for us ..which quadro card way too expensive for 3D modelling and VFX .. i mean i'm hobbyist btw ..😅😅
Ya Quadro cards are jus throw the roof in pricing!
I'm going to be trying to get a 5090. I'm upgrading from a 1080Ti
Hello brother can I use this gpu as external gpu in my Galaxy book 5 pro. Please reply 🙏🙏
Sure if you want but you wont get full performance using an eGPU
I don't think you can...
Thundetbolt 4
Real Ships there 😮
5090 more like a 4090 Ti with a software feature that adds more fake frames....
5090 is bottelnecking 9800x3d
Do 8k and you will see 50% avg diff
Rip apple silicon
maybe Fake Frames is the Future of Gaming
Ofc, people use wireless controllers and dont realize it adds latency! If frame generation without artifacts its amazing technology
@@blondegirl7240yeah, Ive never noticed latency when running games, the only time I have is if using lossless scaling and at like 4x
we must pay then with fake money to Nvidia too!!500$ real money and rest 4x AI fake money
@@blondegirl7240wireless is getting better to the point where you avg person wont notice latencg
2k 😂 resellers wont agree on that
Great video as usual
Thanks again!
Talks about the m4 max and then doesn’t show it at all. GJ
I think I agree with Dave2D when he said that companies need to rely on AI and software for fps boost because hardware progression is slowing down as we're already down to 4nm process now AND due to power & thermal limitations in laptops, Software is the only way to boost FPS rather than hardware.
Both Hardware and AI in conjunction will give a better upgrade in performance as subsequent GPU models launch, in contrast with just hardware improvement which won't give as much performance upgrade just by itself.
I rather just stick out of this AI crap...
@unkown34x33 I agree, its annoying with how every tech company keeps pushing AI, its exhausting but i can somewhat understand AI's use in GPUs
You really believe the spittle that’s coming out your mouth, the more you say it to your self the better it sounds. AI is BS and you are just adding your BS.
@@unkown34x33 then use primitive hardware don't cry here
All this upscaling and fakeframes crap is disgusting. It where good times with GTX 1070 playing in 4k native with good FPS.
I dont think anyone was gaming at 4k with a 1070. I'm sure the GPU industry would love to not use AI to upscale but there seems to be a power constraint issue. Games are getting more graphically intensive and GPUs cant keep up unless they push more power. Unless games took the Nintendo approach and not cared about graphics and focused more on the content.
@@MatthewMoniz well said..
What should I get guys?
Lenovo Legion Slim 5 R7000P 2024 Ryzen 7 8845H RAM 16GB SSD 1TB RTX 4060 16" 2.5K 165Hz
Lenovo Legion 5 Y7000P 2024 i7 14650HX RAM 16GB SSD 1TB RTX 4060 16" 2.5K 165Hz
I mainly use for coding and sometimes playing game like valorant, FO4
Fake frames ofcourse, no other way around it.
60 fps is enough. Everytime I try to go to 144 or 120fps the game breaks... and now fake frames?? nah nah no AI ! f you Nvidia
can I have one for free me broke