If you are a gamer, it has been confirmed by third party testers, that the 5090 can play Solitaire at the highest settings. That alone justifies the $2K asking price.
Yeah, I'm interested in this too. I was disappointed that battlemage didn't get h266 decoding like lunar lake despite being released afterwards, but hopefully there is at least some performance improvement.
I know you're focusing on the RTX 5090 in the title here, but I would like to make it clear for everyone; the new decoders come to every RTX 5000 Series GPU. So even if you get the hypothetical RTX 5050 Laptop in 6 months time, this conversation applies to you. It will decode the same codecs the 5090 will. That is why this is great
@@uwhat1 Yes, but those are few and far between. Had that been what he meant, he should have been clear that without the extra power it's only rated to 75 watts and that accounts for 99% of consumer boards.
Please benchmark Lightroom Classic AI Denoise with the 50-series cards compared to the 4070 Ti, 4080, and 4090! It'd be great to see how CUDA cores and the new AI upgrades impact performance in the same PC setup - SUPER USEFUL info for photographers!
I still haven't seen it confirmed that they are supporting H.265 4.2.2. The last generations also supported 4.2.2 H.264, so i hope they aren't just talking about that on the promo. Either way we MUST have this card ASAP!
It does do HEVC 4:2:2 10-Bit. The 40 Series didn't because the 40 Series had the same NVDEC version as the 30 Series. They only upgraded NVENC to do AV1. So this is finally an exciting update
@@spoorthyv I think you're describing the 6-pin and 8-pin connectors from the PSU. PCI slots (the slot the GPU mounts into) provides power also, maximum 75W as @chritroy8047 says. The 5090 has a 12V 2x6 connector which they're saying is capable of 600W But it can derive it's power from 2 x 8-pins which until now had a limit of 150W each so I see issues for some people ahead.
@@theTechNotice ahh, that makes sense, but wouldn't it be advantageous to convince camera manufacturers to use av1 instead to reduce overall costs (for licensing)?
@@joshhardin666AV1 is very hardware intensive to work with. Playing multiple AV1 on a timeline, is very demanding. AV1 is really a end-user codec, not meant for recording. Many even prefer to record in H.264, over 265, for the same reason. It's also why we have professional codecs, like ProRes. It takes a lot of space, but is very easy to work on, on a timeline.
The 5090 is the ultimate card of course, but I think at the 999 price point the 5080 isn't actually bad either. The 5080 super might be the best value later on though, assuming it gets that VRAM bump. For gamers, it looks to me like the 5070Ti is the best value card at the moment. But that's all guess work, we'll see soon enough. I'll probably go for one 5090 or maybe 5080 FE, if those are actually available at msrp.
@IsaacSchultz-lz8jc then why is Nvidia telling me (Via tech support) that I would need to use the cable it comes with (the 5090) and not the one that came with my 4090?
Really? The previous 4090 cable is rated at exactly 600 watts max when the card draws 575watts max. I think they are saying it to be on the safe side because at release of the 40 series that cable did have some issues so it will probably work its just to cover their butts if you happen to have one of the faulty cables
Thanks for covering those aspects for us creators! I'm probably going to switch from my 3090 to 50-series for video work. I'd probably be happy with the 5080, but I don't want to downgrade from 24gb of Vram... on the other hand, I can't justify the money for the 5090. Damn.
I would go AMD but I need thunderbolt! And I don't think amds usb4 will run correctly with my audio interface unless there's a motherboard from amd I could get with thunderbolt? For 9950X3D
The competition isn't dead, maybe on life support, but definitely not dead. Intel ARC B580 is a huge success for budget gamers. Nvidia dominates for Windows creators.
Do you think the RTX 5090 will perform better than your 2x4080 ProArt Super build? Should i wait for the 5090 or buy 2x4080 super for redshift rendering?
AV1? Well, it's supported by 5090. And world gradually moving towards it. AV1 is less supported than HEVC. Both Twitch and Netflix moving to this codec, but they need majority of devices to support decoding.
Ok so: 1. You can’t use DLSS in DCC software like Blender for example 2. Yes the 5080 has the same amount of VRAM but it is much faster as it is GDDR7 3. People who bought the new Intel cpus and amd gpus 👁👄👁
Exactly what i was hoping for! So happy i went with x870e proart mobo. Waiting for 9950x3D and I'll get the 5090. I game, 3d model, architecture and video edit. Finally, you can actually have it all! Been using Intel all my life and after my 12700k... I'll never go Intel ever again. I've had more issues with it than with my 8700k. Can't wait to get my last 2 components for this build!
@wyatt5167 lmao. Dude I was literally swearing bullets about it. Was the same choices for me, proart or the carbon! The only thing the swayed me towards to the proart is because I really wanted a no RGB black and gold theme. Going all custom loop, black tubing, gold fittings, went with the tcreate ram modules etc. If it wasn't for the aesthetic I would actually have gone with carbon. All my builds are MSI mobos and they never miss a beat. So... we'll see what happens?!
The RTX 5090 boasts impressive computational power, but its $1999 price tag places it firmly in the ultra-enthusiast category. This raises questions about its accessibility for the average consumer and whether its performance gains justify the significant cost.
Fantastic! What processor do you think would work great with this monster video card? Workstation for 3D (Maya, Marmoset, ZBrush) and Unreal Engine 5. For example: Ryzen 9 9950X or Core Ultra 9 285K will be great ESPECIALLY for work (not for the games)? Thanks guys!
I wonder if the logic behind it is similar to 7900XTX, where GPU temperature is actually pretty good even compared to my RTX 4090, but it draws up to 470 watts, compared to 4090 that will draw less at same GPU loads. I hope for 5090 sake the logic is the same or the GPU will cook at idle, let alone during performance moments
Since I decided AMD for my 1st pc build, I've been waiting for the 2025 new GPU launch. I thought I was ready for the XT 9070. But with this latest RTX5090 making such massive claims, how can I now be sure I'm making the right choice at £1000 v £2000? 😢😮
Will a 5090 PC (or 4090) PC be faster for video editing than something like a m4 max with mid-advanced effects/motion graphics? (And which would be cheaper?) (In Resolve Studio)
First things First. I am an AMD Fanboi. My last Intel setup was my 7700k. My current setup is a 5950x and I am purposely holding off upgrading because I am not happy with AMD any longer. There are days when I do in fact miss the co-processing of Quicksync, NVENC and CPU all together, as I had in the past. I for one, WILL be adopting a 100% homogeneous Intel System for my next daily driver, both CPU with NPU, Quicksync included, as well as Intel B Series GPU. I am done with chasing gaming frames. I want up to date functionality. I would NEVER buy an AMD Graphics Card, as it does NOTHING very well.... just enough to get by (Ray Tracing, Encoding etc). I often thought that if AMD could release a high end Laptop/Mobile CPU for Desktop use, similar to every Intel Desktop CPU feature set, then I would stay Team Red. Sadly, this won't be the case. I have also purchased my last Nvidia product, because I am opting to vote with my wallet. It's not that I need a much faster system as an upgrade, I just want a full rich current feature set and feel that Intel can get me to my goal for very little money, again, even if it is only a fraction faster than what I am currently using in terms of Raw performance, it will be an up to date platform.
Already decided to order a pair of the 5070Ti as soon as it's available. We'll see how it does. Until then all we can do is keep watching as the new gear gets released. Thanks for another great video Lauri.
i do like how small the 50 series will be, unless there will be some beefy ones but i like you one you showed, i use a 4090 trio at the moment, i'm good for a few years on my viewsonic elite 1440p 240hz monitor, but the 50 series looks nice 🙂
I'd love to see a comparison in performance between Quicksync and new RTX50 when decoding 4:2:2 10 bit both in h264 and h265 to understand how big of a gap there may be between the two. My current laptop has an intel 185H and RTX4070, so wondering if an RTX50 via USB4/Thunderbolt would make a significant difference
How about the 5070 ti? Seems like a better middle ground for a slightly cheaper price compared to the 5080, still have a 256-bit bus, 16gb of ram, and has 2 encoders (though 1 decoder). Seems okay but not great for the price.
To be fair it is cheaper than the previous gen so at least nivida has that going, we can’t really tell the value yet until they are fully tested by 3rd parties
Isn't dlss useful only in games ? And also developing games based upon dlss would mean less and less optimised games designed with the upscaling tech in mind , all ai And frame gen stuff is fine but what about raw performance increase ?
It does work in some 3D development and rendering applications too. Unreal Engine 5 is an example. D5 Render is another one. D5 in fact also supports XeSS
@@_shreyash_anand That's the first time i'm hearing about this render. I assume XeSS(and FSR if they want to) can work because they don't have to be trained since both have non-ML versions. However as i've seen those in games they suck. I wouldn't want this in my viewport. And still DLSS can't be implemented because it must be trained, it goes against the creation process where you create things from scratch. Same goes for Unreal really i can't imagine myself gathering assets and tweaking lightning and hope it looks great, i want to know it looks great.
for the pros user very good news for the gamers with the fps ia and all the upscale if we only remove 30% gains twice less between rtx 3090 and 4090 pure marketing nvidia really screwed us over
I want to know what the actual performance of 3D is without all the faking-it enabled.. it's not like DLSS, FrameGeneration and Reflex is that useful when you're modelling in lets say Maya.
Disagree intel igpu and intel arc dgpus combination will give huge boost for performance as both gpus talk to each other. additionally Nvidia is power hungry 600w is too much even for high level creative workloads
why are you talking about X3D processors? That is literally only meant for gaming. If you are a creator, and you want a workstation, go for AMD Threadripper 7000 series. Skip the AMD Ryzen 9000 series
The frame addition is the least useful part of the DLSS 4 adding 3 frames rather than one increases the ratio of real to AI interpreted so more chance we are getting artifacts unless the quality of the AI frames (guessed frames) is amazing no benefit to generating a higher percentage of possibly poorer frames. A lot of the info was smoke and mirrors the main point for me is the increase in the RAM and the processing power. I will be replacing my 4090's with 5090's but not because of the misinformation. The cards physical size reduction is also beneficial for me for rendering more cards in the rendering units. Comparing the 5070 to a 4090 was utter nonsense ignoring the reasons people buy a 4090 rarely to play games
Intel is not in trouble, I'm not sure why such a niche use case is supposed to "threaten" Intel. Not only is this a creator-only problem (no gamers, no general users that probably will just buy what's cheap and works for light gaming, general use), but this is a niche use case only for those that care to use that specific codec. - If the creator wants that codec specifically on an NVIDIA card, they do not need a 5090 to get it and a 5090 is not a good deal at all if the encoders are the main talking point. - If the encoders are a goal and you are not a gamer, unless you really need CUDA and NVIDIA's AI features, content production on Arc or Radeon is not worse at all, I don't understand why you think it's a sacrifice to try those cards (Arc could have issues with some video editing softwares, it is still a relatively new player in discrete after all, so won't claim Arc will be perfect). - If the goal is video production and not streaming, even if we have hardware encoders, software encoders still exist and, while drastically slower, using software encoding for the same codec on an older NVIDIA card would probably be a negligible concern compared to the price of the upgrade considering you can just leave it to churn overnight. The 50 series does have some nice features, don't get me wrong, but I think you're definitely overestimating (whether for the clickbait or not) what will actually threaten competition for Intel and AMD, who are not targeting upper range cards at all this gen.
except quick sync comes built-in all the way back to Intel's Second generation but the 5090 goes for $2000. Massive shilling for something 99% of us don't need
Except the video is geared toward the 1% that seeks to squeeze performance over price. Why even bother commenting if you’re not a part of that crowd especially as it deliberately comments in creators within the bounds of Nvidia?
@Unordinary-lg4yt because the people who are salty about GPU prices can't help themselves. Even when the price difference can be explained by inflation alone, these people cannot accept it. They cannot afford the highest end models, they believe they are entitled to these models, and ten years for now they will still be whining
Except the new NVDEC is a part of every RTX 5000 GPU so even an $800 laptop with RTX 5050 will support it. Massive shilling for something even small-time hobbyist creators will be able to take full advantage of
Honestly, what I'm most interested in is reducing my render times in Da Vinci Resolve and Blender. It would be great to have some real performance tests.
Just jumping in to say i saw your other video to invest into XAI508P and i am already up $12,000. I moved some over to XRP expecting a rise in price soon. I wanted to just say thanks for the call.
If you are a gamer, it has been confirmed by third party testers, that the 5090 can play Solitaire at the highest settings. That alone justifies the $2K asking price.
But can it play Angry Birds?
@@animat8 Sorry, you'll have to wait for the 8090 to release in a few years.
Hi, when will you release a video with a full review of the intel arc b580?
Yeah, I'm interested in this too. I was disappointed that battlemage didn't get h266 decoding like lunar lake despite being released afterwards, but hopefully there is at least some performance improvement.
Asap :)
I will not buy another intel card after they handicapped the Arc770 I own with several updates.
@@automan1223that card is good for Ai machines for cheap. Famous in Asian region. Because we cant afford nvdia or quadro here but need cheap vram.
Also interested
NVIDIA wants to launch their own CPU, that is the ultimate plan
We need another Oppenheimer build for creators consisting 4 way 5090 all water cooled with amd threadripper 7995wx also water cooled plzz
There maybe zen5 threadripper
2.5kW just for GPUs. Thats toasty build.
Perfect for Scandinavian winter
@Alex-wg1mb bro you need what you need btw I work in 3d and ai my 1 hour is monthly electricity bill
Thank you for not mentioning about gaming.
I know you're focusing on the RTX 5090 in the title here, but I would like to make it clear for everyone; the new decoders come to every RTX 5000 Series GPU. So even if you get the hypothetical RTX 5050 Laptop in 6 months time, this conversation applies to you. It will decode the same codecs the 5090 will. That is why this is great
great to know, thanks for this
No, not quite. The 80 and 90 get more of the new decode/encoders..
The PCIe slot is only rated to carry "UP TO" 75 watts, not 150.
Not if it has additional power from the motherboard, it might be rated but not power limited (it will not burn out)
@@uwhat1 Yes, but those are few and far between. Had that been what he meant, he should have been clear that without the extra power it's only rated to 75 watts and that accounts for 99% of consumer boards.
@@uwhat1 No - not unless you have a special motherboard. Limit is 75 watts normally.
Hey pops I need a new graphics card to edit my school video project and also our home videos. Just 2000 USD. That's all.
No problem. You can do that with a 2060 Super. Here is $300
@@laartwork"you're not thinking 4th dimensionally". We must take into account the future use cases.
Please benchmark Lightroom Classic AI Denoise with the 50-series cards compared to the 4070 Ti, 4080, and 4090! It'd be great to see how CUDA cores and the new AI upgrades impact performance in the same PC setup - SUPER USEFUL info for photographers!
bro ,when are we getting the review of the b580?
I'm selling my 4080 and upgrading to the 5080. For a small extra cost, I'll get the added decode option, which is fantastic!
Quiksync is not $2000😮
No video about arc b580 yet ?
He doesn't review motherboards
@@nathsabari97 bruh lol
@@daniil3815 i know, thats how irrelevant intel gpu are
How is it irrelevant, unless nivida or Amd comes out with a gpu in that price range that beats it it won’t be irrelevant.
@IsaacSchultz-lz8jc intel will layoff their gpu division anyday now. Don't come crying for driver support.
Really good overview with not a PC game mentioned or in sight.
Hey man, have watched your channel for a while. Great content. What's your name?
5:00 Does RTX5000 series have H.266 (VVC) hardware encoding and decoding support too? H.266 (VVC) is 50% more efficient than H.265 (HEVC) .
Doesn't look like it. Only AV1.
I still haven't seen it confirmed that they are supporting H.265 4.2.2. The last generations also supported 4.2.2 H.264, so i hope they aren't just talking about that on the promo. Either way we MUST have this card ASAP!
It does do HEVC 4:2:2 10-Bit. The 40 Series didn't because the 40 Series had the same NVDEC version as the 30 Series. They only upgraded NVENC to do AV1. So this is finally an exciting update
You can only get 75 watts through the PCI Slot - not 150! Great video as always.
A 6 pin pcie give 75W but a 8 pin provides 150W so I think theyd be able to get an extra 150W with these gpus.
@@spoorthyv I think you're describing the 6-pin and 8-pin connectors from the PSU. PCI slots (the slot the GPU mounts into) provides power also, maximum 75W as @chritroy8047 says. The 5090 has a 12V 2x6 connector which they're saying is capable of 600W But it can derive it's power from 2 x 8-pins which until now had a limit of 150W each so I see issues for some people ahead.
why are we bothering with h.265 when we've got av1? would love to be using av1 10 bit 422.
Cameras don't record AV1, so you want your HW to decode the raw material that's coming directly out of the camera :)
@@theTechNotice ahh, that makes sense, but wouldn't it be advantageous to convince camera manufacturers to use av1 instead to reduce overall costs (for licensing)?
@@joshhardin666AV1 is very hardware intensive to work with. Playing multiple AV1 on a timeline, is very demanding. AV1 is really a end-user codec, not meant for recording.
Many even prefer to record in H.264, over 265, for the same reason.
It's also why we have professional codecs, like ProRes. It takes a lot of space, but is very easy to work on, on a timeline.
Its overrated Lauri and you know Nvidia want everyone to buy the 5090 because the 5080 is so underpowered with only 16GB VRAM.
The 5080 is also half the cost though
@@WatchVirtrinous I know, but 24GB VRAM would have been better. Nvidia are greedy.
“Underpowered” Most of you guys just game… what are you complaining about? Hardly any game is pushing the limits, stop it.
what do u need more power for?
@@ItsAryax Its not power, its more VRAM for Cuda cores for video editing.
Yea, i was so hyped when i saw that 3x encoder 2x decoder. I'm using 13900K tho, but im pretty happy 5090 have that feature. I think im gonna buy that
we need a absolute beast build with threadripper and dual rtx5090 with RGB ofcourse
I am more interested in 5080 and 5070. I hope the videos are coming soon!
editing has gotten really, really good, man. you're crushing it, and a huge inspiration for a video editor looking to branch out to his own channel.
Thanks for the video ! What do you think of the pa401 case ? Do you think the airflow would be as good as the pa402 ?
will 5080 also have the h264 encoding feature that 5090 has?
ooo i didnt knew about the codec thing :O thats very nice!
The 5090 is the ultimate card of course, but I think at the 999 price point the 5080 isn't actually bad either. The 5080 super might be the best value later on though, assuming it gets that VRAM bump. For gamers, it looks to me like the 5070Ti is the best value card at the moment. But that's all guess work, we'll see soon enough. I'll probably go for one 5090 or maybe 5080 FE, if those are actually available at msrp.
I'm wondering the power draw when's it's idle.
will the 5090 use the same power cable as the 4090? or do you have to swap that out too?
Yes the same 2 by 6 pin can do a max of 600w
@IsaacSchultz-lz8jc then why is Nvidia telling me (Via tech support) that I would need to use the cable it comes with (the 5090) and not the one that came with my 4090?
Really? The previous 4090 cable is rated at exactly 600 watts max when the card draws 575watts max. I think they are saying it to be on the safe side because at release of the 40 series that cable did have some issues so it will probably work its just to cover their butts if you happen to have one of the faulty cables
Great discovery on XAI508P - we're looking real bullish this year for the project.
Where is he buying all this XAI508P? Binance i know is listing it but that's months to come. Is it the presale website i'm assuming?
Thanks for covering those aspects for us creators! I'm probably going to switch from my 3090 to 50-series for video work. I'd probably be happy with the 5080, but I don't want to downgrade from 24gb of Vram... on the other hand, I can't justify the money for the 5090. Damn.
I would go AMD but I need thunderbolt! And I don't think amds usb4 will run correctly with my audio interface unless there's a motherboard from amd I could get with thunderbolt? For 9950X3D
The RTX5090 + Ryzen 9950X3D will be the game changer for Creators I'm Exited to get two of them !
if you upgrade to the 5090... i'll take your old card 😅 im using 3060 12 gb
is there a 5950x3d?
The competition isn't dead, maybe on life support, but definitely not dead. Intel ARC B580 is a huge success for budget gamers. Nvidia dominates for Windows creators.
As always this will be almost impossible to buy early this year for must of us!
Do you think the RTX 5090 will perform better than your 2x4080 ProArt Super build? Should i wait for the 5090 or buy 2x4080 super for redshift rendering?
What about AV1 codec, nobody mention that anymore??
AV1? Well, it's supported by 5090. And world gradually moving towards it. AV1 is less supported than HEVC. Both Twitch and Netflix moving to this codec, but they need majority of devices to support decoding.
@Sedokun thanks for answer!
i dont see RTX5090 ProART from ASUS....🤔
Ok so:
1. You can’t use DLSS in DCC software like Blender for example
2. Yes the 5080 has the same amount of VRAM but it is much faster as it is GDDR7
3. People who bought the new Intel cpus and amd gpus 👁👄👁
Why would you want DLSS in Blender?
Exactly what i was hoping for! So happy i went with x870e proart mobo. Waiting for 9950x3D and I'll get the 5090. I game, 3d model, architecture and video edit. Finally, you can actually have it all! Been using Intel all my life and after my 12700k... I'll never go Intel ever again. I've had more issues with it than with my 8700k. Can't wait to get my last 2 components for this build!
So you like the proart? Im caught between that and the carbon wifi. I'm a little afraid that if I get a lemon I'll have a nightmare on my hands
@wyatt5167 lmao. Dude I was literally swearing bullets about it. Was the same choices for me, proart or the carbon! The only thing the swayed me towards to the proart is because I really wanted a no RGB black and gold theme. Going all custom loop, black tubing, gold fittings, went with the tcreate ram modules etc. If it wasn't for the aesthetic I would actually have gone with carbon. All my builds are MSI mobos and they never miss a beat. So... we'll see what happens?!
Can't wait for the reviews to see if these products are actually worth it
@@Ro7770 another Nvdia scam for people with money to burn
Did you see the Las Vegas bubble light up the XAI508P token? I was there it was so cool.
Definitely gettin’ a 5090…my only concern is scalpers. 😤
The RTX 5090 boasts impressive computational power, but its $1999 price tag places it firmly in the ultra-enthusiast category. This raises questions about its accessibility for the average consumer and whether its performance gains justify the significant cost.
Thanks, Good review and insights as always :) I am wondering if the power cable will be different from the 40 series!
Fantastic! What processor do you think would work great with this monster video card? Workstation for 3D (Maya, Marmoset, ZBrush) and Unreal Engine 5. For example: Ryzen 9 9950X or Core Ultra 9 285K will be great ESPECIALLY for work (not for the games)? Thanks guys!
compare the prices with the super and the 50 series
It is not only speed , how about quality ?
In many test quicksync still got slightly better image quality? 🤔🤔
I wonder if the logic behind it is similar to 7900XTX, where GPU temperature is actually pretty good even compared to my RTX 4090, but it draws up to 470 watts, compared to 4090 that will draw less at same GPU loads.
I hope for 5090 sake the logic is the same or the GPU will cook at idle, let alone during performance moments
Feel like I've been boxed in to getting that ultra 200 until this moment.
I really would love to see you test the 5070 that's wht most people are going to buy the most as 12gb vram
Since I decided AMD for my 1st pc build, I've been waiting for the 2025 new GPU launch. I thought I was ready for the XT 9070. But with this latest RTX5090 making such massive claims, how can I now be sure I'm making the right choice at £1000 v £2000? 😢😮
XAI508P going parabolic this cycle 🚀🚀🚀🚀🚀
Will a 5090 PC (or 4090) PC be faster for video editing than something like a m4 max with mid-advanced effects/motion graphics? (And which would be cheaper?) (In Resolve Studio)
The m4 stuff is probably more comparable to the Amd thread rippers the Mac’s are pretty much all cpu
First things First. I am an AMD Fanboi. My last Intel setup was my 7700k. My current setup is a 5950x and I am purposely holding off upgrading because I am not happy with AMD any longer. There are days when I do in fact miss the co-processing of Quicksync, NVENC and CPU all together, as I had in the past. I for one, WILL be adopting a 100% homogeneous Intel System for my next daily driver, both CPU with NPU, Quicksync included, as well as Intel B Series GPU. I am done with chasing gaming frames. I want up to date functionality. I would NEVER buy an AMD Graphics Card, as it does NOTHING very well.... just enough to get by (Ray Tracing, Encoding etc). I often thought that if AMD could release a high end Laptop/Mobile CPU for Desktop use, similar to every Intel Desktop CPU feature set, then I would stay Team Red. Sadly, this won't be the case. I have also purchased my last Nvidia product, because I am opting to vote with my wallet. It's not that I need a much faster system as an upgrade, I just want a full rich current feature set and feel that Intel can get me to my goal for very little money, again, even if it is only a fraction faster than what I am currently using in terms of Raw performance, it will be an up to date platform.
Get a A310/a380 and Vola now you have previous gen intel quicksync
I have a 310 slap into the 7950x rig just for that reason 😂
Superman building a pc!
I do give them some credit for making the card small, I guess..
Already decided to order a pair of the 5070Ti as soon as it's available. We'll see how it does.
Until then all we can do is keep watching as the new gear gets released.
Thanks for another great video Lauri.
I feel like this channel was put up by NVIDIA themselves
Where is the B580 review? 😭
I'm not going to fly to Mars yet, so 4060 and 4060 ti are enough, I sympathize and congratulate you
So I should upgrade from my 3090 to a 5090? Cool.
i do like how small the 50 series will be, unless there will be some beefy ones but i like you one you showed, i use a 4090 trio at the moment, i'm good for a few years on my viewsonic elite 1440p 240hz monitor, but the 50 series looks nice 🙂
I'd love to see a comparison in performance between Quicksync and new RTX50 when decoding 4:2:2 10 bit both in h264 and h265 to understand how big of a gap there may be between the two. My current laptop has an intel 185H and RTX4070, so wondering if an RTX50 via USB4/Thunderbolt would make a significant difference
If you cut out the AI Fram Gen (that u can't use in VR), then the 5090 will outperform the 4090 by at most 20-25%
For me the 5090 will be spot on and will speed up my render times a lot great to see the founders card is a 2 slot one
i want that P16 wallpaper so bad
How about the 5070 ti? Seems like a better middle ground for a slightly cheaper price compared to the 5080, still have a 256-bit bus, 16gb of ram, and has 2 encoders (though 1 decoder). Seems okay but not great for the price.
To be fair it is cheaper than the previous gen so at least nivida has that going, we can’t really tell the value yet until they are fully tested by 3rd parties
hmmmm need to replace an older 3070 .. i might be tempted to go with the 5080 just for the codec ... I just got a B580 by the way :/
Well damn, I just finished my 14800k/4080Super build, YESTERDAY... It's does really kickass on Davinci Resolve and Photoshop, so maybe it's fine. Lol
The Avatar has returned!!
Isn't dlss useful only in games ?
And also developing games based upon dlss would mean less and less optimised games designed with the upscaling tech in mind , all ai
And frame gen stuff is fine but what about raw performance increase ?
It does work in some 3D development and rendering applications too. Unreal Engine 5 is an example. D5 Render is another one. D5 in fact also supports XeSS
@@_shreyash_anand That's the first time i'm hearing about this render. I assume XeSS(and FSR if they want to) can work because they don't have to be trained since both have non-ML versions. However as i've seen those in games they suck. I wouldn't want this in my viewport. And still DLSS can't be implemented because it must be trained, it goes against the creation process where you create things from scratch. Same goes for Unreal really i can't imagine myself gathering assets and tweaking lightning and hope it looks great, i want to know it looks great.
I'm ready 💪
for the pros user very good news
for the gamers with the fps ia and all the upscale if we only remove 30% gains twice less between rtx 3090 and 4090 pure marketing nvidia really screwed us over
best price? 5090 $2000 meaning in europe is going to be 2500 euros yeah amazing price
Its confirmed 2456€ for the 5090fe in Norway.. aib will probably push 3000€
@@lars789lets say an average 2700 for a gpu card only. with this amount you can built a killer pc if you are a freelancer or mid level professional
I want to know what the actual performance of 3D is without all the faking-it enabled.. it's not like DLSS, FrameGeneration and Reflex is that useful when you're modelling in lets say Maya.
Disagree intel igpu and intel arc dgpus combination will give huge boost for performance as both gpus talk to each other. additionally Nvidia is power hungry 600w is too much even for high level creative workloads
why are you talking about X3D processors? That is literally only meant for gaming. If you are a creator, and you want a workstation, go for AMD Threadripper 7000 series. Skip the AMD Ryzen 9000 series
The frame addition is the least useful part of the DLSS 4 adding 3 frames rather than one increases the ratio of real to AI interpreted so more chance we are getting artifacts unless the quality of the AI frames (guessed frames) is amazing no benefit to generating a higher percentage of possibly poorer frames.
A lot of the info was smoke and mirrors the main point for me is the increase in the RAM and the processing power. I will be replacing my 4090's with 5090's but not because of the misinformation. The cards physical size reduction is also beneficial for me for rendering more cards in the rendering units. Comparing the 5070 to a 4090 was utter nonsense ignoring the reasons people buy a 4090 rarely to play games
Intel is not in trouble, I'm not sure why such a niche use case is supposed to "threaten" Intel.
Not only is this a creator-only problem (no gamers, no general users that probably will just buy what's cheap and works for light gaming, general use), but this is a niche use case only for those that care to use that specific codec.
- If the creator wants that codec specifically on an NVIDIA card, they do not need a 5090 to get it and a 5090 is not a good deal at all if the encoders are the main talking point.
- If the encoders are a goal and you are not a gamer, unless you really need CUDA and NVIDIA's AI features, content production on Arc or Radeon is not worse at all, I don't understand why you think it's a sacrifice to try those cards (Arc could have issues with some video editing softwares, it is still a relatively new player in discrete after all, so won't claim Arc will be perfect).
- If the goal is video production and not streaming, even if we have hardware encoders, software encoders still exist and, while drastically slower, using software encoding for the same codec on an older NVIDIA card would probably be a negligible concern compared to the price of the upgrade considering you can just leave it to churn overnight.
The 50 series does have some nice features, don't get me wrong, but I think you're definitely overestimating (whether for the clickbait or not) what will actually threaten competition for Intel and AMD, who are not targeting upper range cards at all this gen.
around 33% increase in performace. Meh... 5000 series is more facelift then next gen.
I’m looking forward to buying the BROWN 💩 TURD PSU just because of a fan maker
Michael Saylor said Nvidia will be mergining with Elon Musk to back XAI508P thoughts on this?
The 5060 is a joke. And the 5090 is insanely expensive.
the problem with 5060, 5060 ti etc. is indeed only 16GB ram. But we aren't talking about gaming here right.. uh?
Everyone what's your take on XAI508P? Will it go up to $1.50 by June like experts are prediciting.
Which one am I most exicted about? None. I built a new PC in May with 4080 Super but I will definitely check comparisons.
Crafted for the ambitious = XAI508P
except quick sync comes built-in all the way back to Intel's Second generation but the 5090 goes for $2000. Massive shilling for something 99% of us don't need
Except the video is geared toward the 1% that seeks to squeeze performance over price. Why even bother commenting if you’re not a part of that crowd especially as it deliberately comments in creators within the bounds of Nvidia?
@Unordinary-lg4yt because the people who are salty about GPU prices can't help themselves. Even when the price difference can be explained by inflation alone, these people cannot accept it. They cannot afford the highest end models, they believe they are entitled to these models, and ten years for now they will still be whining
Except the new NVDEC is a part of every RTX 5000 GPU so even an $800 laptop with RTX 5050 will support it. Massive shilling for something even small-time hobbyist creators will be able to take full advantage of
Where can i buy XAI508P??
Honestly, what I'm most interested in is reducing my render times in Da Vinci Resolve and Blender. It would be great to have some real performance tests.
Just jumping in to say i saw your other video to invest into XAI508P and i am already up $12,000. I moved some over to XRP expecting a rise in price soon. I wanted to just say thanks for the call.
I won't buy the RTX5090 because the RTX6090 will be twice as fast.
Usually it takes 2 or 3 generations to be twice
@@ismafp Personally, I’m waiting for the RTX9090 with 2TB VRAM
@@user-vr2rq5hl6l no less than 32TB thanks
nice try diddy...
Classic upselling with the buzzword AI
Glazer detected
i wonder how much Nvidia have paid u for this full positive imaginative review without any benchmark :D
hahahahahaha i was just wandering that tooo :P