RDNA 5 - The Nvidia KILLER?! RTX 50 & RDNA 5 Specs

แชร์
ฝัง
  • เผยแพร่เมื่อ 13 มิ.ย. 2024
  • Whokey Christmas Sale 25% off code:RGT
    Windows 10 Pro $17,2: biitt.ly/9f0ie
    Windows 11 Pro $22,8: biitt.ly/581eD
    Windows 10 Home $13: biitt.ly/XYm9o
    Office 2016 $23: biitt.ly/cy7Vb
    Office 2019 $48: biitt.ly/YInvw
    www.whokeys.com/
    RDNA 5 - The Nvidia KILLER?! RTX 50 & RDNA 5 Specs
    In today's EXCLUSIVE, we discuss the latest updated RDNA 5 and RTX 50 specs. With RDNA 4 looking to target the mid range, AMD's hopes for defeating Nvidia are all pinned on RDNA 5 GPUs. Join us as we examine the latest hardware specifications for both RTX 50 and RDNA 5, as well as the expected increases in gaming performance and ray tracing performance. With RTX 50 Blackwell shaping up to be VERY promising, will RDNA 5 specs and performance be enough to DEFEAT Nvidia?
    The next few years in PC gaming tech are going to be very interesting indeed, especially with RDNA 4 bowing out of the high end graphics card market. But will RDNA 5 have enough grunt to defeat the RTX 50 in specs and performance? And of course - ray tracing improvements? With many gamers disappointed in RDNA 3's RT, RDNA 5 could be AMD's chance to shine.
    Subscribe - th-cam.com/users/RedGamin...
    Join our Discord - / discord
    Tip on PayPal - paypal.me/RedGamingTech
    AMAZON AFFILIATE LINKS
    US - amzn.to/2CMrqG6
    Canada - amzn.to/2roF4XO
    UK - amzn.to/2Qi6LMS
    GreenManGaming Affiliate link -
    www.greenmangaming.com/?tap_a=...
    SOCIAL MEDIA
    www.redgamingtech.com for more gaming news, reviews & tech
    / redgamingtech - if you want to help us out!
    / redgamingtech - Follow us on Facebook!
    / rgtcrimsonrayne - Paul's Twitter
    / redgamingtech - Official Twitter
    0:00 Start
    0:02 Today's topic
    4:00 RTX 50 Vs RDNA 5
    Royalty-Free Music - www.audiomicro.com/royalty-free-music
  • เกม

ความคิดเห็น • 482

  • @RedGamingTech
    @RedGamingTech  5 หลายเดือนก่อน +7

    Whokey Christmas Sale 25% off code:RGT
    Windows 10 Pro $17,2: biitt.ly/9f0ie
    Windows 11 Pro $22,8: biitt.ly/581eD
    Windows 10 Home $13: biitt.ly/XYm9o
    Office 2016 $23: biitt.ly/cy7Vb
    Office 2019 $48: biitt.ly/YInvw
    www.whokeys.com/

    • @Elam51
      @Elam51 5 หลายเดือนก่อน

      AMD Dies without console business.

    • @Brent_P
      @Brent_P 5 หลายเดือนก่อน +2

      @@Elam51 AMD is actually making good head way in the server/enterprise market. Consoles are not the main priority.

    • @Elam51
      @Elam51 5 หลายเดือนก่อน

      @@Brent_PAMD On Top🤣🤣🤣🤣🤣

    • @memcmeepants2392
      @memcmeepants2392 5 หลายเดือนก่อน

      ummmm windows is free, why would anyone pay for it ?

  • @SaccoBelmonte
    @SaccoBelmonte 5 หลายเดือนก่อน +311

    If I had a dollar for every "Nvidia killer" title I've seen.

    • @visitante-pc5zc
      @visitante-pc5zc 5 หลายเดือนก่อน +7

      Soyjack

    • @skywalker1991
      @skywalker1991 5 หลายเดือนก่อน +3

      Another Nvidia killer , lol

    • @TheGeneReyva
      @TheGeneReyva 5 หลายเดือนก่อน +25

      Still wouldn't be able to buy a worth a damn 40 card.

    • @cajampa
      @cajampa 5 หลายเดือนก่อน +18

      If we had a dollar for very time, we could buy a Ngreedia killer GPU.

    • @joemamasofat420
      @joemamasofat420 5 หลายเดือนก่อน +2

      What is a soyjack

  • @benjaminlynch9958
    @benjaminlynch9958 5 หลายเดือนก่อน +72

    We’ll see if AMD gets there. We’ve heard bold claims from them before regarding their competitive position with nVidia only to be let down once products are actually released. At this point we shouldn’t trust anything regarding performance figures simply because those numbers are all theoretical - they haven’t even taped out the initial design, and real world performance is never 100% exactly what it was modeled to be.
    As for MCM, I’m kind of surprised AMD (and nVidia) don’t have that working yet. Apple has had it working (for graphics!) for almost 2 years now going back to the M1 Ultra chip. The Ultra is 2x M1 Max chips fused together with a high bandwidth bridge that allows software - OS as well as userspace applications - to address the chip as one monolithic chip. No reprogramming necessary as was required back in the day for SLI. But the key thing here is that the bridge that Apple is using was actually developed by TSMC and is available to all of their clients - including AMD, nVidia, and even Intel. Why we haven’t seen anyone else use that bridge is a bit puzzling (maybe TSMC gave Apple an exclusive on it for a period of time???), but from a technology and engineering standpoint, fusing two GPU’s together and have them act, behave, and perform like a single GPU has been a huge success for Apple. Only a matter of time before AMD and nVidia start to use it too.

    • @kcoolj1
      @kcoolj1 5 หลายเดือนก่อน +11

      Maybe because they only have to get it working with the workflows that work on os and even then with gaming a lot of the time if you watch benchmarks the ultra variants don’t work well at all even with Mac arm compatible games. Just my guess why the gpu chipper hurdle hasn’t been crossed for gaming consumer gpus.

    • @kaystephan2610
      @kaystephan2610 5 หลายเดือนก่อน +13

      AMD has often delivered cards that offered competetive performance. All the way back in 2013 AMD had the R9 290 which offered the same performance per watt. It is 10% faster than it's rival - the GTX 780 - and draws 10% more power. The RX 6950 XT has offered RTX 3090 performance (sometimes it was even a bit faster) and it was actually slightly more efficient than the 3090. It was also the sole reason why NVIDIA released the 3090Ti. And RDNA 3 isn't really that bad either. First of all RX 7900XTX is selling pretty well because it's not really that much weaker. The 4090 is like 25% faster on average but it costs like 90% more AT LEAST. I get that the whole "NVIDIA KILLER!!!" thing is kinda annoying but that's just the standard way to get some more engagement with viewers. But AMD offering competition to NVIDIA, THAT isn't exactly far fetched either.

    • @necrotic256
      @necrotic256 5 หลายเดือนก่อน +4

      "We heard from them" We have heard from leakers overhyping products so many times. Like Meteor Lake or DLVR that was supposed to deliver up to 20% of performance/w to Raptor Lake refresh from this year alone

    • @UKKNGaming
      @UKKNGaming 5 หลายเดือนก่อน +2

      @@kaystephan2610 5700XT was going against the 2060 Super now it's as fast as the 2080 the 6950XT is faster than the 3090TI as of right now 7900XTX the most infamous 1000 dollar card that inches the 4090 and surpasses it with AFMF. The Nvidia users never update themselves on what the current performance is on AMD cards or ever just launch performance. Their cards always get slower over the years.

    • @kaystephan2610
      @kaystephan2610 5 หลายเดือนก่อน

      @@UKKNGaming I mean I have no way of verifying the numbers right now but can very well be true. Overall AMD isn't really a bad buy. I guess NVIDIA has the overall more appealinh software package maybe? And also better marketing, resulting in more mindshare. But overall AMD is a very solid option. My next GPU will be an RX 8000 GPU. Out goes my 2070.

  • @Davitron_87
    @Davitron_87 5 หลายเดือนก่อน +17

    “Nvidia killer.” I guess we’re really doing this again.
    If I had a dollar for every time I’ve seen that headline, I could afford a 4090.

    • @Dave-dh7rt
      @Dave-dh7rt 4 หลายเดือนก่อน +1

      The 6900xt definitely competed very well against the 3090. Imo i think it won. Cheaper, just as fast, more efficient

    • @rickmorty4921
      @rickmorty4921 หลายเดือนก่อน

      They are building this narrative because the 5000 series will appear first and they want people to wait for their card. The card will suck compared to nvidia as always. And these people who talk like this simply get paid for such bullshit.

  • @Dark-qx8rk
    @Dark-qx8rk 5 หลายเดือนก่อน +46

    AMD needs to at least triple RT performance for RDNA5 if they want to catch up to Nvidia. Just being 30% faster than RDNA3 is not even going to catch up to the 4000 series.

    • @nimbulan2020
      @nimbulan2020 5 หลายเดือนก่อน +5

      They're going to need more than 50% faster RT just to match the performance scaling of the 30 series (looking at pathtracing workloads to judge raw RT horsepower) much less anything newer. AMD has a long way to go still.

    • @Vujkan
      @Vujkan 5 หลายเดือนก่อน +1

      thats never gonna happen!Nvidia is far above by everything!

    • @Accuaro
      @Accuaro 5 หลายเดือนก่อน +7

      I mean it isn't "difficult" to do, the reason why Nvidia is so good at it is because they have portioned a part of their silicon that is dedicated to RT. Whilst on the other hand AMD runs their RT pretty much off their shaders (downplaying the complexity) but yeah. That's the reason why Nvidia is better at RT.

    • @Vujkan
      @Vujkan 5 หลายเดือนก่อน +1

      iats not only about ray tracing xD,dlss quality also way better,support way better,features on nvidia way better.Point is amd never gonna catch nvidia,also they dont have plans to make next gen amd high end cards thats official!@@Accuaro

    • @muresanandrei7565
      @muresanandrei7565 5 หลายเดือนก่อน +1

      My dude the only reason nvidia is so ahead is dlss at raw power amd is not that bad at hardware they suck at the software level.

  • @ElysaraCh
    @ElysaraCh 5 หลายเดือนก่อน +11

    holy cow the sponsor spot is 2:30 long. more than 10% of this 22 minute video is the sponsor spot...

    • @Workaholic42
      @Workaholic42 5 หลายเดือนก่อน

      You can skip, just double tap with two fingers in the right half of the video

    • @ElysaraCh
      @ElysaraCh 5 หลายเดือนก่อน +1

      @@Workaholic42 I know I can skip, that's still a crazy long sponsor spot

  • @matheuswerly5320
    @matheuswerly5320 5 หลายเดือนก่อน +13

    Aaaaaaaaaaaa there's always a Nvidia killer but Nvidia is out there very alive

    • @visitante-pc5zc
      @visitante-pc5zc 5 หลายเดือนก่อน +2

      Soyjack

    • @magnomliman8114
      @magnomliman8114 5 หลายเดือนก่อน +7

      @@visitante-pc5zc the hell is your probleme ?

  • @Hexenkind1
    @Hexenkind1 5 หลายเดือนก่อน +37

    I believe it when I see it.
    God knows we need the competition to drive prices down.

    • @rangersmith4652
      @rangersmith4652 5 หลายเดือนก่อน +4

      That only works if we, as GPU buyers, stop buying them. Look at the 4080, for example. It's never been OOS and has experienced mediocre sales, but its price doesn't fall because Nvidia can afford to endure a sales slump in the 4080 range while it rakes in the profits for 4090s. Weak sales of the 4080 are offset by 4090 sales. As for price competition, neither company seems willing to consistently play the "value leader" role. A $3000 Nvidia 5090 will remain at $3000 if AMD launches an equal-performing RDNA5 card at $3,00o as long as both sell well and they stare each other down. One company or the other has to blink before prices will fall significantly.

    • @TheBlackIdentety
      @TheBlackIdentety 5 หลายเดือนก่อน +5

      Hilarious how you think prices will ever go down again. They'll be going further up. Leading edge nodes are getting more and more expensive.

    • @mnomadvfx
      @mnomadvfx 5 หลายเดือนก่อน

      @@rangersmith4652
      "A $3000 Nvidia 5090 will remain at $3000 if AMD launches an equal-performing RDNA5 card at $3,00o"
      Worse than that.
      nVidia prices will remain even if AMD can field something as good at $1,500 as long as there is stock deficiency.
      Add in AMD's persistent low Windows driver performance (Linux perf is more than fine OTOH) for DX9-11 and OGL and AMD will never truly overtake nVidia.
      On the other hand, the market possibilities for compact mini consoles are huge if the rumored CU count of 40 is true for Strix Halo/Sarlak.
      Combine that with Steam OS for a permanently wall powered Steam console, and you have something nVidia simply just can't match with its IP and product stack.
      If Sarlak is real and becomes a permanent market segment then AMD could be on to a winner for people in between casual and hardcore gaming that just want a compact gaming PC.

    • @ianlarge9016
      @ianlarge9016 5 หลายเดือนก่อน +1

      @@rangersmith4652 I agree with you about people not buying Nvidia GPUs due to the pricing and I hope AMD and Intel offer competitive products. I also suspect that new entrants from China may enter the market and spoil things for Nvidia. Its already happened with mobile phones, so I don't see any reason why it can't be done with GPUs.

    • @christophermullins7163
      @christophermullins7163 5 หลายเดือนก่อน

      ​@@TheBlackIdentetyand these companies will blame the node prices for higher prices. Personally I am usually using previous gen GPUs which sell used for about 2-3x price to performance vs new so my prices will certainly go down.

  • @necrotic256
    @necrotic256 5 หลายเดือนก่อน +10

    I really dislike these kinds of clickbait titles

  • @scottgardiner7418
    @scottgardiner7418 5 หลายเดือนก่อน +24

    NVIDIA has been serving up great opportunities for amd for the last three generations. Yet, AMD keeps sh!tt!ng the bed. I don't expect this to change anytime soon.

    • @GeneralS1mba
      @GeneralS1mba 5 หลายเดือนก่อน

      I expect rdna 5 to do well, if his leaks are accurate. See how they are thinking of 600w tdp? Nvidia will be beaten UNTIl they release a 5090 ti or something which will be the true competition lol

    • @scottgardiner7418
      @scottgardiner7418 5 หลายเดือนก่อน +2

      @@GeneralS1mba True competition would be welcome if the end result is improved price/performance is delivered from both side.

    • @GeneralS1mba
      @GeneralS1mba 5 หลายเดือนก่อน

      @@scottgardiner7418 yes, i wish for normal prices and actual gen on gen price/performance increases

    • @sharktooh76
      @sharktooh76 5 หลายเดือนก่อน +3

      RDNA 2 (Navi 20) was faster than RTX3000 (Ampere) in the only metric that matters. Rasterization.
      As i said in a reply above, unless full path tracing is rendered at the same speed of rasterization, no one wants nor needs PT/RT.

    • @GeneralS1mba
      @GeneralS1mba 5 หลายเดือนก่อน

      @@sharktooh76 people would definitely sacrifice performance for eye candy if the eye candy is good enough and performance loss isn't too great

  • @clintyoung3541
    @clintyoung3541 5 หลายเดือนก่อน +14

    Why do we keep referring to cards as killer to another brand and then have to down play expectations when they get released. Why not talk about potential parity or hopefully perform rather than fueling fan boy hopes

    • @magnomliman8114
      @magnomliman8114 5 หลายเดือนก่อน +11

      cus it generate more views. Tech youtuber in a nutshell.

    • @mnomadvfx
      @mnomadvfx 5 หลายเดือนก่อน

      "We" being a clickbait video title to generate views which increase the likelihood of visitors that will click on ads?

    • @maynardburger
      @maynardburger 5 หลายเดือนก่อน

      Because people like RGT and MLID base their entire channels on being pretend 'insiders' who tell people about all this magical tech that will come out in the future to get people hyped(AMD fans especially prone to gobbling this bullshit up). If they were to only report when there was legitimate/credible information, they'd be completely useless/superfluous and would get no attention and be exposed as just some boring nobodies doing TH-cam videos in their basement because they have no other life skills.

  • @richardoh3666
    @richardoh3666 5 หลายเดือนก่อน +4

    I'm not holding my breathe on RDNA 5. I don't think they can compete with Nvidia.

  • @Slejur
    @Slejur 5 หลายเดือนก่อน +40

    RDNA X - The Nvidia Killer?! The standard title for every new iteration of RDNA. With this speed, I can soon say; If I get $1 for every time I hear or see "RDNA X - The Nvidia Killer". I can soon buy every version of graphic card Nvidia has created, past, present and in the future.

    • @garyhall3919
      @garyhall3919 5 หลายเดือนก่อน +4

      yup its getting real tiresome

    • @gertjanvandermeij4265
      @gertjanvandermeij4265 5 หลายเดือนก่อน

      Nvidia will be DONE after 3-2 Nm ! Only Radeon will survive after that !

    • @mnomadvfx
      @mnomadvfx 5 หลายเดือนก่อน

      @@gertjanvandermeij4265
      Nah.
      Both will go chiplet.
      It's just a question of who will do it efficiently first, and how long they can keep that advantage, if at all.

    • @Animize
      @Animize 5 หลายเดือนก่อน

      I mean, technically speaking, as you probably know, this is only to get the viewer's attention. Wouldn't it be actually kind of boring if the titles were something like: Next RDNA will most likely beat RTX50 in Raster-Performance per Watt (again), but loose in a lot of other categories ... There is no fire in the truth. Also, with a new architecture dropping like every other year...I am not sure if we just take the main iterations of RDNA, so 5 Dollars - or even the best case 60 or whatever Dollars if we take all the GPU-architectures ever released by AMD / ATI combined, will get you far in Nvidias catalog, if we want to take words for their value 🙂.

    • @garyhall3919
      @garyhall3919 5 หลายเดือนก่อน

      @@Animize with a title like i wouldnt need to watch the video lol

  • @TheCgOrion
    @TheCgOrion 5 หลายเดือนก่อน +12

    Merry Christmas Paul. Thank you for your dedication to the enthusiast hardware market.

  • @fffhunter7765
    @fffhunter7765 5 หลายเดือนก่อน +5

    Every gen we wait for the Nvidia Killer and every gen Amd dont know how to deliver - maybe rdna 10

    • @mnomadvfx
      @mnomadvfx 5 หลายเดือนก่อน

      RT aside RDNA2 was plenty competitive.
      The whole reason nVidia is pushing RT to the hilt is that they know they can't make money on it for much longer.
      8K is a waste of money and 4K raster can only tax a card so much.
      nVidia literally moved the goal posts in 2018 to stay profitable longer at the expense of the game devs that have to code in RT as well as raster too.

    • @magnomliman8114
      @magnomliman8114 5 หลายเดือนก่อน +1

      @@mnomadvfxah yes u still trying hard out here. no wonder u acted stupid on my replay. be happy whit your card and dont try to convince ppl that Amd is better then nvidia just cus the price is higher. i need to see Amd puching for new tech and deliver the performance too. until then i well stick whit Nvidia. Amd cant even delver a normal working driver right now whiteout patching it 5 tims... just recently CS GO 2 banned ppl thet used Amd '' anti lag"" thats how bad it is.

  • @jonzie24
    @jonzie24 5 หลายเดือนก่อน +11

    The last Nvidia killer from AMD was a disaster, if I remember correctly 🤣
    Btw. Love those slides where it says that the 4080 price starts at $899. Just checked it out of curiosity, and it still sits happily at £1,579.00 here in the UK.

    • @Dave-dh7rt
      @Dave-dh7rt 4 หลายเดือนก่อน

      6900XT definitely won against the 3090

  • @GeforceFender
    @GeforceFender 5 หลายเดือนก่อน +2

    Doesn't matter because both brands are not cheap anymore

  • @nathans3022
    @nathans3022 5 หลายเดือนก่อน +8

    Honestly, I am still on Vega64 XTX. I would spend $1,000+ for a GPU but I want HBM2e/HBM3 and a smaller form factor that that'll bring. Cards these days have just gotten ridiculously massive...

    • @spankeyfish
      @spankeyfish 5 หลายเดือนก่อน +3

      Most of the size is cooling; 4090 PCB is about 60% of the length of the whole card. Watercooled cards only need to be the size of the PCB and about 1 slot thick.

    • @danielvipin7163
      @danielvipin7163 5 หลายเดือนก่อน +1

      Just get any nvidia fe cards and watercool it. Their pcbs are always half the length and responds greatly to watercooling.

    • @ShizzleGaming14
      @ShizzleGaming14 5 หลายเดือนก่อน

      Even when hbu become consumer gain they won’t get that small

    • @nathans3022
      @nathans3022 5 หลายเดือนก่อน

      @@ShizzleGaming14 guess you haven't seen a card like the RX Vega64 Nano. Asrock did the 56 but a 64 preproduction unit was made also. They can be VERY small if needed.

    • @absolutelysobeast
      @absolutelysobeast 5 หลายเดือนก่อน

      You know how hot HBM gets? No small cards with HBM for years my dude

  • @JJ-is7mn
    @JJ-is7mn 5 หลายเดือนก่อน +3

    All the things said about RDNA3 that turned out dead wrong.

  • @TheGeneReyva
    @TheGeneReyva 5 หลายเดือนก่อน +1

    What's the release difference gonna be between RDNA5 and RTX 50 and 60 that RDNA5 is getting put against RTX50?

  • @SAFFY7411
    @SAFFY7411 5 หลายเดือนก่อน

    Interesting info as always, Paul. Hope you have a great Christmas too!

  • @Hexenkind1
    @Hexenkind1 5 หลายเดือนก่อน +1

    Merry christmas to you as well.
    I will play a little bit of avatar in unobtanium settings with my 4090.
    Because for what else did I buy this card then, right?^^

  • @georgelugenalt200
    @georgelugenalt200 5 หลายเดือนก่อน +6

    LOL. How many times have we heard this before. I own an 79xtx, a 68xt, a 6750xt, and a 57xt. AMD is getting better, no doubt. But no way will they ever beat Ngreedia. I think they have a treaty between the 2 of them, that AMD will be second fastest, and second most expensive, and Ngreedia agrees to drive up prices so AMD can also profit...

    • @hristobotev9726
      @hristobotev9726 5 หลายเดือนก่อน +1

      Amd beat ngreedia 2 times - 1950xtx, 7970. Almost beat them with 6900

    • @magnomliman8114
      @magnomliman8114 5 หลายเดือนก่อน +1

      @@hristobotev9726you try Harding in the comments to make Nvidia look bad. you are the type of the fanboy the video title was maid for.

    • @DepthsOfHell98
      @DepthsOfHell98 5 หลายเดือนก่อน +1

      ​@@hristobotev97263 times. you forgot their goat the 9700pro

    • @mnomadvfx
      @mnomadvfx 5 หลายเดือนก่อน +1

      They don't have a treaty.
      AMD just has a far more divided focus.
      nVidia doesn't develop a CPU anymore, and even when they did it was clearly not their focus.
      AMD meanwhile has to develop a CPU, the next overhualed CPU 2 gens ahead, the next RDNA, the next RNDA after that, the next CDNA and the next CDNA after that - and more recently they are expanded further into Xilinx IP and Pensanto DPU stuff or whatever they are called.
      Then you add on their semi custom (Sony/MS) contracts.
      AMD are spread extremely thin with far less of a war chest to spend on it all than nVidia or Intel.
      To say it's amazing that they can compete at all is a huge understatement.
      It's a god damn frickin miracle.

    • @mnomadvfx
      @mnomadvfx 5 หลายเดือนก่อน +1

      @@magnomliman8114
      "you try Harding in the comments to make Nvidia looks bad"
      I notice you couldn't actually refute @hristobotev9726's point and resorted to insults instead.....
      What was that about fanboyism again? 😏

  • @magnomliman8114
    @magnomliman8114 5 หลายเดือนก่อน +3

    So we have leaks about 50 series and we have noting about RDNA 5 and we still going for the Nvidia killer title ? man youtubers sure disrespect the intelligence of there audience tbh...

  • @proklet4694
    @proklet4694 5 หลายเดือนก่อน +3

    Sure

  • @zadintuvas1
    @zadintuvas1 5 หลายเดือนก่อน +1

    Remember when Vega was supposed to be an Nvidia killer.

  • @FunFunFun8888
    @FunFunFun8888 5 หลายเดือนก่อน +1

    Happy Christmas! WE LOVE YOU!

  • @FunFunFun8888
    @FunFunFun8888 5 หลายเดือนก่อน +1

    Will it be able to play 4K 150fps without fake frames in Avatar: Frontiers of Pandora with Unobtainium setting on?

    • @kutark
      @kutark 5 หลายเดือนก่อน

      Yep, just load up Dota 2, Fortnite, TF2, CS2, etc!

    • @FunFunFun8888
      @FunFunFun8888 5 หลายเดือนก่อน

      @@kutark 😛🤣

  • @zntei2374
    @zntei2374 5 หลายเดือนก่อน +13

    I'm glad AMD is focusing on the budget segment for RDNA 4. Wealthier gamers these days are too spoiled - who exactly needs 4K 240Hz?

    • @magnomliman8114
      @magnomliman8114 5 หลายเดือนก่อน +7

      @zntei2374
      funny always the ppl that can't reach that tech who act humble like this . if u cant beat it act like u dont need it :D pathetic..

    • @zntei2374
      @zntei2374 5 หลายเดือนก่อน

      @@magnomliman8114 I could buy an RTX 4090 right now, yet I choose to stay with an RX 6600

    • @mnomadvfx
      @mnomadvfx 5 หลายเดือนก่อน

      @@magnomliman8114
      Funny how blah blah blah I've got money to burn when plenty of people are literally starving and can't afford rent in first world countries like US and UK.
      The only pathetic words I read were yours mate.
      I didn't read humble from the OP's comment - they sounded bitter for sure, but in large not wrong.
      240 hz is basically snake oil - and nothing achievable without upscalnig and frame generation anyway regardless of the vendor or SKU on RT games.
      nVidia jumped the gun on RT by at least a decade, perhaps more.
      If you need a denoising shader (at minimum) to get a usable image quality then your hardware isn't ready yet.
      Guess what?
      There isn't a single RT game that doesn't use denoising that b0rks HQ texture quality - that's the real reason for DLSS to exist so that it can fake texture detail after the denoising pass has erased it.
      That is why nVidai have been running hell for leather pumping money into academia to fund research into real time RT software techniques like ReSTIR - because they know their hardware can't cut it.
      The irony is that every step forward the RT research they paid for makes at the software level only benefits AMD and Intel too 😅

    • @thegamehud8214
      @thegamehud8214 5 หลายเดือนก่อน +5

      ​@@magnomliman8114I literally just said the same thing in another post, it's completely true. I play at 4k 120 and I can't live without it, it's a completely different experience.

    • @nipa5961
      @nipa5961 5 หลายเดือนก่อน

      I hope Nvidia will focus on gaming again. AMD needs competition again.

  • @R6ex
    @R6ex 5 หลายเดือนก่อน +4

    I'll believe it when I see it. Nvidia has been holding the performance crown for a long time already!

    • @hatty101
      @hatty101 5 หลายเดือนก่อน +3

      I use your profile picture as a toilet paper

    • @magnomliman8114
      @magnomliman8114 5 หลายเดือนก่อน +1

      @@hatty101 you add 0 value to the conversation. what a waste of account space in the servers...

    • @hatty101
      @hatty101 5 หลายเดือนก่อน

      what converstion? there are no converstion. I started one though if you want to add something to it feel free to, anyway i like this ukraine flag as a toilet paper. its cheap.@@magnomliman8114

  • @pasullica
    @pasullica 5 หลายเดือนก่อน +3

    In other words next gen red and green team cards will be a joke - power hungry to boil some eggs

  • @FunFunFun8888
    @FunFunFun8888 5 หลายเดือนก่อน +1

    I see the used market is suddenly flooded with second-hand RTX40 series cards. I feel sorry for anyone who buys them as they obviously don't know about the replacement with the RTX40 Super series in 2 weeks.

  • @nathanacreman632
    @nathanacreman632 5 หลายเดือนก่อน +2

    Remember when the 7900XTX was called the "4090 killer"?

    • @arenzricodexd4409
      @arenzricodexd4409 4 หลายเดือนก่อน

      2.5 to 3 times faster than 6900XT. 4ghz stock clock.

    • @nathanacreman632
      @nathanacreman632 4 หลายเดือนก่อน

      Still gets pissed on by a 4090 whats your point?@@arenzricodexd4409

  • @Lightsaglowllc
    @Lightsaglowllc 5 หลายเดือนก่อน +2

    Nvidia needs competition…but it’s not coming from AMD.

    • @shayeladshayelad2416
      @shayeladshayelad2416 5 หลายเดือนก่อน

      @lightsaglowllc u know what make nvidia number one? The suckers that pay primo dollar for nvidia shit its not the hardware its the suckers

  • @halistinejenkins5289
    @halistinejenkins5289 5 หลายเดือนก่อน +1

    Merry Christmas sir!

  • @Hombremaniac
    @Hombremaniac 5 หลายเดือนก่อน +3

    From the point of view of 90% of gamers the RTX 4090 might not even exist! So if AMD can come up with GPUs that are competitive with their Nvidia midrange counterparts, that is good enough. Same could have happened this gen. Imagine AMD named 7900XTX as 7900XT instead, while keeping the price of 7900XT the same. Basicaly keep the pricing, but up the performance of each GPU tier so that 7800XT would have the performance of 7900XT and 7700XT that of 7800XT etc. Now that would be a GOOD time for gamers, right? So here is me hoping this is going to happen next gen.
    Let Nvidia have the crown for the fastest Titan class GPU. Sure, it gives them bragging rights and is great for productivity and AI tasks. But for vast majority of gamers it is simply out of reach completely. I also can appreciate Ferrari when I see it, but am more than happy to drive Ford Mondeo.

  • @swanstudios2018
    @swanstudios2018 5 หลายเดือนก่อน +1

    Paul, you're OG! How about adding some cool screensavers on your PC? You've got those huge screens in the background - it's time to tech them out a bit, bro. Thanks for the videos, I've been a fan for years! Also, I'm thinking AMD might really shake things up this year. All the gaming consoles are using AMD technology, both now and in the future. With NVIDIA's focus shifting towards AI, this could be AMD's big moment, though we've said that before. Jensen might just flex his financial muscle and lavish attention on the 5000 series, just because he can. It's like every major company is snapping up their AI chips - they're selling like hotcakes in a tribal village in the rain-forest for the first time!

  • @jagadiesel1
    @jagadiesel1 5 หลายเดือนก่อน +1

    What’s an SED and an AID? How many dies are we talking about?

  • @chriskeach1908
    @chriskeach1908 5 หลายเดือนก่อน

    RTX50 sounds like an AI researchers dream GPU, that incidently happes to play games as well...

  • @christophermullins7163
    @christophermullins7163 5 หลายเดือนก่อน

    I cannot wait for a 3-4x upgrade to my gyx1080. Should be some decent options in the next few months.

  • @mythos000000025
    @mythos000000025 4 หลายเดือนก่อน

    Better analogy would have been climbing Everest vs diving the Mariana trench. Both are hard beyond what most people comprehend but the other isn't just hard, 1 slip will kill you faster than you can imagine...even with all the prep in the world and the best gear. Seems easy, till you throw that steel bearing down the hole and see how much it compresses at the bottom

  • @Eternalduoae
    @Eternalduoae 5 หลายเดือนก่อน

    What in the hell is an AID or a MALL? Paul, sometimes it would be good to provide a quick explainer (unless i missed it in another part of the video? I honestly didn't hear a single description).

    • @Eternalduoae
      @Eternalduoae 5 หลายเดือนก่อน

      Okay Google finally helped, it's an interposer die. Just say that instead of the acronym.
      To be honest, I'm skeptical about that... heat / power / voltage is a big problem in that context.

  • @TPQ1980
    @TPQ1980 5 หลายเดือนก่อน +1

    If you invert the graphics card, you have a platform for converting images into electrical signals and into code. Are we funding AI's visual cortex?

  • @johnvandeven2188
    @johnvandeven2188 5 หลายเดือนก่อน +3

    The American price for the 4090 is $US1600 and the same card in Australia is $AU3200. So, imagine the new 5090 which is expected to be priced at $US2000 in the USA, in Australia that figure will be at least $AU4000. Who in their right mind would buy such an expensive product considering an entire good quality gaming rig will cost $AU3000. So this one component is worth more than the combined cost of all the other technologies inside that gaming rig. I cry foul to this and it is very clear that Nvidia sees itself as equal to Apple or possibly even higher, much higher up that totem pole than any other contender. AMD has positioned itself well since the Zen architecture which was released only 6 years ago. I see AMD as much more innovative and prepared to experiment with radical design approaches while Nvidia keeps it all monolithic and frustratingly expensive. The profit margin for Nvidia is 75% as per their last financial review while AMD is nearer to just 50% and for all those naysayers who are constantly demanding AMD lower their prices even more while Nvidia keeps winning that turkey shoot with stupidly high prices.

    • @magnomliman8114
      @magnomliman8114 5 หลายเดือนก่อน +1

      u can blame the shipping industry for that. 3erd world country's have it 10 tims harder then you.

    • @johnvandeven2188
      @johnvandeven2188 5 หลายเดือนก่อน

      @@magnomliman8114 Exactly. Third World countries do have it the hardest.
      The cost of shipping is negligible at somewhere between $1 to $5 per video card. The rest is greed from the importer and the resellers.

    • @magnomliman8114
      @magnomliman8114 5 หลายเดือนก่อน

      @@johnvandeven2188 sadly true...

    • @maynardburger
      @maynardburger 5 หลายเดือนก่อน

      I have zero issue with the existence of super premium flagship parts at very high prices. What annoys me is that they aren't releasing a more cut down AD102 part at a more affordable price point like they usually do. Something that gives you high end performance that's not miles off the flagship part, but with a more sane price tag. And we can tell by the 4080's pricing they have no intention of doing this whatsoever. What's frustrating is that this has worked, as many people say the 4090 is the only 40 series part that is worth it, even though they've been manipulated into thinking this, even though we've always agreed that these top end flagship parts were poor value.

  • @Animize
    @Animize 5 หลายเดือนก่อน +2

    I don't care about the PS5 Pro, but - I also don't care about the GPU-market anymore. Got a 6700XT a few months ago (replaced my RX580) and I am happy with that. Nvidia, as well as AMD can keep their 1000 Euro GPUs. It's not even that I can't afford them, but I don't want to… For those 3-6 hours per week, that I actually have the time to play some games? It doesn't make sense to spend so much money on, what is essentially a little visual update, over what I have now. I usually upgrade my components around every 4 years into the next "current-gen" value-king. But this time, I even got a "last-gen" card, as the new generation just wasn't worth the extra. Honestly, for a business trip to the US, I got a Laptop with an RX6700s, which essentially is a Desktop RX6600... Didn't know I would get that, otherwise would have probably even stalled the upgrades for my main-rig. The performance is good enough for me, especially in 1440p.

  • @FlyingPhilUK
    @FlyingPhilUK 5 หลายเดือนก่อน +1

    It used to be that everyone was like 'Don't worry that nVidia has crushed AMD in this generation, just wait until the *Next* generation, that's gonna be great!!'
    - now it's become 'Don't worry that nVidia has crushed AMD in this generation, just wait until the *Next* *Next* generation, that's gonna be great!!'
    - I think we can all see which way this is going!
    🤣

  • @ztuber7763
    @ztuber7763 5 หลายเดือนก่อน +2

    I'm holding off on buying the 7900xtx until i know if there's a 2024 sub 1000 card that will outperform it

    • @kutark
      @kutark 5 หลายเดือนก่อน

      Rumors are that (whatever they call it) the 8800xt will land between 7900xt and xtx and should be well under 1000, prob close to 600-700.

  • @Aleksdraven
    @Aleksdraven 5 หลายเดือนก่อน

    What means AIDs in RDNA 5 description leak?.

  • @Khepramancer
    @Khepramancer 5 หลายเดือนก่อน +2

    Doesn't AMD always say that?...

  • @fs9553
    @fs9553 5 หลายเดือนก่อน

    You talk about RDNA5. Man 4 still needs to hit the market.

  • @johndoh5182
    @johndoh5182 5 หลายเดือนก่อน +3

    Blackwell is going to be on N3, an improved N3 that allows them higher speed. They don't have to make a MASSIVE die to get about a 25% higher transistor count over the 4090, and they don't really need more than about 25 - 30% transistors, mostly for RT processing.
    Nvidia ONLY needs to add about 25% to raster performance over the 4090. They probably only need to boost AI performance about 10% because DLSS is already excellent on the 4090. They need the biggest boost in ray/path tracing. In fact if they get 20% greater raster, 35% greater RT and 10% in AI with the 5090, they could release another halo product for around $1600.
    Nvidia will have two choices after this and it's dependent on TSMC. If it will take TSMC a couple years to be able to get Nvidia on N2, then the 5090 will probably be a x02 die and after 8 - 12 months they can put out a x01 die for a 5090 Ti and make it about 20 - 25% more performant than the 5090, and they could also put out Super models going down the stack just like they have to right now since they can't release Blackwell for about a year because of the production issue with TSMC N3 (delaying Blackwell).
    Nvidia WILL get Blackwell on a custom N3, and then they'll get their next line on a custom N2, and that N2 product line is going to be scary. It will be at the point where CPUs won't be fast enough for top end 6000 series running 4K HDR with RT. But part of this falls on the game engines because as has been said many times, game engines need to get better at parallel processing so more cores can benefit the performance and not rely solely on IPC and clock speed improvements. Those are going to be harder to come by after Zen 6/Arrow Lake. SO the big push will HAVE to be on the engineers that code the game engines.
    I'm struggling to see why a 5090 would need more memory than a 4090. They don't need a wider bus, they only need to move to GDDR7.
    Next topic. Yes MCM in graphics compute is almost impossible unless you could separate out a type of core, but because you have to put everything together to make an image at blinding speeds I don't know if it's ever going to be a thing short of the way AMD has taken it on, and it's giving AMD a LOT of issues with very lackluster performance at the top end. I don't think it will ever be possible to break up graphics into sectors for MCM because of the edge case. You have objects that are moving between sectors and data that would be processed in one sector will be needed in other sectors. We'll know when it's feasible, Nvidia will release it.

  • @chillinsince96
    @chillinsince96 5 หลายเดือนก่อน

    watching from my 7900m m18. Hoping more laptops come out with 7900m for all the gamers its a underated beast!

  • @mitchjames9350
    @mitchjames9350 5 หลายเดือนก่อน

    Hopefully AMD puts in Tensor Cores for up scaling.

  • @HuGiv5
    @HuGiv5 5 หลายเดือนก่อน +14

    I've heard the term 'Nvidia Killer' has been thrown around since the mid 2000s.
    Nvidia is unstoppable, no one is touching them until 2030 at best.

    • @__-fi6xg
      @__-fi6xg 5 หลายเดือนก่อน +10

      plot twist, Nvidia is going to be the Nvidia Killler, specifically the marketing team.

    • @iequalsnoob
      @iequalsnoob 5 หลายเดือนก่อน

      @@__-fi6xg lmao not even. people who are adults and have jobs can afford the best, nvidia. all you plebs who refuse to work and live with your parents will keep amd afloat

    • @mnomadvfx
      @mnomadvfx 5 หลายเดือนก่อน +1

      "Nvidia is unstoppable, no one is touching them until 2030 at best"
      Overall yes.
      In gaming?
      Not so much.
      The thing is that not everyone cares about RT, and without that they don't have nearly the advantage you think they do.
      Add in the fact that nVidia's advantage in RT doesn't even matter because it's all based on crutches like biased rendering and texture detail stripping RT denoising anyways and I think that they seriously jumped the gun on the entire thing just to get a short term PR win when raster was starting to run dry for them in GPU sales.

    • @thegamehud8214
      @thegamehud8214 5 หลายเดือนก่อน +1

      ​@mnomadvfx you realize RT is the future rite? Weather anyone likes it or not, so get used to it. I myself look forward to the advancement of the technology to see where it goes. All upcoming games in the future will have its iteration, there's no escaping it . Only broke people with low end cards will be left behind with only basic graphics to experience.
      It's always the people that can't afford to play with RT that complains that it's useless and it doesn't look that different to RT off.

    • @Elam51
      @Elam51 5 หลายเดือนก่อน

      There’s a reason why Media are AMD shills.

  • @duladrop4252
    @duladrop4252 5 หลายเดือนก่อน

    A 130+ WGP is equivalent of 260cu what the hell, that's already close to their MI300s System. Oh dear, Ray Tracing on this generation will be awesome and playable now... This is a monster. Imagine XTX only has 96cu yet it's performance is more convincing, how much more with a 260cu. Bring it on...

  • @dante19890
    @dante19890 5 หลายเดือนก่อน +1

    Realistically speaking its impossible to take the performance crown from nvidia.

  • @sawyerbass4661
    @sawyerbass4661 5 หลายเดือนก่อน

    AMD is absolutely going to refresh Navi 3x. They must have a lot of extra MCDs.
    They can get some low hanging fruit by porting Navi32 to 4nm. A bit more by incorporating whatever tweaks there are in RDNA 3.5. And then more with memory speeds as a last resort, or maybe even how much VRAM from a marketing perspective.

  • @bryantallen703
    @bryantallen703 5 หลายเดือนก่อน

    How many CC's can nVIDIA fit in a 754 mm² die on 4 or 3nm. We know they made the 2080ti that big.

  • @Gindi4711
    @Gindi4711 5 หลายเดือนก่อน

    I dont believe any multi GCD rumours until I see it on shelves.

  • @DepthsOfHell98
    @DepthsOfHell98 5 หลายเดือนก่อน +4

    sure greasy paul

  • @HansImWald
    @HansImWald 5 หลายเดือนก่อน

    any idea if nvidia will produce some kind of ai chip for KI-NPCs?

  • @nimbulan2020
    @nimbulan2020 5 หลายเดือนก่อน +11

    I'll believe it when I see it. Every generation there's rumors of AMD finally beating nVidia again, but if there's one thing AMD never fails to do, it's disappoint.

    • @ryanlemieux3323
      @ryanlemieux3323 5 หลายเดือนก่อน +2

      To be fair, the 7800 XT slaughters the 4070. AMD has released some epic gems over the years for those with more sane GPU budgets.

  • @mraltoid19
    @mraltoid19 5 หลายเดือนก่อน

    11:45 - Wait, CASH or CACHE ?

  • @ehenningsen
    @ehenningsen 4 หลายเดือนก่อน

    I run AI locally, I love raytracing, i upscale live streams and videos, I use the dual AV1 encoders for VR..
    NVIDIA is the only option for now

  • @elivegba8186
    @elivegba8186 5 หลายเดือนก่อน +1

    Mirror mirror on the wall
    Is this the Nvidia killer?

  • @Cpmnk
    @Cpmnk 5 หลายเดือนก่อน

    Is that audacity in the background? I f*cking love audacity

  • @jodylufc4481
    @jodylufc4481 5 หลายเดือนก่อน

    As Kylo Ren once said, "More, more, more, more, more, more ,more, more, more"

  • @darthpaulx
    @darthpaulx 5 หลายเดือนก่อน

    Keep the price competitive is more important.
    Seeing Unreal engine 5 games, they need to at least double the performance from all manufacturers.
    RDNA 3 was messed up.
    If only Amd had names and prices like the 6000 series, then Nvidia was beaten already.
    But they had to add the xtx and gre in the name and kept them as 7900 series.
    If they had 7900xt for 1000 as their best and the rest going to be 7800xt ( 7900xt ) for 700, then Nvidia was screaming.

  • @Logical
    @Logical 5 หลายเดือนก่อน +1

    RDNA # THE NVIDIA KILLER?!?!
    Performance doesn't matter, its the marketing that you have to beat.

  • @Neopulse00
    @Neopulse00 5 หลายเดือนก่อน

    I don't see them churning these out until TSMC finishes the factory in 2025.

  • @moneyman4591
    @moneyman4591 5 หลายเดือนก่อน +2

    We want amd to do well, makes it better for pc gamers overall

    • @undefwun2135
      @undefwun2135 5 หลายเดือนก่อน

      Not just gamers. They need to address the cuda problem. Us smaller creators who also game will always have to go nvidia. And the big boy studios, RnD facilities etc. A lot of money has gone into development on the proprietary CUDA platform

  • @fleabaglane
    @fleabaglane 5 หลายเดือนก่อน

    I9 10900k and 3090RTX great match 1440P max settings 27in I have 80+ 2010 2023
    STILL 2017 2023 GAMES STILL do 130 165 FPS
    1440P max settings
    However about 8 to 10 games run 90s 130s fps I feel its the game engine cause super smooth
    iI9 10900K 10@5.0
    32GB DDR4 PC3000
    3090RTX Gaming Gaming OC Gigabyte 2100@9875
    Samsung EVO PLUS 2x 1 T.B NVME M.2
    1000watt
    Asus ROG SWIFT PG279Q 1440P 165Hz IPS GSYNC WQHD 27in
    ASUS STRIX Z490E E Gaming

  • @rossgarner58
    @rossgarner58 5 หลายเดือนก่อน

    Merry Christmas dude.

  • @jeevan1198
    @jeevan1198 5 หลายเดือนก่อน +5

    Hopefully, Battlemage ends up being a massive performance increase over Alchemist, with good drivers and reasonable prices. If Intel can manage to pull all of that off, they will definitely earn respect from many gamers, and hopefully, that will be enough to pressurise Nvidia and AMD to do better in the gaming market. Battlemage won't compete against GPUs like the 4090, but that doesn't mean Intel won't be able to shine in the gaming market. Far from it. AMD is expected to return to the high end and enthusiast class segment with RDNA 5, which is rumoured to be released around Q4 2025. Celestial shouldn't be too far away from launching then as well, and they reckon Intel will compete in the enthusiast segment with Celestial, according to reliable leakers such as Kopite7Kimi and RedGamingTech. So hopefully, there will be strong competition within a few years. If that ends up being the case, then Celestial and RDNA 5 could bring the heat to Nvidia. I reckon Celestial will outperform RDNA 5, at least when it comes to ray tracing. Intel is already catching up to AMD in RT performance. That's the only way I can see the gaming market getting better, and even then, it might not.

    • @TheGeneReyva
      @TheGeneReyva 5 หลายเดือนก่อน +2

      th-cam.com/video/64Dsgy5b2kk/w-d-xo.html
      ~4080 to just over, all the way to 4080 compute and 4070 gaming (oof) at 9;35 in that video. The massive improvement over the ~3060? performance of the A770.
      Considering the 40 Super cards will be out at the start of the year, vs Q2? for Battlemage... Intel might have to go for the same approach as Alchemist. Scrappy cheap underdog.
      Maybe celestial will have a card that competes at the '80 level, and Druid will bring the beast that chests up to the '90 card of the generation.
      I wonder what E will be. Elementalist?

    • @technicallycorrect838
      @technicallycorrect838 5 หลายเดือนก่อน

      ​@@TheGeneReyvaC, D and E will not be anything if Battlemage does not deliver and shareholders cut of the branch of the GPU division.

  • @icanmakeeverythingilovedie9861
    @icanmakeeverythingilovedie9861 5 หลายเดือนก่อน +1

    Nvidia won't be killed. Not by AMD, at least. They own the productivity space, from rendering to editing to AI. The market share has and will continue to say as much as long as AMD continues to think that gamers are their sole audience.

  • @heilong79
    @heilong79 5 หลายเดือนก่อน +1

    They need cuda type core for all the new ai software that Will be coming out in yhe next year.

  • @Wooskii1
    @Wooskii1 5 หลายเดือนก่อน

    I refuse to play Doom, Cyberpunk, Returnal, etc. with a controller. At the same time, I have a Series X controller on my desk right next to my mouse. Consoles don't even have exclusives that I'm interested in (maybe Gran Turismo) these days, so the price is the ONLY advantage but without standard M&KB support it's a non starter for me personally.
    If Sony or Microsoft worked with game devs to figure out standard mouse and keyboard on their consoles, I think more people would consider going back to console. I mean, there are PLENTY of games, maybe a majority, that are available on console and PC, some with cross platform play (M&KB support could also help there).
    Also, they could open up to mild productivity with some Word type app, better web browsing, Discord, etc. Just typing anything on console is another qol issue holding it back.

  • @Licklobster
    @Licklobster 5 หลายเดือนก่อน

    whichever team comes out with the first hbm3 consumer gpu has my money

    • @arenzricodexd4409
      @arenzricodexd4409 4 หลายเดือนก่อน

      HBM4 already in development. And recent news talk about how the price on HBM goes up 5 times more expensive due to demand from professional space (mostly AI).

  • @OneCrazyDanish
    @OneCrazyDanish 5 หลายเดือนก่อน

    @8 minutes: Or an Alien landing with a gifted GPU 6000 generations ahead of whatever they pulled out of the crash at Roswell :)

  • @Brent_P
    @Brent_P 5 หลายเดือนก่อน

    Blackwell is *NOT* a GeForce part. It is the successor to Hopper.

  • @Kitten_Stomper
    @Kitten_Stomper 5 หลายเดือนก่อน +1

    Isn't RDNA 5 going to be competing with RTX 60?

  • @bodieh1261
    @bodieh1261 5 หลายเดือนก่อน

    Yea... as a long time amd fan and a current owner of a 7900xtx, I'll believe it when I see it. Either way by the time RDNA 5 and RTX 50 series hit the shelves, AMD is no longer gonna be able to get away with the whole "but our card competes with Nvidia in raster performance" argument anymore. This is probably the last generation where that argument is gonna fly, for me anyway. It only took 3 generations but RT is definitely here to stay and will very much be a staple going forward. I for one will not be willing to accept compromised RT performance when it comes time for my next upgrade, whenever that'll be.

  • @anantpai5092
    @anantpai5092 5 หลายเดือนก่อน

    Seems like this guy knows more about Nvidia and Amd products than Jensen and Lisa su put together

  • @romangregor4552
    @romangregor4552 5 หลายเดือนก่อน

    i see the roumers of the specs at other sites and i think its reallysick but i not interest for my own pc in ultra highend i wait for the midtier rx 9000 specs i hope a "9700" get same power as a 4080 or a 4090 :D
    this would be fun a rx9900xtx with 4,1x power of 4090, a rx7800xt with 3,1x power of 4090, a "rx9700xt" 2x or same speed as 4090, "rx9600xt" power of 4080 "rx9500xt" of power of 4070ti "rx9400" power of 4070 :D

  • @libertysound8575
    @libertysound8575 5 หลายเดือนก่อน

    Yo you got sunburnt I did too lol wouldn't it be amazing if something dethroned the 5090

  • @HanzoReiza
    @HanzoReiza 5 หลายเดือนก่อน

    Man, custom RDNA5 / Zen 5 for MS in 2026 could be tasty so could we see Custom RDNA5 and Zen 6? for Sony in 2027?
    Anyhow, Merry Xmas if it is something you celebrate and happy gaming all

  • @frallorfrallor3410
    @frallorfrallor3410 5 หลายเดือนก่อน

    personally i only buy higest end gpu no matter if it nvidia amd or intel arc everything works everything got diffrent pricings but they still all highend

  • @mandasantoso
    @mandasantoso 5 หลายเดือนก่อน

    Honestly, does nVidia some kind of immortal or zombies? Because it surely been killed tens of times already...

  • @tomtomkowski7653
    @tomtomkowski7653 5 หลายเดือนก่อน +4

    AMD killing Nvidia episode 45552.
    AMD bought ATI in 2008 with 50% of the market share and AMD was a bigger corporation than Nvidia. Since then this has been one constant downfall.
    JustWait™ xD
    Sooner Intel (if they will keep pushing) will catch AMD than AMD will catch Nvidia.

    • @urktklirk9770
      @urktklirk9770 5 หลายเดือนก่อน +1

      İn Germany we say; Geschichten aus dem Paulaner Garten

    • @nipa5961
      @nipa5961 5 หลายเดือนก่อน +1

      @@urktklirk9770 MIndfactory sagt AMD 61% Marktanteil letzte Woche.

  • @markus.schiefer
    @markus.schiefer 5 หลายเดือนก่อน

    To be honest, talking about GPU generations that are years away makes little sense to me anymore after the last few releases were rather disappointing. Especially when it comes to AMD.

  • @The_Word_Is_The_Way
    @The_Word_Is_The_Way 5 หลายเดือนก่อน

    Nvidia killer should just be dubbed Nividia competitor because that s all we really need anyway.

  • @marcusjackson2874
    @marcusjackson2874 5 หลายเดือนก่อน

    Lol feels like only yesterday got the rtx 3070 ti but had to upgrade because apparently 8gb not enough anymore was gonna get rtx 4070 ti or 80 but looking online saying recommended 16gb cards now so I just f it back to amd 24gb 7900xtx should last me longer than just under 2 years lol

  • @BLASTIC0
    @BLASTIC0 5 หลายเดือนก่อน

    Still using windows 10. Nice. Ill be switching to linux before I go to 11.

  • @s03t01o02
    @s03t01o02 5 หลายเดือนก่อน

    First of all
    nvidia infos are clearer
    yes nvidia gb2020 exist has 192sm due to the fact that 12gpc in 16sm per gpc is set in the sm nothings change
    Surprisingly n3e node thats make clockspeed up to 3,2ghz
    it is possible that nvidia cancelt ad102 142sm sku therefor rtx5090 is basically gb203 with 142sm from 144 (8 gpc)
    Included clockbumb this will be alone +40% vs rtx4090
    later by the launch of rdna5 flagchip with possible 180cu at around 4,0ghz equals close to 110tf vs nvidia gb202 180sm at 3,2ghz = 101tf
    on amd rdna5 one chip has 3 gcx with each 60cu larger L1cache in size and more inf cache on the interposer chip on chip and like you sad no mcd anymore an 384bit Si is save to say
    Possible 512bit
    Those plans sound like 3d packaging and advanced 3d design we will see if this come before a14 node.this would be necessary.
    tsmc n3 node doesnt include chip on chip manufacturing only cache on top. it is possible but limits clockspeeds to only 3,0ghz it makes no sense on a gpu.
    More cu couldnt fix this.
    Like nvidia is amd bound on driver maxed shaderengines only solution is more alu per cu from 64 up to 192 would fix this and far as i see that design for rdna that is not possible.
    A groundbreaking new architecture makes no sense
    rdna3 lacks of clockspeeds and powersaving because of known reason (too small sram size) the solution is bigger that and scaling with that .The cu count maximum is per gcx 3. 3 gcx at 60cu per gcx =180cu at maxed clockspeed and i Believe mcm still exist and get a active interposer with mcd in n4x 8 times on 64bit =512bit each mcd has 64mb L3 cache to increase bandwith and this paired with 16gbit 28gbps gddr7
    Possible solution is n3 node basically changed gcd in 3 gcx each 60cu same rdna3 design paired with active interposer and beneath, 8 mcd each 64bit in n4x node larger sram cells as well.
    3,55ghz is possible with this addition of n3 node come to close 4,08ghz only thing i am worried is power consumption if amd used n3 node on all chips thats no problem and clockspeed guarantied to 4,08ghz.
    But the design that you describe is not possible in n3 node tsmc doesnt support 3d packaging until a14
    tsv only solution for this and thats limits clockspeeds the reason is like most of the times power delivery too much volt is required and can damaged the sram results in lower volt and lower clockspeeds maxed 3,0ghz would be possible.
    And like i said more cu doesnt fix that but more alu per cu could do that
    But thats makes no sense so early only rdna successor 2030 makes sense to do that
    rdna is basically a 32bit Architecture like nvidia did it since 2006 each alu has fixed datapath to 3 types of alu an ai core and a rt core the maximum of datapath is due to the 32bit limit 4096bit thats limits the alu per cu to 64
    nvidia has overcome the 32bit limit on alu level and can get scaling until 256 alu per sm next nvidia design rubin has 192 alu per sm on n2x node
    After this mcm would be necessary.

  • @paulkendall6069
    @paulkendall6069 5 หลายเดือนก่อน +1

    I think a better way to look at Chiplet GPUs is to compare it to DDay of WW2, The Alies had to get Sufficent Troops on The beach to fight there way of the beach to few and they would be pushed back into the sea too many and they would be easy targets and beach would grid lock causing issues of supply which is just as bad as too few. GPU needs to move large amounts of data too slow and you will get bottle necks and stalling of tasks. I belive problems with high end RDNA3&4 is the delays moving data to where its required in the shortest time possible which is required to meet the performance required and due to glitches and bandwidth available they didn't meet requirements as they were clocked higher they had timing and bandwidth issues that caused glitches or stalls or both. However I've read faster interconnect with higher bandwidths are now slowly becoming available to use its also dropping in cost making it more viable for AMD to use. AMD also have some very clever people who have got chiplets working I wouldn't bet against them doing it with GPUs they have a head start. Nvidia will have been looking at and developing a chiplet type gpu as costs of wafers rise every mm counts so with a design that allows you to use more of the wafer and gives better yields, if AMD bring out a monster Chiplet RDNA5 that blows Nvidia away don't be surprised if you see Jensen holding up a Chiplet bassed RTX card soon after that's close if not slightly better.

  • @Mr.Canuck
    @Mr.Canuck 5 หลายเดือนก่อน

    Every new iteration from AMD is always the same old song and dance, they will always be playing catch-up with their drivers and gpus, they will always be the go-to for the budget oriented. Hoping Intel can pull up the slack in the coming years.

  • @ChrisJohnson-cp5ff
    @ChrisJohnson-cp5ff 5 หลายเดือนก่อน

    Man they been talking about inviting killers since 2012

  • @RTX4090TISLI
    @RTX4090TISLI 5 หลายเดือนก่อน

    Meteor lake reviews please

  • @delatroy
    @delatroy 5 หลายเดือนก่อน

    Maybe AI ASICS killing nvidia general purpose GPUs will bring nvidia back to gamers

  • @chris.dillon
    @chris.dillon 5 หลายเดือนก่อน

    12:40 If there's a bubble (there usually is but a black swan) then there will be a pivot but the work thus far does not disappear. Gamers funded AI. The dotcom boom funded Google and Cisco. Mobile funded TSMC. These are the economic engines that brought on the next thing. The thing with AI is, I think the fossils it will leave is something like (mimicked) intelligence. Definitely hardware and investment, it's already been happening for a while in HPC. Even if it all collapses (like Cisco), the fiber fossils were already buried. That's why I think this isn't like the 80s AI Winter this time. The models are already in use and unlikely to rollback.