New Nvidia Leaks? Enhanced DLSS + What Is Neural Rendering?

แชร์
ฝัง
  • เผยแพร่เมื่อ 2 ม.ค. 2025

ความคิดเห็น •

  • @datageek9132
    @datageek9132 2 วันที่ผ่านมา +335

    Jensen: "Neural Rendering saves so much RAM you can have a 2GB RTX 5060 for £800."

    • @xviii5780
      @xviii5780 2 วันที่ผ่านมา +10

      It probably uses even more vram so you actually can't use it on 5060 🥴

    • @roklaca3138
      @roklaca3138 2 วันที่ผ่านมา +6

      Damn exactly what i wanted to write... With 1gb of vram for 1000eur

    • @apokalip6
      @apokalip6 2 วันที่ผ่านมา

      9070 basura obsoleta que no puede ni con path tracing por solo lo que quieran pagar los fan boys de amd jajajja

    • @gamingfromjohnwayne
      @gamingfromjohnwayne 2 วันที่ผ่านมา +2

      Shit don't say that he be on that faster then his wife lmao😅

    • @puregarbage2329
      @puregarbage2329 2 วันที่ผ่านมา +3

      So funny because it’s true.

  • @elcazador3349
    @elcazador3349 2 วันที่ผ่านมา +389

    "The more you neural render; the more you save VRAM." - JenseNvidia, probably

    • @christophermullins7163
      @christophermullins7163 2 วันที่ผ่านมา +63

      But only in specific games that implement it. And only save 10%. And only on 50 series. It'll be another "4x performance with frame Gen" etc. basically lies

    • @elcazador3349
      @elcazador3349 2 วันที่ผ่านมา

      @@christophermullins7163 "Terms and conditions may apply."

    • @apokalip6
      @apokalip6 2 วันที่ผ่านมา +5

      "la vram sigue siendo todo" lisa su como en el siglo pasado

    • @briananeuraysem3321
      @briananeuraysem3321 2 วันที่ผ่านมา +5

      And take about 3x longer to compute per the original paper

    • @lawnmanGman
      @lawnmanGman 2 วันที่ผ่านมา +6

      the matrix math for temporal is stored in vram... ACTUALLY

  • @igorrafael7429
    @igorrafael7429 2 วันที่ผ่านมา +34

    With neural rendering you no longer need gpu with lot of vram, with frame gen and dlss upscale you no longer need 500w feeding 10240+ cuda cores. This is why we are launching the GeForce PTX 5090 with 256 cuda cores, 1gb and 64 bits for $ 2499. Happy new year, gamers!

    • @beeping2blipping
      @beeping2blipping 3 ชั่วโมงที่ผ่านมา

      You forgot that with Neural rendering you will need to stay connected for our 15 min ads and have a working Internet while playing any game on your computer or mobile device. If you disconnect or skip ads we will charge you an amount you never will forget and set you game to only show frame per second up to 25 FPS as it was in the good old days of TV as there are where such people belongs that doesn't follow Nvidia Neural policy.

  • @lukasgroot
    @lukasgroot 2 วันที่ผ่านมา +26

    I've been neural rendering my whole life already.

  • @jasonhemphill8525
    @jasonhemphill8525 2 วันที่ผ่านมา +84

    Just a reminder that the 80 class is shaping up to be the most cut down 80 in Nvidia history. 50% percent of total 102 cores.
    80s used to be built on 100 dies. Now they are selling us scraps for triple the cost and asking that we be grateful for the insult.

    • @CrashBashL
      @CrashBashL 2 วันที่ผ่านมา +15

      It's really a 5070 under the 5080 tag.
      They tried to scam us like this with the 4080 when it launched but the people didn't let it fly.
      Don't forget that!
      Don't let Nvidia scam us this time either!

    • @maynardburger
      @maynardburger 2 วันที่ผ่านมา +1

      x80's varied from either full fledged upper midrange dies or cut down high end dies. 680, 980, and 1080 were all upper midrange dies, for instance. But Nvidia is lowering the '2nd best die' to be way lower spec. If GB202 is the top end, then GB203 is basically what Nvidia would have called GB204 in pre-40 series generations. They've actually raised the naming by TWO levels per tier, essentially. The 4080 is more like what the 3070 was. With 4070Ti in between that is literally TWO tiers of naming and pricing being raised. From $500 to $1200 in one generation. Insanity. And they're just making it even worse now. And most people who complain about it will still buy eventually anyways. smh

    • @jasonhemphill8525
      @jasonhemphill8525 2 วันที่ผ่านมา +4

      @@maynardburger I'm happy to see I'm not the only one who remembers.
      the 80tis used to be +93% of full 102.
      Shameful.

    • @mexreax4493
      @mexreax4493 วันที่ผ่านมา +1

      Yes because a bunch of dumb companies will buy 5090s for ai. They stop producing the 4090 so they can sell the 5080 at the same price point and have so much more profit. All business. I won't be buying a gpu in about 4 yrs

    • @TiagoMorbusSa
      @TiagoMorbusSa วันที่ผ่านมา

      @@CrashBashL Incorrect: it is a 5060 with a Titan price tag.
      Let me remind you the GTX 760 had half the transistors of the Titan of that generation.
      The GTX 1060 had two thirds of the transistors of the top of the line GTX 1080 at the time of release.
      The RTX 4060 has less than 25% of the RTX 4090.

  • @dotxyn
    @dotxyn 2 วันที่ผ่านมา +51

    11:00 Nvidia's master plan to push people into the 2yr GPU upgrade cycle instead of 4yrs

    • @garyb7193
      @garyb7193 2 วันที่ผ่านมา +6

      Yes, Nvidia will likely force the tech into the RT/PT pipeline and/or image scaling and rendering somehow making sure mid-level cards such as the 5070 run circles around 4090 cards in neural rendering. Planned obsolescence at its best.

    • @evilzinabyssranger5695
      @evilzinabyssranger5695 2 วันที่ผ่านมา

      they realised that gamers are just CRACKHEADS addicted BEETCHES that will buy whatever they launch anyway.
      AFTER they launch 12gb cards for miners and 6 to 8 gb for gamers, they realised that.
      It is what it is. I hope they sell 5060 with 6gb and 64bit bus cause ppl will still think its status to buy nvidia.

    • @Micromation
      @Micromation 2 วันที่ผ่านมา +7

      Jokes on them I'm not upgrading 4090 until I can get acceptable performance (120fps

    • @IgnacioGouk
      @IgnacioGouk 2 วันที่ผ่านมา +5

      Well, jokes on Nvidia, I'm poor as fuck...

    • @richardnguyen1520
      @richardnguyen1520 2 วันที่ผ่านมา +7

      ​@@MicromationDLSS is imperfect, but it's hardly garbage. You squeeze so much more frames out while retaining pretty impressive visual fidelity. Now FSR... That's garbage.

  • @Kneel2ThaCrown
    @Kneel2ThaCrown 2 วันที่ผ่านมา +48

    Daniel Owen had a good video breaking down the article Nvidia had on this

    • @RepublicofODLUM
      @RepublicofODLUM 2 วันที่ผ่านมา +13

      Love his videos. he does such a great job of breaking tech down for someone who isn't well versed with all the verbiage that some of these other channels use.

    • @LordJapos
      @LordJapos 2 วันที่ผ่านมา +5

      Except he was most likely wrong, neural compression probably isn't what neural rendering is referring to.

  • @rains2693
    @rains2693 2 วันที่ผ่านมา +3

    Journalism hates $600 consoles, but never criticizes the price of graphics cards. Digital Foundry should be different.

  • @vitordelima
    @vitordelima 2 วันที่ผ่านมา +65

    There is more information inside NVIDIA's web site, such as neural materials or neural texture compression.

    • @Kinslayu
      @Kinslayu 2 วันที่ผ่านมา +2

      I'm guessing it's this

    • @pctechloonie
      @pctechloonie 2 วันที่ผ่านมา +6

      The keynotes from Siggraph over the years are also very interesting. Yep the neural rendering thing is not a new term. NVIDIA research has been using it for years.
      Can't wait for the next logical step in game rendering. DLSS was merely a stepping stone. What comes next is going to be absolutely bonkers.

    • @vitordelima
      @vitordelima 2 วันที่ผ่านมา +3

      @@pctechloonie One of the possible goals for this would be the use of AI picture generators to directly render the final image somehow.

    • @pctechloonie
      @pctechloonie 2 วันที่ผ่านมา +4

      @@vitordelima the NeRF stuff applying as a filter on top of some rasterized bare bones implementation or even replacing it completely could be very interesting.
      Will be exciting to see where the technology is in 3-5 years.

  • @Alexx_80
    @Alexx_80 2 วันที่ผ่านมา +98

    I'm disappointed that such professionals as you are, didn't watch NVIDIA SIGGRAPH 2024 presentation about real-time neural rendering - regarding material shaders (hierarchical textures). You definitely should read the article which is available on-line.

    • @vitordelima
      @vitordelima 2 วันที่ผ่านมา +19

      Or anything about AI-based Global Illumination.

    • @pctechloonie
      @pctechloonie 2 วันที่ผ่านมา +23

      Very dissapointed as well.
      All you have DF is to do is write "NVIDIA neural rendering" online and you get shoved a ton of research papers in your head.

    • @SALTINBANK
      @SALTINBANK 2 วันที่ผ่านมา +2

      Amen

    • @nosult3220
      @nosult3220 2 วันที่ผ่านมา +2

      They’re stuck in the past since they don’t talk about AI in the correct format.

    • @JamieWoods00746
      @JamieWoods00746 2 วันที่ผ่านมา +8

      They're not up to speed on AI

  • @LpG88
    @LpG88 2 วันที่ผ่านมา +73

    Can’t wait for the comment “Can’t see any difference” when they compare 4090 with 5090 or 4080 with 5080

    • @Mcnooblet
      @Mcnooblet 2 วันที่ผ่านมา +12

      That's because most don't ever get to see anything with their own eyes. Their whole life xp is based on TH-cam videos, which aren't 1 for 1 of what you would actually see. I think for people that have owned enough hardware, surely they have had the situations where its like, wait why does it look different on TH-cam? Why does TH-cam hide the over sharpening effect of PS5 on AC Valhalla? How can I compare a 4090 to a PS5 for visuals and see something entirely different than what TH-cam is showing me. Are there incentives or an agenda to show even parity? Are settings being held back to make one look closer? Is there a marketing component on why aliasing isn't showing up in a TH-cam video? Who knows the full scope of effects that TH-cam video compression has, I know over sharpening makes still images look very good, and sometimes it is hard to see in motion if not in person with your own eyes. But then you play it yourself and it looks pretty ugly and noticeable.
      DF has warned about this many times, even with the PS5 vs PS5 Pro, that if you see it with your own eyes it really is significant vs what you will see on a YT video.

    • @LpG88
      @LpG88 2 วันที่ผ่านมา +13

      I have both PS5 and PS5 pro and I can’t see any difference on the compressed TH-cam videos but I see it at home on my oled TV. And most important of all i feel the 90fps 4K is great on Stellar Blade but impossible to show on a TH-cam video.

    • @steve9094
      @steve9094 2 วันที่ผ่านมา +15

      ​@@McnoobletYeah, it's ridiculous how everyone keeps saying that the PS5 Pro versions of games look no different from the base PS5 versions, when in reality a ~40-50% increase between GPUs (RX 6700 vs RX 6800) is a very noticeable performance uplift.

    • @DragoniteSpam
      @DragoniteSpam 2 วันที่ผ่านมา +10

      For most people I don't think the problem is that they don't see a difference, it's that they don't see enough of a difference for a four-digit price tag to not be an eye roll.
      If you don't buy the card/don't know anyone IRL who owns the card, of COURSE you're only going to see the TH-cam videos, and if the TH-cam videos don't make a convincing sales pitch, then you're probably not going to take a gamble on a card with a four-digit price tag.

    • @VampireNoblesse
      @VampireNoblesse 2 วันที่ผ่านมา +2

      A.I. hardware should focus on NPC behaviour / dialogues & interactions for more immersion.., and just not "slight better grafix"...

  • @DrunkencoderPayk
    @DrunkencoderPayk 2 วันที่ผ่านมา +71

    You guys do know that nvidia has released papers and demos of neural rendering as well as neural texture compression?
    (1-2 years ago...)
    the latter one is interesting as well, 4x pixels in allocating the same amount of vram then other compressions at the cost of slower rendering. but if placed well within render cycle, it may not increase render time.
    and the neural rendering stuff seems big to me. it appears that we get rid of triangles. but nvidias demos suggest that it might be used as material.

    • @John4343sh
      @John4343sh 2 วันที่ผ่านมา +19

      Well you are 100 percent more academic than everyone else in the comment section thus far. Thank you for making such strong points.

    • @faustianblur1798
      @faustianblur1798 2 วันที่ผ่านมา +12

      God of War Ragnarok uses ML to infer texture detail on PS5.

    • @DrunkencoderPayk
      @DrunkencoderPayk 2 วันที่ผ่านมา +4

      @@John4343sh thanks :)

    • @aviagrawal5903
      @aviagrawal5903 2 วันที่ผ่านมา +2

      @@faustianblur1798 does this mean AMD is doing something similar

    • @yusmeow
      @yusmeow 2 วันที่ผ่านมา +8

      YESSS FINALLY SOMEONE THAT ALSO READS THAT ONE ARTICLE! yeah they did a test of it making it sharper and better while using less

  • @kathrynck
    @kathrynck 2 วันที่ผ่านมา +48

    Well, the main purpose of the feature is to try to sabotage backward compatibility for the sake of forcing sales of new product.

    • @justhomas83
      @justhomas83 2 วันที่ผ่านมา +3

      I'm staying on my x570 platform based off RTX 4090. Something about the RTX 5090 scares me honestly. Plus it is going to be 2800 cash. Nah I'm good

    • @TheRandomGamerMoments
      @TheRandomGamerMoments 2 วันที่ผ่านมา

      Just means PC ports of console games will be worse than they already are. Games will be designed for whatever the next gen of consoles has to offer, but we'll still be getting games targeting the PS5.

    • @HanSolo__
      @HanSolo__ 2 วันที่ผ่านมา

      @@justhomas83 Pays $1600+ for GPU and says, "I'm not paying $2800 cash; I'm fine." lol

  • @Alovon
    @Alovon 2 วันที่ผ่านมา +16

    My take is that even if they aren't talking about Ray Reconstruction and the other DLSS features ATM for "Neural Rendering" most routes shouldn't be so expensive to be exclusive to RTX 5000.
    Even with Ray Reconstruction, the Tensor Cores on the RTX 3000 cards still struggle to get saturated, especially in the higher end

    • @Navhkrin
      @Navhkrin 2 วันที่ผ่านมา

      They need to have seperate models for different tier cards to properly saturate tensor cores

    • @SALTINBANK
      @SALTINBANK 2 วันที่ผ่านมา

      Proof how you manage to monitor tensor cores just asking ?

    • @Mcnooblet
      @Mcnooblet 2 วันที่ผ่านมา

      Aren't tensor cores usually 4 to 1 matched with SMs, and RT cores 1 to 1. I think at one time tensor cores use to be 8 to 1 per SM until they improved tensor cores, so it isn't always a moar = better. Also with some of the different generations of tensor cores, I think FP8 support was added for the 40 series. No clue how all of this plays out or why they added certain things per generation of tensor core. Maybe it has nothing to do with gaming, maybe it will eventually? Maybe it is going to be specifically for AI applications. I'm curious to see what they do with all of it.

    • @Alovon
      @Alovon 2 วันที่ผ่านมา +1

      @SALTINBANK there are some tools you can use to measure % usage if the Tensor Cores and even when they light up for a specific part of the pipeline, the % used during those light up situations barely crack 10% on even some midrange cards at worst

    • @aviagrawal5903
      @aviagrawal5903 2 วันที่ผ่านมา

      @@Alovon yeah unless Nvidia supports the software side we’re out of luck. Hence why I like AMD more conceptually they build to be open source, others can improve on their implementations (ex pssr)

  • @CeceliPS3
    @CeceliPS3 2 วันที่ผ่านมา +10

    I've seen a video, probably from Daniel Owen, saying that this new Neural Rendering thing runs concomitantly with Path Tracing (or was it Ray Tracing? Idk) calculations so it doesn't actually hinder in any way the time it takes to process. And it was done using a 4090. So, in theory, we should get it, unless NVidia gets even greedier.

    • @blackface-b1v
      @blackface-b1v 2 วันที่ผ่านมา +3

      It's supposed to help
      Make path tracing run faster

  • @IvanSchoeman
    @IvanSchoeman 4 ชั่วโมงที่ผ่านมา

    My guess is that neural rendering is a hardware version of what Lumen does or something to approximate the space around a single ray or maybe something that helps a tracer find light paths faster.

  • @skyhighnightlight
    @skyhighnightlight 8 ชั่วโมงที่ผ่านมา

    Have a Great Year Richard!!! And the guys too. Good Luck this year.

  • @Atrumoris
    @Atrumoris วันที่ผ่านมา +1

    Okay, so what would all this mean for generative AI? More specifically, I am very interested in AI video generation. I know there are plenty of online platforms like Kling, Luma, Hailuo etc. that offer those services and that they're pretty good at it but I am adamantly against any sort monthly subscriptions.

  • @soulsmith4787
    @soulsmith4787 2 วันที่ผ่านมา +1

    Perhaps an AI inferred renderer instead of traditional rasterization? There was an Nvidia paper that I don't recall the name of, but it had an offline scene that was rendered 1,300% faster with an AI technique. A more recent paper was on neural materials although I think the bullet point is referring to the former AI rendering technique.

  • @ataksnajpera
    @ataksnajpera 2 วันที่ผ่านมา +24

    They will use AI to upscale low resolution mip-maps to save on VRAM!

    • @vitordelima
      @vitordelima 2 วันที่ผ่านมา +1

      There is already texture compressor based on AI.

    • @pctechloonie
      @pctechloonie 2 วันที่ผ่านมา +2

      @@vitordelima yes but have they been used in games? If you know of any game implementations please let me know.

    • @vitordelima
      @vitordelima 2 วันที่ผ่านมา

      @@pctechloonie Of course it wasn't.

  • @blakewilliams5627
    @blakewilliams5627 2 วันที่ผ่านมา +3

    Dell should team up w/ Nvidia to finally make the Alienware UFO.

  • @RepaireroftheBreach
    @RepaireroftheBreach 2 วันที่ผ่านมา +5

    Seems like you guys haven’t seen nvidia’s video on neural rendering from while back. It has to do with a hybrid of ray tracing and path tracing in real time. Check it out. Mystery solved. I’m sure nvidia will reveal more details at CES.

  • @Shiffo
    @Shiffo วันที่ผ่านมา

    so how much is a 4080 worth on the used market currently?
    I am not finding any 4080's used in my country on the marketplace.

  • @Arcticwhir
    @Arcticwhir 2 วันที่ผ่านมา +1

    neural might refer to NeRF (neural radiance fields ) and that tech...?

  • @gamingfromjohnwayne
    @gamingfromjohnwayne 2 วันที่ผ่านมา +1

    I wonder if u can buy thos vram chips and put them on the 40 series?

  • @jasonfch
    @jasonfch 2 วันที่ผ่านมา +5

    I would like to see Neutral AI polygon generation where a polygon are generated in reatime just base on the the skeleton rig. Everything from the human modeling ,cloth/equiments, AI character animation, facial animations and textures and textures scaling. Would apply too to the enviroment generation base on the simple spline shape base on criterias define by the game developers. To simplify this, there is no 3d models/texture/animation in the games files, Just criterias and intructions set.

    • @vitordelima
      @vitordelima 2 วันที่ผ่านมา +2

      It's too slow currently but adaptive tessellation is the closest you can get to it nowadays.

  • @yogiwp_
    @yogiwp_ 2 วันที่ผ่านมา

    Depends if we're talking about neural rendering as umbrella term, or a specific rendering technique (NeRF).

  • @arthurcuesta6041
    @arthurcuesta6041 2 วันที่ผ่านมา

    I believe the next step for DLSS would be optimizing geometry through mesh shaders. You could cave something like nanite, as in dynamic progressive LODs based on camera distance, but without the huge overhead Nanite causes.

  • @LouisRyanMorris
    @LouisRyanMorris ชั่วโมงที่ผ่านมา

    Where do you buy XAI207?

  • @jeremymerry7967
    @jeremymerry7967 2 วันที่ผ่านมา +3

    get ready to pay huge prices leaks show the 5090 is 2600$ with the 5080 being 1350$

  • @Petch85
    @Petch85 2 วันที่ผ่านมา +6

    It has been 6 years with ray tracing.
    It is the distance between Quake (96) and GTA Vice City (Or Marfia).
    Give it 6 years more and we have GTA IV (Or Crysis)
    In 20 years no one will argue that ray tracing did not matter. And there will still be good 2D games.
    Ray tracing have been the holly grail for as long as I can remember. Physics bast lighting can give you realistic results but at the price of a lot of computation.
    But you will still be able to make games in what ever art stile you want. 2D games, 3D rasterization games, and 3D ray-tracing games they can all be good and bad looking games.

    • @mojojojo6292
      @mojojojo6292 2 วันที่ผ่านมา +6

      It's already come along massively since then. From early RT reflections only in games like Battlefield and quake running with full path tracing to modern demanding games like wukong, cyberpunk and alan wake, indy having full path tracing options handling lighting, shadows and reflections with playable framerates. Give it another 6 years and it will be in every AA and AAA game and playable across the entire stack of modern GPU's. RT was never going to happen overnight. It was a pipe dream for the distant future only 8 years ago. Nvidia has made it a reality now.

    • @Petch85
      @Petch85 2 วันที่ผ่านมา +5

      @mojojojo6292 I do agre.
      But to be fair, there are not even a handful of GPUs and maybe just a handful of games that makes ray tracing worth it.
      The 5060 will not be able to run games with full RT, and people that buy it this summer will not be able to use RT in a lot of modern games 2025 and forward.
      Thus I don't think everyone will be enjoying high quality RT even in 6 years.
      But I think a lot of new games will be RT only from now on and especially in 6 years. Because it is just easier to only do RT and not both RT and raster.
      But if you have a 4060/5060 card the games might actually look a little worse than if the game was made using raster. Just because those cards simply do not have enough power and VRAM to make RT look good enough.
      Right now good quality RT is for the privileged and I don't see that change much in the near future. 🤷‍♂

    • @vitordelima
      @vitordelima 2 วันที่ผ่านมา +4

      The current hardware implementation of ray tracing is shit and almost any other method (based on software or hardware) would be better.

  • @EmblemParade
    @EmblemParade 2 วันที่ผ่านมา +1

    I also buy/sell every time a new generation comes out, for the same reasons as Rich's friend: you sell before the price of the old generation drops too much. This makes sense for phones and cars, too. And of course it means you're always there with the latest and greatest features.

  • @jaritsu
    @jaritsu 2 วันที่ผ่านมา +1

    The next big step in nvidia tech is going to be them charging a monthly fee to unlock hardware features. Mark my words.

  • @krat0skrat0s69
    @krat0skrat0s69 2 วันที่ผ่านมา +28

    i think neural rendering is just going to be a new way of generating 3d models out of images

    • @aviagrawal5903
      @aviagrawal5903 2 วันที่ผ่านมา +3

      Excellent guess, checking back here after CES 😂

    • @cmdrblahdee
      @cmdrblahdee 2 วันที่ผ่านมา +5

      This. Nvidia has papers published about it. Im no expert, but it appears to be mostly a compression technique.
      I am disappointed that DF didn't seem to be aware of the paper and able to guess what this feature is (not saying certainty, but this clip seems to be wild speculation based off the combination of words).

    • @kanta32100
      @kanta32100 2 วันที่ผ่านมา +1

      From what I undrestand, it will simulate raytracing, 2x faster and will use less vram.

    • @aviagrawal5903
      @aviagrawal5903 2 วันที่ผ่านมา +1

      @ could you elaborate on simulate? I know it’s just a leak so no worries

    • @EJM07
      @EJM07 2 วันที่ผ่านมา

      It has nothing to do with 3d models. It's a new way of modelling BRDFs. So supposedly it upgrades the appearance of materials. It will be integrated into the games rendering pipeline directly. Tbh I'm not too hyped because from the looks of it it's still very much a work in progress and it's expensive. Combined with that it should be used with raytracing which alone hasn't been too successful in the real time rendering domain so far.

  • @kukuricapica
    @kukuricapica 2 วันที่ผ่านมา

    I think the point is that you can have much higher quality materials while the speed of rendering is same or faster as with standard methods. But i might be wrong because i havent read the paper , just saw some rundown videos explaining it.

  • @faustianblur1798
    @faustianblur1798 2 วันที่ผ่านมา +3

    Presumably using ML to replace shader code. So instead of writing a complex BRDF equation for various materials and lighting, it's instead encoded as a neural network created from photogrammetry or other reference models.

    • @xviii5780
      @xviii5780 2 วันที่ผ่านมา +1

      This sounds like something that would work really well or not at all

    • @vitordelima
      @vitordelima 2 วันที่ผ่านมา

      @@xviii5780 It's already in the NVIDIA's web site.

  • @christopherarocha92
    @christopherarocha92 2 วันที่ผ่านมา +3

    These cards will be used to push their GFORCE now subscription and cloud gaming. Prob help streaming on the cloud more visually clear.

  • @yusmeow
    @yusmeow 2 วันที่ผ่านมา +1

    I think there was a an article few years ago about texture compression technique that requires half the value but better output in texture meaning sharper. Now I don’t know what’s the real name is but I remember reading nvidia was testing it.

    • @pctechloonie
      @pctechloonie 2 วันที่ผ่านมา

      Neural Texture compression. Oh and it's 16x the pixels at the same or lower data footprint. you can find that and every other neural rendering tech by searching for "Neural Rendering NVIDIA" they have an entire section for it.

  • @CrashBashL
    @CrashBashL 2 วันที่ผ่านมา +1

    It's something that will put a 2000€ price mark on the RTX5090 but never to be found in gaming....

  • @user-bg4wk6nh3b
    @user-bg4wk6nh3b 2 วันที่ผ่านมา +1

    It's going to be texture compression of some sort. It alleviates gamer's complaints of low VRAM (even if placebo) while still forcing AI buyers to buy 10x more expensive options to get more ram.

    • @Armendicus
      @Armendicus 2 วันที่ผ่านมา

      Oh so it is the texture compression math they were talking about a year or two ago!! wonder if they'll combine it with photogramitry.

  • @chengong388
    @chengong388 2 วันที่ผ่านมา +1

    you can render a game with basically flat models with no texture, just color coded, and then use AI to convert that into a stylized image.

    • @AGuy-vq9qp
      @AGuy-vq9qp 2 วันที่ผ่านมา +3

      That would look dogshit

  • @robertbrandonruiz69
    @robertbrandonruiz69 3 ชั่วโมงที่ผ่านมา

    How do i invest in XAI207?

  • @singular9
    @singular9 2 วันที่ผ่านมา +6

    I can give you a few guarantees on what the RTX 50 series will be yet another price gouging flop like the 40 series. I can go buy a 3070 for $279 right now and beat the 4060 which costs more. I can go buy a $6900 XT which will decimate a 4070 super for half the price. What will the 5060 be? A 5% uplift over a 4060 while costing 50 bucks more?

    • @Dempig
      @Dempig 2 วันที่ผ่านมา +1

      Used GPU's cost less than new? Shocker

  • @AmigoAmigo-w5p
    @AmigoAmigo-w5p 2 วันที่ผ่านมา +2

    RTX 5000 maybe be the last RTX cards.
    6000 series might no longer be called RTX. Baybe NTX, A.I.TX

  • @oswaldjh
    @oswaldjh 2 วันที่ผ่านมา +11

    Neural Rendering = The tech available only on RTX 5000 cards.
    Another Nvidia play would be to limit the FG on the latest DLSS to the new cards only.
    They got away with that regarding the RTX 3000 cards not getting FG but RTX 4000 did.

    • @joncarter3761
      @joncarter3761 2 วันที่ผ่านมา +1

      You're forgetting 20 series (which I only just upgraded from), we didn't even get resizable bar! Nvidia does this EVERY time if they can add in extra hardware to lock out features for older cards they will.

    • @oswaldjh
      @oswaldjh 2 วันที่ผ่านมา +1

      @@joncarter3761 I remember that the RTX 3000 didn't have resizable bar for the initial release and had to be updated in firmware.

    • @badzilla1173
      @badzilla1173 2 วันที่ผ่านมา +4

      It would be really funny if they did that when a year ago they were showing off neural rendering on a 4090, so it's clearly capable of using it.

  • @seanc6754
    @seanc6754 2 วันที่ผ่านมา +7

    Neural rendering is just a fancy word that Nvidia uses when they are fixing to bend everyone over who wants a 50 series card.. no Vaseline either.. so If you're going to buy a 50 series card just know that your the reason Nvidia is charging so much and your literally paying Nvidia to bend you over and take it..

    • @CynHicks
      @CynHicks 2 วันที่ผ่านมา

      Intel has Battlemage now. Really great bang for your buck. Go get banged. 😉

    • @mitsuhh
      @mitsuhh 2 วันที่ผ่านมา +2

      Life is short and I can afford it. What's wrong?

  • @RobCabreraCh
    @RobCabreraCh วันที่ผ่านมา

    I agree with your viewer. It also makes more sense to buy an expensive card as close to its release as you can.

  • @SALTINBANK
    @SALTINBANK 2 วันที่ผ่านมา +21

    NVIDIA : VRAM issues gone

    • @christophermullins7163
      @christophermullins7163 2 วันที่ผ่านมา +9

      "Vram issues gone"
      But only in certain games so not really.

    • @SALTINBANK
      @SALTINBANK 2 วันที่ผ่านมา +3

      @@christophermullins7163 ERRATUM with neural compression for future games it will be ...
      Plus if you are not a noob you can tweak the game engine (all games) to not max out VRAM buffer (the basics in gaming)
      But people don't bother : too complicated because they don't want to learn the basics ...

    • @SaccoBelmonte
      @SaccoBelmonte 2 วันที่ผ่านมา +3

      Not sure about that. Uncompressed/high quality textures will still live in VRAM.

    • @TheTripleAGamerTheFrame
      @TheTripleAGamerTheFrame 2 วันที่ผ่านมา

      Dlss is in more games than not so it would probably be a good idea to assume that neuro rendering will be there too furthermore for people that have vram issues normally are the people that didn't purchase a graphics card with sufficient vram to run their application this is the fault of the consumer and not Nvidia the consumer is who selected the card with insufficient vram to do the job most consumers are very cheap and expect a lot more from their graphics card then they actually paid for​@@christophermullins7163

    • @John4343sh
      @John4343sh 2 วันที่ผ่านมา +5

      @SALTINBANK It is crazy how most people have no idea how the graphics landscape is going to completely change. AI is going to revolutionize graphics and the processing of them. I think I am just not going to engage with these folks until we have neural rendering in the mainstream. They lack foresight and more importantly insight into the current technological paradigms.

  • @Paul-jb6rk
    @Paul-jb6rk 2 วันที่ผ่านมา +2

    The more you spend the more you save. Only gamers will get that joke.

  • @spencereaston8292
    @spencereaston8292 2 วันที่ผ่านมา +1

    Dlss takes a few pixels and makes them more. AI rendering takes few polygons and makes them more. This would be impossible for a traditional pipeline as it doesn't know what what things are. Have AI provide context and now it can upscale geometry correctly. The other thing that is capable is the developer creates a super high res asset. A tool chain creates an AI model and a minimum polygon model. The game engine only works with the minimum poly model, but it's AI model is loaded like a shader on the GPU. It is then rendered as the super high polygon asset. Very much like vertex shaders in steroids.

  • @fohhee
    @fohhee 2 วันที่ผ่านมา +3

    With half CUDA cores of RTX 5090, RTX 5080 is just a rename RTX 5070, real RTX 5080 got "Unlaunch" again, Nvidia learned their lesson.

    • @johnc8327
      @johnc8327 2 วันที่ผ่านมา +1

      There is no list of requirements a gpu has to have to be 80 class. It’s just a name on the box. If 5090 didn’t exist and 5080 was the biggest gpu nvidia sold, would that make it a real 80 class?

    • @lawnmanGman
      @lawnmanGman 2 วันที่ผ่านมา

      @@johnc8327 distribution of parameters, it should go like this, mid end is the best value because the r&c of the high end needs to be recouped FROM HIGH END SALES, what they are doing is putting all that on everyone and taking in 80% profit with families who wanna just play games without issues, its fine it will just push people to consoles the last 4 years has been more and more broken launches, bad perf and COMP STUTTERS. i saw people calling ps5 pro a scam if that is a scam this is a death sentence

    • @pctechloonie
      @pctechloonie 2 วันที่ผ่านมา +2

      oh NVIDIA has done this before. Remember the 680? Compare it against 780 TI. The simularities are striking.

  • @sharpiemcsharp
    @sharpiemcsharp 2 วันที่ผ่านมา

    There's no replacement for displacement.

  • @TrevorSullivan
    @TrevorSullivan 8 ชั่วโมงที่ผ่านมา

    NVIDIA is playing 4d chess, while all the idiots on reddit are getting hung up on VRAM quantity. "Bigger number better. Smaller number bad."

  • @Elkemper
    @Elkemper 2 วันที่ผ่านมา

    I believe, that it will be a frame gen on steroids - Huang said in 2024, that they plan to render 1 frame and generate 5 from it (kinda). But real reason, is that they want to add some new accelerators, or different architectures, so all previous generations will become obsolete at once.

  • @Velly2g
    @Velly2g 2 วันที่ผ่านมา +5

    Upgrade your GPU every generation if you want to save money. The longer you hold on to an old card, the less value it has.

    • @sogetsu60
      @sogetsu60 2 วันที่ผ่านมา +2

      This works for 60 and some 70 series. 80 and 90 series loose value alot.

    • @mojojojo6292
      @mojojojo6292 2 วันที่ผ่านมา +2

      @@sogetsu60 Wrong, the xx90 holds value better than anything else right up until launch of the next xx90. If you sell before then you lose little to no value. Look at current used 4090 prices with the 5090 just a month away. You can maintain the flagship every gen for about 200-300 lost tops depending on when you sell and whether there is a price increase.

    • @Groovy-Train
      @Groovy-Train 2 วันที่ผ่านมา +4

      OK Jensen.

    • @Velly2g
      @Velly2g 2 วันที่ผ่านมา +2

      @sogetsu60 it works for every series. If you upgrade your cards ASAP you barely lose any money. If you hold on to a card, you lose the same amount or more as if you upgraded every 2 years. So why hold on to a weaker card.

    • @Dempig
      @Dempig 2 วันที่ผ่านมา

      @@sogetsu60 A used 4080super will easily sell for close to MSRP right now, a used 4090 will sell for over msrp. A used AMD card will sell for less than half of msrp

  • @simonwest6002
    @simonwest6002 2 วันที่ผ่านมา +4

    Has rasterization really reached its limits when the engines and people that use them have become increasingly reliant on shortcuts and those engines? Just look at all the baked in long standing issues that UE5 has that is not getting fixed. The artistic vision and achievements from before raytracing (RDR2, Witcher3 even CP2077 without the nvidia endorsement) was done by people who cared and knew their craft and tools. Now there are crutches (DLSS, Frame gen etc) to lean on and those developers (Nvidia, AMD) are looked at to improve them instead of the people making games.

  • @konstantinlozev2272
    @konstantinlozev2272 2 วันที่ผ่านมา

    Ultra realistic AI enhanced images in photo mode?

  • @Kumoiwa
    @Kumoiwa 6 ชั่วโมงที่ผ่านมา

    10:58 that guy might be onto something tbh
    Lets say you buy a graphics card for 500 keep it for 4 years sell it for 200 and buy another one for 500: total spent 800
    But if you buy for 500, in 2 years sell it for 300, buy another for 500 and sell it again in 2 years for 300 to buy for another 500: total spent 900 but you get to use new hardware every gen

  • @jonathaBnhughes
    @jonathaBnhughes 4 ชั่วโมงที่ผ่านมา

    You mentioned XAI207 will 100x on your other video. Can you please go into detail? Everyone is wanting to know!

  • @lamikal2515
    @lamikal2515 วันที่ผ่านมา

    "What is Neural Rendering ?"
    Simple : the new "black-box" that will of course be bundled with the new minor iteration of UE5 (and forced by default), and will transform even a 4090 into a paperweight within 6 months

  • @jerryalexanderyoung69
    @jerryalexanderyoung69 3 ชั่วโมงที่ผ่านมา

    went all in on XAI207 after your suggestion. Let's lick our chops and get those green candles

  • @lawnmanGman
    @lawnmanGman 2 วันที่ผ่านมา +1

    HEY GUYS, LOOK WE MADE THE MATRIX CORES BETTER SO WE CAN FILL THAT VRAM UP QUICKER!

  • @ssvis2
    @ssvis2 2 วันที่ผ่านมา

    The likely reason all the neural and AI additions didn't make sense is that gamers are not the target customers. The supposed game enhancements are a cover. Massive parallel processing with trainable AI models integrated into the hardware are most useful for two groups: industry and state actors. I would bet that one of the major funding sources is the NSA for use in Echelon or whatever the follow-on system is called. Also, folks need to look carefully at what is actually in the drivers for these cards.

  • @RARufus
    @RARufus 2 วันที่ผ่านมา

    We will hit the point where AI can generate real-time games on demand, and/or be used to generate different game worlds, characters, etc. given a set of commands/rules. A game could be a unique experience for every player every time.

  • @KeithPatrickFlores
    @KeithPatrickFlores 3 ชั่วโมงที่ผ่านมา

    The potential of XAI207 is unreal! Excited to see where this goes after watching your video!

  • @TheTripleAGamerTheFrame
    @TheTripleAGamerTheFrame 2 วันที่ผ่านมา +4

    Yes it's probably going to definitely be an exclusive technology to the 5000 series gpus for they do need a selling point over the 4,000 series this is because of how powerful the 4000 series is.
    Furthermore it's an AI future it's a more efficient way to render graphics than raster performance
    My point the Intel ultra 9 series of CPUs. Possess a piece of silicon called a neural processing unit
    This neural processing unit in the Intel ultra 9 CPU while working in conjunction with neural rendering on a 5000 series GPU
    Has the potential to destroy amd's 9800x3d the neural rendering performance I really can't believe nobody has said anything about that in the comments and this better be tested I hope the author of this comment sees what I'm saying here

    • @aviagrawal5903
      @aviagrawal5903 2 วันที่ผ่านมา +1

      @@TheTripleAGamerTheFrame that’s not how software works unless Nvidia and intel collab on the software stack, meaning the only reason to test that is if they announce something. Otherwise I don’t think they matters

    • @TheTripleAGamerTheFrame
      @TheTripleAGamerTheFrame 2 วันที่ผ่านมา

      @aviagrawal5903 you're right about how software works definitely for sure
      But guess what this theory stated is not software it's hardware actual physical silicon this is not a software rendering solution dlss has never been that way
      Dlss uses physical silicon tensor cores
      Frame generation uses a flow accelerator processor also physical silicon
      The only way neural processing can be achieved is with a physical silicon chip
      Called a neural processing unit
      So maybe look to the real Horizon and do your homework before you try and cut down a logical theory
      Your comment makes you look unsmart.

    • @aviagrawal5903
      @aviagrawal5903 2 วันที่ผ่านมา

      @@TheTripleAGamerTheFrame …. Windows cannot run on m1+ Apple hardware. Think about why….

    • @aviagrawal5903
      @aviagrawal5903 2 วันที่ผ่านมา

      Have you ever developed an internal API to allow two pieces of hardware to communicate? It’s what is required. Hence the software….

    • @aviagrawal5903
      @aviagrawal5903 2 วันที่ผ่านมา +1

      One last note, do you think that the physical placement of tensor cores on a silicon chip is what causes the ui of a game to pop up “dlss”, or maybe it’s down to the software? For instance why does every game not have dlss implemented by default… because it has to be implemented (software!)

  • @RichardJohnnyKelly
    @RichardJohnnyKelly 2 ชั่วโมงที่ผ่านมา

    Just swapped all of my last ETH and swapped it into XAI207. Already up a little bit. Unfortunately I have some other junk staked which won’t free up for a while. Still now I am on the train!

  • @jongentile4933
    @jongentile4933 2 วันที่ผ่านมา

    What are the chances that Switch 2 has new enough tensor cores to utilize Neural Rendering? It would be very interesting to see Nintendo get their hands on a brand new emerging tech before any of the other console makers

  • @christophermullins7163
    @christophermullins7163 2 วันที่ผ่านมา +20

    The 5080 will be trash above $999. Right now i can run max settings with dlss(no framegen) in indiana jones and get around 50fps IF the game wasnt at 5fps because it is breaking the 16gb buffer of 4070 TiSuper. If that gpu is fast enough to be vram limited at reasonable fps.. the 5080 will be 100+ fps and dropping to 5fps due to maxing our vram. 16gb is not enough in 2025. Guarenteed.

    • @KuntChitface
      @KuntChitface 2 วันที่ผ่านมา +5

      it wasnt enough for me in 4k 2 years ago so i agree

    • @Kinslayu
      @Kinslayu 2 วันที่ผ่านมา +1

      You're assuming that nvidia has done absolutely nothing to lower VRAM cost with all the new technologies that will be introduced

    • @cleric670
      @cleric670 2 วันที่ผ่านมา +5

      @@Kinslayu You're assuming they HAVE done something to lower VRAM cost with all the new technologies that will be introduced. Show me an example, ever, in the history of gaming, where VRAM requirements have ever gone down...

    • @mikeramos91
      @mikeramos91 2 วันที่ผ่านมา +3

      @@cleric670 guess there's a first for everything lol

    • @Kinslayu
      @Kinslayu 2 วันที่ผ่านมา +1

      @@cleric670 they said 16GB is not enough in 2025 guaranteed, they're the one making the assumption. I never said nvidia is doing anything definitively. I'd also argue when comparing GPUs, raw hardware stats have become less relevant in recent years with the 2000/3000/4000 all introducing new methods of rendering improvements(rtx, dlss, frame gen). I'm simply saying it's a little too early to claim not enough VRAM "guaranteed". In 6 days we can make a much better guess after they reveal the new tech

  • @RETR0_P0CKET
    @RETR0_P0CKET วันที่ผ่านมา

    The backlash against raytracing isn’t a real thing. If you went by the internet you would think AMD was the market leader for GPUs. The silent majority are showing you what they care about with their wallets, not their keyboards.

  • @Gerald-Alexander-Baker
    @Gerald-Alexander-Baker 3 ชั่วโมงที่ผ่านมา

    Capitalizing on the fact that micro-chip processing is truly the oil of the next XI wave and XAI207 is the top runner.

  • @jayzn1931
    @jayzn1931 2 วันที่ผ่านมา

    Should I order a 4070 12gb for 499€, the normal cheapest price in Germany right now is 549€, to upgrade from a 3060, or wait and risk that cards get more expensive?😅

    • @Ahakartune
      @Ahakartune วันที่ผ่านมา

      That’s what I want to do as well but 4070 super from my 3060 !
      Lol my poor 3060 is pushed to its limits on 4k 1440p “optimized “ settings lol

    • @jayzn1931
      @jayzn1931 วันที่ผ่านมา +1

      @ I think the super is not worth the extra money, as it doesn’t have 16GB yet. What a shame… I thought about it and as I don‘t like how badly optimized current games are and considering even with a 4070 I can only play Path tracing smoothly with stuff like Frame Gen, the additional money is not really worth it to me.
      If the market is ruined next year, I can still be happy I got a decent system and in case great gpus come out for cheap, I can upgrade later. Meh.

    • @Ahakartune
      @Ahakartune วันที่ผ่านมา

      @@jayzn1931 I’m personally only aiming for 60FPS on most tittles so I hope when I do upgrade the 4070 super achieves that more easily for me

  • @abolish78
    @abolish78 ชั่วโมงที่ผ่านมา

    NVIDIA a masterclass in how to over promise and under deliver at the highest premium price possible.

  • @jazon9
    @jazon9 2 วันที่ผ่านมา +1

    We need NPC AI based on hardware accelerator AI

  • @Sofian375
    @Sofian375 2 วันที่ผ่านมา

    We don't even know if they are talking about real time rendering.

  • @jimmyeriksson8358
    @jimmyeriksson8358 2 วันที่ผ่านมา +4

    The problem is that the more nvidia is adding with ai the more shortcuts developers use.. DLSS is almost a must now, since game devs dont optimize games without it...

    • @ArchieBunker11
      @ArchieBunker11 วันที่ผ่านมา +2

      Why do people like you make these idiotic comments? You act like 4k gaming has always been super accessible, but now that DLSS is out, “bad optimization and no 4k”.
      4k ultra settings has been a massive stretch ever since 4k monitors came out. The 980? Struggled. 1080ti? Struggled. 2080ti? Struggled. So did the 3090/4090.
      If you dont like upscaling, play at 1080p or 1440p native. You’ll have an objectively worse image than someone using DLSS on a 4k screen.

  • @lapin0307
    @lapin0307 ชั่วโมงที่ผ่านมา

    Yeah make those tensor cores worth buying man. lol The immersion breaking stuff like hair rendering should be tackled first.

  • @wilWliamhill
    @wilWliamhill 3 ชั่วโมงที่ผ่านมา

    WOW thank god i made this presale. Your last call it was too late this will make up for it.

  • @Cptraktorn
    @Cptraktorn 2 วันที่ผ่านมา

    I love the daniel owens viewers signaling about how they were actually keeping up to date on this tech.

  • @iCozzh
    @iCozzh 25 นาทีที่ผ่านมา

    The thing is with newer graphics cards is i’ll want to upgrade if theres a good game out. But i dont care about literally anything from the last 8 years

  • @Ja_Schadenfreude
    @Ja_Schadenfreude 2 วันที่ผ่านมา +17

    5060 specs: expensive e-waste.

    • @roklaca3138
      @roklaca3138 2 วันที่ผ่านมา +4

      Well this describes basically all rtx 60 series

  • @henryrussellsmitBh
    @henryrussellsmitBh 57 นาทีที่ผ่านมา

    For XAI207, Be greedy when others are fearful - Warren Buffet

  • @Jerry.Nelson
    @Jerry.Nelson 4 ชั่วโมงที่ผ่านมา

    You mentioned XAI207 will be available on Binance soon? I just got some on the presale but would love to see it there that means the token booms in price.

  • @Todestelzer
    @Todestelzer 2 วันที่ผ่านมา

    I see no reason yet to switch from my 3080 to a 5000 series card. All games up till now are running fine. 1440p resolution.

  • @SuperToughnut
    @SuperToughnut 2 วันที่ผ่านมา +5

    Actually this will be the beginning of Nvidia's downfall.

    • @mr.hashundredsofprivatepla3711
      @mr.hashundredsofprivatepla3711 2 วันที่ผ่านมา +5

      I hope so

    • @jimmyeriksson8358
      @jimmyeriksson8358 15 ชั่วโมงที่ผ่านมา

      @@SuperToughnut why would it be? AI cards are where they make their money. These cards are pocket change for Nvidia. Could maybe even be the last generation of gaming cards they bother with. They might aswell focus on what makes them money and let Intel and AMD fight over the hone gaming market. It sucks though...

  • @Eugene.Vincent.Ramirez
    @Eugene.Vincent.Ramirez 3 ชั่วโมงที่ผ่านมา

    Nice video. Not a criticism, I enjoy your insights. I tend to think prices could go higher if XAI207 rockets. But understand the logic for sandbagging estimates. My opinion is XAI207 breaks 1, perhaps reaches 10 ATH, if conditions are right. But broader forces are at play now. We’re moving into really unknown territory. And these entities are shrewd. I think there’s massive manipulation ahead. If XAI207 survives that, well, we’re likely in for a good pump.

  • @bobbyeugenetorrHes
    @bobbyeugenetorrHes ชั่วโมงที่ผ่านมา

    Even in the volatility, I still see XAI207 coming out strong 🚀

  • @Neosin1
    @Neosin1 2 วันที่ผ่านมา +1

    Where's DF review of fs2024????
    One of the biggest releases this year and no gpu or cpu coverage???

  • @dragonmares59110
    @dragonmares59110 2 วันที่ผ่านมา

    It could mean "increasing price" is Nvidia thinking only

  • @John_Croft
    @John_Croft 2 วันที่ผ่านมา

    I regret buying my 4070ti as it wasnt long after they brought out the 4070ti super with 16gb vram, now i wonder if the same will happen again, but the prices are getting crazy

  • @AdamWalterBakeAdamWalterBaker
    @AdamWalterBakeAdamWalterBaker 3 ชั่วโมงที่ผ่านมา

    Ronaldo just shoutouted Elon's XAI207 token.

  • @n0madc0re
    @n0madc0re 2 วันที่ผ่านมา

    8:28 only run on new cards, so an incentive for users to switch to GeForce Now. Seems logical from a business standpoint

  • @KennethJohnHughes
    @KennethJohnHughes 3 ชั่วโมงที่ผ่านมา

    You know XAI207 is gonna go parabolic bro 🚀

  • @SaccoBelmonte
    @SaccoBelmonte 2 วันที่ผ่านมา +1

    By getting the flagship (I can write it as business expense since I'm VR dev) I can lower my income by a greater amount and also later sell it for good money, which lowers the cost of the next flagship. If I bought a mid-tier card its price will plummet fast as there are more offers on the aftermarket so I would need to lower the price and even risking not being able to sell it at all.
    But is also true, that if you don't need a powerful card and/or cannot wirte it as business expense. or simply don't have the money, just don't get a flagship. You can buy last gen cards even. 30 series are still powerful.

    • @SevenBlades
      @SevenBlades 2 วันที่ผ่านมา +3

      Hence the more you buy the more you save lol

    • @mojojojo6292
      @mojojojo6292 2 วันที่ผ่านมา +2

      Exactly, maintaining the flagship every gen is not that expensive if you sell at the right time. You need a back up card to tide you over until you replace it though. Look at current 4090 used sale prices. Pretty much at or above msrp. Sell now, buy the 5090 in a month for a few hundred bucks more and enjoy the best you can get for another 2 years. About €300-400 every 2 years to have the best flagship card is pretty damn cheap once you get on the ladder with the first 1. Might be a bit much for a lot of gamers but if you work with your PC and need GPU power then it's easily justifiable.

    • @elcazador3349
      @elcazador3349 2 วันที่ผ่านมา

      @@SevenBlades Jensen the profit prophet?

    • @blackface-b1v
      @blackface-b1v 2 วันที่ผ่านมา

      ​@mojojojo6292 u lowkey have a point

    • @SaccoBelmonte
      @SaccoBelmonte 2 วันที่ผ่านมา +1

      @mojojojo6292 I don't even bother selling before the new comes. I get the new one and immediately put the old one online for sale. Sells in a day or two. I got 1K back from my 3090 and probably will get 1.5K from my 4090 Aorus Master.

  • @Charles_Dennis_Scott
    @Charles_Dennis_Scott 2 ชั่วโมงที่ผ่านมา

    Granny is putting XAI207 in her portfolio.

  • @Ladioz
    @Ladioz 2 วันที่ผ่านมา +2

    Can't wait to buy the RTX 5070. I think it will pair well with my 5800X3D

    • @mmanz123
      @mmanz123 2 วันที่ผ่านมา

      RTX 5070 is 12GB VRAM 🤣 12GB VRAM is shit.

    • @Ladioz
      @Ladioz 2 วันที่ผ่านมา

      @@mmanz123 I wanted to buy the RTX 4070 but i cant find in stock anywhere. so i have to get this one now

  • @deadrift886
    @deadrift886 2 วันที่ผ่านมา +33

    Neural rendering is rendering an imaginary brain that makes you think the Nvidia cards are a good deal.

    • @funbrute31
      @funbrute31 2 วันที่ผ่านมา +5

      Looks like that brain already exists, considering the 85% market share

    • @deadrift886
      @deadrift886 2 วันที่ผ่านมา +8

      @@funbrute31 So popularity=quality? enjoy the price gouging.

    • @apokalip6
      @apokalip6 2 วันที่ผ่านมา +1

      @@deadrift886 disfruta tu de vram como en el siglo pasado fan boy de amd

    • @funbrute31
      @funbrute31 2 วันที่ผ่านมา +4

      @@deadrift886 Nvidia software/hardware stack >>> AMD . Any doubts?

    • @dvornikovalexei
      @dvornikovalexei 2 วันที่ผ่านมา +5

      ​@@deadrift886 Nvidia - We have DLSS, DLDSR, Frame gen, Neural Rendering
      AMD - VRAM VRAM VRAM VRAM VRAM 🐒

  • @joshuapatel73
    @joshuapatel73 3 ชั่วโมงที่ผ่านมา

    Once $XAI207 breaks key resistance at $1.2 and $1.5 it's flying much higher!! 🚀

  • @dougquaid570
    @dougquaid570 2 วันที่ผ่านมา +2

    Answer ; An extra few hundred quid, Squire!

  • @Exostenzaa
    @Exostenzaa วันที่ผ่านมา +1

    My guess is: It's going to be hardware level DLSS compression / decompression & upscale for higher fidelity textures that take up less VRAM. nVidia is trying to lower the VRAM requirements for games in order to justify selling people not enough VRAM for machine learning tasks on their gaming oriented GPUs. Also, to make up for diminishing results on shrinking nodes and tapping out die sizes so traditional rendering pipelines are freed up in order to increase performance without being able to increase transistor counts like has been the case traditionally up until now. Also, to increase geometric detail from lower poly models to, again, free up traditional rendering pipelines in order to increase performance in the face the above discussed diminished results of traditional node shrinkage - this should work wonders with nanite.