I suspect Nvidia learned from the unlaunching of 4080. They will release a 5080 that is by all accounts a 4070 ti comparative to its size to flagship being close to 50% and will launch a ''real'' 5080 SUPER close to the 73% 80 series usualy goes. I think its one of the main reasons why there was literaly no difference to 4080 SUPER and 4080 except the price. There was nothing to improve upon. So this time around, you are getting the scam first and real 80 series later down the road :)
Just saw a reddit post of someone upgrading their 1080ti 11gb to a 4070 12gb after almost 8 years... Top comment was good luck with your 1 extra gig of vram 😂
A 3090 was more expensive than a 5080. VRAM isn't free. And GDDR7 is more expensive to boot. Nvidia isn't Santa Claus, they're all about that profit margin.
This was the plan all along. It's been obvious for YEARS that the end game is to make simplified game development with path/ray tracing possible by upscaling as efficiently as possible without ever having to render a 4k or higher native. I'm not saying I like it, but it's been obvious that this is Nvidia's goal. Less RAM needs and brute force, more processing complexity. I'm less sure than ever this will work out, because we are going in the direction of cell phones. Rather than improving audio quality, they just do more with less bandwidth and the overall experience stagnates or even regresses rather than improves. So we shall see. It doesn't appear that we are going to get a choice on this, as Nvidia is calling ALL the shots as far as steering the industry.
They saying the 5070ti is 16gb. I think that might be the better value again like the 4070tiSUPER is over the 4080. I'm going to invest in the 5070ti for the same reason. There maybe a refresh like the 40 series so early adopters don't get the best version. Plus Nvidia can get paid twice in the same generation.
Doesn't work like that. It goes by Bus width and memory module size. Currently GDDR7 modules are capped at 2GB per module, and with the bus width of the 5080 that makes a maximum of 16GB. With 3GB modules that are expected sometime during 2025, it would make for 24GB. So expect the 5080 Ti/Super to be 24GB. Also the 5060 if it comes out before 3GB modules are available will be 8GB, or if it comes after with 3GB modules it will be 12GB.
This is to Anyone that upgrades.. Please don't do the stupid thing of giving money over for 16GB cards if you already have 16GB Cards. Like 4080 to 5080. That shit does not make sense
3080 User on a 1440p Ultrawide OLED display. I need VRAM so desperately. 16GB isn’t enough for me to really get excited about a long term GPU replacement. I’ll wait for super/TIs with more VRAM I think at this stage. Fingers crossed the prices aren’t ludicrous.
As an RTX 2060 S owner, I have only been able to truly experience ray tracing in one game: Lego - Builder's Journey! LOL So forgive me, but I am VERY sceptical about the first iteration of a new Nvidia technology that is garden walled to the 5000 series and will no doubt be used to push up prices. I think I will wait and see what the 8800XT delivers, as at least AMD is known to support new advances on their older hardware.
@@johndank2209 Im not so sure these people are actually devs anymore. I'm willing to bet they do minor coding and then send it off to nvidia to complete the rest. Keep in mind i have no idea what i am talking about, but i do remember nvidia stating that they were involved with inserting dlss back during the 20 series launch.
I feel like Nvidia rolls out a new gimmick every five years or so; PhysX, Game Works, Ray Tracing, a handful of games end up using it, a few games use it well, then when AMD catches up Nvidia rolls out a new gimmick. If somebody bought the RTX 2080 Ti for the ray tracing performance, did they really get their money's worth?
@@chy.0190 Fair. I guess I'm just not looking forward to seeing a whole subsection of "Neural rendering" benchmarks or whatever. For me, as a common consumer, they're gimmicks.
@@stalkholm5227in my personal consumer opinion, they're gimmicks that get refined with time, never at launch, expect next gen of gpu to handle it """decently"""
Seeing the tensorrt requirements and it's neural capabilities it takes a lot of vram to just render the task.. this is me coming from the various generative AI like stable diffusion.. Vram is big limitation in this area.. even rtx 4090 was having hiccups when rendering high res images with tensor rt... But all of this is done under pytorch and the new neural texture tech this video is talking about is far ahead and require less vram.. maybe nvidia finally fucked over the 8 gb vram issue by improving texture size and quality as a whole since we are hearing rumours of rtx 5060 having 8 gb vram.. I hope this tech comes to previous gpus like 3000& 4000 series as well.. but that may be far fetch since this may be limited to 5000 series GPU for the whole FOMO tactics
@@therealDL6769 there is zero indication it would use more vram. If you look into the white papers on neural compression the assets being compressed with neural compression are significantly smaller than using tradition compression algorithms, while have much better quality. Would lead to much smaller assets with similar quality to what we currently have. Or similar sized assets to what we have now with much better quality.
My 4080 has 16gb. Playing Space Marine 2 with the new texture pack uses more than 16GB. I have texture pop in and random hitches. For reference my 7900xtx with 24GB had 0 issue.
Neural Rendering sounds like a new term that Nvidia just came up with, that means rendering with AI enhanced features (like upscaling, frame-gen etc). They can do that now because devs have basically given up on working on anti-aliasing on their games and just rely on DLAA to clean things up using it's AI, so games technically do look better with Nvidia's "neural rendering" but it is just due to devs no longer optimising anti-aliasing and just leaving it up to the AI magic DLAA to clean it up (which to me, does look better than if you turn it off, but it doesn't look as good as old AA techniques used to look like, that managed to reduce aliasing but still gave us a sharp and detailed look).
It will be more than that. Eventually AI will be able to filter an entire scene with any aesthetic and graphic type, even hyper realistic textures etc, also think about the implications it can have for LOD and texture streaming in general. There are millions of ways to utilize AI in video games. It is in a rough state now but the rate of progression is outstanding, just look how far diffusion transformer models have come in such a short space of time. The future is very bright for video games and AI utilization.
You can still use old techniques by forcing them through the driver. If you want the best antialiasing use supersampling and render your game at 8K and downsample it to 4K. The problem with that is that it works for older games that are not demanding. For MSAA, forget it, doesn't work with modern rendering techniques where light sampling and texture sampling are not of the same resolution or are done in decoupled way with compute shaders, which is why it was replaced with TAA. Other techniques are usually worse so not worth, like checkerboarding. DLSS is a solution for modern games, that minimizes performance cost but retains most of the quality - an upgrade to TAA basically. What you are asking (the best antialiasing techniques) will work with current games yes, but in a couple of hardware generations, like on a 7000 series, but they will cost pretty much how much raytracing costs today. Not worth it most likely, and you will be still using DLAA because it is "good enough", and leaves enough budget for framerate and image quality. Oh and don't fall into the "AI" thing. There isn't much AI in DLSS, it is just plain old machine learning (ML) that is just a subset of the artificial intelligence study. You can just take it as an adaptive algorithm, it doesn't really think, it uses it just for deducing where a pixel has likely moved or what would be the best color to fill in a missing pixel with, and with enough data it does so usually better than any hardcoded algorithm that a human can write. This is also why FSR 4 will go the machine learning route. At some level of complexity and time constraints humans are not able to come up with better algorithms.
Shit, I WISH that most games allowed DLAA. It's superior to TAA in every way when implemented well. Ghost of Tsushima with DLAA on? Man, literally no blurring or jagged edges at all, just a perfect image. Baldur's Gate 3 also practically requires DLAA to look good. Then you have games like The Last of Us Part 1 where the DLAA is TRASH and ghosting is everywhere. It's a shame that it varies so much.
I expected the 5080 to have 16GB but I don't like it unless it's $8-900. Highly doubt that.....I'll hold out for the Super/TI and hope it has 20-24 and not over $1200. If none of this comes true, I just won't update.
Same here, but man, that's just not gonna happen. This is the same Nvidia that priced the 3080 at $700 and then just one gen later priced the 4080 at $1200. I 100% expect the 5080 to start at $1500 minimum. Especially now that AMD isn't competing in the high-end _and_ a lot of people shelled out for the 9800X3D which is very bottlenecked by even a 4090. If the 5080 is a big enough improvement and lets the 9800X3D shine then people will be paying literally any price for it.
Don’t buy the 16 GB model. This launch is coming before the availability of the 3 GB GDDR7 chips, and once those are available in video will be able to sell 24 GB of VRAM on a 256bit memory bus. 5080 super in a year if you can hold out with whatever card you’ve got right now.
Nvidia as a car company: 5060: subcompact with 120 hp, $25,000 5070: compact with 180 hp, $35,000 5080: midsize with 250 hp, $45,000 5090: luxury car with 1,000 hp, $250,000 You’re either getting an econobox or a super car, choose wisely! It just seems like they are ignoring a HUGE “mid-range” segment of the market but maybe this is just how monopolists act?
They are not ignoring anything, they are purposefuly baiting you into buying the flagship instead of getting the 80. I mean look at it. Its is a 5070 Ti at best comparative to its size to the flagship. You are getting shafted on purpose so that your though process is to think you are better of forking the extra money to get the 90.
Yes, its called capitalism and free markets. It's not how monopolies act in particular, this kind of segmenting and price tiers strats you can find for example in fast food brands aswell. Don't like it? Don't buy it. Complaining is like asking for political power in russia or to start a religion legally in north korea, it just reveals you don't understand in what world you live, your own naivety. If people will stop buying their flagships enough to make it less profitable they will adjust. If people are buying nvidia gpu's like its fresh bread, it means there is lots of demand and the trend continues. All businesses are created for profit, not charity and corporations have lots of resources to maximize profit. The market is always right in neoliberal capitalism. Don't like it? Don't buy it or move to a place where they have nvidia grade flagship gpu's at 100$. Oh that place doesn't exists... welcome to reality.
The problem that comes with the getting an 80-class vs. a 90-class RTX GPU nowadays is that it's a very hard up sale to the more expensive 90-class card. With everything I've seen about leaked specs for the RTX 5080 is that it will have half the hardware specs of the 5090 (21760 vs. 10752 Cuda, 512 vs. 256 bit memory bus, and 32GB vs. 16GB VRAM). While that won't translate to half the performance, using only half as much material hardware won't really translate to half the price of the more expensive 5090. They'll be priced close enough to each other so that people looking to spend as much money for an 5080 will have to consider just saving or sinking more money to get a 5090. It's just Nvidia being the same scummy corporation we all know that it is.
After 8 years of not upgrading i think i will spend €3000 on a new gpu. Nvidia has made me used to these extreme prices sadly. I am a single 33 year old whos never owned a car, living in a rental place and my entire net worth is barely 30k............ 😨😰😭 goodbye 10% of my entire net worth
I've come to realize that Nvidia is just ahead of everyone complaining in here. I used to be one of them. - Everyone complained about DLSS. However, they effectively end SLI and Crossfire, as they simulate higher graphics with disproportional costs. - Complaint about FrameGen. Until it became accessible due to AMD you realized it was one big exaggeration...once again. - Now it is DLSS for VRAM and the complainers will complain again. Never learn guys. Nvidia is like the Nikocado of the GPU industry. Two steps ahead.
Let's get a bit real on NVIDIA. We've seen the playbook, time and time again. Technologies are barely useful when first released. Ray tracing. Remember that? The most amazing technology used to sell the 20 series. Was it useful? No. It was nonsense. And even by the time the 40 series came around, most of the lineup didn't even have enough VRAM to utilise it, and when it did, it massively tanked performance to the point that most gamers just didn't use it. And DLSS. How good was that in the first iteration? It wasn't. It was embarrassing. Frame generation - woeful. Huge number of artefacts. These are all technologies in response to poor ray tracing performance in games, or gimped performance specs constantly released and hyped by NVIDIA but MASSIVELY underperforming. New shiny marketing gimmicks. Now we hear the 50 series will have 8GB of RAM, until you get to the 5070 Ti. Are you joking???? And the prices again will be through the roof, delivering sweet FA value to consumers. Don't get sucked in by the hype train for what will be another stream of overpriced, underdelivering products. There is no fear of missing out.
GPU are not being build just for gamer. sure those 20 series are non sense to gamer but for semi pro that work with RT in was a major improvement when a single GPU able to do RTRT with what gamer called as "playable FPS" where as past GPU can only do it with single digit FPS. and just look at gaming market. last quarter AMD only make 462 million. and that already include sales from console chip. nvidia in comparison make 3.3 billion. nvidia able to get that kind of revenue for their gaming division because the one that really buying those geforce are people/company that use GPU for their work. AMD decided to go UDNA because of things like this. gamer are no longer drive the market for GPU.
Who are you talking to and why? This is literal gaslight 101. You realize how ridiculous it is? It's like trying to convince a gay guy that he should not date other guys, he should instead try date a girl because . Why do you care what other people do with their money and freedom? You don't like nvidia? Cool. Invest in amd stock, short nvidia or start your own gpu pipeline. Good luck! If people decide to throw money at a product its their choice, its a free market in a free world, nobody is forcing anybody to consume x or y.
Dang the 5090 is looking like an absolute Beast of a card if these specs are true. If thats the case I want it but I know my wallet will be crying lol. Tnx for the hardware news man.
woah, with graphics cards this crazy i cant wait to get 60 fps with DLSS on at 1080p in an Unreal Engine 5 game next year, and it'll only cost me both of my kidneys and my wife
@kenshinhimura6280 I could turn DLSS off and achieve that. Depends on the game and graphics settings used. You must be not-so-smart to not be utilizing DLSS tho. The difference between DLSS and native res is negligible. By that logic, you might as well turn on Ray Tracing if you're willing to lose out on 50+% of your FPS because of a minor visual difference.
Well tbh the gap between the 5090 and the 5080 is huge enough for a shiny 5080ti or a 5080 super to fill in or both but tbh your looking at 2026 for that.
@@JynxedKoma Yes but assets were from the game itself. The developer also said they aim to make the game look like that in real time. It is possible it can run in real time on a 5090 but obviously with all of the post fx it makes it impossible to render real time, so that is why it was pre rendered. Anyway by the time the Witcher 4 releases we will have a 6090 or 7090 out.
The Neural Texture Path is not cumulative in terms of time. That is the time it needs to be executed/decompressed/reconstructed. However, it can be dedicated to run in parallel with other tasks. So, in practice unless your whole GPU frame time is less than 1ms you will not see a deficit. (The novelty, the way I read it and understood it, is that instead of block compressing per mipmap you compress the whole image as a whole - including mips and instead of the "standard" BC algorithm you use a neural matrix specific for each type of texture - which is a unique algorithm based on the "type" of texture.)
24:40 Those memory speeds refer to the memory clock speed, but you have to multiply it by the memory bus size to determine how fast the memory is for the card itself (memory bandwidth). So 5090 still has almost twice as fast memory because it has twice larger memory bus.
5080: Half the cuda cores, half the VRAM, NOT half the price (I bet). I would say that the LAUNCH 5080 will have 16gb. However, I would put money on the refresh having more (i.e the Super or Ti). Neural Rendering: Locked to 50 series! That 4090?: Worthless suckers!
More proprietary and locked down features that further takes game rendering away from the standardization of DirectX and locks it behind a vendor's paywall. Upscaling was OK when it wasn't required, but this is going to make games actually look better and only on Nvidia cards (maybe only on 50 series cards). That shouldn't be acceptable or allowed since it's extremely anti-competitive.
@@ryanspencer6778 God people never stop complaining about advancements. Nvidia lead the way, others follow. That's just the way it is now. It's no different than years ago when a new Direct X version was introduced bringing new features that didn't work on older cards and it used to happen nearly every gen at 1 stage.
@mojojojo6292 the difference is that directX is a standard that everyone can choose to support. This is only for the cards that Nvidia chooses to allow it on, and then it's up to AMD and Intel to develop their own versions and get games to adopt it. It should be part of DirectX, but Microsoft has been happy to let Nvidia grow their monopoly for years now.
From what I understand of neural rendering in as layman's terms as I can put it is that the goal is to take all the traditional separate serial (or even parallel) rendering layers or passes such as G-buffer (color, normal maps, mipmaps, anisotropic filtering, roughness), Vertex (geometry, position, geometry shader, tessellation, z-buffer), D-buffer (Decals), lighting (alpha, reflections, and light), and possibly post processing, then pass all this as inputs to a neural network and have only the neural network "material" as the only rendering pass interacting with the path tracer.
I don't really get who the 5080 is for. They themselves said the xx70/ti series is geared towards gamers, and the xx90 is for the highest end production/AI loads. Genuine question, is there some kind of task that would benefit from the cores and the bus, while not really being limited by the 16gb vram? It feels like they're building it for something, but given it's probably $1200, I don't know who that is.
'only' 16GB? Are people expecting 24GB from their last flagship release? Good joke. 16GB is more than enough for most people if you're just dealing with games.
Some games on high settings and high resolutions use 16 or more gigabytes of RAM. Not all, but the fact that you’d be paying for, lets face it, an entirely overpriced card, and you’re getting gimped performance because of the lack of RAM that other, cheaper cards have more of is an absurdity. Also yes, people should expect 24 gigs of RAM from Nvidia because their competitor had/has that amount for cheaper than they do.
Those frametime costs do add to the length of the path tracer itself. Maybe not 100% of it, but it does add a performance cost none-the-less. There is no free lunch.
Is it too much to ask for better power efficiency. I don't want to be spending an extra $200 a month just because my gpu consumes more power than my ex's weight.
Nvidia GPUs are more power efficient then AMD GPUs..... Power efficiency was never the issue for Nvidia as noted above, they already are very efficient for what they are.
@@slopedarmor Power efficiency and total power consumption are closely related concepts. Higher power efficiency means less energy is wasted, which directly reduces total power consumption for a given performance level. Conversely, lower efficiency increases power consumption because more energy is wasted as heat or other losses.
I would guess this neural compression would save massive amounts of memory and memory bandwidth on the GPU, which would be huge in reducing costs while also greatly improving visuals. There’s always trade offs, but better compression is always welcome and can have a big impact.
Time to support Intel, you say? The 4080 may be the last Nvidia card I purchase... Tired of Nvidia. I'll play around with the B580 for a bit on my son's new build. Up yours team Green.
With current year and expansion of AI, a realistic pro-consumer memory setup would have been: 5070 - 24GB 5070 Ti - 32GB 5080 - 48GB 5090 - 64GB Don't fall for the assumption that small increments are acceptable and only ask for an extra 4GB. These GPUs are designed to be outdated very soon. Even if you don't care about AI and just want to game, strong limits on consumer chips will only push for more subscription and cloud services for everything with less data ownership for the user. The last thing you should want as a user is for developers to implement more features that require online services. Everyone should care about the gap between consumer and professional chips getting bigger. If they get to charge the excessive price anyway, they need to deliver on the product.
The era of facetime with your match to see if she is real / not a catfish is over. The era of wearing VR goggles to meet your match to go on date with 1996 Adrianna Lima has arrived.
I bet the first game to showcase the neural rendering will be Cyberpunk 2077 again. We know CDPR already got their hands on a 5090 through the Witcher 4 trailer.
Question is: Will older RTX GPUs be able to handle neural rendering? The answer is probably yes, but who knows what Nvidia is gonna try to make us believe...
The neural rendering stuff seems neat but I wonder how it would be implemented. I'm guessing it would need to be worked into the game engine somehow and if so it could be many years until we see it
im ok with the 5080 only having 16gb of vram so long as the price reflects the fact we are only getting 16gb of vram lol. that said if the 5060ti has 16, it seems silly not to launch the 5080 with more.
"So long as the price reflects that" It absolutely won't. You'll have inflation, tax, the Nvidia tip, the Nvidia tax, and $200 over MSRP by default with every model.
This looks exactly like nvidia using AI, again, to tell consumers that they do not need more vram because of some fancy crap. It must be coded into the game so thos doesn not make a 5060 8gb less of a pos.
Dont worry, next gen witcher 3 with a hd texture mod only uses 19 gigs of vram Dont worry, the avatar game at max settings 4k only uses 22gb of vram Dont worry, some graphics options in indiana jones dont even appear if you dont have 12gb + vram. Dont worry, you can run full raytracing in indiana jones with a 12gb card, but you will have to run medium settings to prevent vram overflow.
Regarding NVIDIA's Neural Rendering technique - a good channel to watch its progress is with Two Minute Papers. And this specifics in regards the Papers about the "Real-Time Neural Appearance Models" (yes that's the original title used for this technique) are discussed in the following video called "NVIDIA’s AI Learned On 40,000,000,000 Materials! (October 15, 2023)" and "NVIDIA’s New AI: Gaming Supercharged! (October 29, 2023)". As for Neural Texture Materials, earliest discussion was around at this time with the same channel "High-Resolution Neural Texture Synthesis (January 18, 2018).
Can’t wait for AMD to announce their own neural rendering few days after nvidia announce this. As with everything else before this, upscaling, ray tracing, frame generation, etc. AMD always only follow Nvidia’s lead. Without Nvidia, the gaming world would still be stuck with just rasterization, with no upscaling, ray tracing, or frame generation. lol
Meanwhile, when we had 'just rasterization' we had games performing adequately at native resolution, optimization not going to crap because of devs crutching on RTGI becuase it's faster for THEM to implement, despite being vastly more costly for the consumer and usually not even coming with a major visual uplift. Framegeneration is literally pointless. It means NOTHING when the only cards getting sufficient performance for it to not introduce ruinous amounts of latency, are the upper-end cards that shouldn't need imaginary frames in the first place. Oh, and we had games that weren't smeared with vaseline and loaded with ghosting. God forbid we talk about how upscaling has been crutched on to push graphics that hardware can't run and instead leaves us with an OBJECTIVELY blurry mess compared to old-school. TAA is also causing a lot of blurry mess, but god forbid devs build their pipeline around graphics concepts that don't rely on GUESSING WHERE THE PIXELS GO.
If I had to guess what neural compression is doing then I would say it's probably reconstructing some detail with ai like how dlss reconstructs detail for upscaling. The neural rendering also seems to be doing something similar, this time it's reconstructing shading behaviour or the way light would interact with the surface of a material type. If had to put my money on it, 1st one is for general DLSS upscaling, and 2nd is probably for DLSS-RR.
The conversation of “neurons” vs quality is a topic called “quantization.” Simply, it allows AI to run with lower accuracy while using lower memory. With GDDR7 and a lower memory cost for AI textures, you likely won’t need all of the memory you’re clamoring for.
If that is on 4090, i'm guessing the render time will be much higher on lower tier cards, cause the lack of processing power, which is the one who needed such compression cause their small vram buffer, it is interesting on how they will pull this off.
I am not sure about the rest but the only reason one should be a 5090 is if they are into high end PCVR or handling AI workloads. I believe a 4090 or a 5080 can do 4K 144khz with DLSS. So let us VR users have the 5090 damnit. I can't wait to be fully immersed with my Pimax Crystal or 8KX at some crazy 5k upsampled resolution per eye at 120 FPS.
@@TH-camAccount-i4m actually, I sold a 7900XTX to buy my 4080S, because the 7900XTX just felt a generation behind. Poor upscaling, RT, drivers, etc. Now I can run path tracing and it's incredible. Every GB of RAM over what you use is completely worthless.
5090 and 5080s will sell out in minutes and will go right back up on Newegg and Amazon at inflated prices well above MSRP from scalpers. And….people will still buy them due to FOMO. Hell I saw a guy on Reddit buy a 4080 super OC for 1600$ off Amazon. Like why???????? NIVIDA has a monopoly on the high end market and they are not stupid. They know everyone will still buy them regardless of price due to no competition.
The only alternative is AMD at the moment. Which has a dogshit upscaler, no raytracing performance, and nvidia already has framegen with their dlss3 Only now Intel is sorta catching up. But it'll still be 1-2 years before the cards have ample supply and their Xess and framegen get put into alot of games. Intel also actually had the forethought to put RT cores in their cards so they xan compete with Nvidia's Raytracing.
@@ambientlightofdarknesss4245 Meanwhile, people LOVE to ignore that the reason AMD can't compete, is because people literally put no effort into support for the non-Nvidia biased methods, leaving AMD to have to brute-force RT while devs are literally building their implementations based entirely around Nvidia's architecture, while the upscaler gets no real improvement because NOONE IS WILLING to actually implement it and WORK ON IT. PRECISELY because of attitudes like yours. 'oh there's a platform agnostic option? Let's not use it and talk bad about it because a hardware locked version is 'better''. Idiots.
My theory is Nvidia is the ones buying there own cards then relisting them for more money. Thats why they do not stop it when they could!! It should be illegal honestly.
Neural rendering is Amazing, I didn't expect to see it so soon. I'm guess that by the way they put it, the RT cores and Tensor cores can't really talk to each other that easily on current gen, new architecture feature? My biggest concern is that this will not be trasparent, but developers will have to implement it, this means a looong a** time before we actually see it in games. I hope I'm wrong.
@@UmmerFarooq-wx4yo Yes kinda, From my understanding neural rendering uses AI but will either completely render a game image or parts of it, like objects, the global illumination, shadows, etc, giving a realistic and high quality look and at the same time being a lot more performant than rendering locally on the gpu die, it won’t be like the AI model that frame generation or dlss use and help the gpu locally run the game but rather fully take over and render certain things and give the gpu some more performance.
Finally a decent application for utilizing "AI" neural networks for something fucking useful, non-instrusive and beneficial for the gaming industry. First time i'm excited for anything regarding what we today refer to as "AI" in videogame graphics. The rest is not my cup of tea, meaning DLSS and FG (interpolation). Halo Strix looks legitimately exciting and 40CUs is actually insane!! Nvidia, knock it off with only giving us 16GB VRAM on a fucking RTX 5080, come on! All in all, i'm excited for next year - it's gonna be so good! So many great developments in tech all at once.
"Neural rendering" sounds like will need a lot of VRAM. If you want some decent "neural rendering" performance, you'll need RTX 5090. Again: NVIDIA creates a problem and then sell the solution.
im still on a 1070ti 8gb waiting for something worth my money so I can enjoy all these new games but so far nothing in the 30, 40 and now seemingly the 50 series cares enough to have decent vram without spending as much as my current system on just a gpu. going to see what AMD offers in regards to RT performance this time around because I know they always have decent vram. We all know the 5080 should have around 20gb, 5070 should have 18-16gb and the 5060 should have 12gb (at least, maybe 16gb at a slower bus).
Is only Nvidia going to have neural rendering for the Geforce RTX 5000 series, or is AMD also going to have neural rendering on the Radeon RX 8000 series.
Chat GPT said this is what the website breaks down to as well as why Nvidia isn't increasing the VRAM possibly: Here’s a simplified summary of the NVIDIA Neural Texture Compression (NTC) website: --- ### **NVIDIA Neural Texture Compression (NTC)** NTC is a new way to compress textures (images or material data used in 3D graphics). It uses AI to compress textures much more efficiently compared to older methods, while still keeping high visual quality. **Why it’s important:** 1. **Smaller Sizes:** Graphics assets take up less memory and storage space. 2. **Improved Visuals:** Maintains better quality, even with high compression. 3. **Compatibility:** Works with modern graphics pipelines. **Key Features:** - Optimized for gaming and real-time applications. - Reduces memory bottlenecks, especially for demanding tasks like VR or high-resolution textures. - Uses neural networks to predict and render textures with less data. --- If you need specific details on how NTC works, feel free to ask!
RTX5080 16gb is pretty good in my opinion. can still run many applications and games. but RTX5060 8gb? 🤡 at the beginning of the release sold for $320++ 💩💩 (this is the price of the RTX4060 when it was first released)
Check out Manta Sleep here tinyurl.com/4fd7ey9c and revolutionize your sleep like I did! #napwithmanta #mantasleep #pronap #proudlypronap
Nvidia:
You guys wanted VRAM so we listened to you, and gave the 5090 32GB of VRAM
🤣🤣
It will only cost you 2 lungs and a kidney you are welcome 🤣🤣🤣🤣
@@soulbreaker1467 The joke didn't need explaining.
monkey's paw
@@DingleBerryschnapps come on stop being a grumpy person
16 gb only for a 5080 is outrageous.
@@NUNYABIZNNAAAZZZ said the brainless consumer
@@magruster angry poor. I'm an enthisiast so I will be purchasing a 5090.
I suspect Nvidia learned from the unlaunching of 4080. They will release a 5080 that is by all accounts a 4070 ti comparative to its size to flagship being close to 50% and will launch a ''real'' 5080 SUPER close to the 73% 80 series usualy goes.
I think its one of the main reasons why there was literaly no difference to 4080 SUPER and 4080 except the price. There was nothing to improve upon. So this time around, you are getting the scam first and real 80 series later down the road :)
@@TheFirstSSJ more like you love getting shafted by nvidia with a smile on your face :)
some people will get it, finally.
the fact that the 3090 had 24gb and the 5080 will have 16 is insane.
D.O.A.
Just saw a reddit post of someone upgrading their 1080ti 11gb to a 4070 12gb after almost 8 years...
Top comment was good luck with your 1 extra gig of vram 😂
@@slopedarmorjaajjajajaja
@@slopedarmori mean after 8 goddam years he should gotten something better im worry for modding like to mod my games
A 3090 was more expensive than a 5080. VRAM isn't free. And GDDR7 is more expensive to boot. Nvidia isn't Santa Claus, they're all about that profit margin.
Nvidia will tell us to play games with neural texture compression instead of selling us more memory :D
probably better that way.
Got that 2080ti launch vibes when RT came out of nowhere. Prepare for the Ai Hype to hit gaming this gen.
Muh Neural Network./
kinda hilarious. 4k gaming is not 4k XD.. but 2k res upscaled up to 4k and textures also scaled for 4k using neural lol. wtf is "real" nowadays.
This was the plan all along. It's been obvious for YEARS that the end game is to make simplified game development with path/ray tracing possible by upscaling as efficiently as possible without ever having to render a 4k or higher native. I'm not saying I like it, but it's been obvious that this is Nvidia's goal. Less RAM needs and brute force, more processing complexity. I'm less sure than ever this will work out, because we are going in the direction of cell phones. Rather than improving audio quality, they just do more with less bandwidth and the overall experience stagnates or even regresses rather than improves. So we shall see. It doesn't appear that we are going to get a choice on this, as Nvidia is calling ALL the shots as far as steering the industry.
@@VatiWah this is why 1080p gaming is the best.
Should have been a 20gb 5080 and a 24gb 5080 ti. 16 would be good for a 5070 and under $800
They saying the 5070ti is 16gb. I think that might be the better value again like the 4070tiSUPER is over the 4080.
I'm going to invest in the 5070ti for the same reason. There maybe a refresh like the 40 series so early adopters don't get the best version. Plus Nvidia can get paid twice in the same generation.
You are asking to much to Nvidia, they do not care about their costumers
There is a 24 GB 5080. Nvidia is waiting for 3GB GDDR7 modules to be manufactured. It's probably going to come in a Super refresh at some point.
Doesn't work like that. It goes by Bus width and memory module size. Currently GDDR7 modules are capped at 2GB per module, and with the bus width of the 5080 that makes a maximum of 16GB. With 3GB modules that are expected sometime during 2025, it would make for 24GB. So expect the 5080 Ti/Super to be 24GB. Also the 5060 if it comes out before 3GB modules are available will be 8GB, or if it comes after with 3GB modules it will be 12GB.
Yeah but let them tell you GDDR7 is super expensive
literally "16 times the detail!"
Todd vindicated.
This is to Anyone that upgrades.. Please don't do the stupid thing of giving money over for 16GB cards if you already have 16GB Cards. Like 4080 to 5080. That shit does not make sense
What do I do who is sitting on a 3090 😢😢😢😢. I want to upgrade too without loosing vram and the 4090 costs an arm and a leg.
If somebody has a 4080 why would they need to upgrade?
3080 User on a 1440p Ultrawide OLED display.
I need VRAM so desperately. 16GB isn’t enough for me to really get excited about a long term GPU replacement. I’ll wait for super/TIs with more VRAM I think at this stage. Fingers crossed the prices aren’t ludicrous.
@@nightlabinteractive8744 3090 still good. Don't upgrade till it dies.
@@nightlabinteractive8744 ti or super 5000 series will probably have good vram
As an RTX 2060 S owner, I have only been able to truly experience ray tracing in one game: Lego - Builder's Journey! LOL
So forgive me, but I am VERY sceptical about the first iteration of a new Nvidia technology that is garden walled to the 5000 series and will no doubt be used to push up prices.
I think I will wait and see what the 8800XT delivers, as at least AMD is known to support new advances on their older hardware.
Upscaling = imagined pixels
Framegen = imagined fps
Neural rendering = imagined games
That's the future. More content at breakneck speeds to keep money flowing, the human aspects of it all be damned.
Last one does not even make sense lol
In the future game developers would all lose their jobs as consumers can just use GPUs to generate games they want to play.
And people wonder why Asto Bot got GOTY....
Tell me you didn't watch the video without telling me....
not nearly enough exclamation marks in the title
Neural Rendering = the new BS jargon to create FOMO and justify high prices. Does anyone remember Simultaneous Multi-Projection?
the new Raytracing
DLSS is real, RT is real, SVR is real, hating on Nvidia and ignoring everything great they made is BS itself3
i just want asynchronous reprojection to exist outside of vr software... please god just let me decouple mouse input from frame rate.
My thoughts exactly.
And now all AAA (maybe even AA) games will have mandatory RT.
5060 and 5070 dead on arrival 😂
source: trust me bro
Here we go boys, another "groundbreaking" rendering technology in its infancy that will be used to justify insane prices
and devs using it as a crutch so they can be lazy.
Only spec that matters is frametime.
@@johndank2209 Im not so sure these people are actually devs anymore.
I'm willing to bet they do minor coding and then send it off to nvidia to complete the rest.
Keep in mind i have no idea what i am talking about, but i do remember nvidia stating that they were involved with inserting dlss back during the 20 series launch.
@@Pwnag3Inc Gaming Companies are just relying on their interns lol
@@Pwnag3Inc yeah... you've got no clue what you're talking about LOL
I feel like Nvidia rolls out a new gimmick every five years or so; PhysX, Game Works, Ray Tracing, a handful of games end up using it, a few games use it well, then when AMD catches up Nvidia rolls out a new gimmick. If somebody bought the RTX 2080 Ti for the ray tracing performance, did they really get their money's worth?
Nvidia bought PhysX from Aegia from Novodex which had dedicated ppu cards and turned it into Cuda.
@@UmmerFarooq-wx4yo Cool! Do most of Nvidia's users get anything out of the CUDA features? I thought that was mostly for productivity.
@@stalkholm5227 lol NVIDIA make more money from productivity users than gaming. Theres a reason they have 85% market share.
@@chy.0190 Fair. I guess I'm just not looking forward to seeing a whole subsection of "Neural rendering" benchmarks or whatever. For me, as a common consumer, they're gimmicks.
@@stalkholm5227in my personal consumer opinion, they're gimmicks that get refined with time, never at launch, expect next gen of gpu to handle it """decently"""
Does neural rendering use more vram?
Yes, a lot more to keep neuronal textures loaded/ready and put them.
Neural compression uses a lot less for better quality output images than traditional compression algorithms.
Less
Seeing the tensorrt requirements and it's neural capabilities it takes a lot of vram to just render the task.. this is me coming from the various generative AI like stable diffusion.. Vram is big limitation in this area.. even rtx 4090 was having hiccups when rendering high res images with tensor rt... But all of this is done under pytorch and the new neural texture tech this video is talking about is far ahead and require less vram.. maybe nvidia finally fucked over the 8 gb vram issue by improving texture size and quality as a whole since we are hearing rumours of rtx 5060 having 8 gb vram.. I hope this tech comes to previous gpus like 3000& 4000 series as well.. but that may be far fetch since this may be limited to 5000 series GPU for the whole FOMO tactics
@@therealDL6769 there is zero indication it would use more vram. If you look into the white papers on neural compression the assets being compressed with neural compression are significantly smaller than using tradition compression algorithms, while have much better quality. Would lead to much smaller assets with similar quality to what we currently have. Or similar sized assets to what we have now with much better quality.
5070 should have 16GB, 5080 should have 24GB and 5090 32GB!
wait for the Super! (I assume)
My 4080 has 16gb. Playing Space Marine 2 with the new texture pack uses more than 16GB. I have texture pop in and random hitches. For reference my 7900xtx with 24GB had 0 issue.
lol why would you downgrade VRAM?
@@GTduzItgot a question is that 4k or 2k
My 7900XT 20Gb handles it with ease.
That’s crazy that it would even take up that much
Really? That's alarming. What resolution are you playing at?
I could bet my life on the fact that the RTX 5080 Ti Super will have 24 gb VRAM
16gb of vram on whats likely a 1200 dollar card. shameless.
The 2060 super had 8gb of ram, the 5060 having 8gb is ridiculous. How many jackets does Jensen need ffs?
Wait till you learn the RTX 2060 had 12gb........
Neural Rendering sounds like a new term that Nvidia just came up with, that means rendering with AI enhanced features (like upscaling, frame-gen etc). They can do that now because devs have basically given up on working on anti-aliasing on their games and just rely on DLAA to clean things up using it's AI, so games technically do look better with Nvidia's "neural rendering" but it is just due to devs no longer optimising anti-aliasing and just leaving it up to the AI magic DLAA to clean it up (which to me, does look better than if you turn it off, but it doesn't look as good as old AA techniques used to look like, that managed to reduce aliasing but still gave us a sharp and detailed look).
It will be more than that. Eventually AI will be able to filter an entire scene with any aesthetic and graphic type, even hyper realistic textures etc, also think about the implications it can have for LOD and texture streaming in general. There are millions of ways to utilize AI in video games. It is in a rough state now but the rate of progression is outstanding, just look how far diffusion transformer models have come in such a short space of time. The future is very bright for video games and AI utilization.
You can still use old techniques by forcing them through the driver. If you want the best antialiasing use supersampling and render your game at 8K and downsample it to 4K. The problem with that is that it works for older games that are not demanding. For MSAA, forget it, doesn't work with modern rendering techniques where light sampling and texture sampling are not of the same resolution or are done in decoupled way with compute shaders, which is why it was replaced with TAA. Other techniques are usually worse so not worth, like checkerboarding. DLSS is a solution for modern games, that minimizes performance cost but retains most of the quality - an upgrade to TAA basically. What you are asking (the best antialiasing techniques) will work with current games yes, but in a couple of hardware generations, like on a 7000 series, but they will cost pretty much how much raytracing costs today. Not worth it most likely, and you will be still using DLAA because it is "good enough", and leaves enough budget for framerate and image quality.
Oh and don't fall into the "AI" thing. There isn't much AI in DLSS, it is just plain old machine learning (ML) that is just a subset of the artificial intelligence study. You can just take it as an adaptive algorithm, it doesn't really think, it uses it just for deducing where a pixel has likely moved or what would be the best color to fill in a missing pixel with, and with enough data it does so usually better than any hardcoded algorithm that a human can write. This is also why FSR 4 will go the machine learning route. At some level of complexity and time constraints humans are not able to come up with better algorithms.
Shit, I WISH that most games allowed DLAA. It's superior to TAA in every way when implemented well. Ghost of Tsushima with DLAA on? Man, literally no blurring or jagged edges at all, just a perfect image. Baldur's Gate 3 also practically requires DLAA to look good. Then you have games like The Last of Us Part 1 where the DLAA is TRASH and ghosting is everywhere. It's a shame that it varies so much.
Neural rendering isn't a new topic.
I expected the 5080 to have 16GB but I don't like it unless it's $8-900. Highly doubt that.....I'll hold out for the Super/TI and hope it has 20-24 and not over $1200. If none of this comes true, I just won't update.
💯💯👌
Same here, but man, that's just not gonna happen. This is the same Nvidia that priced the 3080 at $700 and then just one gen later priced the 4080 at $1200. I 100% expect the 5080 to start at $1500 minimum. Especially now that AMD isn't competing in the high-end _and_ a lot of people shelled out for the 9800X3D which is very bottlenecked by even a 4090. If the 5080 is a big enough improvement and lets the 9800X3D shine then people will be paying literally any price for it.
Don’t buy the 16 GB model. This launch is coming before the availability of the 3 GB GDDR7 chips, and once those are available in video will be able to sell 24 GB of VRAM on a 256bit memory bus. 5080 super in a year if you can hold out with whatever card you’ve got right now.
Man, Im so hyped for 5090. Will be getting one day one. Hope not having a kidney is no problem.
Yeah like, I could care less what the 5090 news is.
Edit: lol be happy with your purchase. MOST people arent buying halo products. Get mad? Relax.
@@DM-dk7jsWhy comment then? No one cares that you don’t care.
Good luck on availability. Also launch models are never the best. I’m gonna wait a couple months and get a white Strix RTX 5090.
@@ZackSNetwork I care that he doesn't care.
@@ZackSNetwork I care than he doesn't care. Makes me feel better for also not caring.
Nvidia as a car company:
5060: subcompact with 120 hp, $25,000
5070: compact with 180 hp, $35,000
5080: midsize with 250 hp, $45,000
5090: luxury car with 1,000 hp, $250,000
You’re either getting an econobox or a super car, choose wisely!
It just seems like they are ignoring a HUGE “mid-range” segment of the market but maybe this is just how monopolists act?
madmike just leaked the 50 series prices 😯
5080 needs to be priced a bit closer to the 5090. Thats how they get you
They are not ignoring anything, they are purposefuly baiting you into buying the flagship instead of getting the 80.
I mean look at it. Its is a 5070 Ti at best comparative to its size to the flagship. You are getting shafted on purpose so that your though process is to think you are better of forking the extra money to get the 90.
Yes, its called capitalism and free markets. It's not how monopolies act in particular, this kind of segmenting and price tiers strats you can find for example in fast food brands aswell. Don't like it? Don't buy it. Complaining is like asking for political power in russia or to start a religion legally in north korea, it just reveals you don't understand in what world you live, your own naivety. If people will stop buying their flagships enough to make it less profitable they will adjust. If people are buying nvidia gpu's like its fresh bread, it means there is lots of demand and the trend continues. All businesses are created for profit, not charity and corporations have lots of resources to maximize profit. The market is always right in neoliberal capitalism. Don't like it? Don't buy it or move to a place where they have nvidia grade flagship gpu's at 100$. Oh that place doesn't exists... welcome to reality.
The problem that comes with the getting an 80-class vs. a 90-class RTX GPU nowadays is that it's a very hard up sale to the more expensive 90-class card. With everything I've seen about leaked specs for the RTX 5080 is that it will have half the hardware specs of the 5090 (21760 vs. 10752 Cuda, 512 vs. 256 bit memory bus, and 32GB vs. 16GB VRAM). While that won't translate to half the performance, using only half as much material hardware won't really translate to half the price of the more expensive 5090. They'll be priced close enough to each other so that people looking to spend as much money for an 5080 will have to consider just saving or sinking more money to get a 5090. It's just Nvidia being the same scummy corporation we all know that it is.
I think im going to just enjoy AD102 for another XMAS and watch you guys try to beat the scalper bots for 5090 and 5080
The 5000 series could make me coffee for all I care. Not a chance in hell I'm spending a premium for a single PC part.
After 8 years of not upgrading i think i will spend €3000 on a new gpu. Nvidia has made me used to these extreme prices sadly.
I am a single 33 year old whos never owned a car, living in a rental place and my entire net worth is barely 30k............ 😨😰😭 goodbye 10% of my entire net worth
I've come to realize that Nvidia is just ahead of everyone complaining in here.
I used to be one of them.
- Everyone complained about DLSS. However, they effectively end SLI and Crossfire, as they simulate higher graphics with disproportional costs.
- Complaint about FrameGen. Until it became accessible due to AMD you realized it was one big exaggeration...once again.
- Now it is DLSS for VRAM and the complainers will complain again.
Never learn guys. Nvidia is like the Nikocado of the GPU industry. Two steps ahead.
5080 super 20GB vram, 5070 Ti super 20/18GB, 5070 super 16/18GB. 5090 Ti Super Saiyan Ti Super Saiyan 9999GB.
Let's get a bit real on NVIDIA. We've seen the playbook, time and time again. Technologies are barely useful when first released. Ray tracing. Remember that? The most amazing technology used to sell the 20 series. Was it useful? No. It was nonsense. And even by the time the 40 series came around, most of the lineup didn't even have enough VRAM to utilise it, and when it did, it massively tanked performance to the point that most gamers just didn't use it. And DLSS. How good was that in the first iteration? It wasn't. It was embarrassing. Frame generation - woeful. Huge number of artefacts. These are all technologies in response to poor ray tracing performance in games, or gimped performance specs constantly released and hyped by NVIDIA but MASSIVELY underperforming. New shiny marketing gimmicks. Now we hear the 50 series will have 8GB of RAM, until you get to the 5070 Ti. Are you joking???? And the prices again will be through the roof, delivering sweet FA value to consumers. Don't get sucked in by the hype train for what will be another stream of overpriced, underdelivering products. There is no fear of missing out.
lulz
Can't fear missing out when it gets scalped
5060 TI is 16gb and 4070 is 12.
GPU are not being build just for gamer. sure those 20 series are non sense to gamer but for semi pro that work with RT in was a major improvement when a single GPU able to do RTRT with what gamer called as "playable FPS" where as past GPU can only do it with single digit FPS. and just look at gaming market. last quarter AMD only make 462 million. and that already include sales from console chip. nvidia in comparison make 3.3 billion. nvidia able to get that kind of revenue for their gaming division because the one that really buying those geforce are people/company that use GPU for their work. AMD decided to go UDNA because of things like this. gamer are no longer drive the market for GPU.
Who are you talking to and why? This is literal gaslight 101. You realize how ridiculous it is? It's like trying to convince a gay guy that he should not date other guys, he should instead try date a girl because . Why do you care what other people do with their money and freedom? You don't like nvidia? Cool. Invest in amd stock, short nvidia or start your own gpu pipeline. Good luck! If people decide to throw money at a product its their choice, its a free market in a free world, nobody is forcing anybody to consume x or y.
Bruh 5080 with 16gb is sucks 😞
Is sucks
Dang the 5090 is looking like an absolute Beast of a card if these specs are true. If thats the case I want it but I know my wallet will be crying lol. Tnx for the hardware news man.
woah, with graphics cards this crazy i cant wait to get 60 fps with DLSS on at 1080p in an Unreal Engine 5 game next year, and it'll only cost me both of my kidneys and my wife
Maybe upgrade your CPU if that's your experience. I easily get over 60 FPS in UE5 games @ 4K.
@@DBTHEPLUG DLSS performance isn't 4K
@kenshinhimura6280 I could turn DLSS off and achieve that. Depends on the game and graphics settings used.
You must be not-so-smart to not be utilizing DLSS tho. The difference between DLSS and native res is negligible.
By that logic, you might as well turn on Ray Tracing if you're willing to lose out on 50+% of your FPS because of a minor visual difference.
I hope they make a 5080 Ti or Super with 20/24gb
Well tbh the gap between the 5090 and the 5080 is huge enough for a shiny 5080ti or a 5080 super to fill in or both but tbh your looking at 2026 for that.
@ I really hate Nvidia tbh, overpriced and are greedy with the VRAM. Nvidia vs AMD is just Apple vs Samsung
Once again here to request a look at the history of generational GPU price:performance improvement over the years 🙏
"Do you guys put neural in front of every words?" -Scott lang probably
So this is what the Witcher 4 cutscene ran on.
It wasn't a cutscene, silly. It was a pre-rendered cinematic.
@@JynxedKoma Yes but assets were from the game itself. The developer also said they aim to make the game look like that in real time. It is possible it can run in real time on a 5090 but obviously with all of the post fx it makes it impossible to render real time, so that is why it was pre rendered. Anyway by the time the Witcher 4 releases we will have a 6090 or 7090 out.
@@John4343sh lol aiming to run it on a 5090 or even 6090
Worst part is that it might be the case given that they are working with UE5
Advances in technology mean Ciri is ugly in 8k and nonsensically a Witcher.
@Moonmonkian Weak males worry about the beauty of video game characters.
You wouldn't complain if you had a nice-looking female living with you.
B580 sold out everywhere.... Might want to move up the release of the B570. There is a demand, intel could fill it.
Intel is gonna be bankrupt before 2026, so I wouldn't buy a card that will have literally 0 support in 9 months.
Government bailout coming. It's all good.
@@SeasoningTheObese bro can travel in time to tell that
Adding the Nvidia research sources was so helpful
Nap gang!!
The Neural Texture Path is not cumulative in terms of time. That is the time it needs to be executed/decompressed/reconstructed. However, it can be dedicated to run in parallel with other tasks. So, in practice unless your whole GPU frame time is less than 1ms you will not see a deficit.
(The novelty, the way I read it and understood it, is that instead of block compressing per mipmap you compress the whole image as a whole - including mips and instead of the "standard" BC algorithm you use a neural matrix specific for each type of texture - which is a unique algorithm based on the "type" of texture.)
24:40 Those memory speeds refer to the memory clock speed, but you have to multiply it by the memory bus size to determine how fast the memory is for the card itself (memory bandwidth). So 5090 still has almost twice as fast memory because it has twice larger memory bus.
5080: Half the cuda cores, half the VRAM, NOT half the price (I bet). I would say that the LAUNCH 5080 will have 16gb. However, I would put money on the refresh having more (i.e the Super or Ti).
Neural Rendering: Locked to 50 series!
That 4090?: Worthless suckers!
More proprietary and locked down features that further takes game rendering away from the standardization of DirectX and locks it behind a vendor's paywall. Upscaling was OK when it wasn't required, but this is going to make games actually look better and only on Nvidia cards (maybe only on 50 series cards). That shouldn't be acceptable or allowed since it's extremely anti-competitive.
It's Frame Gen again
@@ryanspencer6778 God people never stop complaining about advancements. Nvidia lead the way, others follow. That's just the way it is now. It's no different than years ago when a new Direct X version was introduced bringing new features that didn't work on older cards and it used to happen nearly every gen at 1 stage.
Don't think it will be locked to 50 series.. the demo here is running on a 4090.
@mojojojo6292 the difference is that directX is a standard that everyone can choose to support. This is only for the cards that Nvidia chooses to allow it on, and then it's up to AMD and Intel to develop their own versions and get games to adopt it. It should be part of DirectX, but Microsoft has been happy to let Nvidia grow their monopoly for years now.
From what I understand of neural rendering in as layman's terms as I can put it is that the goal is to take all the traditional separate serial (or even parallel) rendering layers or passes such as G-buffer (color, normal maps, mipmaps, anisotropic filtering, roughness), Vertex (geometry, position, geometry shader, tessellation, z-buffer), D-buffer (Decals), lighting (alpha, reflections, and light), and possibly post processing, then pass all this as inputs to a neural network and have only the neural network "material" as the only rendering pass interacting with the path tracer.
Sixteen times the texels! - Nvodd Hovidia
The biggest question is will Neural Rendering support on Nvidia RTX 4000 or 3000 series because not everyone will have shiny new GPU’s
Fact, 3000 i don’t think. 4000 probably.
I don't really get who the 5080 is for. They themselves said the xx70/ti series is geared towards gamers, and the xx90 is for the highest end production/AI loads. Genuine question, is there some kind of task that would benefit from the cores and the bus, while not really being limited by the 16gb vram? It feels like they're building it for something, but given it's probably $1200, I don't know who that is.
Yea 60s and 80s are basically getting messed up since 3000 series
80 Series has just been there to upsell the 90s for two gens now.
'only' 16GB? Are people expecting 24GB from their last flagship release? Good joke. 16GB is more than enough for most people if you're just dealing with games.
Some games on high settings and high resolutions use 16 or more gigabytes of RAM.
Not all, but the fact that you’d be paying for, lets face it, an entirely overpriced card, and you’re getting gimped performance because of the lack of RAM that other, cheaper cards have more of is an absurdity.
Also yes, people should expect 24 gigs of RAM from Nvidia because their competitor had/has that amount for cheaper than they do.
Daang, these leaks are sweet. You have really filled in the void ever since AdoredTV went on hiatus.
Those frametime costs do add to the length of the path tracer itself. Maybe not 100% of it, but it does add a performance cost none-the-less. There is no free lunch.
Looks like I'm keeping my 2060 into the 6000 series! 16GB on a 80-level card in 2025 is silly.
At that point upgrade to amd bro. Im pretty sure even the intel b580 performs better
Nvidia really sold me on buying a console. I don't feel like paying $2000 for awful ports.
Is it too much to ask for better power efficiency. I don't want to be spending an extra $200 a month just because my gpu consumes more power than my ex's weight.
Undervolt your gpu. Or buy a less powerful one. NVIDIA gpus are already pretty efficient as they are.
Nvidia GPUs are more power efficient then AMD GPUs..... Power efficiency was never the issue for Nvidia as noted above, they already are very efficient for what they are.
Power efficiency isnt really related to total power consumption.
@@slopedarmor Power efficiency and total power consumption are closely related concepts. Higher power efficiency means less energy is wasted, which directly reduces total power consumption for a given performance level. Conversely, lower efficiency increases power consumption because more energy is wasted as heat or other losses.
@aflyingcowboy31 if they improved power efficiency by 2x, the 5090 would still consume 600watts or whatevers its gonna be. It would just perform x2.
protest the 5080!
boycott Nvidia
As always, awesome content! Thanks Dan!
As someone with a 3080, the 5080 should have more VRAM than 16gb
I would guess this neural compression would save massive amounts of memory and memory bandwidth on the GPU, which would be huge in reducing costs while also greatly improving visuals. There’s always trade offs, but better compression is always welcome and can have a big impact.
Time to support Intel, you say? The 4080 may be the last Nvidia card I purchase... Tired of Nvidia. I'll play around with the B580 for a bit on my son's new build. Up yours team Green.
With current year and expansion of AI, a realistic pro-consumer memory setup would have been:
5070 - 24GB
5070 Ti - 32GB
5080 - 48GB
5090 - 64GB
Don't fall for the assumption that small increments are acceptable and only ask for an extra 4GB. These GPUs are designed to be outdated very soon. Even if you don't care about AI and just want to game, strong limits on consumer chips will only push for more subscription and cloud services for everything with less data ownership for the user. The last thing you should want as a user is for developers to implement more features that require online services. Everyone should care about the gap between consumer and professional chips getting bigger. If they get to charge the excessive price anyway, they need to deliver on the product.
The era of facetime with your match to see if she is real / not a catfish is over. The era of wearing VR goggles to meet your match to go on date with 1996 Adrianna Lima has arrived.
I wonder if this is related to the neural radiance cache NVIDIA researches put out papers on a few years ago.
I bet the first game to showcase the neural rendering will be Cyberpunk 2077 again. We know CDPR already got their hands on a 5090 through the Witcher 4 trailer.
CDPR announced that they wont have be showcasing this tech in cyberpunk2077. That is officially a last gen game now
Question is: Will older RTX GPUs be able to handle neural rendering?
The answer is probably yes, but who knows what Nvidia is gonna try to make us believe...
If they have Tensor cores, it should be possible. Up to Nvidia to support or not though.
20Gb 5080 minimum would’ve been awesome but oh well. I’m jumping ship to AMD this next gen. I genuinely can’t put my money behind NVIDIA anymore.
AMD will never compete with Nvidia because both companies are owned by the same family.
@ That doesn’t really matter to me. I’d rather get more VRAM for my buck.
20GB doesn't make sense if you know anything about how the current VRAM capacity/chips/memory lanes are being manufactured.
Even though the 5090 is an overkill my 4090 is doing just fine but I’m still upgrading YOLO
The cuda core count difference between the 5080-5090 is ludicrous and the vram too.
The neural rendering stuff seems neat but I wonder how it would be implemented. I'm guessing it would need to be worked into the game engine somehow and if so it could be many years until we see it
im ok with the 5080 only having 16gb of vram so long as the price reflects the fact we are only getting 16gb of vram lol. that said if the 5060ti has 16, it seems silly not to launch the 5080 with more.
"So long as the price reflects that" It absolutely won't. You'll have inflation, tax, the Nvidia tip, the Nvidia tax, and $200 over MSRP by default with every model.
@SeasoningTheObese oh and at least 10% new extra import tax if u are in the usa
This looks exactly like nvidia using AI, again, to tell consumers that they do not need more vram because of some fancy crap. It must be coded into the game so thos doesn not make a 5060 8gb less of a pos.
Dont worry, next gen witcher 3 with a hd texture mod only uses 19 gigs of vram
Dont worry, the avatar game at max settings 4k only uses 22gb of vram
Dont worry, some graphics options in indiana jones dont even appear if you dont have 12gb + vram.
Dont worry, you can run full raytracing in indiana jones with a 12gb card, but you will have to run medium settings to prevent vram overflow.
It'll also be proprietary and games that lean on it for performance will be utterly unplayable with it turned off. Nvidia is a pox on the industry.
I remember when people laughed at people saying Nvidia would incrementally increase the price of the 80/90 products. Not so funny now lmfao.
Regarding NVIDIA's Neural Rendering technique - a good channel to watch its progress is with Two Minute Papers. And this specifics in regards the Papers about the "Real-Time Neural Appearance Models" (yes that's the original title used for this technique) are discussed in the following video called "NVIDIA’s AI Learned On 40,000,000,000 Materials! (October 15, 2023)" and "NVIDIA’s New AI: Gaming Supercharged! (October 29, 2023)".
As for Neural Texture Materials, earliest discussion was around at this time with the same channel "High-Resolution Neural Texture Synthesis (January 18, 2018).
Can’t wait for AMD to announce their own neural rendering few days after nvidia announce this. As with everything else before this, upscaling, ray tracing, frame generation, etc. AMD always only follow Nvidia’s lead. Without Nvidia, the gaming world would still be stuck with just rasterization, with no upscaling, ray tracing, or frame generation. lol
Meanwhile, when we had 'just rasterization' we had games performing adequately at native resolution, optimization not going to crap because of devs crutching on RTGI becuase it's faster for THEM to implement, despite being vastly more costly for the consumer and usually not even coming with a major visual uplift. Framegeneration is literally pointless. It means NOTHING when the only cards getting sufficient performance for it to not introduce ruinous amounts of latency, are the upper-end cards that shouldn't need imaginary frames in the first place.
Oh, and we had games that weren't smeared with vaseline and loaded with ghosting.
God forbid we talk about how upscaling has been crutched on to push graphics that hardware can't run and instead leaves us with an OBJECTIVELY blurry mess compared to old-school. TAA is also causing a lot of blurry mess, but god forbid devs build their pipeline around graphics concepts that don't rely on GUESSING WHERE THE PIXELS GO.
Hey hey hey, relax dude. I paid $800 for the ai motion blur, so I'm gonna use the whole ai motion blur.
If I had to guess what neural compression is doing then I would say it's probably reconstructing some detail with ai like how dlss reconstructs detail for upscaling. The neural rendering also seems to be doing something similar, this time it's reconstructing shading behaviour or the way light would interact with the surface of a material type.
If had to put my money on it, 1st one is for general DLSS upscaling, and 2nd is probably for DLSS-RR.
The conversation of “neurons” vs quality is a topic called “quantization.” Simply, it allows AI to run with lower accuracy while using lower memory. With GDDR7 and a lower memory cost for AI textures, you likely won’t need all of the memory you’re clamoring for.
How'd you clean those masks when they get dirty/smelly? Do they come apart in a way you can wash fabric parts?
Same here. 16 gb 5080 ? So i guess im getting a 5070 ti with 16 gb. Or a 5070 with 16gb.
If that is on 4090, i'm guessing the render time will be much higher on lower tier cards, cause the lack of processing power, which is the one who needed such compression cause their small vram buffer, it is interesting on how they will pull this off.
Oh boy maybe slow down fps so you have to spend more for dlss, like ray tracing did. Plain raster in native is always king
When about the neural rendering the NVidia rep responded "My CPU is a neural net processor; a learning computer."
I am not sure about the rest but the only reason one should be a 5090 is if they are into high end PCVR or handling AI workloads. I believe a 4090 or a 5080 can do 4K 144khz with DLSS. So let us VR users have the 5090 damnit. I can't wait to be fully immersed with my Pimax Crystal or 8KX at some crazy 5k upsampled resolution per eye at 120 FPS.
Power naps the life saver.
I'm using a 4080S with 16GB. The people getting upset at 16GB are being completely unreasonable.
Hah you cant even run witcher 3 with the hd texture pack mod
You must be clueless then.
Laughs in 20GB 7900xt
@@TH-camAccount-i4m actually, I sold a 7900XTX to buy my 4080S, because the 7900XTX just felt a generation behind. Poor upscaling, RT, drivers, etc. Now I can run path tracing and it's incredible.
Every GB of RAM over what you use is completely worthless.
@@slopedarmor how are you doing with path tracing, which is actually noticable in gameplay.
So cut scene level graphics but in real time gameplay??
5090 and 5080s will sell out in minutes and will go right back up on Newegg and Amazon at inflated prices well above MSRP from scalpers. And….people will still buy them due to FOMO. Hell I saw a guy on Reddit buy a 4080 super OC for 1600$ off Amazon. Like why???????? NIVIDA has a monopoly on the high end market and they are not stupid. They know everyone will still buy them regardless of price due to no competition.
The only alternative is AMD at the moment. Which has a dogshit upscaler, no raytracing performance, and nvidia already has framegen with their dlss3
Only now Intel is sorta catching up. But it'll still be 1-2 years before the cards have ample supply and their Xess and framegen get put into alot of games. Intel also actually had the forethought to put RT cores in their cards so they xan compete with Nvidia's Raytracing.
@@ambientlightofdarknesss4245 Meanwhile, people LOVE to ignore that the reason AMD can't compete, is because people literally put no effort into support for the non-Nvidia biased methods, leaving AMD to have to brute-force RT while devs are literally building their implementations based entirely around Nvidia's architecture, while the upscaler gets no real improvement because NOONE IS WILLING to actually implement it and WORK ON IT. PRECISELY because of attitudes like yours.
'oh there's a platform agnostic option? Let's not use it and talk bad about it because a hardware locked version is 'better''. Idiots.
My theory is Nvidia is the ones buying there own cards then relisting them for more money. Thats why they do not stop it when they could!! It should be illegal honestly.
Neural rendering is Amazing, I didn't expect to see it so soon. I'm guess that by the way they put it, the RT cores and Tensor cores can't really talk to each other that easily on current gen, new architecture feature? My biggest concern is that this will not be trasparent, but developers will have to implement it, this means a looong a** time before we actually see it in games. I hope I'm wrong.
Is it like AI baking on the go?
@@UmmerFarooq-wx4yo Yes kinda, From my understanding neural rendering uses AI but will either completely render a game image or parts of it, like objects, the global illumination, shadows, etc, giving a realistic and high quality look and at the same time being a lot more performant than rendering locally on the gpu die, it won’t be like the AI model that frame generation or dlss use and help the gpu locally run the game but rather fully take over and render certain things and give the gpu some more performance.
Gwent 2 is back? Let’s go!
"16x the texels!" ~ Ted Howa-... err Jensen Huang
Finally a decent application for utilizing "AI" neural networks for something fucking useful, non-instrusive and beneficial for the gaming industry. First time i'm excited for anything regarding what we today refer to as "AI" in videogame graphics. The rest is not my cup of tea, meaning DLSS and FG (interpolation).
Halo Strix looks legitimately exciting and 40CUs is actually insane!!
Nvidia, knock it off with only giving us 16GB VRAM on a fucking RTX 5080, come on!
All in all, i'm excited for next year - it's gonna be so good! So many great developments in tech all at once.
"Neural rendering" sounds like will need a lot of VRAM. If you want some decent "neural rendering" performance, you'll need RTX 5090. Again: NVIDIA creates a problem and then sell the solution.
Looks awesome but I’d prefer neural improved physics or destruction
Neural Rendering is early, and very cool.
A product heavy reliance on AI tools like DLSS is like a car without its engine.
im still on a 1070ti 8gb waiting for something worth my money so I can enjoy all these new games but so far nothing in the 30, 40 and now seemingly the 50 series cares enough to have decent vram without spending as much as my current system on just a gpu. going to see what AMD offers in regards to RT performance this time around because I know they always have decent vram.
We all know the 5080 should have around 20gb, 5070 should have 18-16gb and the 5060 should have 12gb (at least, maybe 16gb at a slower bus).
The 4070 Ti Super was on a good sale for 720-740 USD this black friday, it was a good choice for getting 16gb of vram and being able to use dlss 3
So the neural rendering is a thing only for content creators? right? RIGHT?
Is only Nvidia going to have neural rendering for the Geforce RTX 5000 series, or is AMD also going to have neural rendering on the Radeon RX 8000 series.
Chat GPT said this is what the website breaks down to as well as why Nvidia isn't increasing the VRAM possibly:
Here’s a simplified summary of the NVIDIA Neural Texture Compression (NTC) website:
---
### **NVIDIA Neural Texture Compression (NTC)**
NTC is a new way to compress textures (images or material data used in 3D graphics). It uses AI to compress textures much more efficiently compared to older methods, while still keeping high visual quality.
**Why it’s important:**
1. **Smaller Sizes:** Graphics assets take up less memory and storage space.
2. **Improved Visuals:** Maintains better quality, even with high compression.
3. **Compatibility:** Works with modern graphics pipelines.
**Key Features:**
- Optimized for gaming and real-time applications.
- Reduces memory bottlenecks, especially for demanding tasks like VR or high-resolution textures.
- Uses neural networks to predict and render textures with less data.
---
If you need specific details on how NTC works, feel free to ask!
RTX5080 16gb is pretty good in my opinion.
can still run many applications and games.
but RTX5060 8gb? 🤡
at the beginning of the release sold for $320++ 💩💩 (this is the price of the RTX4060 when it was first released)
we already have AI designing suspension arm for minimal weight and maximum strength.
we can expect minimum mesh for maximum detail.
probably.
Just picked up a hellhound 7800 xt 16gb for 499. I got it before the trade wars in 2025.
If neural compresion textures is a thing. Nvidia could save rtx 2000 3000 4000 with 8 gb vram