Best gaming channel by far! No talking heads blabbering about stuff they don’t understand or haven’t actually played. You get an honest educated opinion from people passionate about video games and associated technology.
i agree. nobody shouting and screaming at you as soon as you click on the link, the youtuber is literally spitting on there camera just because there too loud and obnoxious. digital foundry really know there numbers and tech, and best of all there calm while explaining things. a bit like a wise college tutor lol
There really isn't a tech channel I enjoy watching more. They don't blast you about subscribers, they don't irritate us with ads or try to sell us anything. It's literally the exact reason I did subscribe and make sure to like all videos and turn on my notifications for them exclusively. No other channel will I do that for. They don't have to sell me anything or ask me throughout the entire video. It's at the end, it's in good taste, and I'll be damned sure to continually follow/support them.
@@hldemi agreed, I'm simply happy that a $500 card (3070) provides similar or better performance than last years $1200 (2080ti).A 66% drop in price . Now thats something !
Yeah, it's really cool, but it's not really news.. Microsoft said they would bring this to PC with DirectStorage. This is just Nvidia's implementation of that.
@@Sythern0 Yeah, Nvidia is working alongside with Microsoft's DirectStorage. I'm just happy this is being address now and not farther down in the future.
@@Sythern0 direct storage is just the API that trys to go straight to the GPU, rtx io is hardware decompression on the GPU. They pretty much need each other
@@doctorsilva1345 I'll be waiting to get a PS5 for some good exclusives to launch. If I upgrade my PC I get all the games I can think of at 4k 60fps Raytraced Ultra Settings. If I get a PS5 I get a couple multiplat games I could play on PC at much higher settings with better performance. Not worth it.
@@johnames6430 God of War 2. New Uncharted. A real Spiderman sequel that's not just a standalone DLC. Or any new games. Can't know what you're looking for when it doesn't exist, can you?
I just noticed they used ThePruld's video for the "content creation" category 3:41 Our goofy darksouls boi made it *Editing to add: Adam Sessler at 36:29 makes me the big happy
I'm so jealous that you guys get to play with these cards already! Couldn't think of anybody better to get one early though. Keep up the great work guys
Any recommendations for a new PC user? Best components? I'm planning to use RTX 3080 GC since thats new best in the market rn people sayin. So what should I get no matter the price? And what monitor + Mouse and keyboard plays best
to be honest, i swear i got super hyped for rtx 3080, still getting my PS5 this year BUT i might get an rtx first and probably wait for next year for my console
True. Playstation first-party games are awesome, no matter how good the RTX 3080 is. I only have the money for the GPU unfortunately so i'll be getting the 3080.
My exact thoughts!! First time ever I will miss a launch, but I think it will be more beneficial in the long run. As I'm not the "paid for beta experience" guy once again...
I'm really happy about RTX IO coming to Turing. I've been wondering that since the presentation. Once it starts getting implemented it's gonna be huge in my opinion. Thanks to RTX IO and DLSS Turing cards could stay relevant a lot longer than they otherwise would have
The IO stuff will only improve loading times and data streaming and as far as i know game need to actually support that. Improvement here really depends on application, also on PC case this will have transition period while consoles with that technology can fully utilize that from get go (if game is not in last generation).
@@ShadowriverUB I said "once it starts getting implemented", didn't I? ;) As for a transition period period I don't think that's such a huge deal. Just like higher resolution textures in current console games that aren't necessarily used the new PC games could have the new technology only on supported hardware. There's no reason for it to not simply be enabled by a switch in graphics settings.
@Clau007 DirectStorage does not need PCIe 4. Xbox Series X uses DirectStorage, and it doesn't even have PCIe 4. Turing cards support RTX IO/DirectStorage and none of those have PCIe 4.
Let’s hope they can manufacture enough. At that price point they’ll will sellout within seconds of launch and you’ll end up paying $1000 on eBay or waiting till next year to get one.
These are scared prices... Nvidia is a greedy corporation, why all of the sudden are they keeping prices the same as last gen. They are anticipating AMD prices so these prices keep Nvidia competitive. Think about it, if NVidia knows it has AMD beat, why not skim more off the top of its customer? They did it last few generations.
It’s always going to be in a good position because it’s modular. The *only* handicap PC has is I/O at the moment, and it will be sorted relatively quickly too. Nvidia is already trying to address it, others will follow suit because performance = $$$. I’ve been PC gaming for decades and I haven’t been this excited about new hardware in a long time. And I’m such a cynic!!
Mine, too. Can't wait to see what AMD has to show, but it's going to be a tough sell. Despite all the leak hype, I wonder if Big Navi will just end up another mid range card.
@@Sh-hg8kf the rx 580 is a great card! Specially for a first build, you won't be disapointed, just make sure you have a nice CPU to acompany it, ok? Try and get the 8gb if you can aswell, but the 4gb model is fine if that's what you have in store.
Yes but not the newest games. They might release the games after 2-4 years after they have launched on ps5. And they will still keep their most precious exclusive on playstation
@@plasmic4727 true but not that big of a deal to me if I have never played the game before. I still had a great time playing GTA V for example on PC like 2 years after the PS3/X360 version came out.
"Where does nvidia go from there".. well in my mind we still need a ti card because the 3080 only has 10gb vram. For many high end enthusiasts or developers, the 1080ti was a great card because of the extra vram, even 1gb more than the 3080! So the extra vram on the 3090 is very attractive.
Yeah I'm having a pretty hard time deciding between 3080/3090. 3080 + upgrading my (garbage 24") secondary monitor is the smarter option, but I'm developing purely path traced stuff (also VR path traced stuff) and I have the need for speed. Losing 1GB likely won't matter, but after a 1080 ti & 2080 ti it feels a bit weird. Extra memory bandwidth on the 3090 also very attractive.
For developers sure, that makes sense, but for gamers, even high end gamers... I can't recall a single game out there that uses 10GB of VRAM at 4K, let alone more than 10GB.
You guys are great! I love Digital Foundry. The back and forth talking between colleagues who you can tell are also friends is soothing, and there is so much in these conversations that I find to be not only informative but also entertaining. But you guys are not "entertainers". You are just genuine people who end up entertaining others just by doing what you love to do. I have followed Digital Foundry for a number of years now, and I am also very excited about your future videos! Keep up the great work!
DEVILTAZ35 DLSS 2.0 is accelerated by the Tensor cores in Nvidia’s graphics cards. AMD has no true equivalent just yet. AMD and Xbox have configured their shader cores to perform machine learning calculations, but based on the numbers given by Xbox, this solution seems to have a fraction of the efficiency of Nvidia’s solution. It’ll probably take years for AMD to match them since Nvidia has been investing huge amount of money for a long time into deep learning and AI to produce this. Direct X 12 Ultimate (which I’m assuming is what you were referring to) is just an expanded feature-set for the DirectX12 graphics API to support SFS, VRS, mesh shading, and ray tracing.
Microsoft has AI out the wazoo. That's the thing though, RDNA is all collaborative with Sony and Microsoft, AMD gets to pick the cream of the crop console features and put them on PC.
Me too. I plan for the RTX 3060 to be my next GPU after a GTX 650ti Boost 2GB. That old card was a lot of bang for the buck but it's showing its age and there are a number of games I want to play that it can't handle now.
Well if they drop $100 off like they did, $300 would be damn good for 2080 performance. Also this will Boom VR as part of the largest problem is GPU performance a 2060 barely cuts it for a 1440p Oculus product and leads to in some cases a bad experience from dropped frames.
@@nocturnal101ravenous6 Ya. I'm wondering if it will be $350 but yeah, $300 would be incredibly competitive and AMD might really struggle in the GPU market while contending with a product like that.
@@paulemillevasseur7622 I don't think they are competing with AMD here, I think they are competing with the consoles particularly I think the 50 and 60 series are going to be made specifically to be better to attract console games to the PC market. the 70/80/90 was for PC high end gamers and content creators that were already PC crowd.
@Digital Foundry The demo with the makeup, is from Luminous Productions and is using an upgraded version of the Luminous Engine (the one from Final Fantasy 15).
Holy hell. What an amazing presentation. I have a feeling that these RTX 3000 series graphics cards are gonna be like the most revolutionary thing to happen to PC gaming since the launch of 3DFX's Voodoo cards in the mid-1990's and the release of Crysis in 2007. These cards gonna be like the modern equivalents of the Voodoo 3 graphics cards from the late 1990s. I'm probably gonna opt for the cheapest option, the RTX 3070. Of course, it's gonna be the cheapest raytracing card and far from the best, but I think it would be good enough for 1440p or even 1080p, while not being monstrously expensive like the RTX 3090. I don't think the world is ready yet for 8K gaming.
no one is ready for 8K (no TVs, no monitors). I can't even hardly see the difference between 1440p and 2160p on my 4k TV at couch distance with my 2080 (well it's slightly crisper but not really that life changing in motion, not like the jump from 1080p to 1440p imho) so with a 3070 you're gonna be able to game 2K/4K @60 easy depending on the games and different settings you prefer to use. Also don't forget more and more games are implementing resolution scaling bars in settings (my GF is currently playing Dragon Quest XI at 1600p : 2160p Ultra was a bit choppy at times and lowering settings was a bit of a step down graphically in this particular case, but @85% of native 4K, it runs 60fps without a hitch) so you will be able to tune in every games as YOU want it (nvidia panel can force it too). That's the great thing with PC gaming, you can favor FPS count by lowering some settings or prefer to play full on 4K @30fps if you wish to, you decide, and with a 3070, the choice will definitely be yours
Frame rate upscaling could be (almost) latency free, if the AI had access to high framerate geometry, and it learned how to interpolate the rendered shading correctly. Let me predict: this will be the most important tech in a couple of years
@@paint4326 For what?? To see how many people play a graphically outdated game? A cheap version of Call of Duty? A buggy, broken game made on a dinosaur engine.... 😒 fun....
I really love the digital foundry content, you do such a great job at everything you do and is really interesting to hear you talking. I am waiting for those ampere reviews
@ 6:20 I'd guess the main reason Epic probably implemented it was just to practice implementing RT in a major shipped title. The raytracing and DLSS will probably end up being more of a futureproofing feature just because the game will foreseeably be popular for at least a few more years. It's also a feature that probably will cross over to the new consoles.
Microsoft *is* developing AI upscaling through DirectML, and I'm sure DLSS is just an implementation of that like RTX was, it's just that Nvidia is way quicker to deploy these compared to the competition.
@@InsuredFrames It's up to Nvidia and AMD to implement such software features, or at least I'm sure that Microsoft is working closely with both in terms of DirectX feature sets. What I mean is that clearly Tensor cores and DirectML are very much related, with one being hardware to accelerate ML workloads and the other being an API that helps with ML acceleration. Now don't expect DLSS games to work on AMD, but rather that AMD will have its own implementation of AI upscaling (they do) and that in the future games will be made to upscale through DirectML and not through DLSS. Maybe DLSS in the future will mean that the training was done on Nvidia servers or something and will probably run better on Nvidia hardware by virtue of having the dedicated hardware to run it.
@PardonMySanity Ugh, I'm saying the entire time that Nvidia will probably support DirectML and quite obviously meant that DLSS will in the future might just be branding for some sort of training service, like what DLSS has now, but for DirectML instead. Especially if the dedicated hardware gives them a big advantage over AMD anyway, which I assume it will because since Turing, Nvidia GPUs are (almost) completely asynchronous as well so having a dedicated block for this specific type of mixed precision acceleration will give you a big advantage in this area. The vector units in a Radeon GPU have a big advantage at true double precision stuff (which GeForce doesn't do natively), but that doesn't help here. I feel like you're knowingly being thick so you you can try to take some sort of intellectual high ground, but in the end you're adding nothing to the discussion. Nice to throw some ad hominem in there as well. What I'm talking about has nothing to do with (nothing more than just rumoured) DLSS 3.0 either, and you'd know that if you really knew what you were talking about. DirectML is nothing but an API that allows developers a more easy and universal way to implement any form of acceleration for specific tasks (like inferencing). You still need to train a model for now, unless they can compress the data somehow like Google did with their voice recognition on the Pixel 4, but for video we're talking about an order of magnitude more data. I'd be surprised if that DLSS rumor was true at all and it doesn't seem like it since nothing was announced so far, and that would seem like a pretty big feature to talk about. Maybe that's what the 24 GB on the 3090 is for, you can't really stream all that data from an SSD all the time if you're using that bandwidth for assets as well.
The RTX 2080, 2080TI were popular with the AI/Deep learning crowd as a value for money card. In addition to 8K gaming, the RTX 3090 I think is also target at the AI crowd, hence the huge VRAM for larger AI models.
Most interesting is IO. For years poor programming has been a thing, many parties involved sometimes all at the same time. Id love to see a plug and play, on or off switch globally.
Random idea: what would a Corridor Crew x Digital Foundry collab look like? Experts in across similiar fields teaching each other about the different ways films and games use rending technology
Two questions: 1. How much is the bottleneck of using Z490 Asus Maximus XII Extreme MOBO with 3090/3080 GPUs ? 2. Any news When they will announce/release 3080Ti ?
Switch price details were announced 50 days prior to release and it was extremely hard to get for the first year after release. Look at how that turned out. Sony will be fine
I felt a great disturbance in the early adopters department, as if millions of 2080 ti owners suddenly cried out in terror and were suddenly silenced. "jen-hsun huang kenobi"
Nope, too busy enjoying it for the past X months. :-) I'll probably upgrade to the 3080 Ti when it comes out in Spring 2021@ $999 (or maybe splurge for the 3090) and move the 2080 Ti to one of my spare gaming rigs. Resale value for the 2080 Ti is definitely going to suck for those trying to offload it though. :-/
20:38 The intersection accelerator in the TMU can perform 4 BVH box intersections or 1 ray-triangle intersection per cycle as per the patent "TEXTURE PROCESSOR BASED RAY TRACING ACCELERATION METHOD AND SYSTEM". Microsofts deep-dive slide on hot chips says that they have 380G BVH intersection/s or 380/4=95G triangle intersections/s. This adds up with what they've shown. The math is 1.825Ghz * 52CU*4 ray ops per cycle= 379,6G BVH intersections/s . The short answer is "BVH and triangle intersections are both accelerated".
@7:20 I think the inadvertently sold me on 144Hz. Simply because the difference between that and 360Hz is a Less Stuttery image in Slowmo, chances are In real time the differences should be minimal especially since player skill changes the Dynamics. So I'm confident in myself that if I want to invest on anything beyond 60Hz in order to have bare minimum to negate an advantage outside of Player Skill, 144Hz is the way to and the rest falls on me and my Mouse.
Don't confuse rendered frames per second with monitor refresh rate, people often confuse the two and DF is making that mistake here. I've seen multiple input lag tests that show a sizable input lag reduction comparing locked 60 fps on a 60hz screen to locked 60fps on a 144hz screen in OW and Apex. The major benefits of a 240hz or 360hz display can be felt even with a much lower amount of rendered frames because each frame will be displayed closer to when it is rendered compared to a slower refresh display. Obviously higher fps will lower the input latency further.
@@xannari3961 Ugh. It isn't a confusion. I'm aiming for 144FPS the FPS Limit on a 144Hz Display without getting Screen Tearing. Input lag will come down to Display type. I have seen many Non TN panels promote 1MS Response time on 144Hz and up Monitors, but consider a Monitor and I sit in front of it, losing color accuracy isn't a concern. The issue you are referring sound like a GPU and CPU that aren't capable of a locked 144FPS deceasing The Frame Timing and Increasing input lag. . . . If you still think I'm confused I wouldn't mind a correction if you still think it's needed.
@@cMARVEL360 I am just talking about pure input latency. Lets say I have a 240hz monitor, I am playing Apex, and set a 144 fps max because that is my .1 fps low and I want a consistent experience no matter what I am doing in game. If I set my monitor refresh rate to 144hz I will have overall higher input latency than if I set my monitor to 240hz (with the 144 fps lock). I've seen several sets of tests showing this. If I misunderstood what you were saying originally then sorry for the confusion, it just sounded like you were set on a 144hz monitor rather than a 240hz or 360hz purely because you cannot obtain 240fps or 360fps in your game of choice. In the video both John and Rich seem to say that you need to get 360 fps in game for a 360hz monitor to matter, this is factually incorrect and a very common misconception. I just wanted to point out that getting a 240hz monitor will give input latency advantage over a 144hz monitor even if you are locked to 144fps in game.
@@xannari3961 I see. What I was saying is that 144Hz/144FPS and FPS is my maximum investment when going beyond 60Hz/FPS gaming. The thinking behind that being that in the Slow Motion feed, the character was over the reticle for a similar amount of time but a slightly more Stuttery experience. But again only in slow motion, and I feel that is a good enough experience for me to remain competitive but not overly depended on my display for Skill but still relying on my skills and reaction time. The issue you are talking about.... Couldn't it be fixed with Free sync, G-Sync and Variable refresh rate on HDMI 2.0?
@@cMARVEL360 A higher refresh monitor allows a rendered frame to be displayed sooner since the monitor is refreshing more often. Even with g-sync/vrr a 144hz monitor has technical limits on time between refreshes that might delay a rendered frame from being displayed. I think it was battlenonsense, I recall seeing a few tests showing a 10ms drop in input latency between a 60fps locked 60hz display and a 60fps locked 144hz display. I've also heard several pro fps players comment on this topic when talking about Apex and Warzone (two games that max out at 140ish .1% lows on a 2080ti) saying a 240hz monitor makes a noticeable difference with input lag. Like I said, I just see this topic come up a lot and everyone seems think fps is the only factor for input latency but a higher refresh monitor will deliver lower input latency which is ultimately what makes a game feel better (imo).
I would think frame rate upscaling would be difficult to do with out introducing latency. It seems the most accurate approach would require buffering up two key frames and interpolating in between. That would add at least one frame worth of delay since you would have to wait until you get the most recent key frame before you can display the previous key frame, and then you can interpolate up to the most recent key frame. Alternatively you could do predictive interpolation by following a trend of objects acceleration vectors, but jerk would be near impossible to predict and would cause overshoots followed by an abrupt change when displaying the next key frame. A compromise would be to wait half way in between the current key frame and the next key frame to display the current key frame. Then you could predict one frame and interpolate between that frame and the next frame. This approach would triple the frame rate and minimize the jerk from the overshoot while only introducing the latency by half of the key frame time.
@@Janbore got a 2080 last September for the price of 3070 so I did not make a bad deal but yeah... 3070 sounds like a damn sweetspot of price/performance value.
@@kidi84 here in finland 2080 still cost anywhere from 800 to 1000 euros and 2080ti will cost anywhere to 1100-2000 euros so that 500 for 3070 is a nice very nice deal!!!
I’m a touring musician, have been for a decade, on the road 250+ shows a year so never home.. until quarantine. Now I am and this reveal REALLY makes me want to build a tower PC again. Only problem is by the time I put one together I might be out again so couldn’t use it!! But WANT. Badly
Hey the pixel count thing should be possible to achieve! It's a simple regression problem and can even be a classifiaction problem if you limit the number of potential resolutions! I could prob train that with the proper dataset right now :) All you need for the dataset is native resolution footage/images with their attached resolution.
The Broadcast background blurring/removal thing without a green screen has been solved for quite a while, I use XSplit VCam for my work video chats. Works exactly like NVidia showed, except it doesn't do the AI panning. I guess Zoom also does it, but XSplit VCam acts as a virtual camera so works with any software that expects a webcam. Moving it to GPU will help the rest of the system I guess, but if you've got a 3000 series card, chances are you aren't worrying about the CPU power needed for this.
Unless you're in Brazil just about anyone should be able to afford a 3070 (not saying it's "cheap" but resourceful people can usually hit a figure like that without too much trouble). Even if you're only making $3/hr as would be the case in many developing nations, take a second job at 4hrs/day and you could buy a 3070 when it launches in October. Instead of wasting your time dreaming about what could be, try to make it happen. Then after you do that with a 3070, you've got a model for good work ethic that should lead to success in many areas of your life.
@@budthecyborg4575 I don't have a job, bruh. My family is kinda poor, and my parents are paying a lot of taxes. My PC is partially built with second-hand components
Thanks for acknowledging the lack of DLSS 2.0 vs true native resolution (rather than TAA) testing, I can't wait for what Richard may find in his explorations. 👍
AI will 'fill in' a lot of games now for developers, so humans will design the game and give a rough blueprint as to what they are looking for then get AI to do the massively time consuming trees, ground, buildings, etc.
@@linusmlgtips2123 No they just keep spending money on things that are not playable game material, instead he's turned star citizen into a digital version of real estate. Where the rich can buy custom tricked out features that the regular consumer will never be able to access from an early acess point. Its a massive fundraising black hole that only appeals to the filthy mega rich
@@laerslexikondermusica4480 They say the want a hundred star systems by launch and they only have around a handful, most not fully completed. That just means a lot more work needs to be done. No, they barely added anything related to your claims.
The leap in performance over Turing was amazing reveal but what caught me off guard was that RTX IO thing, i expected PC to at least take a year to catch up to what PS5 was doing with their IO complex. Now i really want to see load times and asset streaming tests between PC and PS5 in multiplats games. The only thing PS5 has advantage now is installbase, since everyone will have that fast SSD, in PC it will take time until most users are using NVME PCie4 SSD plus an RTX card.
I wonder how much the DLSS improvements are setting up the 40xx series as VR killers paired with next-gen HMDs. The main complaints I hear about HMDs is resolution and framerates, so DLSS seems to be an answer there. Maybe a DF video visiting the topic of DLSS as it pertains to VR?
This event took some fire from the consoles for me, but just thinking about the new God of War or whatever Naughty Gods are making next makes this pale in comparison. It's about the games in the end.
Hi DF, I was just reading about the PS5 HD camera which states it alows the ps5 to remove backgrounds from the player when streaming etc with the create button... does that suggest machine learning or is there another way to do this?
Still getting Series X, it's still gonna be very capable if you look at everything else, including the techniques Microsoft will be using. Honestly, maybe this GPU will be a good comparison for the Series X, and see how they both stack. I could care less about whether PC is better or not, there's so much you need to even make it decent, and Series X is gonna be a steal. Hey correct me if I'm wrong, I have my preference, you have yours. Xbox and PC are basically the same ecosystem, especially how they're both mostly Microsoft.
@AnEn That may be true, but it's more than the GPU alone, more so how everything else works together. This new piece just came out, so how would it be inside the new systems? At the end of the day, we're all great. I'm glad these systems have ditched the old Jaguar cores, and actually have more modern components.
@AnEn I'm pc enthusiast sold my 1080 getting a rtx 3080 and I play xbone one with bro all the time I'm simmer too and I totally agree you enjoy your hobby / entertainment preference without being an obnoxious twat like instance ie salt maker. I think the consoles will be trenendous value and great, it's massive upgrade from jaguar cpu cores especially
Video to 3D, wtf!? Insane. Physics is going to be so beautiful, finally. Will love to see this physics on something like gta5, where all the cars to trucks weights are different.
best thing is DLSS is the biggest thing for graphics for Nvidia and what ever they call it for AMD. Machine learning AI graphics prove to be better than native 4K graphics so you can get RTX with high frame rate.
AMD has no competitor to DLSS. They don't have tensor cores. You need them for DLSS, unless you think DLSS 1.0 didn't suck. But you would be alone there.
@@laerslexikondermusica4480 I know it is not DLSS but there is Machine Learning tech for the Xbox Series X which has RDNA 2 we just need to learn more about that feature
@@Wylie288 I know DLSS is only for Nvidia the reason why I said that is the Xbox Series X has some feature for machine learning tech which we need to know more about that is it and what it can do but in no way is it comparable to the 3070 and up
23:33-Alex literally popped into my head when I was watching the reveal event live. 150+GB for CP-2077 is still much more palatable than for Warzone (excluding updates)
Finally a card that is fast enough to render Richard's hands.
Wichard and yt car mechanic Scotty Kilmer should have a hand gesture competition.
😂
Dead lol
Yes 8k 60fps🤣🤣🤣
🤣
Richard "there were rays, lots of them and they were being traced"
My man, 👏🏼😆
Time?
@@FlatEarthDenier 00:37
There were rays
being traced,
but I never saw them flinging.
No, I never saw them at all,
since there was denoising.
Lil Traced Richard x
"100 copies of crysis"
That sounds funny and all, until you try to run 100 instances of crysis at the same time.
That single CPU core is going to kill itself.
And burn the house down
Flight sim is the new standard
@@WarbossRB Wrong, you would ofcourse use a dual socket 64 core, which would lead to 128 cores and run each instance on it's own core
@@WarbossRB ROFL
2:18 So the 3080 was really hiding behind those spatulas the whole time!
god, i never thought to look there during the announcement yesterday and when he pulled it out from behind there i felt so dumb.
😆
you'll never guess what was baking in the oven...
Edit: and not because the oven was on 👀
They haven't announced it's real time cloaking features yet!
I also read the post on pcmasterace
Best gaming channel by far! No talking heads blabbering about stuff they don’t understand or haven’t actually played. You get an honest educated opinion from people passionate about video games and associated technology.
You mean like IGN, Game Spot...etc?
i agree. nobody shouting and screaming at you as soon as you click on the link, the youtuber is literally spitting on there camera just because there too loud and obnoxious. digital foundry really know there numbers and tech, and best of all there calm while explaining things. a bit like a wise college tutor lol
This is actually a "tech channel for gaming." That's why the contents look professional.
There really isn't a tech channel I enjoy watching more. They don't blast you about subscribers, they don't irritate us with ads or try to sell us anything. It's literally the exact reason I did subscribe and make sure to like all videos and turn on my notifications for them exclusively. No other channel will I do that for. They don't have to sell me anything or ask me throughout the entire video. It's at the end, it's in good taste, and I'll be damned sure to continually follow/support them.
Kyle Ogle they really know what there talking about an such a calm and intellectual manner. no lies or bull
The RTX 3090 is as powerful as the NEC Earth Simulator, the fastest supercomputer in the world from 2002 to 2004.
so 16 years to have it in my room. Not that impressive to be honest.
@@hldemi NEC Earth Simulator cost over $550,000,000 to build. I'd say it's a pretty rapid improvement.
@@Smith-sz9nt Gotta agree. Still 16 years of hardware development is pretty long time too.
@@hldemi agreed, I'm simply happy that a $500 card (3070) provides similar or better performance than last years $1200 (2080ti).A 66% drop in price . Now thats something !
@@Smith-sz9nt 3090 is about 0.00045% price of the supercomputer. Impressive in its own league !
Nvidia IO got me excited the most. The fact that someone is finally addressing the traditional i/o decompression bottlenecks makes me a happy boy.
Time to upgrade to PCI-E 4.0
Yeah, it's really cool, but it's not really news.. Microsoft said they would bring this to PC with DirectStorage.
This is just Nvidia's implementation of that.
@@Sythern0 Yeah, Nvidia is working alongside with Microsoft's DirectStorage.
I'm just happy this is being address now and not farther down in the future.
@@Sythern0 direct storage is just the API that trys to go straight to the GPU, rtx io is hardware decompression on the GPU. They pretty much need each other
@rek yeah especially for those who may nt have a large enough office space for a green screen.
The RTX IO DirectStorage acceleration is the secret weapon here. It's PC's answer to the PS5 SSD architecture, which is a huge deal.
This is aging well
Held off upgrading my 1080 for the last year telling myself the 30 series would be a large upgrade. Sure enough. Upgrading to a 3080 on release week.
Wise man
I have a 1080 also but I will actually upgrade next year tho, I want to buy PS5 and will be already spending lots of money on it.
@@doctorsilva1345 I'll be waiting to get a PS5 for some good exclusives to launch. If I upgrade my PC I get all the games I can think of at 4k 60fps Raytraced Ultra Settings. If I get a PS5 I get a couple multiplat games I could play on PC at much higher settings with better performance. Not worth it.
@@brandonjohnson4121 what are the exclusives you are looking for on PS5?
@@johnames6430 God of War 2. New Uncharted. A real Spiderman sequel that's not just a standalone DLC. Or any new games. Can't know what you're looking for when it doesn't exist, can you?
Ngl The Nvidia keynote was more exciting than any next gen console showcase we saw this year
idk the ps5 unreal 5 engine demo still looks pretty good bro
Ga Pa Pa La unless you know it is running on 1440p @ 30fps
I agree on this👍
Yup. I'm not getting the new Xbox now and upgrading my GPU to a 3080 💪
@@un80rns the marbles demo was running 1440p 30 fps
The spatula tag says, "From your partners at Unity" in case anyone else couldn't find an answer. 28:13
Liar!
It clearly says "MAGA 2020 SUCKERZ ~TRUMP"
I love that the whole edgy-executive-with-cool-clothes of the 00s is returning. J Allard was the king.
The whole presentation also almost smelled like Steve Jobs. Very well made, even down to the looks.
Neo Lix and the “one more thing” at the end
J Allard and Kutaragi-San
@@Superdelphinus Yes! Perfectly structured. I enjoyed it very much. :D
Seeing this presentation I can't help but feel that we already live in the future. And I'm only 32 years old.
Suck it, I'm 25.
@@runner9528 that's not the proper way to talk to your elders.
I’m about to be 26 in a few days if that matters
@@runner9528 I'm the youngest here
Turned 33 in mid july, never imagined real time could look like this so soon!
I just noticed they used ThePruld's video for the "content creation" category 3:41
Our goofy darksouls boi made it
*Editing to add: Adam Sessler at 36:29 makes me the big happy
Praise the sun 'till you go hollow
I'm so jealous that you guys get to play with these cards already! Couldn't think of anybody better to get one early though. Keep up the great work guys
Any recommendations for a new PC user? Best components? I'm planning to use RTX 3080 GC since thats new best in the market rn people sayin.
So what should I get no matter the price? And what monitor + Mouse and keyboard plays best
something better- the ps5
@@ayrtonmichaque7214 its good but not everyone wants a console. also how is it better when its not out
@@ayrtonmichaque7214 garbage
@@UAGoWSuplexer 1440p 144hz gsync monitor would be great for this card.
3:23 "BIIIIG NAVVIIII" hahahha
Need announcement now!!!
It was a great event and looking forward to their release.
I don't even have a pc and I found the event more interesting than sony and and xbox
@@sean7332 :D
@@XnxxD it kinda puzzles me what I wanna get now
@@sean7332 Join the master race.
@@HeloisGevit I've had an xbox pretty much all my life plus a ps but not for this gen, I may get a ps5 and pc
The most important question is, if Jensen Huang was running in real-time with RTX on or if he was pre-rendered?
They actually run 4 of him at the same time, but they were showing us only one of them.
only the jacket is pre-baked!
I just want to know how Alex managed to get HDR and bloom to run at the same time
Yes
Big Navi sounds like the name of a Final Fight boss that never was
That big navi better be fucking big because they were hyping it up for like 10 years
Turns out the best big Navi is Rtx 3070~ level for the same price. Like the last 10 years.
yashya shino nope
Neo Lix They were the Nali. Residents of the planet Na Pali.
It’s the little light thing from Zelda after she failed weightwatchers
to be honest, i swear i got super hyped for rtx 3080, still getting my PS5 this year BUT i might get an rtx first and probably wait for next year for my console
Good choice! Will be getting a 3090 though. I wanna play Cyberpunk in ALLLL its glory!
True. Playstation first-party games are awesome, no matter how good the RTX 3080 is. I only have the money for the GPU unfortunately so i'll be getting the 3080.
My exact thoughts!! First time ever I will miss a launch, but I think it will be more beneficial in the long run. As I'm not the "paid for beta experience" guy once again...
@@qwertmom i'm not poor but that doesn't mean i'm getting both this year, that's a lot of money to be fair lol, but yeah
I would do the opposite, power is zero without optimized games to play.
31:26 is where talk of the 3070 starts and what it means for next gen consoles PS5 and Series X especially
I'm really happy about RTX IO coming to Turing. I've been wondering that since the presentation. Once it starts getting implemented it's gonna be huge in my opinion.
Thanks to RTX IO and DLSS Turing cards could stay relevant a lot longer than they otherwise would have
The IO stuff will only improve loading times and data streaming and as far as i know game need to actually support that. Improvement here really depends on application, also on PC case this will have transition period while consoles with that technology can fully utilize that from get go (if game is not in last generation).
Yeah PS5 SSD technolgy is being beaten by old cards.
@@ShadowriverUB I said "once it starts getting implemented", didn't I? ;)
As for a transition period period I don't think that's such a huge deal. Just like higher resolution textures in current console games that aren't necessarily used the new PC games could have the new technology only on supported hardware. There's no reason for it to not simply be enabled by a switch in graphics settings.
@Clau007 DirectStorage does not need PCIe 4. Xbox Series X uses DirectStorage, and it doesn't even have PCIe 4. Turing cards support RTX IO/DirectStorage and none of those have PCIe 4.
@@ShadowriverUB surely any games that support direct storage on xbsx will support it on pc.
Let’s hope they can manufacture enough. At that price point they’ll will sellout within seconds of launch and you’ll end up paying $1000 on eBay or waiting till next year to get one.
These are scared prices...
Nvidia is a greedy corporation, why all of the sudden are they keeping prices the same as last gen. They are anticipating AMD prices so these prices keep Nvidia competitive. Think about it, if NVidia knows it has AMD beat, why not skim more off the top of its customer? They did it last few generations.
@@KieferNguyen I think it may also be to persuade some to put their PS5 money towards a new GPU instead.
Hopefully amd is competitive
@Zoran at least nvidia actually improved their products meaningfully, unlike intel...
@@dopeoxide amd doesn't have dlss
39:00 "The PC is looking to be in a very good position here."
the video came out 15 minutes ago, how’d you get here dude
nat bro watching this vid at 2x speed
@@catears3053 10x
It’s always going to be in a good position because it’s modular. The *only* handicap PC has is I/O at the moment, and it will be sorted relatively quickly too. Nvidia is already trying to address it, others will follow suit because performance = $$$. I’ve been PC gaming for decades and I haven’t been this excited about new hardware in a long time. And I’m such a cynic!!
Very good. The 3070 is gonna embarrass the next generation consoles.
Well, my RX 580 is having an existencial crisis at this very moment... I can hear the weaping... Screaching...
"What is my purpose?"
"You run FarCry 5 reasonably well, but that's all."
"Oh my God."
My Rx 480 too..
But I'll wait for Big Navi, I wanna see both sides of the battle before I upgrade
Mine, too. Can't wait to see what AMD has to show, but it's going to be a tough sell. Despite all the leak hype, I wonder if Big Navi will just end up another mid range card.
Lol, I'm buying a rx 570/580 myself in a while for my first built. My Intel HD 3000 is having an existential crisis lmao
@@Sh-hg8kf the rx 580 is a great card! Specially for a first build, you won't be disapointed, just make sure you have a nice CPU to acompany it, ok? Try and get the 8gb if you can aswell, but the 4gb model is fine if that's what you have in store.
31:55 actually Sony just announced that they are planning to release more of their first party exclusives on PC as well.
Yeah but not until years later
Yes but not the newest games. They might release the games after 2-4 years after they have launched on ps5. And they will still keep their most precious exclusive on playstation
@@plasmic4727 true but not that big of a deal to me if I have never played the game before. I still had a great time playing GTA V for example on PC like 2 years after the PS3/X360 version came out.
@@megapro125 lol I had an average laptop but got a gaming one back in 2016 so I never knew gta 5 was released late for pc
I'd love to play older PS4 exclusives on PC, I skipped PS4 this generation.
"Where does nvidia go from there".. well in my mind we still need a ti card because the 3080 only has 10gb vram. For many high end enthusiasts or developers, the 1080ti was a great card because of the extra vram, even 1gb more than the 3080! So the extra vram on the 3090 is very attractive.
Yeah I'm having a pretty hard time deciding between 3080/3090. 3080 + upgrading my (garbage 24") secondary monitor is the smarter option, but I'm developing purely path traced stuff (also VR path traced stuff) and I have the need for speed. Losing 1GB likely won't matter, but after a 1080 ti & 2080 ti it feels a bit weird. Extra memory bandwidth on the 3090 also very attractive.
For developers sure, that makes sense, but for gamers, even high end gamers... I can't recall a single game out there that uses 10GB of VRAM at 4K, let alone more than 10GB.
3070 ti 16 gb already had unofficial launch thx to lenovo
@@KillahMate What's going to happen with bigger worlds thanks to ssds?
@resColts Bigger worlds will still be streamed into VRAM just like before.
You guys are great! I love Digital Foundry. The back and forth talking between colleagues who you can tell are also friends is soothing, and there is so much in these conversations that I find to be not only informative but also entertaining. But you guys are not "entertainers". You are just genuine people who end up entertaining others just by doing what you love to do. I have followed Digital Foundry for a number of years now, and I am also very excited about your future videos! Keep up the great work!
AMD really needs to invest more into AI soon. DLSS 2.0 is huge.
Isn't DLSS part of the latest Direct X 12 in the Series X and PC anyway? .
DEVILTAZ35 DLSS 2.0 is accelerated by the Tensor cores in Nvidia’s graphics cards. AMD has no true equivalent just yet. AMD and Xbox have configured their shader cores to perform machine learning calculations, but based on the numbers given by Xbox, this solution seems to have a fraction of the efficiency of Nvidia’s solution. It’ll probably take years for AMD to match them since Nvidia has been investing huge amount of money for a long time into deep learning and AI to produce this. Direct X 12 Ultimate (which I’m assuming is what you were referring to) is just an expanded feature-set for the DirectX12 graphics API to support SFS, VRS, mesh shading, and ray tracing.
its too late already, Machine learnings currency is time and Nvidia is already millions of hours ahead.
Microsoft has AI out the wazoo. That's the thing though, RDNA is all collaborative with Sony and Microsoft, AMD gets to pick the cream of the crop console features and put them on PC.
AMD is going to have a half assed rip off version of everything shown here eventually.
Now i'm hyped for the 3060.
Me too. I plan for the RTX 3060 to be my next GPU after a GTX 650ti Boost 2GB. That old card was a lot of bang for the buck but it's showing its age and there are a number of games I want to play that it can't handle now.
Well if they drop $100 off like they did, $300 would be damn good for 2080 performance. Also this will Boom VR as part of the largest problem is GPU performance a 2060 barely cuts it for a 1440p Oculus product and leads to in some cases a bad experience from dropped frames.
@@nocturnal101ravenous6
Ya. I'm wondering if it will be $350 but yeah, $300 would be incredibly competitive and AMD might really struggle in the GPU market while contending with a product like that.
@@paulemillevasseur7622 I don't think they are competing with AMD here, I think they are competing with the consoles particularly I think the 50 and 60 series are going to be made specifically to be better to attract console games to the PC market.
the 70/80/90 was for PC high end gamers and content creators that were already PC crowd.
@@nocturnal101ravenous6 I resemble that remark. I am totally a console turned PC gamer and I am totally interested in the 60 series.
So excited for the 3080. I'm building my first PC right now and this is going to be awesome in it!
Welcome to the PC Master Race
Good stuff
Welcome to the PC Master Race!
Well done brother and welcome to the master race
You think this is good? Just wait until you see the AMD 6700 XT!
@Digital Foundry
The demo with the makeup, is from Luminous Productions and is using an upgraded version of the Luminous Engine (the one from Final Fantasy 15).
Always wait to see DF's take on everything tech... Love these videos!
great chat, thanks to all involved...
Holy hell. What an amazing presentation.
I have a feeling that these RTX 3000 series graphics cards are gonna be like the most revolutionary thing to happen to PC gaming since the launch of 3DFX's Voodoo cards in the mid-1990's and the release of Crysis in 2007. These cards gonna be like the modern equivalents of the Voodoo 3 graphics cards from the late 1990s.
I'm probably gonna opt for the cheapest option, the RTX 3070. Of course, it's gonna be the cheapest raytracing card and far from the best, but I think it would be good enough for 1440p or even 1080p, while not being monstrously expensive like the RTX 3090. I don't think the world is ready yet for 8K gaming.
Untill the 4000 series at least!
no one is ready for 8K (no TVs, no monitors).
I can't even hardly see the difference between 1440p and 2160p on my 4k TV at couch distance with my 2080 (well it's slightly crisper but not really that life changing in motion, not like the jump from 1080p to 1440p imho) so with a 3070 you're gonna be able to game 2K/4K @60 easy depending on the games and different settings you prefer to use.
Also don't forget more and more games are implementing resolution scaling bars in settings (my GF is currently playing Dragon Quest XI at 1600p : 2160p Ultra was a bit choppy at times and lowering settings was a bit of a step down graphically in this particular case, but @85% of native 4K, it runs 60fps without a hitch) so you will be able to tune in every games as YOU want it (nvidia panel can force it too).
That's the great thing with PC gaming, you can favor FPS count by lowering some settings or prefer to play full on 4K @30fps if you wish to, you decide, and with a 3070, the choice will definitely be yours
@@kidi84 If you can't see the difference between 1440p and 4k on a full screen tv, you need a new damn tv.
@@RichiPete or brand new eyes
But the CEO said, 3070 is (slightly) better than 2080 ti. So u can expect 4k gaming from that card
Cant wait for my RTX3080 to ready in the oven.
Dude go for it!
Nvidia killed it!
Can't wait for my rtx 3090 to tip over my whole rig because it's so beafy.
The tag that you speak of at 2:47 reads "From your partners at Unity".
Frame rate upscaling could be (almost) latency free, if the AI had access to high framerate geometry, and it learned how to interpolate the rendered shading correctly.
Let me predict: this will be the most important tech in a couple of years
Underrated comment, agree.
Well that turned out to be pretty spot on prediction!
Nvidia! I want all your money from DLSS3!
6:37 CS:GO an abuse of an 2080 TI, thats why i play it with two
Do people still play that?
@@DEVILTAZ35 check steam charts mate
@@paint4326 For what?? To see how many people play a graphically outdated game? A cheap version of Call of Duty? A buggy, broken game made on a dinosaur engine.... 😒 fun....
Scrotum Monster If you call CS:GO buggy and broken, then you really dont know the game, its one of the most, if not the, most bug-free, polished games
@@Twitch_Moderator bruh
0:21 John's BFGPU joke falls a bit flat, so Rich takes a shot at it.... AND KILLED IT! :P
Woah, Alex take two steps back my friend!
I really love the digital foundry content, you do such a great job at everything you do and is really interesting to hear you talking.
I am waiting for those ampere reviews
denoised images compress terribly through youtube
Imagine if Naughty Dog made a game designed only to be played on a 3090 no compromises.
It would be a civ clone
Joel's brain matter would be splattered all over us
It would be one of the best looking garbage game ever made like the last of us part 2?
Wow, that would be some beautiful average gameplay.
Three most passionate men in the world as far as computer gaming goes. Thank you guys, I love your opinions and genuine excitement!
90% of the video
Everyone : YEAH YEAH YEAH..
Yeah, right?
DF not having at least a million subs is injustice of the highest order.
Well many don't understand the technical aspects of everything they talk about.
They're too informative. That's the issue, people love trash tech channels. Look who are all at the top.
@@fionnmaccumhaill1023 yeah from those crappy clickbait news outlets.
@ 6:20
I'd guess the main reason Epic probably implemented it was just to practice implementing RT in a major shipped title.
The raytracing and DLSS will probably end up being more of a futureproofing feature just because the game will foreseeably be popular for at least a few more years. It's also a feature that probably will cross over to the new consoles.
Great news yesterday considering Sony announced at an investor meeting only like 2 weeks ago that more exclusives are coming to PC.
Damn, I was wondering what was happening to my speakers when the "BIG NAVI"* part came out at 3:25
*edited
Navi* there is no military power here
Yeah but totally need AMD's announcement right about now!! I hope they give us something amazing for 399! I won't splurge more on a GPU lol
@@kelb0y9o20
Edited, definitely not the same thing indeed
@@Nonx47 or is it? lol
@@FinneousPJ1 if big Navi goes to $399, then Rx 5700xt is going to be obsolete...
Microsoft *is* developing AI upscaling through DirectML, and I'm sure DLSS is just an implementation of that like RTX was, it's just that Nvidia is way quicker to deploy these compared to the competition.
DirectML is not using any specific dated core
No Microsofts AI is a implementation of DLSS
@@InsuredFrames It's up to Nvidia and AMD to implement such software features, or at least I'm sure that Microsoft is working closely with both in terms of DirectX feature sets. What I mean is that clearly Tensor cores and DirectML are very much related, with one being hardware to accelerate ML workloads and the other being an API that helps with ML acceleration. Now don't expect DLSS games to work on AMD, but rather that AMD will have its own implementation of AI upscaling (they do) and that in the future games will be made to upscale through DirectML and not through DLSS. Maybe DLSS in the future will mean that the training was done on Nvidia servers or something and will probably run better on Nvidia hardware by virtue of having the dedicated hardware to run it.
@PardonMySanity Ugh, I'm saying the entire time that Nvidia will probably support DirectML and quite obviously meant that DLSS will in the future might just be branding for some sort of training service, like what DLSS has now, but for DirectML instead. Especially if the dedicated hardware gives them a big advantage over AMD anyway, which I assume it will because since Turing, Nvidia GPUs are (almost) completely asynchronous as well so having a dedicated block for this specific type of mixed precision acceleration will give you a big advantage in this area. The vector units in a Radeon GPU have a big advantage at true double precision stuff (which GeForce doesn't do natively), but that doesn't help here.
I feel like you're knowingly being thick so you you can try to take some sort of intellectual high ground, but in the end you're adding nothing to the discussion. Nice to throw some ad hominem in there as well.
What I'm talking about has nothing to do with (nothing more than just rumoured) DLSS 3.0 either, and you'd know that if you really knew what you were talking about. DirectML is nothing but an API that allows developers a more easy and universal way to implement any form of acceleration for specific tasks (like inferencing). You still need to train a model for now, unless they can compress the data somehow like Google did with their voice recognition on the Pixel 4, but for video we're talking about an order of magnitude more data. I'd be surprised if that DLSS rumor was true at all and it doesn't seem like it since nothing was announced so far, and that would seem like a pretty big feature to talk about. Maybe that's what the 24 GB on the 3090 is for, you can't really stream all that data from an SSD all the time if you're using that bandwidth for assets as well.
3:41 love that they used thepruld's dark souls video for the "Content Creators" example image.
The RTX 2080, 2080TI were popular with the AI/Deep learning crowd as a value for money card. In addition to 8K gaming, the RTX 3090 I think is also target at the AI crowd, hence the huge VRAM for larger AI models.
30 series in general will be a huge increase for AI performance, even more than for gaming.
3080 vs 3090 had like a 10% performance boost was the worst, 4080 vs 4090 was about a 20-25% boost
Genuinely impressed.
that "grinds my gears, argh!" lmao
Most interesting is IO.
For years poor programming has been a thing, many parties involved sometimes all at the same time. Id love to see a plug and play, on or off switch globally.
3080 is good enough for any game right now
That stove is going to need serious cleaning after every meal.
I can't imagine trying to clean grease from that backsplash, omg.
Sure it comes off easy with a little bit of elbow grease and a light scrub with $100 bills
The amount of information on this video is so deep I almost drawn at the end :)
Guys....please start selling merch. Would love to sport Rich’s DF baby blue font t-shirt. I can’t be the only one.
Nope I wouldn't buy crappy merch that's overpriced, too many TH-camrs selling shit items at inflated prices
Beni Meglei They do. Link in the description
Random idea: what would a Corridor Crew x Digital Foundry collab look like?
Experts in across similiar fields teaching each other about the different ways films and games use rending technology
30:30 States that i9 cpu was used. Pcie 4.0 not really a boost then? Wouldn’t a zen 2 chip make more sense for pcie 4.0 compatibility?
Didn't want to acknowledge their direct competitor probably
@@mickeyu2893 yea .
@@mickeyu2893 AMD is not a direct competitor tbh. Only Radeon Technologies Group.
Mifthahul Fikri could have been paid by intel too hard to tell
Mifthahul Fikri I would be very impressed if intel could match even the 1080tis performance much less what we just saw
Cant wait to buy this so I can finally play Roblox at a stable 60fps!
Two questions:
1. How much is the bottleneck of using Z490 Asus Maximus XII Extreme MOBO with 3090/3080 GPUs ?
2. Any news When they will announce/release 3080Ti ?
Switch price details were announced 50 days prior to release and it was extremely hard to get for the first year after release. Look at how that turned out. Sony will be fine
I live for that random Doctor Who reference Rich pulled 😂
The depth of field and per pixel motion blur on Richard's hand movements are phenomenal.
I felt a great disturbance in the early adopters department, as if millions of 2080 ti owners suddenly cried out in terror and were suddenly silenced. "jen-hsun huang kenobi"
Nope, too busy enjoying it for the past X months. :-)
I'll probably upgrade to the 3080 Ti when it comes out in Spring 2021@ $999 (or maybe splurge for the 3090) and move the 2080 Ti to one of my spare gaming rigs.
Resale value for the 2080 Ti is definitely going to suck for those trying to offload it though. :-/
From a 970 to a 3080. Gonna feel like Christmas morning. Good things come to those who wait!!
@Salt Maker Your sense of humour is about as advanced as your choice of gaming platform.
That demo was showcasing a next gen Marble Madness game. Get HOIPED
I'd buy it, especially if its a spindizzy clone. My first-person Paradroid remake could be a thing now as well.
20:38 The intersection accelerator in the TMU can perform 4 BVH box intersections or 1 ray-triangle intersection per cycle as per the patent "TEXTURE PROCESSOR BASED RAY TRACING ACCELERATION METHOD AND SYSTEM". Microsofts deep-dive slide on hot chips says that they have 380G BVH intersection/s or 380/4=95G triangle intersections/s. This adds up with what they've shown.
The math is 1.825Ghz * 52CU*4 ray ops per cycle= 379,6G BVH intersections/s . The short answer is "BVH and triangle intersections are both accelerated".
@7:20 I think the inadvertently sold me on 144Hz. Simply because the difference between that and 360Hz is a Less Stuttery image in Slowmo, chances are In real time the differences should be minimal especially since player skill changes the Dynamics.
So I'm confident in myself that if I want to invest on anything beyond 60Hz in order to have bare minimum to negate an advantage outside of Player Skill, 144Hz is the way to and the rest falls on me and my Mouse.
Don't confuse rendered frames per second with monitor refresh rate, people often confuse the two and DF is making that mistake here. I've seen multiple input lag tests that show a sizable input lag reduction comparing locked 60 fps on a 60hz screen to locked 60fps on a 144hz screen in OW and Apex. The major benefits of a 240hz or 360hz display can be felt even with a much lower amount of rendered frames because each frame will be displayed closer to when it is rendered compared to a slower refresh display. Obviously higher fps will lower the input latency further.
@@xannari3961
Ugh. It isn't a confusion. I'm aiming for 144FPS the FPS Limit on a 144Hz Display without getting Screen Tearing. Input lag will come down to Display type. I have seen many Non TN panels promote 1MS Response time on 144Hz and up Monitors, but consider a Monitor and I sit in front of it, losing color accuracy isn't a concern.
The issue you are referring sound like a GPU and CPU that aren't capable of a locked 144FPS deceasing The Frame Timing and Increasing input lag. . . .
If you still think I'm confused I wouldn't mind a correction if you still think it's needed.
@@cMARVEL360 I am just talking about pure input latency. Lets say I have a 240hz monitor, I am playing Apex, and set a 144 fps max because that is my .1 fps low and I want a consistent experience no matter what I am doing in game. If I set my monitor refresh rate to 144hz I will have overall higher input latency than if I set my monitor to 240hz (with the 144 fps lock). I've seen several sets of tests showing this. If I misunderstood what you were saying originally then sorry for the confusion, it just sounded like you were set on a 144hz monitor rather than a 240hz or 360hz purely because you cannot obtain 240fps or 360fps in your game of choice. In the video both John and Rich seem to say that you need to get 360 fps in game for a 360hz monitor to matter, this is factually incorrect and a very common misconception. I just wanted to point out that getting a 240hz monitor will give input latency advantage over a 144hz monitor even if you are locked to 144fps in game.
@@xannari3961
I see. What I was saying is that 144Hz/144FPS and FPS is my maximum investment when going beyond 60Hz/FPS gaming.
The thinking behind that being that in the Slow Motion feed, the character was over the reticle for a similar amount of time but a slightly more Stuttery experience. But again only in slow motion, and I feel that is a good enough experience for me to remain competitive but not overly depended on my display for Skill but still relying on my skills and reaction time.
The issue you are talking about.... Couldn't it be fixed with Free sync, G-Sync and Variable refresh rate on HDMI 2.0?
@@cMARVEL360 A higher refresh monitor allows a rendered frame to be displayed sooner since the monitor is refreshing more often. Even with g-sync/vrr a 144hz monitor has technical limits on time between refreshes that might delay a rendered frame from being displayed. I think it was battlenonsense, I recall seeing a few tests showing a 10ms drop in input latency between a 60fps locked 60hz display and a 60fps locked 144hz display. I've also heard several pro fps players comment on this topic when talking about Apex and Warzone (two games that max out at 140ish .1% lows on a 2080ti) saying a 240hz monitor makes a noticeable difference with input lag. Like I said, I just see this topic come up a lot and everyone seems think fps is the only factor for input latency but a higher refresh monitor will deliver lower input latency which is ultimately what makes a game feel better (imo).
Thank all of you who participated in the RTX 20 series Beta test.
I love how you guys nearly blew a wad when that marble madness-like RT demo popped up, lol. I totally get it.
I had to skip that section. Listening to guys watching pron is not my kink.
I would think frame rate upscaling would be difficult to do with out introducing latency.
It seems the most accurate approach would require buffering up two key frames and interpolating in between. That would add at least one frame worth of delay since you would have to wait until you get the most recent key frame before you can display the previous key frame, and then you can interpolate up to the most recent key frame.
Alternatively you could do predictive interpolation by following a trend of objects acceleration vectors, but jerk would be near impossible to predict and would cause overshoots followed by an abrupt change when displaying the next key frame.
A compromise would be to wait half way in between the current key frame and the next key frame to display the current key frame. Then you could predict one frame and interpolate between that frame and the next frame. This approach would triple the frame rate and minimize the jerk from the overshoot while only introducing the latency by half of the key frame time.
I'm so glad I didn't buy any of the 20 series. More just cause I'm broke and haven't had the money to spare. But 3070 is calling my name
Same here!!
@@Janbore got a 2080 last September for the price of 3070 so I did not make a bad deal but yeah... 3070 sounds like a damn sweetspot of price/performance value.
@@kidi84 here in finland 2080 still cost anywhere from 800 to 1000 euros and 2080ti will cost anywhere to 1100-2000 euros so that 500 for 3070 is a nice very nice deal!!!
I’m a touring musician, have been for a decade, on the road 250+ shows a year so never home.. until quarantine. Now I am and this reveal REALLY makes me want to build a tower PC again. Only problem is by the time I put one together I might be out again so couldn’t use it!! But WANT. Badly
Ah, John, Rich and Alex the holy trio
Hey the pixel count thing should be possible to achieve! It's a simple regression problem and can even be a classifiaction problem if you limit the number of potential resolutions! I could prob train that with the proper dataset right now :) All you need for the dataset is native resolution footage/images with their attached resolution.
Crazy how only Digital Foundry got the 3080 in advance
The Broadcast background blurring/removal thing without a green screen has been solved for quite a while, I use XSplit VCam for my work video chats. Works exactly like NVidia showed, except it doesn't do the AI panning. I guess Zoom also does it, but XSplit VCam acts as a virtual camera so works with any software that expects a webcam. Moving it to GPU will help the rest of the system I guess, but if you've got a 3000 series card, chances are you aren't worrying about the CPU power needed for this.
I was expecting him to say "and this entire presentation of my face, body and environment was rendered, realtime, leveraging the power of NVIDIA AI."
He does look a bit robotic now you point it out, like a badly done mocap
That graphics card really WAS there the whole time. Gosh darn it to heck!
You kiss your mother with that mouth?
Every day, when I watch videos about these GPUs, I feel more and more poor lmao
Just buy the 3070
@@laerslexikondermusica4480 can't afford it
Unless you're in Brazil just about anyone should be able to afford a 3070 (not saying it's "cheap" but resourceful people can usually hit a figure like that without too much trouble).
Even if you're only making $3/hr as would be the case in many developing nations, take a second job at 4hrs/day and you could buy a 3070 when it launches in October.
Instead of wasting your time dreaming about what could be, try to make it happen.
Then after you do that with a 3070, you've got a model for good work ethic that should lead to success in many areas of your life.
@@budthecyborg4575 I don't have a job, bruh. My family is kinda poor, and my parents are paying a lot of taxes. My PC is partially built with second-hand components
@@mortenera4423 You just reinforced my point.
Thanks for acknowledging the lack of DLSS 2.0 vs true native resolution (rather than TAA) testing, I can't wait for what Richard may find in his explorations. 👍
AI will 'fill in' a lot of games now for developers, so humans will design the game and give a rough blueprint as to what they are looking for then get AI to do the massively time consuming trees, ground, buildings, etc.
Depends on the game, without proper design with an idea in mind, some things should never be done by AI if there's a goal in mind.
I don't think that's the kind of AI they are talking about. It's only about AI-enhanced image reconstruction afaik
This is exactly what star citizen needs. That's why I've always said the game is way too far ahead of it's time.
@@linusmlgtips2123 No they just keep spending money on things that are not playable game material, instead he's turned star citizen into a digital version of real estate. Where the rich can buy custom tricked out features that the regular consumer will never be able to access from an early acess point. Its a massive fundraising black hole that only appeals to the filthy mega rich
@@laerslexikondermusica4480 They say the want a hundred star systems by launch and they only have around a handful, most not fully completed. That just means a lot more work needs to be done.
No, they barely added anything related to your claims.
The leap in performance over Turing was amazing reveal but what caught me off guard was that RTX IO thing, i expected PC to at least take a year to catch up to what PS5 was doing with their IO complex. Now i really want to see load times and asset streaming tests between PC and PS5 in multiplats games. The only thing PS5 has advantage now is installbase, since everyone will have that fast SSD, in PC it will take time until most users are using NVME PCie4 SSD plus an RTX card.
NVIDIA knocked it out of the park last night. No splitting hairs about it.
Straight up
Great that you guys were given the opportunity to hands on with the 3080. a sign of how relevant DF is in the techverse. keep up the good work.
John, the compression on your mic is too high!
35:05 'You need an actual crane to make it work', hahaha priceless
Checked my oven again, still no GPU =(
I wonder how much the DLSS improvements are setting up the 40xx series as VR killers paired with next-gen HMDs. The main complaints I hear about HMDs is resolution and framerates, so DLSS seems to be an answer there. Maybe a DF video visiting the topic of DLSS as it pertains to VR?
This event took some fire from the consoles for me, but just thinking about the new God of War or whatever Naughty Gods are making next makes this pale in comparison. It's about the games in the end.
Hi DF, I was just reading about the PS5 HD camera which states it alows the ps5 to remove backgrounds from the player when streaming etc with the create button... does that suggest machine learning or is there another way to do this?
Some laptop cameras use depth to filter out things behind the gamer. The PS5 Camera has 2 cameras built in so it should be able to calculate depth.
@@lazarushernandez5827 thanks for the reply
Still getting Series X, it's still gonna be very capable if you look at everything else, including the techniques Microsoft will be using. Honestly, maybe this GPU will be a good comparison for the Series X, and see how they both stack. I could care less about whether PC is better or not, there's so much you need to even make it decent, and Series X is gonna be a steal. Hey correct me if I'm wrong, I have my preference, you have yours. Xbox and PC are basically the same ecosystem, especially how they're both mostly Microsoft.
@AnEn That may be true, but it's more than the GPU alone, more so how everything else works together. This new piece just came out, so how would it be inside the new systems? At the end of the day, we're all great. I'm glad these systems have ditched the old Jaguar cores, and actually have more modern components.
@AnEn I'm pc enthusiast sold my 1080 getting a rtx 3080 and I play xbone one with bro all the time I'm simmer too and I totally agree you enjoy your hobby / entertainment preference without being an obnoxious twat like instance ie salt maker. I think the consoles will be trenendous value and great, it's massive upgrade from jaguar cpu cores especially
Video to 3D, wtf!? Insane. Physics is going to be so beautiful, finally. Will love to see this physics on something like gta5, where all the cars to trucks weights are different.
best thing is DLSS is the biggest thing for graphics for Nvidia and what ever they call it for AMD. Machine learning AI graphics prove to be better than native 4K graphics so you can get RTX with high frame rate.
AMD has no competitor to DLSS. They don't have tensor cores. You need them for DLSS, unless you think DLSS 1.0 didn't suck. But you would be alone there.
AMD doesnt have one yet
@@laerslexikondermusica4480 I know it is not DLSS but there is Machine Learning tech for the Xbox Series X which has RDNA 2 we just need to learn more about that feature
@@eaglesfan7636 There is. But not tensor cores. Its all software based. DLSS requires real time machine learning.
@@Wylie288 I know DLSS is only for Nvidia the reason why I said that is the Xbox Series X has some feature for machine learning tech which we need to know more about that is it and what it can do but in no way is it comparable to the 3070 and up
23:33-Alex literally popped into my head when I was watching the reveal event live.
150+GB for CP-2077 is still much more palatable than for Warzone (excluding updates)