Thanks for watching! We said it a few times in the video, but lots of people skip down to the comments. To really reiterate a point: Minecraft is a game that is made by its players, and so performance is mostly a representation of an expected hierarchy of cards and of an expected delta between cards. The absolute FPS numbers are largely meaningless, as they will change from map-to-map and creation-to-creation. We show that here, so you can check out the video if you want to know more. Consider watching our Arctic Liquid Freezer II review if you haven't! th-cam.com/video/KPaSEGe6ML0/w-d-xo.html
I'm glad I'm of an age group smart and experience enough to see what NVIDIA is and has always done. Create issues and try to create new tech to "fix" the issues or get over hurdles they've created just to sell more of their software. Its really ridiculous.
Also "that's not saying anything negative about minecraft; we've played a lot of Minecraft around here" seems to imply they have played Minecraft since beta.
@@GamersNexus I understand now. I think my confusion stemmed from when Steve mentions that he specifically hadn't played since beta, and then it's later mentioned that "we've played a lot". TL;DR: I misinterpreted "we" as "I" because of prior experiences where the two were used (incorrectly) interchangeably by other channels. Long explanation ahead: Actually reading that sentence makes me now understand that it's targeting GN as a whole (we), as in it's "people at GN" (not Steve as a person) that's played Minecraft recently. That's a mistake on my part, not on yours. I understand now what that sentence meant looking back at it, although I think it could have been worded slightly differently to clear up confusion, personally. As Steve has become the "face" of GN it's easy to assume "we" refers to "Steve", although I understand technically and gramatically that's wrong. Obviously "we" refers to plural, gramatically, so it doesn't make sense to assume "we" refers to "Steve". Many other channels use "we" in the first person as a replacement for "I" (for some reason) when referring to themselves, so I guess I naturally assumed that was the case here as well, which clearly contradicted with the earlier statement. For other channels that only have one person that works on it referring to themselves as "we" has conditioned me to believe that it's perhaps slang or common misuse of the word "we" in place of "I". Either way, that's not a fault of the statement, that's a fault of my understanding of the statement based on prior experiences in which that was the case and "we" was used informally and incorrectly in exchange for "I". I also think me not speaking English as a first language also plays a part, as I might not be as solid on the specific gramatics of English as you are, (presumably) speaking English as a first language.
@@KayJay01 English isnt your first language? You probably have better English than 90% of native English speakers, judging by the way you analyzed that.
You mean benchmarking raytracing performance. This is similar to lowering resolution in CPU benchmarks to make the CPU the limiting factor. Because Minecraft has simple geometry and texturing. It's not useful for benchmarking GPUs altogether, just the raytracing portion.
To be fair even though raytracing isn't a huge deal yet. Within just a few years it could be an important factor in comparing gpus in general (at least mid to high end). Minecraft is so far the best implementation of rtx or at least the most significant . I've played like BF5 and Control on a friend's PC, but which were great, but the trade off in performance for Ray tracing was definitely not worth it for the details you only notice when you're looking for it, especially in a fast paced game loek Battlefield 5.
I wouldn't be surprised if games eventually learned to partial ray tracing using excess CPU and GPU combined. Zen 3 CPU is massively good for scaling up workload on the CPU...
@@bananya6020 Powerful processor, not so much. Lotta RAM? Definitely. With 1.15, I can't get good performance with anything less than 4Gb dedicated. With 1.9 I could use around 2Gb, and everything before 1.8 you didn't even need 1Gb. Since the combat update (1.9 for you non-Crafter heathens), CPU requirements certainly jumped.
@@bluspectre2042 For me, Minecraft runs better on Linux (the Java version, don't know if you can run the Windows 10 one in wine), especially the machines that have 4GB of RAM. Advice for anyone trying to get more performance on Minecraft Java.
@@bananya6020 Hahahah dude why are you spilling bullshit? I have a Pentium G4560 with 4gb ram, GT240 (10 years old GPU) and it runs extra smooth. Maybe you have too much crap running on your pc or have a laptop?
Referencing 28:50 - Minecraft is actually the most sold video game, period, at 180 million copies. There are several *franchises* that have sold more, Tetris included, but those represent distinctly different games made by different studios and people over many decades, whereas Minecraft is just, well, Minecraft. It was the most sold PC-specific game already in 2014.
Might have been way outside of the scope, and would introduce a major variable in having to run tests on the Java version of Minecraft, but it would have been interesting to see how performance compared to the Optifine version of Minecraft, which adds a whole bunch of lighting and other features through the use of shader packs. Something I noticed the RTX lacked in the video was a torch you were holding in the forest didn't cast light on your surroundings - something that the Optifine Minecraft succeeds in doing, all without ray tracing.
Java Edition does not have RTX and most likely never will (officially). Bedrock Edition is completely reworked, it functions differently. Basically two completely different games.
This is a beta. By the time it's finalized it will have many little things worked out. For instance, the player isn't visible in reflections, only the hand. So they still need to implement correct player presence in world.
I doubt that pc edition (aka java) will never have ray tracing officially, because bedrock makes more money with in-app purchases and such and also runs better because mojang devs didn't fuck up the code by making EVERYTHING an object yet.
@@davidmartinek5257 Yes, I know that. And the Win 10 version will never have Optifine mods. I should have said "run tests on the Java version *as* *well* " for clarity. Hence the extra variable.
GOD FINALLY! Someone in my feed mentions that they have omitted light emission COMPLETELY in some places of the non-RTX version. It was mind boggling how some TH-camrs were "showcasing RTX" using the scene at 7:00, as if the non-RTX version has any lighting at all! Nvidia/Microsoft should be ridiculed for this more often, and I appreciate your lengthy emphasis on this.
I agree they should, plus I think the difference is enough on its own without needing to do that. But also on the flip side I'm glad that someone finally mentioned that the big performance hit was not just ray tracing its also all the other texture elements and features that had to add to the game to get it to run and that if you added all the other standard tech to minecraft that it does not currently use and not the raytracing it would still perform badly on any card AMD/Nvidia/RTX/GTX. Many utubers are framing it to hatebait on nvidia/RTX
0:41 “That’s where we are now...we’re trying to play Minecraft...” PMSL at the intonation of your voice when you say that! When you’ve got 5 and 3 year old daughters, just try playing anything else for more than 5 minutes!
Mostwanted Hmm, well the oldest one hates Pokemon but is obsessed by minecraft...perhaps this will be an immovable object meeting an irresistible force!
You gotta lean in to it. The more they complain about you playing things other than Minecraft, the more you go into old man gamer mode until finally you're just playing VR poker in Tabletop Simulator complete with musty cigars and cheap whiskey. It'll go one of two ways: Either they'll be completely disgusted by your descent into neanderthalism and you'll wind up fostering a future decades-long career in deadpan dad jokes, Or they'll follow suit and you'll have a couple card sharps by the time they're in their teens and you get to release a pair of socially asymmetric hormonal terrors upon the world come their 18th. Win-Win, as I see it.
Geno Merci I’m creased up with laughter at your reply - I’m veering towards your description (minus the cigars - ex-smoker) already, what with the poker in RDR2! I try to get them watching survival mode in minecraft but they crap themselves every time a creeper blows me up when I’m not looking.
One good reason why real time ray tracing can significantly improve visual quality in Minecraft is that this game is highly flexible and dynamic in lighting and geometries, which makes traditional rendering techniques less effective here. Plus because Minecraft RTX uses path tracing, some effects are really dificult, if not impossible for rasterization (e.g. true refractions)
@@Syrus84 SEUS has the ability to run on all discrete GPU's from both Nvidia and AMD, Minecraft RTX does not, that result alone should provide enough of a reason not to support Minecraft RTX, why support the Corporate option if it only supports half of the market? Not to mention the fact that SEUS looks far better than what Nvidia & Microsoft have slapped together. I pitty the fools who think or possibly even downright believe that Minecraft RTX is the best version of Minecraft you can get in regards to visuals and, visuals + performance.
That's how they get you lol. They release an experimental feature and promise that it'll be mainstream in about a years time and put a premium price on the feature. Just like with everything, once ray tracing becomes more mainstream you'll see the prices of howning the privilege to drop significantly.
Something to consider about Minecraft: Each block is said to be 1 meter. Each chunk is 16x16x16 blocks. So for each added chunk of view distance, you are adding 16 meters. Also, view distance usually doesn't restrict up/down distance as much, just horizontal. That's a lot of processing, even if it is pretty basic.
With RTX pack on, I can get a stable render distance of 12 chunks. With RTX/RTX pack off, I can get over 64. Chunks are not 2d, if I'm correct you have x, y and z. At the most basic level of understanding.
For context to those that know minecraft, 3 of the biggest spanish TH-camrs did a rtx minecraft stream a few days ago where they were testing the rtx feature and they had the exact same performance as their heavily modded private server that uses the SEUS shader! (the server has sht like astral sorcery, ice and fire, mca, zoo and wild life, tektopia etc...)
That's what I guess too. Surely RTX won't be a feature for the original Java. Also high quality textures and graphic mods are available so comparing only to vanilla seems a bit unfair.
@@PainterVierax They can implement RTX into OG Java, the Renderdragon engine is capable of working with the Java codebase, they just have to implement it.
@@CharlesHydronium > RTX is hardware, it does not effect what program you can add ray-tracing to. There are already Minecraft Java mods that add ray tracing through RTX cores or just by using a normal GPU. The big difference is the use of DLSS which is its own program and not hardware.
SSAO(HBAO) and bloom are postprocess is in screenspace, so their calculations are affected by the frame buffer resolution. The resolution influences the amount of this SSAO "shading" and bloom before DLSS.
@@Paulicek1 The difference is Nvidia RTX has access to hardware that has been specifically included and utilised to help increase performance of the Graphical workloads when you use RTGI, I can at least use SEUS on my AMD GPU, with no limitations in that regard, and the performance seems to be better, at least on paper going from the performance numbers we are seeing in this GN video. Minecraft RTX feels very much like another cash grab attempt by Microsoft, and a PR stunt by Nvidia. But then if that was the case, or, if that is the case, then it would be hardly surprising, especially seeing as Microsoft have already shown that they are a cash grab giant over the past two decades, and, Nvidia have shown that they love to do PR Stunts in an attempt to justify their disgraceful buisness practices and anti-consumer actions.
@@LND3947 Nothing wrong with doing a cash grab if folks want to pay. If they don't pay then there's no incentive. btw AMD are just as guilty as Nvidia with their business practices and anti-consumer actions. ALL the companies screw the public over one way or another, it's their _modi operandi_ .
@@clansome The differences is that AMD doesn't drive another company into bankruptcy. People gloat and back Microsoft but forget that Windows is a monopoly.
The denoiser artifacts like seen at 24:44 would drive me nuts. There is not a lot of talk about that but I see it in other RTX titles as well and I think it heavily impacts immersion. It is very distracting.
Something Nvidia is continuously working on improving, since we're dealing with lower raycounts for raytracing denoising needs to be accurate for every effect, in reality this means for path traced minecraft there are 3 (iirc) denoisers at work before blending into the final image. They're very much aware of most bugs & are doing whatever they can to improve the experience. Thinking about it this could be their dedicated playground to getting a blueprint ready for what to actually implement for certain RTX effects & hand that over to developers so the experience is at least consistent everywhere.
cant wait for the RTX 30 series where you'll be able to get high fps ray tracing. Wasn't too excited about this until now, there might actually be a reason to upgrade my 1080 ti sooner rather than later
Liam McLeod if 3080ti gives 50% more speed... it is not very impressive in here. We need much more speed than 3080ti can offer... Maybe 4080ti or 5080ti will be fast enough...
@@haukionkannel It isn't the speed of the card itself. It's the effectiveness of the tesnor cores. the 3080ti will probably contain denser tensor cores, with a higher yield in tensor core speed than 50%. Ontop of that, it'll probably contain more of them.
I think it's very smart of you to have held on to your 1080Ti as it's still an exceptional GPU. Ray tracing is still in it's infancy as far as it being implemented into mainstream gaming and thus the privilege of using it will be expensive, which it is, performance wise and money wise. With the release of the new consoles ray tracing should become more mainstream, I'd say probably late 2021 or even 2022 which is when it'll be worth it to invest in a ray tracing capable gpu in my opinion.
@@12pandemon I would argue the price is more important than anything. If the 3080ti does raytracing 100% better but costs 2500$ it's not actually an improvement, its just a higher priced option
I'm willing to bet it'll run better with updates, it's still only a Beta. I'm wondering if that D12 Ultimate update next month will make a difference, even if it doesn't, it runs fine for me so I'm not complaining.
Keep your fake projected shadows if you want, once you see RTX shadows it makes those fake projected shadows look like a joke, because they're. You can see shadows from the leaves from 10 feet away or more and shadows draw a few feet away, RTX shadows draw 4-5x the distance with all shadows having proper distance shadowing from sharp to very soft, with more light sources not just one like DXR. Self shadowing is also superior.
@@WrathInteractive all that may be true. but mincraft still runs at about 45 fps with rtx on. if they can't run minecraft then there is not much else that they can run. the tech was not ready yet.
RTX goes much further with raytracing whereas SEUS PTGI limits the amount of bounces to 1 or 2 in order to not go into single digit performance numbers at like 720p or less, it's also confined to screenspace in some cases.
@@MLWJ1993 That's actually a setting I would like to see with future RTX Minecraft updates. In order to achieve more chunk distance with RTX, perhaps the player could choose to limit light bounces and/or be able to toggle options such as bloom.
I’d love to see a whole video that’s just about what various graphics settings are in games. Most games I’ve played give no definition for what changes and sometimes it’s hard to understand without video before after comparisons.
Digital Foundry covers a lot of that, although I don't think they have a single video on it - they do them for select PC releases but titles vary what settings mean even if they're named the same so it can't be generalised across all games.
I love these in depth reviews of the application of new technology! Personally I think that this might be the future for real time ray tracing. Games with simple graphics that allow the player to be creative. I can imagine that people who spend a lot of time creating something awesome, are going to be thrilled to see their work being transformed by RTX. Even if they themselves don't own an RTX card, other people can share screenshots and videos.
I love this. Pushing for higher textures stopped adding a ton of visual fidelity in 2013. We need to push for HDR, begget lights and especially better shadows. It is the next step. We should stop pushing higher textures, optimize what we have an implement better light and shadow like raytracing
Minecraft Bedrock is actually surprisingly well optimized. Most of the criticism about Minecraft's optimization comes from the Java version of the game, which (while better in every other way) has rather poor optimization. The most laggy bit (in terms of server side, not client side) was lighting updates, which were incredibly laggy (to the point of players like nessie, mirgrim, and phipro making mods that replaced the entire lighting engine), and have been fixed in newer versions of Java. The most laggy bits remaining are certain hoppers when interacting with full inventories and redstone dust making absurd amounts of block updates (why people use rails in contraptions where lag is a limiting factor).
also, they remade the game in 1.13 which destroyed performance. for me, fps dropped from ~120 to ~24. some people checked the code and realized the converted what were internally ints (such as id's of items and enchantments) to strings, and also made everything a fking object, ending by making the garbage cleaner work far too hard because of the now inefficient use of them. some people undid these changes, and when I used their mods, my fps jumped back to ~60.
That it’s Minecraft is not relevant, that it’s doing full path tracing is. At the moment this is the most intense RTX game out there. The 2060 will do fine in titles like Control, or Metro Exodus
Great vid. I was actually a bit interested in this. Would be quite interesting to see for the 40 and 30-series as well. For me personally I would be after a 28 chunk render distance at 1440p UW, in which case I think a 4070Ti or maybe even more would be needed to get a framerate comfortably above 60fps... IMO a minimum of 28, all the way up to about 40-50 chunks is a sweet-spot in survival mode for me personally, as 28 doesn't feel limiting, while very long render distances around and above 50 in survival-mode doesn't go well in terms of immersion when it comes to world generation and the size of biomes. The fact I'd probably need just about a 4080 or even a 4090 to be within that range with RTX On at 1440p 100Hz is kinda crazy.
That is the thing. The batman games use a lot of rasterizer cheats to look that cool. If you want RTX you would first have to throw away all that work. Then you turn on RTX and see how it looks. Then you try new tricks that work with RTX to get to the image quality you once had. So a lot of work for a small improvement : light is more dynamic.
@@Ferdinand208 AND you're not even considering that you'll probably have to sacrifice on res or poly-count to maintain reasonable performance. That's one of the reasons they keep showing off Minecraft, it's already lighter on the GPU to start with.
No dinamyc lighting... The torch when you have on your hand doesn't light up the area. On minecraft java edition with optifine installed you can do that.
Lol you think that's the only thing that constitutes dynamic lighting? Minecraft RTX is entirely path traced, this means it's entirely done via ray tracing, rather than a rasterized games with RTX features on top. Everything is entirely dynamic, it's still in Beta, there are bugs and missing features.
@Zane White have you even tried them? they perform far better. also, if you think it's ugly, it's possible to mod the game so you can change the shaders to make it look nice (though looks are subjective).
I'm going to wait until RTX anything is more than "We went from seeing the game to turning your monitor off to improve graphics". The jungle lights scene was terrible but you should've had started a forest fire to see what the drop would be from all those trees/leaves/etc on fire being new sources of light. I guess RTX would need to make it so it doesn't look like the world is lit up by laser beams. I don't know about anyone else but if I open a window of my house at noon and turn off all the interior lights the room isn't pitch black except one spot by the window.
I up voted all the comments talking about mods already please make a comparison with raytracing/ ultra shaders. They work in the real minecraft and don't require dedicated gpus.
The player's hand is lit up by the environment but they failed to project light from objects in the player's hand ? Like torches and glowstone ? The first thing every shader ever implemented first because it's so useful ?
this is a showpiece, not a functionality thing it's to get pr and have a reason to say "see, rtx wasn't a _total_ failure." it's why gn had to disable rtx to even explore the map in some places.
@@bananya6020 I know, that's why I'm so salty about it, because the PR bullshit works :/ People are just praising "RTX" when they only mean raytracing or pathtracing. People know Nvidia "invented" RTX so it's gonna be in people's mind that they invented raytracing...
@@SoKette it's a fucking *beta* if it was 100% finished it wouldn't be called *beta* now wouldn't it? Also Nvidia is in fact still working on pretty much everything graphics related for this, from optimizations to getting the denoiser to deliver results faster...
Yoooo what the HECK is that timebar feature? Like it's all segmented and tells me information about what each segment is about. That's AWESOME! How come I've never seen it before?
Custom map with ray tracing and DLSS, I'm getting a pretty consistent 57fps at 4k with ASUS 2080Ti Strix and an old 5960X cpu and 32gb ram Not too bad, But they need to get up to speed with the over all look, Optifine realistic shaders in java edition still destroys it and looks way better in my opinion.
I've noticed that at least on the Java Editions, resolution seems to matter more than chunk render distance when using path traced global illumination shaders. Can anyone confirm a similar phenomena on Bedrock Edition?
Any specific shader mods? We might be able to look into it, but not sure when yet. Upvoting for visibility so other commenters can maybe recommend path traced community solutions for testing.
java version only has one rtx shader, and that one is still in development from sonic ether , you CANT use it for benchmark right now cause its not finished and therefore non comparable atm
@@GamersNexus SEUS PTGI E9/E11, all fancy settings in the vanilla options. Tested on a GTX 980 Ti (EVGA SC ACX 2.0) and an R7 3700X at 1080p/900p/720p. Was using the Default Improved PBR resource pack. On a standard world, In this case on and island surrounded by just about every vegetative biome there is and mountains. Found that I could comfortable go up to 32 chunks and still have playable framerate, reducing the render distance to 16 seemed like it barely affected results. 64 Is only when it really felt sluggish. Tested on a superflat world with Overworld/Tuneler/Standard preset, basically the same framerate as on s regular world but with seemingly improved frametimes. I don't recall VRAM being an issue here. It was only when I adjusted the full screen resolution down to 900p or 720p, that I was able to achieve that magical 50/60 FPS mark a lot more consistently. It was generally much smoother and responsive. Again, I tried changing the chunk render distance but it would often make differences In the single digits at most until you got to 64. I got a Vega 56 and Vega FE laying around, so I may still do some testing on those cards to see if it makes any difference here. I've also found that turning up the shader settings up did not make a substantial difference in performance or the overall feel of the gameplay.
@@oOWaschBaerOo RTX shader? RTX is Nvidia's brand and refers to hardware accelerated ray tracing. What does this have to do with an unofficial ray tracing shader pack?
Abraham I agree rtx 3080ti will not be fast enough for this game! Waiting 5000 may be. Then it can be 80% faster than 2080ti... it may be enough to 1080p Gaming with raytrasing on... Wuhuu...
My 2 RTX 2080s SUPER NVLinked served me well so far. But given that the 2080 SUPER will be the new baseline with next gen consoles, I will upgrade to RTX 30 series.
Why? I use shader mods that run better(fps wise). And look quite realistic. Minecrafts professional take using rtx, does not look like a shader killer yet.
@@4lc441 Im not an expert but I play mods just fine with the shaders I use. They also work on online servers as well such as wynncraft. Im pretty confident most mods and shaders are compatible.
Saying Minecraft might have sold more than Tetris is an understatement.. Minecraft is *the* best selling game of all time. 180 million copies versus Tetris' 35 million. The only thing that comes close is GTA V at 120 million, which has profited more as a result of being sold at a AAA pricepoint as well as the shark card economy. In fact GTA V is the single most profitable entertainment property of all time, that includes movies, books, games, etc.
Steve, I don't mean to question your testing methodology, but there's no way a GTX 2080 Super couldn't go above 60fps with 14+ chunks loaded (non-RTX). I know it's anecdotal, but my RX 480 from years ago on MODDED minecraft gets over 70fps at a 15 chunk draw distance, and vanilla minecraft with raytraced shaders (SUES PTGI) gets average 60fps at 12 chunk draw distance. And that's on Java Minecraft, which is known to be significantly slower than the C++ based Bedrock edition, which is the edition that got RTX support. Just wondering what could have possibly stunted your performance so much in this case.
Everybody would be praising NV too if there was something to do with it at launch. But there was nothing. And in the first few titles that got RTX support, you really had to look for it to get value out of it. I think it would've been a massive success if they worked with Minecraft to make that the first supported title AT LAUNCH.
@@robertstan298 everybody is on the nvidia hate train rn. If amd released something like this everybody would be prasing them for introducing futuristic tech. Meanwhile when nvidia does it it gets hated on bc it doesnt work perfectly. Amd still releases broken cards like 5700xt and nobody bats an eye
@@gbner9991 Ah, so you're just a butthurt fanboy, got it. I bet you can't even afford a RTX 2060, but you sure like to shill your gamer soul out for corporations that don't give a flying fuck about your existence, or video gaming. Good grief you people are so sad.
I have a suspicion Minecraft will be used by Nvidia to demonstrate how AmAzInG their new cards are if they add significantly more RT cores. If that is the case, Mark my words they will show a bar graph between a 2080 super at 56fps and a 3080 at 100fps.
The one thing you missed or maybe I missed, the more blocks loading in chunks will greatly have an effect on FPS. Like the Dystopian City, you showed would kill FPS so bad it wouldn't be a real playable area. But great information because I am in the market for a new gpu.
I've actually tried Control since it got a DLSS 2.0 update, and it runs and looks great. This is a definite improvement compared to the state in which the game came out initially, like it's perfectly playable on a 2060 Super now with everything maxed out and DLSS turned on
22:29 interesting part of minecraft is that it changes things that are drawn far away (egz. stops rendering fancy leaves when you are far away). The distance increase with your increasing resolution + mainly width (ive done some testing :P). so probably this is whats going on here
I still dont understand DLSS, its meant to be able to run the game at a lower res while looking like native or higher res correct? So a sharp high quality image while performance should suffer less. So when you run 1080p with DLSS On....does that mean you are running 1080p with 4k (or so) image quality Or does that mean you are actually running a lower resolution but with 1080p image quality?
Lower resolution, it's basically a form of upscaling. At 4k with DLSS on you are actually running the game at 1080p and then upscaling that image to 4k which is why the framerates increase and the more intensive ray tracing settings also run better.
Like Omar says, if a game is set to 1080p and the DLSS setting is set to performance the game engine is only rendering 1 of every 4 pixels displayed. Meaning, the other 3 pixels are guessing what to display based on the previous frame. It does this by taking multiple samples of the previous frame.
Would have been helpful to clarify explicitly what turning DLSS on actually changes the render resolution to. I think at 1080p, the game renders 720p+DLSS upscaling, and 4K becomes 1440p+DLSS upscaling? That would makes sense then why the lights show up at 22:20 with 4K/DLSS but not at native 1080p because the base image used for DLSS was 1440p.
So you’re telling me that a 2080 Ti can’t maintain 60 FPS at all times (99% of the time) at 1080p with DLSS helping out? All right. Lemme real quick go buy a Titan RTX.
If a few years ago, a friend would have said "I need a new GPU to play Minecraft", I would have offered a few bucks and a pat on the shoulder followed by the words "If you need anything, just ask. That's what friends do." Now my reaction would rather be "Yes, me too. And it won't be cheap." lol
You can left hand hold torches and stuff like shields in Minecraft now which would help navigate dark forests and caves early on without turning off RTX.
I understand that realtime raytracing is still in its infancy and thus probably a poor purchase but I'll be damned if it isn't absolutely stunning. Too bad great depression 2.0 is about to hit because this shit will be excellent in a year or so.
Just curious can you run the ray tracing on non-rtx hardware or does it require rt cores? What kind of performance hit would you get without rt cores if its even possible?
I hope they do, I just built my wife a 3950x rig for work from home. She's an artist and I want to show her some videos of Raytracing to see if having impressive lighting in video games will help inspire her digital art.
area of a radius 160 circle - 80424 units. radius of a 20 chunk circle, 400 units. its a bit high, so 500 units with of geometry rendered with RTX + DLSS = 80k units normal, or 160 times the render effort? (edit: this assumes that the first was gpu capped, which may not be true)
I'm guessing this video feels like someone from NASA explaining trajectory and getting overexcited to a kid just because they mentioned playing a space game once... on a phone.
Thanks for watching! We said it a few times in the video, but lots of people skip down to the comments. To really reiterate a point: Minecraft is a game that is made by its players, and so performance is mostly a representation of an expected hierarchy of cards and of an expected delta between cards. The absolute FPS numbers are largely meaningless, as they will change from map-to-map and creation-to-creation. We show that here, so you can check out the video if you want to know more. Consider watching our Arctic Liquid Freezer II review if you haven't! th-cam.com/video/KPaSEGe6ML0/w-d-xo.html
I'm glad I'm of an age group smart and experience enough to see what NVIDIA is and has always done. Create issues and try to create new tech to "fix" the issues or get over hurdles they've created just to sell more of their software. Its really ridiculous.
Of course minecraft has been used to promote rtx...its childish graphics make it easy!!! Duuuhhhhh
Abysmal considering these are the results of minecraft at 1080p...lol...so what do you think top games performace will be like.
Minute 3:00
I have never seen you so happy😁.
Happy for you
Minecraft on dual titan rtx 😈
Steve: I haven't played minecraft since alpha/beta.
Also Steve: we took a few days for testing.
Keep telling yourself you were testing.
Also "that's not saying anything negative about minecraft; we've played a lot of Minecraft around here" seems to imply they have played Minecraft since beta.
FeelsBadMan that it wasnt the java version
@@KayJay01 Past perfect tense. We don't currently, but we have played it. The testing was a fun excuse to play again for a little bit.
@@GamersNexus I understand now. I think my confusion stemmed from when Steve mentions that he specifically hadn't played since beta, and then it's later mentioned that "we've played a lot".
TL;DR: I misinterpreted "we" as "I" because of prior experiences where the two were used (incorrectly) interchangeably by other channels.
Long explanation ahead:
Actually reading that sentence makes me now understand that it's targeting GN as a whole (we), as in it's "people at GN" (not Steve as a person) that's played Minecraft recently. That's a mistake on my part, not on yours.
I understand now what that sentence meant looking back at it, although I think it could have been worded slightly differently to clear up confusion, personally. As Steve has become the "face" of GN it's easy to assume "we" refers to "Steve", although I understand technically and gramatically that's wrong. Obviously "we" refers to plural, gramatically, so it doesn't make sense to assume "we" refers to "Steve".
Many other channels use "we" in the first person as a replacement for "I" (for some reason) when referring to themselves, so I guess I naturally assumed that was the case here as well, which clearly contradicted with the earlier statement. For other channels that only have one person that works on it referring to themselves as "we" has conditioned me to believe that it's perhaps slang or common misuse of the word "we" in place of "I". Either way, that's not a fault of the statement, that's a fault of my understanding of the statement based on prior experiences in which that was the case and "we" was used informally and incorrectly in exchange for "I".
I also think me not speaking English as a first language also plays a part, as I might not be as solid on the specific gramatics of English as you are, (presumably) speaking English as a first language.
@@KayJay01 English isnt your first language? You probably have better English than 90% of native English speakers, judging by the way you analyzed that.
Key question that remains unanswered: Is steve shrinking or his hair growing?
It will eventually consume me.
@@GamersNexus I think the applicable technical term would be "Going out in style"
@Ell me too lol
Cousin it 2.0 incoming!
@@GamersNexus So you're gonna become Cousin It in Addams Family?
Who knew Minecraft was gonna was gonna be the benchmark everyone's gonna be benchmarking gpus from now on.
You mean benchmarking raytracing performance. This is similar to lowering resolution in CPU benchmarks to make the CPU the limiting factor. Because Minecraft has simple geometry and texturing. It's not useful for benchmarking GPUs altogether, just the raytracing portion.
@@deranger It's also a very dynamic game where a lot of baked lighting would not work
Yea all those 10 people who bought a high end rtx card can finally benchmark their cards.
To be fair even though raytracing isn't a huge deal yet. Within just a few years it could be an important factor in comparing gpus in general (at least mid to high end). Minecraft is so far the best implementation of rtx or at least the most significant . I've played like BF5 and Control on a friend's PC, but which were great, but the trade off in performance for Ray tracing was definitely not worth it for the details you only notice when you're looking for it, especially in a fast paced game loek Battlefield 5.
I wouldn't be surprised if games eventually learned to partial ray tracing using excess CPU and GPU combined.
Zen 3 CPU is massively good for scaling up workload on the CPU...
Then: But can it run Crysis?
Now: But can it run Minecraft with RTX on?
it's gonna be "But can it run Crysis Remastered" soon
@@junathanhaoward if mine craft stresses PC's more than crysis Remastered it will be sad times.
@@liaminwales wich one will be more optimised?
But, can it run Crysis Remastered with RTX on ?
@@timyt13 i hope to god crysis, still my fave game engine just wish more games used it.
loved prey and that ran super well.
DLSS 2 has seen a very solid implementation in Control and Wolfesntein Youngblood, please revisit those titles again.
Never looked at them, so it'd be a first visit. Planning to.
@@GamersNexus roblox Ray tracing
@@squidward9626 Burn in hell.
Burn hotter than that 2080 after hours of testing.
@@bluspectre2042 hahah nah my ass is hot enough by a pentium 4
DLSS 2 has basically made RTX a viable option for games
I still can't get over the fact that Mojang calls their RTX raytracing engine "Render Dragon"
That was the first thing i noticed upon firing up minecraft
Am I the only one who’s most excited for mirrors to simply work in games going forward?
Reflections in general you mean? 😂
The Gta V mirrors in houses and apartments work perfectly.
Now that you say that, I'm really curious as to how valve got mirrors working in Portal back in the 2010s!
Even GTA San Andreas has working mirrors.
0:35 "The benchmark helps set the bar for what level of performance is needed to play… Minecraft"
LMFAO
They could have Ray traced Daggerfall but okay...
@@dra6o0n vengeance
And Bedrock Edition no less, the bar is very low lol
Steve’s faces in the thumbnail is the reason im subbed
😅😂😂
Steve on Steve
Juicy
This year can’t get any worse right?
April 2020: 2080Ti is required to run minecraft
you already need like an i7 anyways
devs don't optimize the game anymore, community got too big and they don't have the time anymore
@@bananya6020 Powerful processor, not so much. Lotta RAM? Definitely. With 1.15, I can't get good performance with anything less than 4Gb dedicated. With 1.9 I could use around 2Gb, and everything before 1.8 you didn't even need 1Gb. Since the combat update (1.9 for you non-Crafter heathens), CPU requirements certainly jumped.
@@bluspectre2042 For me, Minecraft runs better on Linux (the Java version, don't know if you can run the Windows 10 one in wine), especially the machines that have 4GB of RAM. Advice for anyone trying to get more performance on Minecraft Java.
@@bananya6020 Hahahah dude why are you spilling bullshit? I have a Pentium G4560 with 4gb ram, GT240 (10 years old GPU) and it runs extra smooth. Maybe you have too much crap running on your pc or have a laptop?
@@bluspectre2042 now try robust modpacks lol
Referencing 28:50 - Minecraft is actually the most sold video game, period, at 180 million copies. There are several *franchises* that have sold more, Tetris included, but those represent distinctly different games made by different studios and people over many decades, whereas Minecraft is just, well, Minecraft. It was the most sold PC-specific game already in 2014.
Might have been way outside of the scope, and would introduce a major variable in having to run tests on the Java version of Minecraft, but it would have been interesting to see how performance compared to the Optifine version of Minecraft, which adds a whole bunch of lighting and other features through the use of shader packs. Something I noticed the RTX lacked in the video was a torch you were holding in the forest didn't cast light on your surroundings - something that the Optifine Minecraft succeeds in doing, all without ray tracing.
Java Edition does not have RTX and most likely never will (officially). Bedrock Edition is completely reworked, it functions differently. Basically two completely different games.
@@davidmartinek5257 - It's about comparing the experience people receive from playing it, not eliminating variables.
This is a beta. By the time it's finalized it will have many little things worked out. For instance, the player isn't visible in reflections, only the hand. So they still need to implement correct player presence in world.
I doubt that pc edition (aka java) will never have ray tracing officially, because bedrock makes more money with in-app purchases and such and also runs better because mojang devs didn't fuck up the code by making EVERYTHING an object yet.
@@davidmartinek5257 Yes, I know that. And the Win 10 version will never have Optifine mods. I should have said "run tests on the Java version *as* *well* " for clarity. Hence the extra variable.
GOD FINALLY! Someone in my feed mentions that they have omitted light emission COMPLETELY in some places of the non-RTX version. It was mind boggling how some TH-camrs were "showcasing RTX" using the scene at 7:00, as if the non-RTX version has any lighting at all! Nvidia/Microsoft should be ridiculed for this more often, and I appreciate your lengthy emphasis on this.
I agree they should, plus I think the difference is enough on its own without needing to do that.
But also on the flip side I'm glad that someone finally mentioned that the big performance hit was not just ray tracing its also all the other texture elements and features that had to add to the game to get it to run and that if you added all the other standard tech to minecraft that it does not currently use and not the raytracing it would still perform badly on any card AMD/Nvidia/RTX/GTX. Many utubers are framing it to hatebait on nvidia/RTX
time to give crysis a run for it's money!
0:41 “That’s where we are now...we’re trying to play Minecraft...” PMSL at the intonation of your voice when you say that! When you’ve got 5 and 3 year old daughters, just try playing anything else for more than 5 minutes!
Get them Pixelmon generations on the technic launcher, they will love it , It fully modelled Pokemon in Minecraft
Mostwanted Hmm, well the oldest one hates Pokemon but is obsessed by minecraft...perhaps this will be an immovable object meeting an irresistible force!
You gotta lean in to it. The more they complain about you playing things other than Minecraft, the more you go into old man gamer mode until finally you're just playing VR poker in Tabletop Simulator complete with musty cigars and cheap whiskey.
It'll go one of two ways: Either they'll be completely disgusted by your descent into neanderthalism and you'll wind up fostering a future decades-long career in deadpan dad jokes, Or they'll follow suit and you'll have a couple card sharps by the time they're in their teens and you get to release a pair of socially asymmetric hormonal terrors upon the world come their 18th.
Win-Win, as I see it.
Geno Merci I’m creased up with laughter at your reply - I’m veering towards your description (minus the cigars - ex-smoker) already, what with the poker in RDR2! I try to get them watching survival mode in minecraft but they crap themselves every time a creeper blows me up when I’m not looking.
@@georgemorley1029 Don't see it as pokemon, see it as animal slave labour.
One good reason why real time ray tracing can significantly improve visual quality in Minecraft is that this game is highly flexible and dynamic in lighting and geometries, which makes traditional rendering techniques less effective here. Plus because Minecraft RTX uses path tracing, some effects are really dificult, if not impossible for rasterization (e.g. true refractions)
i want a comparison vs SEUS PGTI shaders and Patrix resource pack vs Minecraft RTX
I just started using SEUS renewed and imho it looks ten times better than this already, can't wait for PTGI...
@@Syrus84 SEUS has the ability to run on all discrete GPU's from both Nvidia and AMD, Minecraft RTX does not, that result alone should provide enough of a reason not to support Minecraft RTX, why support the Corporate option if it only supports half of the market?
Not to mention the fact that SEUS looks far better than what Nvidia & Microsoft have slapped together.
I pitty the fools who think or possibly even downright believe that Minecraft RTX is the best version of Minecraft you can get in regards to visuals and, visuals + performance.
@@LND3947 +1
@@LND3947 SUES only looks better when you dial up the settings, making it unplayable. You get 20-30 fps on SUES vs 60 fps RTX on the same hardware
@Machinet those tests are going to show amd's horrible (or not-so-good) opengl drivers
Me: "honey, can i get 2 x RTX 2080 supers for Minecraft ???"
wife: "i want a divorce"
I cant wait to drop $700 on a GPU so I can run minecraft at 1080p / 60FPS with 12 chunk draw distance! Thank god for tech advancements! Wow.
@Nick S Of course.
@@YeCannyDaeThat Star Citizen answer : but it's beeetaaaa !
That's how they get you lol. They release an experimental feature and promise that it'll be mainstream in about a years time and put a premium price on the feature. Just like with everything, once ray tracing becomes more mainstream you'll see the prices of howning the privilege to drop significantly.
Something to consider about Minecraft: Each block is said to be 1 meter. Each chunk is 16x16x16 blocks. So for each added chunk of view distance, you are adding 16 meters. Also, view distance usually doesn't restrict up/down distance as much, just horizontal. That's a lot of processing, even if it is pretty basic.
Minecraft chunks are 2d, so 16x16x256
@@arthur5 I'm pretty such a chunk is 3d because chunks you can't see are rendered. It is 16x16x16
@@slimstrait780 A chunk is 16x16x256, not 16x16x16. The chunk grid is 2d. There are some fan projects to mod in 3d cubic/vertical chunks, though.
With RTX pack on, I can get a stable render distance of 12 chunks. With RTX/RTX pack off, I can get over 64. Chunks are not 2d, if I'm correct you have x, y and z. At the most basic level of understanding.
For context to those that know minecraft, 3 of the biggest spanish TH-camrs did a rtx minecraft stream a few days ago where they were testing the rtx feature and they had the exact same performance as their heavily modded private server that uses the SEUS shader! (the server has sht like astral sorcery, ice and fire, mca, zoo and wild life, tektopia etc...)
seus > rtx :P
You're confusing the java version of the game and this windows 10 c++ based version in some of your commentary
That's what I guess too. Surely RTX won't be a feature for the original Java. Also high quality textures and graphic mods are available so comparing only to vanilla seems a bit unfair.
@@PainterVierax Im sure that they said something about searching for a way to implement RTX in Java but they cannot promise it actually working
@@PainterVierax They can implement RTX into OG Java, the Renderdragon engine is capable of working with the Java codebase, they just have to implement it.
@@Babybarschalarm no, C++ and Java are two completely different programming languages
@@CharlesHydronium > RTX is hardware, it does not effect what program you can add ray-tracing to. There are already Minecraft Java mods that add ray tracing through RTX cores or just by using a normal GPU. The big difference is the use of DLSS which is its own program and not hardware.
SSAO(HBAO) and bloom are postprocess is in screenspace, so their calculations are affected by the frame buffer resolution. The resolution influences the amount of this SSAO "shading" and bloom before DLSS.
ah man, i hope they release rtx support for my 2-core i3 iGPU soon. expecting blue screen at minecraft launch doe
Maybe LowSpecGamer can destroy the game for you to make it run with ray tracing.
you can't even play normal minecraft with a dual core i3
@@pcislocked good point
@@pcislocked Are you sure ? From what I know, up until rather recently (1.14?) Minecraft was fairly single threaded.
@@Winnetou17 yes, but it's an i3 anyways. maybe you can play with a low gpu like gt210 or something but opengl is not just gpu unfortunately
2019: When we made jokes about RTX lowering your Frame rate
2020: Minecraft RTX taking your framerate in the arse like no tomorrow
Same thing happend with basic seus shaders on Minecraft Java. This huge performance hit. Nothing new.
@@Paulicek1 The difference is Nvidia RTX has access to hardware that has been specifically included and utilised to help increase performance of the Graphical workloads when you use RTGI, I can at least use SEUS on my AMD GPU, with no limitations in that regard, and the performance seems to be better, at least on paper going from the performance numbers we are seeing in this GN video.
Minecraft RTX feels very much like another cash grab attempt by Microsoft, and a PR stunt by Nvidia.
But then if that was the case, or, if that is the case, then it would be hardly surprising, especially seeing as Microsoft have already shown that they are a cash grab giant over the past two decades, and, Nvidia have shown that they love to do PR Stunts in an attempt to justify their disgraceful buisness practices and anti-consumer actions.
@@LND3947 Nothing wrong with doing a cash grab if folks want to pay. If they don't pay then there's no incentive. btw AMD are just as guilty as Nvidia with their business practices and anti-consumer actions. ALL the companies screw the public over one way or another, it's their _modi operandi_ .
@@Paulicek1 You can't use Optifine on Java Minecraft 1.15 or 1.16 sooo.
@@clansome The differences is that AMD doesn't drive another company into bankruptcy.
People gloat and back Microsoft but forget that Windows is a monopoly.
The denoiser artifacts like seen at 24:44 would drive me nuts. There is not a lot of talk about that but I see it in other RTX titles as well and I think it heavily impacts immersion. It is very distracting.
thats why i think dxr right know works better for reflections instead of lightning + shadows.
Something Nvidia is continuously working on improving, since we're dealing with lower raycounts for raytracing denoising needs to be accurate for every effect, in reality this means for path traced minecraft there are 3 (iirc) denoisers at work before blending into the final image. They're very much aware of most bugs & are doing whatever they can to improve the experience.
Thinking about it this could be their dedicated playground to getting a blueprint ready for what to actually implement for certain RTX effects & hand that over to developers so the experience is at least consistent everywhere.
Well it's a beta software with early adopter hardware...
@@MLWJ1993 People didn't spend $1,200 for a beta.
@@igorrafael7429 It's noticeable in reflections too
cant wait for the RTX 30 series where you'll be able to get high fps ray tracing. Wasn't too excited about this until now, there might actually be a reason to upgrade my 1080 ti sooner rather than later
Liam McLeod if 3080ti gives 50% more speed... it is not very impressive in here. We need much more speed than 3080ti can offer... Maybe 4080ti or 5080ti will be fast enough...
@@haukionkannel It isn't the speed of the card itself. It's the effectiveness of the tesnor cores. the 3080ti will probably contain denser tensor cores, with a higher yield in tensor core speed than 50%. Ontop of that, it'll probably contain more of them.
I think it's very smart of you to have held on to your 1080Ti as it's still an exceptional GPU. Ray tracing is still in it's infancy as far as it being implemented into mainstream gaming and thus the privilege of using it will be expensive, which it is, performance wise and money wise. With the release of the new consoles ray tracing should become more mainstream, I'd say probably late 2021 or even 2022 which is when it'll be worth it to invest in a ray tracing capable gpu in my opinion.
@@12pandemon I would argue the price is more important than anything. If the 3080ti does raytracing 100% better but costs 2500$ it's not actually an improvement, its just a higher priced option
we were scammed on the 2080ti's. they can't even run mincraft well with rtx.
I'm willing to bet it'll run better with updates, it's still only a Beta.
I'm wondering if that D12 Ultimate update next month will make a difference, even if it doesn't, it runs fine for me so I'm not complaining.
gr8 b8, my 2070 Super runs it just fine. Granted at 1080p, if you're running at 4k cheers, i'm sure that may cause performance dips.
Keep your fake projected shadows if you want, once you see RTX shadows it makes those fake projected shadows look like a joke, because they're. You can see shadows from the leaves from 10 feet away or more and shadows draw a few feet away, RTX shadows draw 4-5x the distance with all shadows having proper distance shadowing from sharp to very soft, with more light sources not just one like DXR. Self shadowing is also superior.
@@WrathInteractive all that may be true. but mincraft still runs at about 45 fps with rtx on. if they can't run minecraft then there is not much else that they can run. the tech was not ready yet.
@@donk8961 with a 2080ti, most people expect to run more than 1080@60fps. my comment was not bait at all. watch steve's video.
Thanks for the labeled timestamps on the video time bar. That's a really cool thing to see. You just keep raising the bar.
Would be interesting to see an image comparison between RTX and SEUS PTGI.
RTX goes much further with raytracing whereas SEUS PTGI limits the amount of bounces to 1 or 2 in order to not go into single digit performance numbers at like 720p or less, it's also confined to screenspace in some cases.
@@MLWJ1993 That's actually a setting I would like to see with future RTX Minecraft updates. In order to achieve more chunk distance with RTX, perhaps the player could choose to limit light bounces and/or be able to toggle options such as bloom.
@@MLWJ1993 minecraft rtx is path-tracing, not ray-tracing, it second game to use it after q2rtx
I’d love to see a whole video that’s just about what various graphics settings are in games. Most games I’ve played give no definition for what changes and sometimes it’s hard to understand without video before after comparisons.
Digital Foundry covers a lot of that, although I don't think they have a single video on it - they do them for select PC releases but titles vary what settings mean even if they're named the same so it can't be generalised across all games.
I love these in depth reviews of the application of new technology!
Personally I think that this might be the future for real time ray tracing. Games with simple graphics that allow the player to be creative. I can imagine that people who spend a lot of time creating something awesome, are going to be thrilled to see their work being transformed by RTX. Even if they themselves don't own an RTX card, other people can share screenshots and videos.
This is the first review of this BETA that actually mentioned that RTX off does not include any light... THANK YOU !!!!
"Minecraft benchmark".
Oh, yes.
An 8 bit game is the oficial test for bleeding edge technology now.
Glorious.
I love this. Pushing for higher textures stopped adding a ton of visual fidelity in 2013. We need to push for HDR, begget lights and especially better shadows. It is the next step. We should stop pushing higher textures, optimize what we have an implement better light and shadow like raytracing
ok that underwater structure looks dope af with rtx on
Steve playing as Steve looking how the diamonds are shining under the new light.
i'd rather have RTX on beetle adventure racing...imagine the possibilities like 8k 120fps on that i.p as well?
Minecraft Bedrock is actually surprisingly well optimized. Most of the criticism about Minecraft's optimization comes from the Java version of the game, which (while better in every other way) has rather poor optimization. The most laggy bit (in terms of server side, not client side) was lighting updates, which were incredibly laggy (to the point of players like nessie, mirgrim, and phipro making mods that replaced the entire lighting engine), and have been fixed in newer versions of Java. The most laggy bits remaining are certain hoppers when interacting with full inventories and redstone dust making absurd amounts of block updates (why people use rails in contraptions where lag is a limiting factor).
also, they remade the game in 1.13 which destroyed performance. for me, fps dropped from ~120 to ~24. some people checked the code and realized the converted what were internally ints (such as id's of items and enchantments) to strings, and also made everything a fking object, ending by making the garbage cleaner work far too hard because of the now inefficient use of them. some people undid these changes, and when I used their mods, my fps jumped back to ~60.
will you revisit this testing with the 30 series which is rumoured to have significantly improved RT hw?
I wouldn't be surprised if they'll keep doing smaller tests with each major update and do more testing if results are noteworthy.
obviouly he will
That ray traced GN logo while putting out your name really got me! :D
5:05 When your 700$ gpu can play minecraft at 14fps.
What an improvement :D
at least they are trying something new
I love the way you broke up the video into titled segments on the seek bar
That's youtube
so people bought 2060 for "future proof RTX features" and can't even get 60 FPS in minecraft with it on?
RTX with 2060 was always a shit and certainly not future proof.
um.....it's a beta..... It'll get better once optimized.
You must be new here
That it’s Minecraft is not relevant, that it’s doing full path tracing is. At the moment this is the most intense RTX game out there. The 2060 will do fine in titles like Control, or Metro Exodus
who exactly said, these 1st generation RTX card last long ? These are just expensive test cards. You should skip it.
Great vid. I was actually a bit interested in this. Would be quite interesting to see for the 40 and 30-series as well. For me personally I would be after a 28 chunk render distance at 1440p UW, in which case I think a 4070Ti or maybe even more would be needed to get a framerate comfortably above 60fps... IMO a minimum of 28, all the way up to about 40-50 chunks is a sweet-spot in survival mode for me personally, as 28 doesn't feel limiting, while very long render distances around and above 50 in survival-mode doesn't go well in terms of immersion when it comes to world generation and the size of biomes. The fact I'd probably need just about a 4080 or even a 4090 to be within that range with RTX On at 1440p 100Hz is kinda crazy.
5:25 just imagine batman arkham knight would be so cool with rtx man damn
That is the thing. The batman games use a lot of rasterizer cheats to look that cool. If you want RTX you would first have to throw away all that work. Then you turn on RTX and see how it looks. Then you try new tricks that work with RTX to get to the image quality you once had. So a lot of work for a small improvement : light is more dynamic.
@@Ferdinand208 AND you're not even considering that you'll probably have to sacrifice on res or poly-count to maintain reasonable performance. That's one of the reasons they keep showing off Minecraft, it's already lighter on the GPU to start with.
@20:57 "Not mistake the forest for the trees" while showing a jungle map. ohhhh Steve :'D
No dinamyc lighting... The torch when you have on your hand doesn't light up the area. On minecraft java edition with optifine installed you can do that.
Lol you think that's the only thing that constitutes dynamic lighting?
Minecraft RTX is entirely path traced, this means it's entirely done via ray tracing, rather than a rasterized games with RTX features on top.
Everything is entirely dynamic, it's still in Beta, there are bugs and missing features.
@@AlfaHazard Sonic ethers unbelievable shaders (Seus) are also pathtraced, and have been before minecraft rtx had been announced.
@@Isaacminer1 "and have been before minecraft rtx had been announced."
and have been a thing before even RTX was announced.
@Zane White have you even tried them? they perform far better. also, if you think it's ugly, it's possible to mod the game so you can change the shaders to make it look nice (though looks are subjective).
@Zane White they dont use rt cores or DLSS so i guess so, they're running straight off of brute force, so of course they're gonna run worse
I'd love to see another video on Seus PTGI too
I'm going to wait until RTX anything is more than "We went from seeing the game to turning your monitor off to improve graphics". The jungle lights scene was terrible but you should've had started a forest fire to see what the drop would be from all those trees/leaves/etc on fire being new sources of light. I guess RTX would need to make it so it doesn't look like the world is lit up by laser beams. I don't know about anyone else but if I open a window of my house at noon and turn off all the interior lights the room isn't pitch black except one spot by the window.
Good Video guys🔥🔥🔥 you are on fire today
I up voted all the comments talking about mods already please make a comparison with raytracing/ ultra shaders.
They work in the real minecraft and don't require dedicated gpus.
The player's hand is lit up by the environment but they failed to project light from objects in the player's hand ? Like torches and glowstone ? The first thing every shader ever implemented first because it's so useful ?
this is a showpiece, not a functionality thing
it's to get pr and have a reason to say "see, rtx wasn't a _total_ failure."
it's why gn had to disable rtx to even explore the map in some places.
@@bananya6020 I know, that's why I'm so salty about it, because the PR bullshit works :/ People are just praising "RTX" when they only mean raytracing or pathtracing. People know Nvidia "invented" RTX so it's gonna be in people's mind that they invented raytracing...
@@SoKette it's a fucking *beta* if it was 100% finished it wouldn't be called *beta* now wouldn't it? Also Nvidia is in fact still working on pretty much everything graphics related for this, from optimizations to getting the denoiser to deliver results faster...
"Minecraft Tech Jesus isn't real. He won't hurt you."
*this video's thumbnail*
Yoooo what the HECK is that timebar feature? Like it's all segmented and tells me information about what each segment is about. That's AWESOME! How come I've never seen it before?
Has anyone compared this with the fully path-traced shader pack?
There's no 'fully path-traced' shader pack. SEUS does ray sampling like basically every "ray tracing" game implementation.
2020 and Minecraft is being used to benchmark GPUs. What a time to be alive.
17:04 the x in 2080 super is small
2007: but can it run crysis
2020: but can it run minecraft
Custom map with ray tracing and DLSS, I'm getting a pretty consistent 57fps at 4k with ASUS 2080Ti Strix and an old 5960X cpu and 32gb ram
Not too bad, But they need to get up to speed with the over all look, Optifine realistic shaders in java edition still destroys it and looks way better in my opinion.
The first time ever I click on a video of gamer nexus purely from the thumbnail.
Testing, or even just mentioning shaders was a missed opportunity
AFAIK there are no shaders for the non java Minecraft win 10 edition, which you need for RTX
so you basically would be comparing two different games
Java and Bedrock are working completely differently. Two completely different games.
You can still compare the experience people will receive from playing either shaders in Java or RTX. Framerates and visuals still apply.
2080 super: struggle to run minecraft rtx
my 2060: *chuckle* Im in danger
I've noticed that at least on the Java Editions, resolution seems to matter more than chunk render distance when using path traced global illumination shaders. Can anyone confirm a similar phenomena on Bedrock Edition?
Any specific shader mods? We might be able to look into it, but not sure when yet. Upvoting for visibility so other commenters can maybe recommend path traced community solutions for testing.
java version only has one rtx shader, and that one is still in development from sonic ether , you CANT use it for benchmark right now cause its not finished and therefore non comparable atm
@@GamersNexus SEUS PTGI E9/E11, all fancy settings in the vanilla options. Tested on a GTX 980 Ti (EVGA SC ACX 2.0) and an R7 3700X at 1080p/900p/720p. Was using the Default Improved PBR resource pack.
On a standard world, In this case on and island surrounded by just about every vegetative biome there is and mountains. Found that I could comfortable go up to 32 chunks and still have playable framerate, reducing the render distance to 16 seemed like it barely affected results. 64 Is only when it really felt sluggish.
Tested on a superflat world with Overworld/Tuneler/Standard preset, basically the same framerate as on s regular world but with seemingly improved frametimes. I don't recall VRAM being an issue here.
It was only when I adjusted the full screen resolution down to 900p or 720p, that I was able to achieve that magical 50/60 FPS mark a lot more consistently. It was generally much smoother and responsive. Again, I tried changing the chunk render distance but it would often make differences In the single digits at most until you got to 64.
I got a Vega 56 and Vega FE laying around, so I may still do some testing on those cards to see if it makes any difference here. I've also found that turning up the shader settings up did not make a substantial difference in performance or the overall feel of the gameplay.
@@oOWaschBaerOo And what of Continuum RT? I think I've observed something similar using it as well.
@@oOWaschBaerOo RTX shader? RTX is Nvidia's brand and refers to hardware accelerated ray tracing. What does this have to do with an unofficial ray tracing shader pack?
I would love to see some tests, if not systematic, or what explosions/fire do to the framerate. Seems quite interesting
I’ll stick to my 1080Ti sli system until 5000 series at this rate, no impressed still and Amepere won’t be much better. Just 40 FPS more at most.
Abraham I agree rtx 3080ti will not be fast enough for this game!
Waiting 5000 may be. Then it can be 80% faster than 2080ti... it may be enough to 1080p Gaming with raytrasing on... Wuhuu...
My 2 RTX 2080s SUPER NVLinked served me well so far. But given that the 2080 SUPER will be the new baseline with next gen consoles, I will upgrade to RTX 30 series.
2020: MineCraft looks amazing
2018: Okay, what the hell is going on over there??
Steve, you could have simply pressed F3 to find out avg. fps and other data.
they were on shitcraft aka bedrock
Love the time regions on the YT video!
AMD is gonna have to respond to this. Cuz Minecraft is reason enough for some people to switch from their RX cards to GeForce RTX cards
Why? I use shader mods that run better(fps wise). And look quite realistic. Minecrafts professional take using rtx, does not look like a shader killer yet.
@@joshuacolt2630 Except shaders are incompatible with many mods right? This could actually have decent mod support down the road.
@@4lc441 Im not an expert but I play mods just fine with the shaders I use. They also work on online servers as well such as wynncraft. Im pretty confident most mods and shaders are compatible.
@@4lc441 shaders have no real reason to be incompatible with mods.
yeah, all the 7 year old kids will beg their parents for a new 2080ti after having just bought a 5700xt.
Saying Minecraft might have sold more than Tetris is an understatement.. Minecraft is *the* best selling game of all time. 180 million copies versus Tetris' 35 million. The only thing that comes close is GTA V at 120 million, which has profited more as a result of being sold at a AAA pricepoint as well as the shark card economy. In fact GTA V is the single most profitable entertainment property of all time, that includes movies, books, games, etc.
RTX is overrated
you may as well say can it run RTX instead of crysis
but yeah its a useless nvidia physx but for shading instead
Steve, I don't mean to question your testing methodology, but there's no way a GTX 2080 Super couldn't go above 60fps with 14+ chunks loaded (non-RTX). I know it's anecdotal, but my RX 480 from years ago on MODDED minecraft gets over 70fps at a 15 chunk draw distance, and vanilla minecraft with raytraced shaders (SUES PTGI) gets average 60fps at 12 chunk draw distance.
And that's on Java Minecraft, which is known to be significantly slower than the C++ based Bedrock edition, which is the edition that got RTX support.
Just wondering what could have possibly stunted your performance so much in this case.
if AMD did this first everybody would be praising them
Do what? Turn a game that runs on potatoes into a glorified destroyer of $1200 GPUs?
Fuck that. And fuck "RTX".
Everybody would be praising NV too if there was something to do with it at launch. But there was nothing. And in the first few titles that got RTX support, you really had to look for it to get value out of it. I think it would've been a massive success if they worked with Minecraft to make that the first supported title AT LAUNCH.
@@robertstan298 everybody is on the nvidia hate train rn. If amd released something like this everybody would be prasing them for introducing futuristic tech. Meanwhile when nvidia does it it gets hated on bc it doesnt work perfectly. Amd still releases broken cards like 5700xt and nobody bats an eye
@@Slimmeyy At launch of what... Minecraft released in 2009...
@@gbner9991 Ah, so you're just a butthurt fanboy, got it. I bet you can't even afford a RTX 2060, but you sure like to shill your gamer soul out for corporations that don't give a flying fuck about your existence, or video gaming.
Good grief you people are so sad.
I have a suspicion Minecraft will be used by Nvidia to demonstrate how AmAzInG their new cards are if they add significantly more RT cores. If that is the case, Mark my words they will show a bar graph between a 2080 super at 56fps and a 3080 at 100fps.
They will do it. I'm pretty sure as well.
The one thing you missed or maybe I missed, the more blocks loading in chunks will greatly have an effect on FPS. Like the Dystopian City, you showed would kill FPS so bad it wouldn't be a real playable area. But great information because I am in the market for a new gpu.
I am very impressed how the seek bar is split into topics
I've actually tried Control since it got a DLSS 2.0 update, and it runs and looks great. This is a definite improvement compared to the state in which the game came out initially, like it's perfectly playable on a 2060 Super now with everything maxed out and DLSS turned on
I've watched enough Minecraft RTX videos, but I clicked just because of the thumbnail...holy shit.
22:29 interesting part of minecraft is that it changes things that are drawn far away (egz. stops rendering fancy leaves when you are far away). The distance increase with your increasing resolution + mainly width (ive done some testing :P). so probably this is whats going on here
I still dont understand DLSS, its meant to be able to run the game at a lower res while looking like native or higher res correct?
So a sharp high quality image while performance should suffer less.
So when you run 1080p with DLSS On....does that mean you are running 1080p with 4k (or so) image quality
Or does that mean you are actually running a lower resolution but with 1080p image quality?
If you running at 1080p DLSS you are using a lower than 1080p res, I think its around 540p reconstructed to look like 1080p
Lower resolution, it's basically a form of upscaling. At 4k with DLSS on you are actually running the game at 1080p and then upscaling that image to 4k which is why the framerates increase and the more intensive ray tracing settings also run better.
You're upscaling using AI to the resolution you select, in this case 1080p DLSS is 720p scaled to 1080p.
Like Omar says, if a game is set to 1080p and the DLSS setting is set to performance the game engine is only rendering 1 of every 4 pixels displayed. Meaning, the other 3 pixels are guessing what to display based on the previous frame. It does this by taking multiple samples of the previous frame.
Would have been helpful to clarify explicitly what turning DLSS on actually changes the render resolution to. I think at 1080p, the game renders 720p+DLSS upscaling, and 4K becomes 1440p+DLSS upscaling? That would makes sense then why the lights show up at 22:20 with 4K/DLSS but not at native 1080p because the base image used for DLSS was 1440p.
Thank you Steve. Amazing review. I learnt a lot.
the progress bars for your charts are a genius move.
Sectioning the video like this is VERY helpful, thanks Gamer Nexus!
I love seeing Ray-Tracing improving since it was first added to games! Would love to see the day AMD gets decent ray-tracing as well.
So you’re telling me that a 2080 Ti can’t maintain 60 FPS at all times (99% of the time) at 1080p with DLSS helping out?
All right. Lemme real quick go buy a Titan RTX.
??? It just did, in a beta even that has already been confirmed to not be (fully) optimised and still a work in progress...
If a few years ago, a friend would have said "I need a new GPU to play Minecraft", I would have offered a few bucks and a pat on the shoulder followed by the words "If you need anything, just ask. That's what friends do."
Now my reaction would rather be "Yes, me too. And it won't be cheap." lol
You can left hand hold torches and stuff like shields in Minecraft now which would help navigate dark forests and caves early on without turning off RTX.
I understand that realtime raytracing is still in its infancy and thus probably a poor purchase but I'll be damned if it isn't absolutely stunning. Too bad great depression 2.0 is about to hit because this shit will be excellent in a year or so.
I've never played Minecraft, but I've watched hours of Minecraft RTX videos in the last couple of days.
Just curious can you run the ray tracing on non-rtx hardware or does it require rt cores? What kind of performance hit would you get without rt cores if its even possible?
Can we see a video of these tests, no talking just the view?
Thanks :)
Angela Scheeler might be side channel content, but I’d be up for a looping stream of an automated tour of them all.
I hope they do, I just built my wife a 3950x rig for work from home. She's an artist and I want to show her some videos of Raytracing to see if having impressive lighting in video games will help inspire her digital art.
area of a radius 160 circle - 80424 units. radius of a 20 chunk circle, 400 units. its a bit high, so 500 units with of geometry rendered with RTX + DLSS = 80k units normal, or 160 times the render effort? (edit: this assumes that the first was gpu capped, which may not be true)
How does the night vision potion work with ray tracing? Does it request more rays to be traced towards the player or does it just add a filter?
Love the new time bar / timestamp bar
Steve playing Minecraft. What a time to be alive!
I'm guessing this video feels like someone from NASA explaining trajectory and getting overexcited to a kid just because they mentioned playing a space game once... on a phone.
quick question : Why would you make a GN logo and keep putting it sideways on your videos as a AZ logo?