i tried this with my pc. Its actually extremely beautiful. Especially when I'm in a foggy biome. It feels like I'm really there. Because of the smoke coming out of my PC its really immersive.
Minecraft Java is the worst optimized game ever and for some reason not many people are complaining about it. This is something that Mojang should work on instead of adding new features that are only going to make the game even laggier.
Complementary Shaders absolutely slap. The visuals are stunning and the impact isn't anything too crazy when paired with Iris. I've tried many others, but I keep coming back to Complementary and BSL sometimes.
What you said about Minecraft utilizing more of the GPU was technically correct, but you must understand that Minecraft still isn't actually using the 4090 to its full ability- RT cores are not being used. All of the graphics are being bottlenecked through the "generic" parts of the GPU because neither OptiFine nor Iris support RT cores in their current state. Iris is working on it, however, and Continuum Graphics is making Focal Engine, a mod which will allow Minecraft to utilize Vulkan rendering software, which will be able to take advantage of RT cores as well.
Cuda remains the strongest part of any GPU however. standalone RT cores might hardly break the halfway mark in terms of performance compared to the cuda cores.
I didn't even consider that the game wasn't utilizing the RT cores. I'm glad to hear that Iris and Continuum are working on it, and hopefully we can see a video eventually where it's a reality
@@igameidoresearchtoo6511 I'm not sure you're quite grasping just how much of an impact RT cores have on ray tracing. Cuda cores work fine for a rasterized rendering pipeline, but are horribly unoptimized for ray tracing. When it comes to shaders like SEUS PTGI, KappaPT, and NostalgiaVX, the main barrier to performance (and overall appearance) is the fact that they have to squeeze path-traced rendering into the Cuda cores along with all the other rendering tasks. To fit, corners have to be cut and fps has to be sacrificed. RT cores make a big difference when it comes to the rendering method that they're *literally optimized for.*
@@draco6349 True, but in many games they work together with cuda to get an even higher frame rate, but for minecraft I really don't think they might even break the halfway mark for standalone RT cores vs cuda cores. Minecraft hardly uses cuda to its full potential, if even to its designed potential, I really doubt it would use the much more complex RT cores correctly.
@@igameidoresearchtoo6511 well sure, Minecraft itself wouldn't because Mojang doesn't care enough to optimize it, but that's why there are modders working to integrate support properly.
If you turn off animated textures, portal effects (or whatever its called) and some more texture effects and all particles set to minimal or off, the fps on Patrix 128x/256x doubles or Even quadrouples… its an interesting fix that people should be aware of!
1:22 Also, it's very noticeable that using shaders and high-end resource packs binds even more performance to the GPU. You can actually see your CPU load dropping when you turn on shaders, as rendering load is taken off of the CPU. I say this running PTGI on an i5-11600KF and RTX 3070.
@@etmezh9073 1080 is by far the most popular resolution (unless it changed), plus even SEUS doesn't recommend people to use HRR 3, says he probably will rewrite HRR entirely because the performance gains compared to the normal version are starting to not be visible anyone
Continuum and SEUS are the 2 heaviest shaders in the world, if you really want a high quality, customized and really good for your FPS shaders, you need AstraLex, you just need to be pacient and costum it...
It boggles my mind how far the modders pushed Minecraft in the decade or so it's been out, I mean sure, it's not exactly going to run well, but the contrast between the baseline visuals and what modders accomplished is absolutely stunning.
Seeing how high the fps could go with low render distance was cool. But it also would have been interesting to see how high the render distance can go with a 4090
@@someone11112This is a somewhat misleading response. While render dist does heavily depend on the amount of ram you have allocated, your CPU is heavily taxed at higher render distances. This is often why you'll see a reduction in your GPU usage as you increase your render distance; your CPU bottlenecks your GPU more.
I want to note that I wanted to replicate the highest framerate video on minecraft, and there is still a lot you can do to optimize the fps even more, when i first got my Rtx 3070 ti i used Lunar Client and turned off everything, i used 1.8.9 and turned everything to low, i hit a wopping 6200+ fps using an rtx 3070 ti and ryzen 9 5900x, i wanted to make a video out of it but a week later my minecraft account got hacked.. I know i have a photo of the fps somewhere lol
@@lemonaut1 Probably not that much, so i would say no more than 10 fps less, but it does depend on the hardware. But anyways if you are trying to get max possible performance, every frame matters
Man i always enjoy watching incredible shader videos for Minecraft. The great visual ones that aren't a texture pack and still have vanilla blocks make the game feel so much immersive.
The Rethinking Voxels shaderpack (or Complementary Reimagined) combined with the Faithless texture pack makes the game look completely incredible, highly recommend
Rethinking Voxels (Complementary Reimagined RT edit) could also be a big thing after a few more updates. The light system on that thing is completely unreal on par or above with bedrock indoors and any other rt shader I’ve tried, and there’s a huge plus which is that it is still Complementary
@@TheLivingHuman that's from complementary reimagined though, so if you use it for only that, then you'd be better off using complementary reimagined for improved performance and general visual fidelity (minus coloured light)
@@etmezh9073 i use vsync and it still happens, setting it myself doesn't work either because my monitor is actually 59.6hz, and obviously i can't put that in setting
I saw it on modrinth, but you could do better. They put a lot of bloat mods there which are unnecessary, you are better off just nitpicking the opti mods out of it and delete the rest.
@@idedary Yeah, I don't understand the bias people have with this modpack lol, maybe they advertise it. There's plenty of better modpacks out there for that
@@Cronalixz I agree, I can get 50% more performance by just selecting the mods myself, as I did for my server. When i look at the mods list there is unnecessary stuff hogging the resources. Its a clickbait modpack.
F3 reduces your FPS because the text rendering is not optimized. Every frame it has to rebuild the strings letter by letter as far as I'm aware. I don't think it has any kind of caching.
You need to see the load of each core of the CPU with some software like HWinfo to know if you are throttling it. Minecraft probably runs on just a couple cores, so if you have 16 cores it could be throttling 2 cores to the max but the task manager will only show you a 12-13% CPU usage.
Intrestingly, one thing that's clearly noticable in the ray traced high resolution versions of any games on graphics cards, is the power of the sun! In that it is now able to profoundly showcase the power of sunlight to a more realistic degree and also create enough difference between the region in bloom and not in bloom, effectively also increasing the f stops!
The reason bedrock uses less gpu usage, and was locked at 72 fps may have been due to vsync, as if it is unable to reach 144 fps (assuming that is your monitor refresh rate) it will lock it to half of 144 (which is 72). Because it locked the fps, it will end up using less gpu
i dont understand what any of this means i just know that bedrock runs faster than java especially when playing in biomes like mangrove swamp / jungle. wanted to cry when i went on java and saw that the things i love exploring in bedrock is being laggy af lol
@@nafsii04 bedrock is written in c++, its just native code. It is gonna be a lot faster. But java runs on a virtual machine. But thats why minecrat java has a lot of support for modding and compatibility
Yes this is correct, he has to edit some configs manually to disable it! With my 3060 I was stuck at 48 for some reason but after editing the config I got like 60.
not really.. the biggest graphics card leap in history was the monster that was the GTX 1080 Ti this card was so powerful not even the next gen RTX 2080 could beat it , the 1080 Ti was waaay ahead of its time
I struggle to see how raytracing could be compatible with LOD, maybe LOD could simplify textures and models in the distance for vram benefits? But that doesnt solve the light rendering which is what I would call the most intensive part. Maybe it can limit the amount of light bounces after a certain range, but even then that would make lighting from the sun and moon janky. Maybe its possible but based on my current knowledge of raytracing that would very hard for a developer to implement correctly.
@@legendarylava8775 I'm not even thinking of it being such a coherent comparability, just for there to not be significant visual glitches when both are used at the same time, which occurs currently with LOD mods and most even non-raytracing shaders.
When seeing only a 17% or so cpu usage this is completely normal and can still be a CPU „limitation“ as Minecraft can’t use lots of cores like other games. So even if it is maxing out the 2 cores it uses, your 16 core CPU won’t reach a high percentile load, but can still cap the framerate. (The single core speed is the limiting factor :)
I think you should have used the physics mod to really push the GPU and I think we all wanted to see how it looks. Love your videos, please keep posting these amazing videos
One thing to note, if your gpu isn't at about 90% utilization or above, it IS the cpu that is the bottleneck. That's why your 4090 was hardly ever being used fully in this video. If you upgrade your cpu you should see a big increase in fps.
@@TBone.Gaming there’s many variables but it would depend on the generation of it and your gpu. If you have a 4090 and the newest i9 (I’m not very familiar with these) then it would be better than this video, but still the bottle neck as it may take another year for cpus to catch up to the 4090
@@TBone.Gaming prolly not, I heard the 13900k is only about a ~5% increase from the 13700k strictly for gaming. Every cpu on the market is bottlenecking the 4090 at this point so the only thing you can do is play the waiting game.
You should have used performance overlay like afterburner to see the gpu usage. When I tested some super high quality shaders with my 7900xtx and lowering the render distance to 12 made the game utilize the gpu and cpu more raising my fps from 60 to over 300.
ordered my 4090 a few days ago and its on its way got a 4k monitor ready! this video popped up and seeing it makes me all the more excited to open that box!
@@Bl0xxy managed to get super lucky and got one. For sure worth setting money to the side. Even if little by little. A monster worth selling you soul for haha. But also fair close to over kill. Not really neccesary. But sometimes I do renders for my friends games he makes. So I kinda need it.
I would really like to see if you would use more optimizing mods how it would run… the best mods are for sure: sodium, lithium, thread tweaks, more culling and some that i can’t really remember from my memory. But I would say these would reallyy improve the performance
@@0h_hey944 You spent all that money on the world’s fastest gaming PC, so you might as well get the world’s fastest gaming monitor. You will definitely notice the difference at 500hz.
What you said about bedrock rtx shaders I completely agree with. Indoors look absolutely fantastic, especially pair with a good texture pack, but outdoors usually just look decent to sometimes plain bad for a shader
It wouldn't be so bad if it wasn't so glitchy and you could disable the atmospheric blurring crap. It's technically the most advanced in terms of true RT, but it looks surprisingly mid in all but the darkest of scenarios.
1:30 the weird thing on my system is that Minecraft and even the task manager say that it uses 90% GPU, but the GPU itself says its only at 30% and it only pulls like 25% of its max wattage. If you wonder if this has something to do with the raytracing cores you can get the GPU WAY higher in traditional rasterised graphics without any raytracing or upscaling. The 4090 has over 16k regular shader cores but only 512 tensor cores for upscaling and 128 raytracing cores, and that's pretty representative of its power draw as well. Minecraft Java shader packs don't actually use RT, so I would expect that 90% would actually draw like 400W. Meanwhile the VRAM usage without extreme texture packs is like 3Gb or so. And yes that is after setting it to 6gb ram (of which it uses 3-4gb in practice) and without maxing out the CPU.
Ya, like when i first brought minecraft, I was so dumb but I watched a video of a player going to the nether, and I tried the tutorial on how to make it and i got t the nether and when I saw it I was like"bro i just found hell in minecraft"😅😅😅
2:28 " What you're seeing, now this is my normal state" 2:36 " This is a super Saiyan" 2:46 " And if I push it beyond that you get super Saiyan 2" 3:18 "And This... is to go... EVEN FURTHER BEYOND!"
Na i woukd go to iris and not to optifine because iris has better shader Compiler and use less Performance and iris has nicer shaders today and with other Performance and generell fabrik Tools u can triple your default fps and has also nicer shaders and options
Please tell me something I can't find anywhere: when you pay in Patreon for the Patrix, you pay 1 month, but can use the texture pack forever? Or only as long as you pay for it eeeevery month?
For those who are curious: SEUS PTGI is INSANELY fast for a full path traced shader and the visuals are absolutely stunning. It runs at a (marginally) playable 20 FPS on my GTX 1660 ti, at 1080p and when using the free 32 px version of Patrix it runs about 15 fps. So if you have a 2080 ti (which if I recall correctly was a MASSIVE improvement over the earlier cards) then that should run at a pleasantly playable frame rate.
i really love rethinking voxels, its one of the pretties packs ive ever used, being fully path traced, and it gets better performance than PTGI while also having colored light.
3:30 The simple explanation to this is that Minecraft was only using a single CPU core at the time. The GPU wasn't pinned to 100 or even near it, so you were CPU bound by quite a bit, just even so Minecraft isn't able to use the whole CPU at once.
For the 4090Ti(whenever that one's coming out) test, I can recommend you use the Nvidium mod(along with Sodium), which fully utilizes the power of the GPU and the Bobby mod for an absurd render distances, which unlike DH(which is also amazing), are fully rendered. Right now, on my 3090, I'm playing with these mods on 128 chunks render distance and it rarely drops below 100fps. The downside is that Nvidium isn't compatible with any shaders, but even without them, you can still reach the full potential of the GPU.
seeing this makes me realize that whenever a new flagship gpu is released there is that 1 degen sitting under his sink thinking : oh boy its time to make another minecraft texture pack
You should do a Minecraft conversion for a war mod. Like including planes. tanks. boats. machinery. or basically whatever you want to included. I bet you it will be epic. Also if you make it a modpack im pretty sure it will get popular that way too.
i tried this with my pc. Its actually extremely beautiful. Especially when I'm in a foggy biome. It feels like I'm really there. Because of the smoke coming out of my PC its really immersive.
LMAO
bro I'm dying after reading this 😂
Okay you got me lol
ikr it even got the smell and shit
Lmfaooo omfg
Minecraft Java is the worst optimized game ever and for some reason not many people are complaining about it. This is something that Mojang should work on instead of adding new features that are only going to make the game even laggier.
That's true
When they did that in 1.15, people called it a trash update
Main reason for why I play 1.12.2 and use mods to add significantly more features than mojang ever hoped to add in a much higher quality too.
RIIR (Rewrite it in Rust)
True
Complementary Shaders absolutely slap. The visuals are stunning and the impact isn't anything too crazy when paired with Iris. I've tried many others, but I keep coming back to Complementary and BSL sometimes.
Complementary goes so hard, I used to have one similar with black cinematic bars at the top, thats all i miss
That the website you use to get this REALISTIC minecraft?
@@GONTE_YT curseforge or modrinth. Iris and Optifine can run shaders but can't you just search it up how to install them?
Try rethinking voxels. Its a complementary based shader with ray tracing. If you cant run that then complementary reimagined is also really good
@@Rooftopaccessorizer I will check it out, thanks!
What you said about Minecraft utilizing more of the GPU was technically correct, but you must understand that Minecraft still isn't actually using the 4090 to its full ability- RT cores are not being used. All of the graphics are being bottlenecked through the "generic" parts of the GPU because neither OptiFine nor Iris support RT cores in their current state. Iris is working on it, however, and Continuum Graphics is making Focal Engine, a mod which will allow Minecraft to utilize Vulkan rendering software, which will be able to take advantage of RT cores as well.
Cuda remains the strongest part of any GPU however.
standalone RT cores might hardly break the halfway mark in terms of performance compared to the cuda cores.
I didn't even consider that the game wasn't utilizing the RT cores. I'm glad to hear that Iris and Continuum are working on it, and hopefully we can see a video eventually where it's a reality
@@igameidoresearchtoo6511 I'm not sure you're quite grasping just how much of an impact RT cores have on ray tracing. Cuda cores work fine for a rasterized rendering pipeline, but are horribly unoptimized for ray tracing. When it comes to shaders like SEUS PTGI, KappaPT, and NostalgiaVX, the main barrier to performance (and overall appearance) is the fact that they have to squeeze path-traced rendering into the Cuda cores along with all the other rendering tasks. To fit, corners have to be cut and fps has to be sacrificed. RT cores make a big difference when it comes to the rendering method that they're *literally optimized for.*
@@draco6349 True, but in many games they work together with cuda to get an even higher frame rate, but for minecraft I really don't think they might even break the halfway mark for standalone RT cores vs cuda cores.
Minecraft hardly uses cuda to its full potential, if even to its designed potential, I really doubt it would use the much more complex RT cores correctly.
@@igameidoresearchtoo6511 well sure, Minecraft itself wouldn't because Mojang doesn't care enough to optimize it, but that's why there are modders working to integrate support properly.
If you turn off animated textures, portal effects (or whatever its called) and some more texture effects and all particles set to minimal or off, the fps on Patrix 128x/256x doubles or Even quadrouples… its an interesting fix that people should be aware of!
Thank you for sharing !
6:56 I love that lighting in the storage area
What's the shader?
@@hopeconfig4072 Minecraft RTX on Bedrock edition
@@hopeconfig4072 bedrock rtx
What is it?@@hopeconfig4072
shader?
1:22 Also, it's very noticeable that using shaders and high-end resource packs binds even more performance to the GPU. You can actually see your CPU load dropping when you turn on shaders, as rendering load is taken off of the CPU. I say this running PTGI on an i5-11600KF and RTX 3070.
I run PTGI HRR 2.1 on a 3050 and somehow have 60-85 fps if i use Iris + Sodium
@@ZedDevStuff Thats awesome
@@ZedDevStuff 1080p...
also 2.1 is much less intensive than 3.0.
@@etmezh9073 1080 is by far the most popular resolution (unless it changed), plus even SEUS doesn't recommend people to use HRR 3, says he probably will rewrite HRR entirely because the performance gains compared to the normal version are starting to not be visible anyone
Continuum and SEUS are the 2 heaviest shaders in the world, if you really want a high quality, customized and really good for your FPS shaders, you need AstraLex, you just need to be pacient and costum it...
It boggles my mind how far the modders pushed Minecraft in the decade or so it's been out, I mean sure, it's not exactly going to run well, but the contrast between the baseline visuals and what modders accomplished is absolutely stunning.
there’s so much to be done when every single component of the game can be isolated into individual “blocks” customization becomes endless
VScode, game edition
Seeing how high the fps could go with low render distance was cool. But it also would have been interesting to see how high the render distance can go with a 4090
Render distance is RAM based. Basically you multiply how many gigabytes of RAM you have by 3 to get your render distance.
@@someone11112so if I got 16GB of ram I can easily run 48 render distance?
@@GameOver-nm2us how much of that have you allocated to minecraft?
@@someone11112This is a somewhat misleading response. While render dist does heavily depend on the amount of ram you have allocated, your CPU is heavily taxed at higher render distances. This is often why you'll see a reduction in your GPU usage as you increase your render distance; your CPU bottlenecks your GPU more.
@@someone11112 I should reinstal minecraft now that I upgraded from 16GB to 64GB to try some things out I guess xdd
I want to note that I wanted to replicate the highest framerate video on minecraft, and there is still a lot you can do to optimize the fps even more, when i first got my Rtx 3070 ti i used Lunar Client and turned off everything, i used 1.8.9 and turned everything to low, i hit a wopping 6200+ fps using an rtx 3070 ti and ryzen 9 5900x, i wanted to make a video out of it but a week later my minecraft account got hacked.. I know i have a photo of the fps somewhere lol
@@anonimus11236 yeah i did the exact same thing turning everything off from my pc
@@anonimus11236 maybe it's my drivers but Linux performs about the same for me
@@anonimus11236 wow, does dynamic wallpaper really change performance in a measurable way?
@@lemonaut1 Probably not that much, so i would say no more than 10 fps less, but it does depend on the hardware. But anyways if you are trying to get max possible performance, every frame matters
@@anonimus11236 or a custom image of windows. Like Tiny11
Man i always enjoy watching incredible shader videos for Minecraft. The great visual ones that aren't a texture pack and still have vanilla blocks make the game feel so much immersive.
Imagine showing this to someone in the past playing the alpha version of the game
The Rethinking Voxels shaderpack (or Complementary Reimagined) combined with the Faithless texture pack makes the game look completely incredible, highly recommend
The Barely Default resource pack too, cant forget that!
i think rethinking voxels is deffo going to be the go to shader when it comes out of beta, great visuals with good performance
"With that out of the way, we're going to take things up a 'Notch'"... I see what you did there hahah (3:56)
No? He didn't emphasize it?
Obviously it wasn't intended, but it's still a pun :| @@malindrome9055
Rethinking Voxels (Complementary Reimagined RT edit) could also be a big thing after a few more updates. The light system on that thing is completely unreal on par or above with bedrock indoors and any other rt shader I’ve tried, and there’s a huge plus which is that it is still Complementary
I'll need to check it out!
@@AsianHalfSquat I meant Rethinking Voxels, that’s the proper name btw, sorry lol
Rethinking voxels is probably the best shader I've ever used, the aroura's in snowy biomes are crazy
@@TheLivingHuman that's from complementary reimagined though, so if you use it for only that, then you'd be better off using complementary reimagined for improved performance and general visual fidelity (minus coloured light)
@@gri5733 yeah I get halved performance on voxels compared to complementary.
Windowed mode is brutal. The ultimate challenge
Back when I had intel hd graphics 530, I either used windowed mode or fullscreened in an ugly resolution.
What about it? I play Minecraft in windowed mode, because my monitor is 59hz and any game on full screen will cause screen tearing.
@@LostW just cap the FPS.
@@etmezh9073 i use vsync and it still happens, setting it myself doesn't work either because my monitor is actually 59.6hz, and obviously i can't put that in setting
@@LostW set it to 60hz then instead of 59.6
unless there is no "60.000hz" option
Finally started ordering the parts for my PC today with an RTX 4070 and am super excited!
I would love to see this again but with the fabulously optimized modpack :D
I saw it on modrinth, but you could do better. They put a lot of bloat mods there which are unnecessary, you are better off just nitpicking the opti mods out of it and delete the rest.
@@idedary Yeah, I don't understand the bias people have with this modpack lol, maybe they advertise it. There's plenty of better modpacks out there for that
@@Cronalixz I agree, I can get 50% more performance by just selecting the mods myself, as I did for my server. When i look at the mods list there is unnecessary stuff hogging the resources. Its a clickbait modpack.
@@idedary you should upload your modpack like that and advertise it as "fabulously optimized without the bloat" lmfao
@@Cronalixz he should actually do that because im stupid and i cant do what he says because im not smart enough
F3 reduces your FPS because the text rendering is not optimized. Every frame it has to rebuild the strings letter by letter as far as I'm aware. I don't think it has any kind of caching.
Love the extra editing it looks cool
You need to see the load of each core of the CPU with some software like HWinfo to know if you are throttling it. Minecraft probably runs on just a couple cores, so if you have 16 cores it could be throttling 2 cores to the max but the task manager will only show you a 12-13% CPU usage.
It's java. It only runs on a single core. It's Windows that makes it appear the load is being shared, but it actually isn't
Intrestingly, one thing that's clearly noticable in the ray traced high resolution versions of any games on graphics cards, is the power of the sun! In that it is now able to profoundly showcase the power of sunlight to a more realistic degree and also create enough difference between the region in bloom and not in bloom, effectively also increasing the f stops!
This man will never stop upgrading his PC with the latest and best graphics cards isn’t he? 😂
😄
Yeah, and some shader developers will have to up their ante with the gpus lol
Song from 0:12 is The Ancient Dragon from Dark Souls 1 :)
The reason bedrock uses less gpu usage, and was locked at 72 fps may have been due to vsync, as if it is unable to reach 144 fps (assuming that is your monitor refresh rate) it will lock it to half of 144 (which is 72). Because it locked the fps, it will end up using less gpu
i dont understand what any of this means i just know that bedrock runs faster than java especially when playing in biomes like mangrove swamp / jungle. wanted to cry when i went on java and saw that the things i love exploring in bedrock is being laggy af lol
@@nafsii04 if the fps with vsync on doesnt reach the monirtor refresh rate, it makes the fps half of the refresh rate.
@@nafsii04but since they do a render dragon update to bedrock now it start to be laggy..
@@nafsii04 bedrock is written in c++, its just native code. It is gonna be a lot faster. But java runs on a virtual machine. But thats why minecrat java has a lot of support for modding and compatibility
Yes this is correct, he has to edit some configs manually to disable it! With my 3060 I was stuck at 48 for some reason but after editing the config I got like 60.
Looking forward towards the 5090 benchmark!!!
4090 + 20fps
👍
this "destroying gpu" series is just getting better every year wow
3090 ti was a serious leap in graphics card technology.
But the 4090 alone is a the Neal Armstrong of graphics card leaps.
Minutemen?
@JavaScrapper YES.
More specifically, the fallout 4 modded Minutemen Republic flag.
I am just saying ok don't get to say that I am stupid just think u take this shader and try to run it in a 2 gb ram mobile and its on fire yay
awesome analogy dude.........lol
not really.. the biggest graphics card leap in history was the monster that was the GTX 1080 Ti this card was so powerful not even the next gen RTX 2080 could beat it , the 1080 Ti was waaay ahead of its time
This is litterally the graphics I expect for a minecraft movie
4:45 is how my computer runs completely vanilla Minecraft lol
0:22 who seen those blocks launching in the background?!
*CHICKENS*
They were goats
you should've used the physics mod to see what minecraft could really be like
Keep up the content! I love it!
1:00 What shader is that? Or is it bedrock RTX?
Nvm its probably rethinking voxels
hey asianhalfsquat, no april fools day video this year? 😩
Funfact: if you detonate 9 TNTs in minecraft with this graphics 10th TNT will be your pc
When raytracing shaders start working with LOD mods, I’d love to see you visit this idea again.
I struggle to see how raytracing could be compatible with LOD, maybe LOD could simplify textures and models in the distance for vram benefits? But that doesnt solve the light rendering which is what I would call the most intensive part. Maybe it can limit the amount of light bounces after a certain range, but even then that would make lighting from the sun and moon janky.
Maybe its possible but based on my current knowledge of raytracing that would very hard for a developer to implement correctly.
@@legendarylava8775 I'm not even thinking of it being such a coherent comparability, just for there to not be significant visual glitches when both are used at the same time, which occurs currently with LOD mods and most even non-raytracing shaders.
The intro is half the video
yes
When seeing only a 17% or so cpu usage this is completely normal and can still be a CPU „limitation“ as Minecraft can’t use lots of cores like other games. So even if it is maxing out the 2 cores it uses, your 16 core CPU won’t reach a high percentile load, but can still cap the framerate. (The single core speed is the limiting factor :)
This
6:42 This is what i remember minecraft to look like as a kid
I think you should have used the physics mod to really push the GPU and I think we all wanted to see how it looks. Love your videos, please keep posting these amazing videos
Where's Your This Year's April Fool Video
fun fact: There is an advantage having FPS past your monitors refresh rate, as it shows you the newest frames rather than showing you them quicker
One thing to note, if your gpu isn't at about 90% utilization or above, it IS the cpu that is the bottleneck. That's why your 4090 was hardly ever being used fully in this video. If you upgrade your cpu you should see a big increase in fps.
Hey man, i recently got a I9, do you think that's going to yield better results than the video?
@@TBone.Gaming there’s many variables but it would depend on the generation of it and your gpu. If you have a 4090 and the newest i9 (I’m not very familiar with these) then it would be better than this video, but still the bottle neck as it may take another year for cpus to catch up to the 4090
@@TBone.Gaming prolly not, I heard the 13900k is only about a ~5% increase from the 13700k strictly for gaming. Every cpu on the market is bottlenecking the 4090 at this point so the only thing you can do is play the waiting game.
@@Minth_aaayeah i have a 4090 with the i9-13900k and i bought it at the time it was the highest tier. waiting game is definitely the move
I'm going to cry, he didn't upload a april fools video 😭😭
2:51 I chocked on my food after hearing that
You should have used performance overlay like afterburner to see the gpu usage. When I tested some super high quality shaders with my 7900xtx and lowering the render distance to 12 made the game utilize the gpu and cpu more raising my fps from 60 to over 300.
6:10 it doesn't even feel like Minecraft anymore
i havent watching you for a year or 2 i never heard you so active and happy is it just me or what?
ordered my 4090 a few days ago and its on its way got a 4k monitor ready! this video popped up and seeing it makes me all the more excited to open that box!
damn you're rich i wish i could ever get one.. im stuck with a geforce gtx 1070
@@Bl0xxy managed to get super lucky and got one. For sure worth setting money to the side. Even if little by little. A monster worth selling you soul for haha. But also fair close to over kill. Not really neccesary. But sometimes I do renders for my friends games he makes. So I kinda need it.
Imagine this on GT710 💀
I would really like to see if you would use more optimizing mods how it would run… the best mods are for sure: sodium, lithium, thread tweaks, more culling and some that i can’t really remember from my memory. But I would say these would reallyy improve the performance
I love playing competitive pvp with my 4090 and 13900k with no shaders on 1440P!
You should go for 1080p 500hz. Yes 1080p 500hz is actually a thing now. Asus made the first 500hz monitor. Last year
@@bradlyboy328 i dont notice anything above 240hz, and i rly only need a 144hz one
@@0h_hey944 You spent all that money on the world’s fastest gaming PC, so you might as well get the world’s fastest gaming monitor. You will definitely notice the difference at 500hz.
@@bradlyboy328 ik, i have played on a 1080p 360hz before, didnt notice so i returned it and got a 144hz 1440p. honestly its fine.
@@KyjiPurr and other games, but mostly minecraft
wait what was the game at 0:32 ?
Deep rock galactic
“A pretty respectable 2200 fps” not me playing on 50💀
Have you tried the rethinking Voxels Shader? In my opinion one of the best shaders rn.
Yeah, those are really epic.
Yes, except in the max pre-set. With my 3090 and a resolution of 1440p, I achieve only 6 fps
@@_Zelkova bro that's good XD I get 4 frame or my PC crashes when I use then
What you said about bedrock rtx shaders I completely agree with. Indoors look absolutely fantastic, especially pair with a good texture pack, but outdoors usually just look decent to sometimes plain bad for a shader
It wouldn't be so bad if it wasn't so glitchy and you could disable the atmospheric blurring crap. It's technically the most advanced in terms of true RT, but it looks surprisingly mid in all but the darkest of scenarios.
1:30 the weird thing on my system is that Minecraft and even the task manager say that it uses 90% GPU, but the GPU itself says its only at 30% and it only pulls like 25% of its max wattage.
If you wonder if this has something to do with the raytracing cores you can get the GPU WAY higher in traditional rasterised graphics without any raytracing or upscaling. The 4090 has over 16k regular shader cores but only 512 tensor cores for upscaling and 128 raytracing cores, and that's pretty representative of its power draw as well. Minecraft Java shader packs don't actually use RT, so I would expect that 90% would actually draw like 400W. Meanwhile the VRAM usage without extreme texture packs is like 3Gb or so.
And yes that is after setting it to 6gb ram (of which it uses 3-4gb in practice) and without maxing out the CPU.
if someone could make a mod compatible with the 40 series's frame generation that would be a game changer
No Optifine 4 this year 😢
Yea but look there is now 1.20 version in Minecraft :L
Bliss shaders look amazing in the nether with the atmospheric fog
Nah bruh thats hell 5:50
Ya, like when i first brought minecraft, I was so dumb but I watched a video of a player going to the nether, and I tried the tutorial on how to make it and i got t the nether and when I saw it I was like"bro i just found hell in minecraft"😅😅😅
Waiting on the next optifine
Congratulations on 1mil btw
Where optifine 4?
This man is the reason for half of Nvidia's profit
😭
@my account got suspended :/ it's a joke, he isn't actually
2:28 " What you're seeing, now this is my normal state"
2:36 " This is a super Saiyan"
2:46 " And if I push it beyond that you get super Saiyan 2"
3:18 "And This... is to go... EVEN FURTHER BEYOND!"
Where big boi optifine 4 april fool video?
;(
WHERES OPTIFINE 4
Are you kidding me that’s so cool you deserve a sub man.
where's optifine 4?
/??/?
optifine 4?
ok but running portal rtx with i assume no upscaling or frame generation of any kind at 100 fps is WILD this things a BEAST (imagine the 5090 😭)
april fools where
You should use euphoria patches or rethinking voxels mod (addon to compeletary shaders) it adds even better lightning
WHERE IS MY OPTIFINE 4
No april fools joke? 🥺👉👈
Nvdia: The rtx 4090 can run anything
Minecraft: Let me even the odds
Waiting for Optifine 4 update.........
Bro where is optifine 4
Na i woukd go to iris and not to optifine because iris has better shader Compiler and use less Performance and iris has nicer shaders today and with other Performance and generell fabrik Tools u can triple your default fps and has also nicer shaders and options
I love watching videos of stuff I will never be able to afford.
me too
Oh my pccccccccc
no april fools vid :(
Please tell me something I can't find anywhere: when you pay in Patreon for the Patrix, you pay 1 month, but can use the texture pack forever? Or only as long as you pay for it eeeevery month?
Soon your gonna have to do this with an RTX 5090.
For those who are curious:
SEUS PTGI is INSANELY fast for a full path traced shader and the visuals are absolutely stunning. It runs at a (marginally) playable 20 FPS on my GTX 1660 ti, at 1080p and when using the free 32 px version of Patrix it runs about 15 fps. So if you have a 2080 ti (which if I recall correctly was a MASSIVE improvement over the earlier cards) then that should run at a pleasantly playable frame rate.
The smoke coming out of my phone after watching this video is too realistic 🔥
How did u get the clouds to look like that? 5:27
and is there a way to get the same result but without the heavy shader?
i really love rethinking voxels, its one of the pretties packs ive ever used, being fully path traced, and it gets better performance than PTGI while also having colored light.
I would love to see you try rethinking voxels, it might not be the best, but it’s just beautiful in its own way.
3:30 The simple explanation to this is that Minecraft was only using a single CPU core at the time. The GPU wasn't pinned to 100 or even near it, so you were CPU bound by quite a bit, just even so Minecraft isn't able to use the whole CPU at once.
I just wish I had a computer that can run Minecraft normally without any mods or shaders
Laptop is the best way for cheap gaming. Less than £500 and you can play minecraft. If you get a cheap mouse it's no different to a PC.
My favourite shaders has to be rethinking voxels. It's so cool to actually see Minecraft have that musch realistic lighting
When your computer runs hot enough to smelt alloys:
For the 4090Ti(whenever that one's coming out) test, I can recommend you use the Nvidium mod(along with Sodium), which fully utilizes the power of the GPU and the Bobby mod for an absurd render distances, which unlike DH(which is also amazing), are fully rendered. Right now, on my 3090, I'm playing with these mods on 128 chunks render distance and it rarely drops below 100fps.
The downside is that Nvidium isn't compatible with any shaders, but even without them, you can still reach the full potential of the GPU.
5:26 those clouds are CRAZY
seeing this makes me realize that whenever a new flagship gpu is released there is that 1 degen sitting under his sink thinking : oh boy its time to make another minecraft texture pack
why were there chickens getting flinged and just dying? More lag? lol i love it
You should do a Minecraft conversion for a war mod. Like including planes. tanks. boats. machinery. or basically whatever you want to included. I bet you it will be epic. Also if you make it a modpack im pretty sure it will get popular that way too.
I gotta ask, how do you get your mods all into one world using the same version? Appreciate it.