Ben other question.. Do you think that is possible to do a serial about the diferent options overall the more complex to find information from the details panel material like: material domain, blend mode, shading model.. we are using the same 30 options everytime in my jobs but we haven´t seriouse documentation about the shading model: hair, clear coat, thin transparent, eye, thin translucent, speaking about the workflow with the new inputs that appear in the main Unreal pbr node. thanks.. Lots Dubts.. 6 years working hard with shaders in Unreal and every time I´ve more dubts.. Sorry.. Kind Regards Ben..
I've used DDX and DDY node 4 years ago when I was making a Mobile VR application with stereo 360 photos . I didn't know why I used it because I was following a tutorial and couldn't find anything as useful as this video so I skipped it. ( Apparently it was for the seems 😄😄) I'm looking forward for the upcoming videos. Thank you so much for teaching me a new thing. You are GREAT!!!!! 😊😊
The examples in this one were very interesting. It's the first time I've seen that seam fix and the included PerturbNormalLQ function. In the past, I've used ddx and ddy to create a "line art" shader based on the calculated curvature. Thank you for this series, we really need more intermediate level content like this with great explanations and examples.
DDX\DDY were really useful to me recently in removing a single pixel line from a radial gradient material, just as you described. This explains it so well why the nodes worked! I'd love to see you go through how to do 3 or 6 point lighting material setups for clouds/fog/smoke, that's an area that I'd really like to understand better for my VFX work. As always thanks Ben!
Thanks alot Ben that was really useful, I've been wanting a good walkthrough of DDX & DDY. And the normal generation from height has opened so much more shader options for me :D
Whenewer I need something I know you have the answer dear Ben. ))) I just look at your channel and the video is just there. Thank you very much. PertubNormalLQ was what I looked for.
That method will give you cleaner results than the ddx ddy method, but it's more expensive - so it just depends on the cost/quality trade-off you want to do.
Hello Ben, thank you for all your videos, they are really high quality and very instructive. I started from your folliage episodes and I think I went along and watched almost everything on you channel! I have a suggestion of topic: about pixel Depth Offset. I can't really understand how to use it properly, I understand it can be really powerful with blending. I was looking at the materials from the Rural Australian environnent made by Unreal (really gorgeous) and they use it for a lot of things.
Hard to say without seeing the specific implementation. Here's one method that's using derivatives to make triplanar mapping cheaper: iquilezles.org/articles/biplanar/ Code is in GLSL so it's using the equivalent dFdx() and dFdy(). It's using these to figure out the slope - so it can know which two of the three faces are the most important in the effect. Then it just does biplanar mapping with those two faces instead of the full triplanar - making it perform a little bit better. It's very clever.
Ben you are a true hero! One thing I want to know about perturb normal is that can we increase the final normal quality by increasing the height texture resolution or applying a blur after perturb normal?
If you need better quality, the best way is to use the HQ (high quality) version of the node instead. The point of this node that I'm showing is to be as cheap as possible.
Hi ben, really awesome tutorial. Since u mentioned us the cross product of ddx and ddy can get us normal, I wonder if we can use ddx and ddy to find curvature of a mesh (e.g. applying rust on edges), if it could work, then should I feed the normal texture to the ddxy or the vertex normal? If it won't work then is there any standardized way of finding curvature in shader? Thanks so much!
Question... i have a material that uses triplanar for a 2d texture Array instead of a regular texture. It works fine with regular texture, but im not sure why the ddx and ddy doesnt work for the texture array. Seems like when i set it up to use the normal mips instead of DDX and DDY it works, but when i leave the DDX DDY connected to a texture arrary it says .. "(Node TexSampleParameter2DArray) Cannot cast from smaller type float 2 to larger float 3... Basically for an array you would typically append a scalar value to the UV value to choose which texture in the array... Would i need to do something for that with the ddx ddy?
Could you use DDX and DDY to calculate normals from a height map or texture? If so, is it more expensive than doing a texture fetch for a normal texture?
Yes you can! However, the results are a bit blocky, and there's no filtering or mip mapping, so the quality isn't very good compared with sampling a texture. But it is a technique that's usable in some cases. In Shader Graph, the Normal From Height node works that way - with DDX and DDY.
@@BenCloward To expand on this, I noticed that the bump map for unreal does not update in Lumen ray traced hit lighting reflections, I isolated it down to the DDX and DDY nodes. As far as I understand it those are screen space calculations, so would you know of any alternatives that would work with that?
Stupid question ... is it better performance-wise to use alpha channel on my albedo texture and autogenerate nml from it, or to plug an entire extra nml texture? I don't mind extra pixelation if im going to save space/memory and even gain some extra performance from it.
Fewer texture samples is cheaper. But the quality of normal data that you'll get from a bump map is pretty low, so you'll want to do tests and see if that low level of quality is acceptable for your use case.
So.. granted you have one texture with RGBA channels, you can technically reconstruct normals for the first channel & second channel and also have one for detail normals (using the DDX/DDY method) and.. perhaps some mask or something in the alpha. Handy!
Yes, you could do that - but detail normals are usually in a separate texture because they’re tiled at a higher frequency than the regular normal. Otherwise you could just bake the normal and detail normal together into a single normal.
This method is good for converting intermittent or runtime generated greyscale data to normals. If you're using an actual external texturemap you will be better off spending another channel to create an actual normalmap rather than paying the ALU cost that comes with this setup.
Hi Ben! I was using DDX DDY when deriving world normals from abs world position for a global snow mask (as a decal), but just as expected, my mask looks exactly like that ignoring smoothing groups. Is there any way to fix this?
I did some investigating. I put a material on a cube and if you spin the camera around in a birds eye view the normal vectors change. If I swap the position of the DDX and DDY that are coming out of the texture sample it fixes it. It was happening when I tried using PerturbNormalLQ too so the same would apply to it as well. Can anyone else confirm that I am right?
Thanks, that was nice intro to derivatives for me. One question though - I think you need to normalize normals after cross product to make it work correctly. Noticed that from Ned Makes Games when he was implementing similar effect (th-cam.com/video/J1n1yPjac1c/w-d-xo.html)
If you do a cross product between two normalized vectors, the result is also a normalized vector - so no need to normalize again. It's safer to normalize (because it's always possible that one of the input vectors isn't normalized), but faster to not normalize. I usually err on the side of faster. I would only add in a normalize here of there was some visual problem caused by not doing it. If you put in a normalize and see no difference in the result, get rid of the normalize.
@@BenCloward actually it's true - I'm not sure why but it works without normalization in the case you showed, where you put recalculated normals to normals pin. In my case where I was making unlit shader that was using recalculated normals in the process, it worked only after normalizing the value after cross product. Nor sure why though. i.imgur.com/soS5sKD.png
Thanks, Ben.
I was searching a lot about what this DDX and DDY & couldn't find how it works etc.
You are the only person who explained it well.
Congratulations Ben.. I believe that this is the first content explaining really well this node.. Looking for during 4 years.. Tks you.
Ben other question.. Do you think that is possible to do a serial about the diferent options overall the more complex to find information from the details panel material like: material domain, blend mode, shading model.. we are using the same 30 options everytime in my jobs but we haven´t seriouse documentation about the shading model: hair, clear coat, thin transparent, eye, thin translucent, speaking about the workflow with the new inputs that appear in the main Unreal pbr node. thanks.. Lots Dubts.. 6 years working hard with shaders in Unreal and every time I´ve more dubts.. Sorry.. Kind Regards Ben..
I've used DDX and DDY node 4 years ago when I was making a Mobile VR application with stereo 360 photos . I didn't know why I used it because I was following a tutorial and couldn't find anything as useful as this video so I skipped it. ( Apparently it was for the seems 😄😄)
I'm looking forward for the upcoming videos. Thank you so much for teaching me a new thing. You are GREAT!!!!! 😊😊
The examples in this one were very interesting. It's the first time I've seen that seam fix and the included PerturbNormalLQ function. In the past, I've used ddx and ddy to create a "line art" shader based on the calculated curvature. Thank you for this series, we really need more intermediate level content like this with great explanations and examples.
DDX\DDY were really useful to me recently in removing a single pixel line from a radial gradient material, just as you described. This explains it so well why the nodes worked!
I'd love to see you go through how to do 3 or 6 point lighting material setups for clouds/fog/smoke, that's an area that I'd really like to understand better for my VFX work. As always thanks Ben!
Thanks alot Ben that was really useful, I've been wanting a good walkthrough of DDX & DDY. And the normal generation from height has opened so much more shader options for me :D
You're a legend, this has always been so mysterious to me what this actually does! Never mind useful things to do with it
Whenewer I need something I know you have the answer dear Ben. ))) I just look at your channel and the video is just there. Thank you very much. PertubNormalLQ was what I looked for.
awesome!
Id love to see you use the ddx and ddy to impliment a screen space curvature shader in unreal too!
It's the one we have been waiting for
I was always sampling heightmap several times with shifted UVs in order to get the normal. Looks like now I need to rework that parts.
That method will give you cleaner results than the ddx ddy method, but it's more expensive - so it just depends on the cost/quality trade-off you want to do.
@@BenCloward Yeah, it is a landscape in my case which is seen only from the sea so quality is not that important.
Hello Ben, thank you for all your videos, they are really high quality and very instructive. I started from your folliage episodes and I think I went along and watched almost everything on you channel!
I have a suggestion of topic: about pixel Depth Offset. I can't really understand how to use it properly, I understand it can be really powerful with blending. I was looking at the materials from the Rural Australian environnent made by Unreal (really gorgeous) and they use it for a lot of things.
Thank you for the suggestion!
Very good video for me, thanks Ben. But how to use DDX DDY, if i use VectorToRadialValue node, instead TextureCoordinate?
Thanks for this, really helpful!
ive seen people use ddx and ddy for a triplanar material i found... i dont understand the point of it. Why would they?
Hard to say without seeing the specific implementation. Here's one method that's using derivatives to make triplanar mapping cheaper: iquilezles.org/articles/biplanar/ Code is in GLSL so it's using the equivalent dFdx() and dFdy(). It's using these to figure out the slope - so it can know which two of the three faces are the most important in the effect. Then it just does biplanar mapping with those two faces instead of the full triplanar - making it perform a little bit better. It's very clever.
Ben you are a true hero! One thing I want to know about perturb normal is that can we increase the final normal quality by increasing the height texture resolution or applying a blur after perturb normal?
If you need better quality, the best way is to use the HQ (high quality) version of the node instead. The point of this node that I'm showing is to be as cheap as possible.
@@BenCloward thanks, I will try digging into that.
Thank you! It‘s very useful
perfect tutorials!!
_d/dy and d/dx is fun in unreal than in calculus_
You're a legend!
Hello, please is it possible to fully use this perturb normal in pathtracer? Unfortunately, I have black artifact with pathtracer...
Hi ben, really awesome tutorial. Since u mentioned us the cross product of ddx and ddy can get us normal, I wonder if we can use ddx and ddy to find curvature of a mesh (e.g. applying rust on edges), if it could work, then should I feed the normal texture to the ddxy or the vertex normal? If it won't work then is there any standardized way of finding curvature in shader? Thanks so much!
Question... i have a material that uses triplanar for a 2d texture Array instead of a regular texture. It works fine with regular texture, but im not sure why the ddx and ddy doesnt work for the texture array. Seems like when i set it up to use the normal mips instead of DDX and DDY it works, but when i leave the DDX DDY connected to a texture arrary it says .. "(Node TexSampleParameter2DArray) Cannot cast from smaller type float 2 to larger float 3...
Basically for an array you would typically append a scalar value to the UV value to choose which texture in the array... Would i need to do something for that with the ddx ddy?
Could you use DDX and DDY to calculate normals from a height map or texture? If so, is it more expensive than doing a texture fetch for a normal texture?
Yes you can! However, the results are a bit blocky, and there's no filtering or mip mapping, so the quality isn't very good compared with sampling a texture. But it is a technique that's usable in some cases. In Shader Graph, the Normal From Height node works that way - with DDX and DDY.
@@BenCloward To expand on this, I noticed that the bump map for unreal does not update in Lumen ray traced hit lighting reflections, I isolated it down to the DDX and DDY nodes. As far as I understand it those are screen space calculations, so would you know of any alternatives that would work with that?
thank you so muich
Stupid question ... is it better performance-wise to use alpha channel on my albedo texture and autogenerate nml from it, or to plug an entire extra nml texture? I don't mind extra pixelation if im going to save space/memory and even gain some extra performance from it.
Fewer texture samples is cheaper. But the quality of normal data that you'll get from a bump map is pretty low, so you'll want to do tests and see if that low level of quality is acceptable for your use case.
@@BenCloward Will do. It's for mobile so unless it's gonna look very weird, i probably wont mind low quality. Thanks for the answer :)
So.. granted you have one texture with RGBA channels, you can technically reconstruct normals for the first channel & second channel and also have one for detail normals (using the DDX/DDY method) and.. perhaps some mask or something in the alpha.
Handy!
Yes, you could do that - but detail normals are usually in a separate texture because they’re tiled at a higher frequency than the regular normal. Otherwise you could just bake the normal and detail normal together into a single normal.
This method is good for converting intermittent or runtime generated greyscale data to normals. If you're using an actual external texturemap you will be better off spending another channel to create an actual normalmap rather than paying the ALU cost that comes with this setup.
Awesome
Greate .... Thank you.
Hi Ben! I was using DDX DDY when deriving world normals from abs world position for a global snow mask (as a decal), but just as expected, my mask looks exactly like that ignoring smoothing groups. Is there any way to fix this?
If you need smooth normals, ddx ddy is definitely not the technique you want to use. You should use vertex normals or pixel normals if you can.
@@BenCloward thanks Ben!
Is it just me or do the normals look incorrect in the UE version?
I did some investigating. I put a material on a cube and if you spin the camera around in a birds eye view the normal vectors change. If I swap the position of the DDX and DDY that are coming out of the texture sample it fixes it. It was happening when I tried using PerturbNormalLQ too so the same would apply to it as well. Can anyone else confirm that I am right?
@@whyismynametaken123 in UE example there should not be transformvector node
Thanks, pixelate mtl, grate tutorial
Thanks, that was nice intro to derivatives for me.
One question though - I think you need to normalize normals after cross product to make it work correctly. Noticed that from Ned Makes Games when he was implementing similar effect (th-cam.com/video/J1n1yPjac1c/w-d-xo.html)
If you do a cross product between two normalized vectors, the result is also a normalized vector - so no need to normalize again. It's safer to normalize (because it's always possible that one of the input vectors isn't normalized), but faster to not normalize. I usually err on the side of faster. I would only add in a normalize here of there was some visual problem caused by not doing it. If you put in a normalize and see no difference in the result, get rid of the normalize.
@@BenCloward actually it's true - I'm not sure why but it works without normalization in the case you showed, where you put recalculated normals to normals pin.
In my case where I was making unlit shader that was using recalculated normals in the process, it worked only after normalizing the value after cross product. Nor sure why though.
i.imgur.com/soS5sKD.png
Calculus, ah shit here we go again :)