@@akoras3222Rotating normal maps always breaks them (they store vectors, direction matters). This happens even without any fancy tiling, but is often harder to notice. In this case though, each tile will behave as if light is falling from unique direction per-tile.
Btw the vector math node also has a scale mode that multiplies all channels by the same float. So you don't have to connect a value node to the multiply all the time
Man, you are crazy smart how you manipulate the UVs. The explanations are perfect. Now, I'm applying this approach in Unreal shader. Thank you so much !!
ohh, this is absolutely AWESOME, not just a solution but fantastically articulated so it becomes an understandable process :D, please make more gems like this :)
Very happy to find this. I am much more excited to use my image-based material libraries now without having to worry about the tiling. I snuck another map range between the white noise and the multiply and plugged it into the scale. So, now the white noise is driving the rotation and a controllable scale range of the map for even more (but subtle) variation. This is the closest a tiled image can come to being procedural! Note: You do have to adjust things based on the image itself and I think that's obvious. Micro displacement works perfectly fine as well. Thanks again!
This and any method involving sharp transitions won't work well with bump maps, and completely break displacement maps. A more suitable approach to avoid that is using a noise output, splitting it into colors, and have each color mix in a variant (rotation and scale) of the texture. While you will only get 4 different variations, it's certainly enough to break repetitions while maintaining smooth transitions. I use this all the time for applicable textures.
Interesting! I think you might even get away with 2 variations? AFAIR Dylan Neill has a tutorial where he uses that for ocean wave displacement. Splitting the Color output of noise textures with a _Separate RGB_ node is definitely an underused trick. (BTW, Fac is the same as the red channel.)
Great tutorial! Here's a quicker (though less understandable for teaching) way to set this up with fewer _Math_ nodes: To distort the coordinates, plug them and the _Noise Texture_ Color into a _MixRGB_ set to *Linear Light.* Generate tiles of random values with a 2D _Voronoi Texture._ Split the Color output with a _Separate RGB_ and use one for rotation. If you want squares, set Random to 0.
Great Manuel! You don't have to manually connect nodes to the color to preview them, just enable node wrangler addon and Ctrl+shift+click to preview them instantly. (It also cycles the preview node on the fly for every output from the selected node).
It was always (and still is) available for download for free in the description of Blender Guru's "How to tile a texture" video. (Also, the .blend file also includes ones for color variation and PBR texture blending.) :P Though, I love that Manuel is actually building it from the ground up, and explaining the maths and concepts.
Could you please somehow took into consideration on future nerd rant How to learn houdini or any software effectively I mean How to learn something while watching tutorial Sounds really stupid But it will worth it Cheers BTW keep these blender stuff up Since I am starting to learn Blender and entagma is 😍😍😍
There was a tutorial YEARS ago for substance designer where a guy made a "fx map" (basically drag n drop math) where he had substance calculating how the uvs were situated and had the textures adapt to them. Is that possible with blender? Cause the dude countered stretching and weirdly rotated uvs in his thing
I dont understand what ive done wrong. By default my texture was showing as a solid colour. Seemingly an average of all the colour in my image and following these steps didnt fix that. I was only able to make it show as a texture by linking a position geometry node to the image vector which then meant that none of the steps in this tutorial functioned as intended.
Hi, I've Successfully created the non-tiling material in UE5 with the similar process, the step shows in this video: th-cam.com/video/ToQcPH_gWNk/w-d-xo.html
Do you know if there’s a reason that Blender doesn’t have a scale vector node, with a ‘center’ control (like with the vector rotation)? To determine the origin of scaling without extra nodes. Is it just an inconsistency in the UI?
There is a scale mode on the vector math node. Vector Rotate is kinda special and is the only one like that. Rotating has a lot more going on then a simple scale which simply multiples all 3 values by the same number.
@@GifCoDigital yes, you’re right. I’m specifically interested in having the ‘origin point’ for scaling vectors. It’s very useful inside of geometry nodes (as well as shader) if you can define the origin point of scaling. I have to do this currently by adding extra nodes, was wondering if there’s a preventing factor to having a scale vector node with origin input?
@@btn237 The "centering" in Entagmas tutorial only works so well and easy because of the modulo, so the subtiles are all 1x1 and of course the center is at 0,5. But in many other cases the center wouldn't be there so it makes sense to leave the translation-values (with which you do the centering) free to the user. I think it's a common misconception that a single vector automatically has a point of origin. However this point of origin would be another vector that needs to be determined somehow. The only exception is some vector at the world origin (which still is the vector {0,0,0}).
@@ironscavenger the ‘center’ input allows the user to specify any point of origin, which *is* leaving it up to them as you put it. I’m using this input all the time in geometry nodes to rotate vectors - either by specifying a particular vector to be the origin of rotation, or doing a bit of math to determine (for example) the center of mass of a given mesh island, and then rotating according to that. As I see it, allowing this input for scaling also would have similar benefits, resulting in potentially fewer nodes and busywork - do you see what I’m talking about?
@@btn237 well no there is no technical limitation. And you can even just do it your self right now by making a node group the performance would be almost the same as a built in node as it's doing the exact same math under the hood
Note that this will break normal maps, you'll need to rotate them back by the same angle, between the "Normal Map" output and the shader input.
do you know why that is the case?
@@akoras3222Rotating normal maps always breaks them (they store vectors, direction matters). This happens even without any fancy tiling, but is often harder to notice. In this case though, each tile will behave as if light is falling from unique direction per-tile.
how u do that?
Btw the vector math node also has a scale mode that multiplies all channels by the same float.
So you don't have to connect a value node to the multiply all the time
Man, you are crazy smart how you manipulate the UVs. The explanations are perfect. Now, I'm applying this approach in Unreal shader. Thank you so much !!
ohh, this is absolutely AWESOME, not just a solution but fantastically articulated so it becomes an understandable process :D, please make more gems like this :)
Very happy to find this. I am much more excited to use my image-based material libraries now without having to worry about the tiling. I snuck another map range between the white noise and the multiply and plugged it into the scale. So, now the white noise is driving the rotation and a controllable scale range of the map for even more (but subtle) variation. This is the closest a tiled image can come to being procedural! Note: You do have to adjust things based on the image itself and I think that's obvious. Micro displacement works perfectly fine as well. Thanks again!
This and any method involving sharp transitions won't work well with bump maps, and completely break displacement maps.
A more suitable approach to avoid that is using a noise output, splitting it into colors, and have each color mix in a variant (rotation and scale) of the texture.
While you will only get 4 different variations, it's certainly enough to break repetitions while maintaining smooth transitions. I use this all the time for applicable textures.
Interesting! I think you might even get away with 2 variations? AFAIR Dylan Neill has a tutorial where he uses that for ocean wave displacement.
Splitting the Color output of noise textures with a _Separate RGB_ node is definitely an underused trick.
(BTW, Fac is the same as the red channel.)
That's how far blender has come. Being taken seriously by you guys.
Great tutorial! Here's a quicker (though less understandable for teaching) way to set this up with fewer _Math_ nodes:
To distort the coordinates, plug them and the _Noise Texture_ Color into a _MixRGB_ set to *Linear Light.*
Generate tiles of random values with a 2D _Voronoi Texture._ Split the Color output with a _Separate RGB_ and use one for rotation. If you want squares, set Random to 0.
You guys are on a whole another level with your tutorials. Thank you!
As a Blender noob, uh yeah, mind blown.
As a tip you could enable the node wrangler addon and with shift ctrl click on a node this then previews this directly in the viewport.
Wow! This is amazing! Exactly what I needed and more - it blows my mind what you can do in Blender!
Thanks for this awesome tutorial! Exactly what I needed to texture large areas.
Great Manuel! You don't have to manually connect nodes to the color to preview them, just enable node wrangler addon and Ctrl+shift+click to preview them instantly. (It also cycles the preview node on the fly for every output from the selected node).
Fantastic tutorial. More blender tutz pls. Thanks.
That was absurd. Holy hell. Thanks a ton!
This is Great! Right to the point, and clearly explained why things are so.
bro my brain couldn't keep up with this
Bless this tutorial
NICE!, and you have a very relaxing voice too haha
This was the bomb. Gets a like. I also appreciate not having to look at Windows.
Nice. You could add some offsets too, so that you're not always seeing just the center of the texture.
I recommend getting an audio mixer. Maybe use de-plosive, mouth de-click, or de-ess.
Amazing tut we need more blender tuts
outstanding work!
Now Andrew (donut king) lost his opportunity to sell seamless texture node 😅. Thank a lot ! Your tutorials are the best !
It was always (and still is) available for download for free in the description of Blender Guru's "How to tile a texture" video. (Also, the .blend file also includes ones for color variation and PBR texture blending.) :P
Though, I love that Manuel is actually building it from the ground up, and explaining the maths and concepts.
This is insane. Thank you so much!
Nice tutorial. Never seen such method. Thanks 😊
Thank you! This was very helpful!
Node preview addon is good for preview each node output
You are the best!!! Thank YOU!
i hope in the future Blender make a one node for this thing, like in the Vray uvw randomizer
Again a great tutorial 🔥
Excellent video 👌
at 2:57 i am struggling, when i changed repeat to "clip" it didnt clip it into the corner, what do i dooooooooo
Great Tip.Thanks amil.
thank you
That's fantastic!
Helpful!
Thank you so much sensei :)
Thanks for the math class
Entagma ❤
Is there a way to rotate the single, whole tile based on the normal direction/fresnel of the object? That would be cool for some scenarios
Could you please somehow took into consideration on future nerd rant
How to learn houdini or any software effectively
I mean How to learn something while watching tutorial
Sounds really stupid But it will worth it
Cheers
BTW keep these blender stuff up Since I am starting to learn Blender and entagma is 😍😍😍
There is an addon from Blender Guru for free which does the Same still works!
Hi, I was wondering if it is possible to smooth the cuts between the generated tiles.
Nice, is it possible to recreate in Redshift inside Houdini or C4D?
This is very intriguing. Have you seen whether this is possible through Octane or Redshift?
There was a tutorial YEARS ago for substance designer where a guy made a "fx map" (basically drag n drop math) where he had substance calculating how the uvs were situated and had the textures adapt to them.
Is that possible with blender? Cause the dude countered stretching and weirdly rotated uvs in his thing
I dont understand what ive done wrong. By default my texture was showing as a solid colour. Seemingly an average of all the colour in my image and following these steps didnt fix that. I was only able to make it show as a texture by linking a position geometry node to the image vector which then meant that none of the steps in this tutorial functioned as intended.
great!
why is bro not using the vector math scale / instead of vector math multiply
Thanks For Sharing This👌
can you do the same with unreal engine?
Hi, I've Successfully created the non-tiling material in UE5 with the similar process, the step shows in this video: th-cam.com/video/ToQcPH_gWNk/w-d-xo.html
👍👍
And they say it's not rocket science...
Do you know if there’s a reason that Blender doesn’t have a scale vector node, with a ‘center’ control (like with the vector rotation)? To determine the origin of scaling without extra nodes.
Is it just an inconsistency in the UI?
There is a scale mode on the vector math node. Vector Rotate is kinda special and is the only one like that. Rotating has a lot more going on then a simple scale which simply multiples all 3 values by the same number.
@@GifCoDigital yes, you’re right. I’m specifically interested in having the ‘origin point’ for scaling vectors. It’s very useful inside of geometry nodes (as well as shader) if you can define the origin point of scaling. I have to do this currently by adding extra nodes, was wondering if there’s a preventing factor to having a scale vector node with origin input?
@@btn237 The "centering" in Entagmas tutorial only works so well and easy because of the modulo, so the subtiles are all 1x1 and of course the center is at 0,5.
But in many other cases the center wouldn't be there so it makes sense to leave the translation-values (with which you do the centering) free to the user.
I think it's a common misconception that a single vector automatically has a point of origin. However this point of origin would be another vector that needs to be determined somehow.
The only exception is some vector at the world origin (which still is the vector {0,0,0}).
@@ironscavenger the ‘center’ input allows the user to specify any point of origin, which *is* leaving it up to them as you put it. I’m using this input all the time in geometry nodes to rotate vectors - either by specifying a particular vector to be the origin of rotation, or doing a bit of math to determine (for example) the center of mass of a given mesh island, and then rotating according to that. As I see it, allowing this input for scaling also would have similar benefits, resulting in potentially fewer nodes and busywork - do you see what I’m talking about?
@@btn237 well no there is no technical limitation. And you can even just do it your self right now by making a node group the performance would be almost the same as a built in node as it's doing the exact same math under the hood
Maya bifrost
i can't watch node tutorial no more thk but no thk