For people who are native english speakers the subtitles are a very nice thing, but when the course have a lot of videos i do feel a indexing feature is needed, as someone learning new tecniques or concepts, would need to go forth and back sometimes,by example in the ancient Creature Factory Workshop, Andy Andy Goralczyk used simple html to make navigate between videos easier .Feels more natural to click in a hyperlink to find the specific video you need to consult, than opening a directory and search by the filename between dozens of files.
Hello Gleb, I currently follow your Space VFX tutorial and I am interested, if this method of baking texture is applicable on the results of Simple Planet and Advanced Planet - it uses lot of mix shaders and I am really unsure how to bake it properly (not to mention how to export multi-layered object such as Advanced planet with clouds and so on). I was experimenting a bit and I have results which I need to export into .osg file for our Planetarium.
Hi, thank you for amazing tutorial, the best about texture baking I ever seen. So if I really understood, we have to bake it individually. For example I want to bake the 3 things: Diffus/Normal/Roughness. If there are 5 BSDF node, then I need to do it 3x5 individually?
Thank you! Questions: Setting the bake mode to bake emissive map, and plugging the texture into the material view. Would it be the same result when I use a viewer node with node wrangler? The Ctrl shift click method is faster. Second: did I understand correctly when you say the height, roughness, etc is set to color data after baking? Why is it not non-color data? Bonus question 🙂 When I bake combined maps that include lighting, is it possible to include the effects of filmic? I'm baking models for AR with lighting baked in. I'd love the soft filmic look to carry over.
Excellent questions! 1. Yes, sort of. The thing is that the Node Wrangler compensates for scene exposure, so that you always see a consistent output from the viewer node. So if your scene exposure is not 0, the viewer (Emission) node from the Node Wrangler will not have its strength set to 1, and therefore the baked values will be wrong, since emission strength affects the baked values, but the scene exposure does not. This is why I prefer connecting everything directly to the material output for baking, as that ensures the emission strength is always 1. In hindsight, I probably should have mentioned this in the video. 2. I didn't actually mention anything about color/non-color data, so there might have been a misunderstanding. Which part are you referring to? But in any case, this is a relevant point. The short version is that you don't have to worry about it for images used within Blender that were baked from Blender. Now I warn you in advance that the actual explanation will get a bit rambly... In Blender, "Linear", "Raw", and "Non-Color" are all treated the same way, namely, the actual numeric values stored in the image file are used as is. On the other hand, "sRGB" mode will apply the reverse sRGB transfer function to the values, to bring them into linear scene referred space. Here's where the distinction is relevant... When you save your image as EXR, Blender will store the unmodified values, but if you save as PNG for instance (or bake to an 8-bit image in general), it'll apply the sRGB transform, because as far as Blender is concerned, it's just an image and images contain colour (even if that's not our actual intent), and generally you don't want to store 8-bit colour data linearly. This is a bit unfortunate, but it's the reality of how Blender handles images right now. So basically, for your EXR textures, it'll read them as Linear by default, which is correct (you can also set it to Non-Color, it'll be the same). And for your 8-bit PNGs, TIFFs, etc. it'll read them as sRGB by default, which is also correct, since it needs to reverse the transformation that was applied when saving (which ideally it wouldn't apply at all in our case, but oh well). The point of all this is that it actually has nothing to do with the baking process, but rather how the colour data is stored, and also, you don't really need to worry about it, since Blender's default behaviour when reading the images matches the way it saves the images. When using the resulting images outside Blender, you can treat EXRs as raw data, and anything else as sRGB. 3. Yes, it is possible, though not advisable. If you check "Save As Render" when saving the texture, it'll apply the transformations set in the Color Management panel (not if you save as EXR though. With EXRs Blender always stores scene referred data, without any transformation). But the thing is that with Filmic applied, the only thing you can do with the image is display it. Any other operation will be broken, as you don't have the actual scene referred data. Ideally, you'd want to bake the textures without Filmic, and use Filmic as the final step in your colour pipeline in your actual application. Basically, you only want to apply Filmic to your final rendered result, be that a scene in Blender, a frame in a game, or indeed an AR application. You might get away with it if your app does literally nothing to the image, though in practice that is pretty much impossible, as even something like texture sampling with any filter other than nearest (say, linear, or cubic), will be ever so slightly wrong. Well, that turned out a bit longer than I expected. Anything related to colour management gets really dense, haha. Anyway, I hope I answered your questions clearly. Let me know if you have any other doubts :)
@@LunaRood The care and attention you gave to answering my questions is impressive. Thanks a lot, and what a great tutorial! Very good explanation for the questions I had. The node wrangler thing for example. I would have never known that without tons of testing on my own.
I was dabbling in baking textures for the first time ever tonight and I'm not having much luck. The object I'm working with is a single object for inner and outer walls for a building. The inner walls use a procedural texture, but the outer walls use a couple of image textures. When I go to bake, it seems to work okay, it runs the texture bake process and gives me a progress bar at the bottom of the window, but my procedural texture just end up black and my mixed texture for the exterior seems to just be grabbing a small section and turning that into the new texture. I've gone through 3 different tutorial videos and I'm just not able to make this work. No idea what I'm doing wrong.
Hi. I'm wondering about such a thing. We can bake and output textures in the shading panel of the blender. For example, I created a lava flow, animation in the shading panel, so how do I output it? Also my main goal is to dump this output into an unreal engine.
Hi! this helped me a lot already as Im trying to learn Blender as a Designer user for videogames. This video solved the baking process for me as I didnt have an idea how to proceed. However, as Luca mentions in the video, one idea of baking (and the one Im after) is to use in videogames, but the video doesnt explain how to make the output tileable, which is requirement in games. I dont know if this is possible natively from Blender, I was informed it was but it is not covered in the video. Could you make a video on that? or point to a source on the matter? Thanks!
Yes! You just need to bake out the vector data that gets plugged into the Displacement socket of the Material Output node, by connecting it to the surface output, just like I demonstrated with the colour. You need to bake to a 32-bit image, and save as EXR. Then you can use the texture in the Displacement modifier, by setting the direction to "RGB to XYZ". So it's basically the same as I showed with the height map, except that you bake the final vectors themselves. Sanctus made a great video demonstrating exactly this use-case: th-cam.com/video/fbE9UowvOPM/w-d-xo.html
if you plag anything except shader in material outup baking wont work i did like you plag the color to material output but baking didnt worked then i plugged the prinsibled bsdf shader and it worked
Ok your shader tree is much larger and you seem to have something infinitely more advanced than i have. Why does it say my bake is going to take 24 hours? It is merely a voronoi with some color and translation changes. It can skip through the generations in fraction of a second. I find this odd. Ill let it run tho.
I would recommend to not use pixels like 2000:1000 as they are not in the power of 2 instead you should use 2048:1024 or 4096:2048 as they mean 1k, 2k etc
If you have any questions or suggestions, feel free to leave a comment!
For people who are native english speakers the subtitles are a very nice thing, but when the course have a lot of videos i do feel a indexing feature is needed, as someone learning new tecniques or concepts, would need to go forth and back sometimes,by example in the ancient Creature Factory Workshop, Andy Andy Goralczyk used simple html to make navigate between videos easier .Feels more natural to click in a hyperlink to find the specific video you need to consult, than opening a directory and search by the filename between dozens of files.
How to get these procedural texture for free
@@nonnomenes1590 lol
Hello Gleb,
I currently follow your Space VFX tutorial and I am interested, if this method of baking texture is applicable on the results of Simple Planet and Advanced Planet - it uses lot of mix shaders and I am really unsure how to bake it properly (not to mention how to export multi-layered object such as Advanced planet with clouds and so on).
I was experimenting a bit and I have results which I need to export into .osg file for our Planetarium.
what to do if we have multiple uv maps for normal maps???
Probably the best texture baking tutorial for blender ❤️
That's awesome to hear! :D
@@LunaRood hope to see more stuff from you soon
This is amazing! I was just looking for how to bake textures and wasn't expecting the Eevee displacement part, that's insane! Thank you so much!
Thank you Luca. these tips were awesome.
Glad you liked it! :D
Can you please make a tutorial on how you did the wall texturing it looks amazing.
It's from his course
this tutorial is perfect. well explained with good demonstrations. great job
Thank you! :D
simple and clean tutorial
You are a life saver my guy!
Thank U so much for the tutorial. It helped me
A big thank you! Very helpful. Thanks again!
This is an excellent tutorial. Thank you.
The amount of steps is pretty crazy considering 99% of it could be automatic. Im.not complaining at the tutorial just at blender
marmoset toolbag
I'd like this texture as freebie Gleb!!!
i needed this so much!!!!!!! thanks!
Hi, thank you for amazing tutorial, the best about texture baking I ever seen.
So if I really understood, we have to bake it individually.
For example I want to bake the 3 things: Diffus/Normal/Roughness. If there are 5 BSDF node, then I need to do it 3x5 individually?
Thank you! Questions: Setting the bake mode to bake emissive map, and plugging the texture into the material view. Would it be the same result when I use a viewer node with node wrangler? The Ctrl shift click method is faster.
Second: did I understand correctly when you say the height, roughness, etc is set to color data after baking? Why is it not non-color data?
Bonus question 🙂 When I bake combined maps that include lighting, is it possible to include the effects of filmic? I'm baking models for AR with lighting baked in. I'd love the soft filmic look to carry over.
Excellent questions!
1. Yes, sort of. The thing is that the Node Wrangler compensates for scene exposure, so that you always see a consistent output from the viewer node. So if your scene exposure is not 0, the viewer (Emission) node from the Node Wrangler will not have its strength set to 1, and therefore the baked values will be wrong, since emission strength affects the baked values, but the scene exposure does not. This is why I prefer connecting everything directly to the material output for baking, as that ensures the emission strength is always 1. In hindsight, I probably should have mentioned this in the video.
2. I didn't actually mention anything about color/non-color data, so there might have been a misunderstanding. Which part are you referring to? But in any case, this is a relevant point. The short version is that you don't have to worry about it for images used within Blender that were baked from Blender. Now I warn you in advance that the actual explanation will get a bit rambly... In Blender, "Linear", "Raw", and "Non-Color" are all treated the same way, namely, the actual numeric values stored in the image file are used as is. On the other hand, "sRGB" mode will apply the reverse sRGB transfer function to the values, to bring them into linear scene referred space. Here's where the distinction is relevant... When you save your image as EXR, Blender will store the unmodified values, but if you save as PNG for instance (or bake to an 8-bit image in general), it'll apply the sRGB transform, because as far as Blender is concerned, it's just an image and images contain colour (even if that's not our actual intent), and generally you don't want to store 8-bit colour data linearly. This is a bit unfortunate, but it's the reality of how Blender handles images right now. So basically, for your EXR textures, it'll read them as Linear by default, which is correct (you can also set it to Non-Color, it'll be the same). And for your 8-bit PNGs, TIFFs, etc. it'll read them as sRGB by default, which is also correct, since it needs to reverse the transformation that was applied when saving (which ideally it wouldn't apply at all in our case, but oh well). The point of all this is that it actually has nothing to do with the baking process, but rather how the colour data is stored, and also, you don't really need to worry about it, since Blender's default behaviour when reading the images matches the way it saves the images. When using the resulting images outside Blender, you can treat EXRs as raw data, and anything else as sRGB.
3. Yes, it is possible, though not advisable. If you check "Save As Render" when saving the texture, it'll apply the transformations set in the Color Management panel (not if you save as EXR though. With EXRs Blender always stores scene referred data, without any transformation). But the thing is that with Filmic applied, the only thing you can do with the image is display it. Any other operation will be broken, as you don't have the actual scene referred data. Ideally, you'd want to bake the textures without Filmic, and use Filmic as the final step in your colour pipeline in your actual application. Basically, you only want to apply Filmic to your final rendered result, be that a scene in Blender, a frame in a game, or indeed an AR application. You might get away with it if your app does literally nothing to the image, though in practice that is pretty much impossible, as even something like texture sampling with any filter other than nearest (say, linear, or cubic), will be ever so slightly wrong.
Well, that turned out a bit longer than I expected. Anything related to colour management gets really dense, haha. Anyway, I hope I answered your questions clearly. Let me know if you have any other doubts :)
@@LunaRood The care and attention you gave to answering my questions is impressive. Thanks a lot, and what a great tutorial! Very good explanation for the questions I had. The node wrangler thing for example. I would have never known that without tons of testing on my own.
@@rileyb3d Glad to help! :)
@@LunaRood Are these materials tileable after baking them? It's hard to tell from the images :D
Thank you for the tutorial! I have a question, I am using principled texture with animated emission, how can I bake that?
Awesome, thanks
Thank you!
10:48
wooooooooooow!
It is Mega Cool
Thank you.
Excellent
what to do if we have multiple uv maps for normal maps???
Thank you
I was dabbling in baking textures for the first time ever tonight and I'm not having much luck. The object I'm working with is a single object for inner and outer walls for a building. The inner walls use a procedural texture, but the outer walls use a couple of image textures. When I go to bake, it seems to work okay, it runs the texture bake process and gives me a progress bar at the bottom of the window, but my procedural texture just end up black and my mixed texture for the exterior seems to just be grabbing a small section and turning that into the new texture. I've gone through 3 different tutorial videos and I'm just not able to make this work. No idea what I'm doing wrong.
UVs?
I am going through the same it just won't work for me idk why I've looked everywhere
Hi. I'm wondering about such a thing. We can bake and output textures in the shading panel of the blender. For example, I created a lava flow, animation in the shading panel, so how do I output it? Also my main goal is to dump this output into an unreal engine.
Thanks!
When baking these for Unreal Engine, should I enable 32-bit Float or not?
Hi! this helped me a lot already as Im trying to learn Blender as a Designer user for videogames. This video solved the baking process for me as I didnt have an idea how to proceed.
However, as Luca mentions in the video, one idea of baking (and the one Im after) is to use in videogames, but the video doesnt explain how to make the output tileable, which is requirement in games. I dont know if this is possible natively from Blender, I was informed it was but it is not covered in the video.
Could you make a video on that? or point to a source on the matter? Thanks!
I tried to make a an Image Texture with transparency by clicking Alpha when making a new image, it just turns black for me.
speaking of baking procedural stuff... can you bake vector displacement to geometry?
and i mean make some crazy procedural shape with displacement, not just 3d bump
Yes! You just need to bake out the vector data that gets plugged into the Displacement socket of the Material Output node, by connecting it to the surface output, just like I demonstrated with the colour. You need to bake to a 32-bit image, and save as EXR. Then you can use the texture in the Displacement modifier, by setting the direction to "RGB to XYZ". So it's basically the same as I showed with the height map, except that you bake the final vectors themselves. Sanctus made a great video demonstrating exactly this use-case: th-cam.com/video/fbE9UowvOPM/w-d-xo.html
@@LunaRood okay, thank you very much! i will definitely save this video and the one you mentioned somewhere on my hard drive :)
if you plag anything except shader in material outup baking wont work i did like you plag the color to material output but baking didnt worked then i plugged the prinsibled bsdf shader and it worked
Ok your shader tree is much larger and you seem to have something infinitely more advanced than i have. Why does it say my bake is going to take 24 hours? It is merely a voronoi with some color and translation changes. It can skip through the generations in fraction of a second. I find this odd. Ill let it run tho.
Can you do this with any model?
Yep, as long as you have UVs, the shape of the model doesn't matter.
I would recommend to not use pixels like 2000:1000 as they are not in the power of 2 instead you should use 2048:1024 or 4096:2048 as they mean 1k, 2k etc
And please make a tutorial on HOW TO MAKE Mossssssssssss ?! 🙄
As a matter of fact, the procedural texturing course covers the creation of that whole wall, including the moss! ;)
@@LunaRood can you leave a link for it? I cant really find the tutorial..
@@alpersevdin9296 Links are in the video description.
Thank you