Since the recording Blender 2.9 was released with the denoising shown in this video included in the default installation (no need to install anything extra), and it supports all hardware! Find it in Render Panel: Sampling: Denoising. Optix (faster) is for compatible Nvidia cards only and OpenImageDenoise (slower) works with any CPU. Have fun ;)
Since recording this video, some big news was announced: 1. Optix Denoiser might actually make it's way into Blender officially, removing the need for an addon. (nice!) 2. Intel's Denoiser (not shown in this video) will be included in Blender 2.81! Early results look very promising: twitter.com/BlenderBrit/status/1164306780089978881
@@fabbrobbaf not yet, unfortunately. It still affects animation as a sequence of single footages, so flickering like in Optix remains. Actually it is already in downloadable alpha version of 2.81.
Andrey Sokolov I made a simple turnaround with susanne and it actually worked (by far better than the current default denoiser) but I wanted to know if it worked in more complex situations. So you say no...😔
Sounds goofy... But I regularly do denoising on video (with Neat)... And then I add just a tad of film grain. :D In short... While technically grain is basically the noise equivalent in film, there is a big difference between how film grain appears vs video noise. And it does have the perk that it dithers what otherwise could end up as color banding.
I've been testing the intel denoiser myself and I'd say it's on par with Optix, not convinced it's better though. The downsides of Optix certainly appear to be present in Intel's one in a like for like situation. The best difference is that the Intel Denoiser is implemented as a compositing node allowing you to work some node magic to preserve details and, of course, it's open source so way more Blender friendly.
@Blank Person what, the ram dosn't change anything, i have 8gb of ram too, you need, a nice gpu, or a Nice Cpu, i reccomend you a nice gpu, if you want to gaming too... also the gpu i have render in Cycles in 4 seconds a really big scene, instead the viewport it renders istantly, my gpu is only 170$ Gygabyte Nvidia GeForceGTX 1650 super
@@blk9365 stupidest shit ive heard ever ram does matter for 3d anything 16gb minimal, its neccessary for a few reasons mainly for rendering mid to high poly scenes i myself have 64gb which allows me to load in up to 128 billion polygons granted that isnt really neccessary for small animations but for larger ones its neccessary
As far as I know, he pronounced it the right way.... Edit: Alright, he doesnt th-cam.com/users/redirect?stzid=UgxfpHGcondVv8w9ASZ4AaABAg.8ytJPMmW4HH8ytWq_1V0Aj&event=comments&html_redirect=1&q=http%3A%2F%2Finternational.download.nvidia.com%2Fpartnerforce-us%2FBrand-Guidelines%2FNVIDIA_LogoGuidelines.pdf&redir_token=hVAoCzZiXlcKS2ToBGuIDNoWdNt8MTU2NjQ2OTcyM0AxNTY2MzgzMzIz
Sometimes I like to go back to these old videos and see how far has blender gone from here It’s genuinely cool to see how much it has improved and the amount of features that have been added over the years
5 ปีที่แล้ว +825
How is nobody here freaking out at how he says Nvidia? Neh-vidia? What madness is this?
I'm not even an animator or artist, I have never used Blender in my life. But I seem to always watch videos of yours that I come across. You do a great job at making me interested in changing career paths into this industry.
+1 for Neat Video plugin. This plugin is simply the best on the market right now. We were able to render 12+ animations via Octane, and with the samples down to as low as 1500 in some shots. Then a quick denoise, and it was like we had rendered at 25k.
Man, I just read your "This week in 3D" mail for this week... That donut looks insane! I began to learn how to use Blender with your first blender tutorial and I haven´t even downloaded 2.8 yet because I want to do it when your new tutorial comes out! Thank you so much for making this videos free for us :D
Saw your AI vid yonks ago and am still drooling ... have used Neat since for-ever for my astrophotography ... I'm fully invested in your superb channel and love your passion for Blender (which I downloaded way back in late 2001... maybe 2002(?) and struggled with until about 5 yrs ago... as a part time hobbyist only). Imagine my genuine surprise to hear I've been incorrectly pronouncing my gpu as "EN-vidia" rather than, as I can now appreciate, should be referred to as "NAH-vidia". Cheers, guru... much love :)
This is an incredibly well timed video for me, as I was literally today and yesterday trying to fight blender's denoiser to not destroy textures. Now I have some new strategies to try. Thanks!
Finally! My laptop has been struggling with renders even though it is pretty good. Denoising is so bad, I couldn't fix those marks, so I had to use huge amounts of samples. And my laptop was burning.
Andrew why don´t you mention two of the most important options in Blender? - Open Image Denoiser: already in master for 2.81, available for months in the Theory Studios build and our own build, Bone-Studio, in graphical.org - Temporal aware blender denoise, already present in 2.80 Both are way better than the options you present here IMO
@@blenderguru Use our build, it's available in graphicall, in fact tonight (in a few hours in spain) I'll present a new feature that will be implemented in 2.81 or 2.82, but we just implemented it in our branch, stay tuned to our channel, the live will be in english :)
Could you link me to a good temporal aware blender denoise video, article or something? I have never heard about it and in a quick search I couldn't find anything
Guru, please do a video explaining different roles in the industry, and their responsibilities. For example, what is 3D generalist, Technical artist, etc. As someone who is interested in entering this field I would love to get some insight and focus on how stuff works. Keep up the Amazing work you do. Thanks for everything!!
Thanks so much Andrew i used almost three days to render single animation with duration of 13 seconds for my TH-cam channel, this trick is all i wanted. Can't wait to use them for my next videos.
I have three denoisers In my tests: Fastest || BlackMagic Resolve - RedGiant - NEAT || Best results in tough situations If you don't mind rendering at higher samples pick the fastest, if you have want to render at low noise sample use the others. That's why I like RedGiant. It gives me enough to get by but is still that tad faster on longer sequences.
Optix temporal denoise since Blender 3.1 is amazing for animations, more people should be talking about it! No one seems to have tested it because there is no documentation. I got this far: Set render output to EXR_Multilayer; turn on vector and denoise_data layers and render an animation. Then open up python console and enter: bpy.ops.cycles.denoise_animation() It goes through each saved image file for each frame (so its best to test with just a few frames), and overwrites the layer "View Layer.Combined.RGB" with a denoised version. (I think you have to have an RTX card, Optix as the Cycles Renderer in System settings, and turn denoise on in render tab, select Optix, then turn off again - I don't know if all this is necessary, but when I initially tried with Cuda selected in system settings, and OpenImageDenoise in render tab, bpy.ops.cycles.denoise_animation() worked, but the results were poor. So I did everything I could to ensure Optix was being used, not any other denoiser, and got much better results. Open the EXR image sequence in Davinci Resolve, go to Fusion page, and on the MediaIn node, choose the layer RenderLayer.Combined.
In 2.81 there's a denoising node in the compositor that can render the video with retaining 90% details. But it takes a bit tricky node set-up to get the best results. What's mind-blowing is that you can render at 1 sample and get a 200 sample looking output
By the way, Blender 2.81 (or currently available on the site beta build) will have Intel Open Image Denoise implemented in it as a compositor effect, meaning you can apply it ontop of "progressive refine" feature. Or you could save your render for later with right data (diffuse pass and normal map pass) and denoise it later. Or you could go get a render from some other render image/software with those passes and put them into that compositor node and denoise any render in Blender. OR you can feed it a real camera image and leave it to try to denoise that. CRAZY STUFF.
I appreciate all the work you do for the side by sides. You don't get enough credit for that sometimes. Thanks for the info, especially on glass, the white pixel fly aways drive me nuts, so I'm looking forward to these possible solutions.
In case you are as lost as I me: The integrated denoising feature in Blender 2.9+ is actually located in the tab called "View layer properties", at the very end, and not in the "Render properties" tab.
Thank you for the info on the D-Noise, I did some research and ended up using Bone-Studio - BoneMaster - Blender 2.8x which includes OptiX AI which works with GTX cards. Now the render time has reduced considerably and there is almost no noise even at 128 samples in Cycles.
Why don't y'all just hop into the program and experiment and do your own best logical approach to achieving that, you can only follow tutorials step by step for so long. No need to delay and wait for someone else to figure out the steps for you.
Thanks! - That's a really great look at denoising - it's good to hear the ups and downs (so much of AI raytracing tech is sales talk these days). Also good to hear the comparisons you've mentioned - it'll be fascinating to see how this all settles in few years (to think I used to ray trace on workstations 20+ years ago and we'd wait the night a basic image... ! )
Hey, I just wanted to say that I've never loved another man before, so this is all new to me, but your entire channel is just making me feel very different feelings about you.
Yep. Great timing :P I actually recorded this 6 weeks ago, and when I looked there was no easy way to test the intel denoiser back then. I'll eagerly await 2.81 to do a comparison :)
Blender Guru have you tried the bone studio Blender build in GraphicAll, comes with the denoiser as well as adaptive sampling which i think you will love, it decreases render times a lot in some scenes
16 - 32 gb of ram ryzen 3600 or above, or intel i7 (newest gen is good, but last gen is not much worse) GTX 2080 and a decently sized HDD *i dont know much about laptops, but if you can find one with these specs it will work great. You could also just invest in a Render Farm subscription. At work i have an old PC from around 2012, and it works decently for blender. and then when i need to render i send it off to the Render Farm*
People don't render on laptops! At least not those who are serious about it. Laptops are relatively slow, and would run hot for long periods of time if you rendered on them. Not good. Same goes for complex simulations. The ideal rendering machine is one with many cpu cores and lots of ram, although exactly how much depends on how complex your scenes are and whether you are rendering a single frame or a sequence of frames for animation. Resolution also matters. For a desktop pc, consider 8 cores/16 threads and 16 GB of ram as a starting point. Enthusiasts will be looking at 16 core/32 thread or 32 core/64 thread machines with 32 - 128 GB of ram. If you're really serious, then you'll probably build yourself a small render farm. However, if you're determined to manage with just a laptop, then using a cloud render service might be the best option. Rendering on gpus is also an option, but they are limited by the amount of graphics ram that they have. A big, complex scene with many light sources, materials and complex reflections may not fit into the gddr available. A gpu cannot access system ram, which is why studios use render farms comprising of Xeon processors and huge amounts of ram per machine. Very expensive, but effective. However, it seems that most people seem to manage rendering on gpus, most of the time. Right now, I would be more interested in the OpenGL performance of a gpu, as a guide to its EEVEE performance for fast render preview.
About the standard denoiser: I found that the "blotchy patches" you speak of are usually more pronounced when using "randomize grain", which in fact should be turned off when using the standard denoiser.
FYI: D-noise doesnt work in win7, it will freeze your blender. But if you kill the process after you click quick denoise, it will make it work. A simple trick that worked for me.
Neh-Vidia honestly sounds so much cooler than En-Vidia. Also great video Neatvideo is seriously magic and like almost required vfx tool I'm glad you mentioned it!
Am I really the only one wondering about his pronunciation of Nvidia? I mean, "nahvidia"? What?!? Isn't it pronounced "N - vidia"? Kinda like "envy" - dia.? Other than that, interesting video. I believe I saw a video mentioning that 2.81 will get an Intel made denoiser. Be fun to see another comparison then.
5 ปีที่แล้ว
No, you're not the only one, and you're right, the correct pronunciation is "en-vidia"
He saved us earlier with filmic render, pbr materials and now he shapes blender again with better denoiser! Did you notice, cycles improves very slowly, other parts of blender is running very fast! Still waiting for so many things for cycles, exclude-include future or better interior rendering algorithm like vray or corona etc...
Man I watched that animal motion presentation at the time but only started out in blender and watching your channel a few weeks ago! Nice how things reconnect.
Lol, when you said it doesn't work well with animations, my first thought was "well, then you should get in contact with some video codec people, because quite a few video codecs have temporal denoisers built into their routines. But, you figured that one out already. Now, may be a good time to get them involved in making a post processing plugin for blender that does just that step :)
I think that the noise which especally is in the shadows while animation, adds more realism. Its like a real camera. Just rec something in dark areas, and you will notice the noise. I would say, that noise is not always bad, especally when it comes to relism. And you should know, that realism is not perfection, its the opposite: ITS IMPERFECTION. Awesome video.
Just in case somebody isn't aware of it, TH-cam has a 2x speed function, meaning you'll have to spend just half as much time watching any video. It's especially useful for videos where people speak very slowly
I prefer remove denoise in the pos-production, you change your pass layer with noise e apply denoise pass by pass, its keep more information, BUT it is very cool, thanks for this information :D
Might not be the first to mention this, but the two denoisers that I would like to see added to this comparison would be Topaz Labs Denoise AI (for stills), and Red Giant Magic Bullet Denoiser III (for animations).
Hey Andrew, I don't know if you already knew this but for emission objects (I mean not blender lamps but solid objects) there is a node called light falloff that can be connected to the strength of the light. It reduces the strength by providing that the light strength doesn't reach an infinite value (quite common in blender cause of noise). It reduce much much the noise. If you didn't know I suggest you to try it, it changed my life 😅. Let me now if this was helpful
First of all, congrats on the sponsorship. I haven't been keeping the tightest track on your channel, so for all I know you could have had one for months, but good to see you moving up the chain all the same. So I downloaded D-Noise a few months ago, but I never used it because my friend kept tell me, "just check it and go. You don't need anything else." Needless to say, it didn't work, and I've since learned that if there is a Download button attached in the user preferences, it's usually a good idea to click it. But like you mentioned, it does smudge out edges, which is something I never liked. But, better then nothing. So what sample range do you generally work at? Obviously each project is general, but as a general rule of thumb. Everyone I've talked to insists that it's a different number, and that their number is the universal standard (Those being from 30 Samples to 1,000+). Currently I have 256 as my default, but I'm not sure if I should bump it up to 500 again, or would that be diminishing returns. I like to go more photo realistic, but I'm definitely not near the point of the kitchen in this video. :P
Nice! I immediately thought of the trail neat photo denoiser I've used, didn't know they had one for video as well. Also had no idea how good it really is as far as denoisers go
I remember watching this a while back. Such a thorough review of the denoisers. Thanks so much again Andrew. So 3 years later, what do you all think is considered the better denoisers on the market (or even photo enhancers)...especially with all the ongoing A.I technology that is out? There are so many A.I. apps on mobile too. Are there any updated videos out regarding A.i. enhancing topic such as this (not the A.I. art generating topics *yuck*). Would love to find a comparison between Blender, Optix, Topazlab, Intel, etc. 3 years later.
I predict in the future a lot of our workflow and modeling will probably be just setting landmark objects, and giving each object metadata tags for an AI renderer. The 3D interface will most likely only be needed to assist in setting a temporospacial tag for each object and separating them via color for our eyes. The AI renderer will have a generalized architecture, probably generative adversarial, and have a separate word-association layer for us humans to interface with its settings. Datapacks will be downloadable that have been trained on specific images that can set color grading and textural effects. Multiple datapacks will probably use image masking to give only specific objects neurostyle textures and materials.
The v-ray denoiser works good for animation as it takes into account the previous and next frame. It is slower though. Also not sure how the v-ray implementation in blender works but you can denoise renders with the standalone tool. You only need to generate some passes, then if you generate the denoise passes it could work with renders from other renderers. Its quite fast when in GPU - 4 to 10 seconds maximum at 1080. We'll see how it gets developed. I think Neat video is amazing solution for now - it can also deflicker if I'm not mistaken. And we'll see soon the Intel's denoiser so that would bring more advancement too.
If you dont know already, in Blender 2.8 32x32 is now the fastest tile size to use with the GPU now in cycles, although the stock denoiser still prefers to be faster with larger tile sizes.
4:09 Andrew with Blender Denoiser
4:11 Andrew with Optix Denoiser
He over used the denoiser on himself
Hahahahahaha
lol
lol..instant haircut. He even wearing the same shirt.
when i saw that andrew change i go down to read comments and i find this first comment. hahahahhahaha
Since the recording Blender 2.9 was released with the denoising shown in this video included in the default installation (no need to install anything extra), and it supports all hardware! Find it in Render Panel: Sampling: Denoising. Optix (faster) is for compatible Nvidia cards only and OpenImageDenoise (slower) works with any CPU. Have fun ;)
Any news in the denoising for archwiz aniimation department? :)
Thanks, appreciate the update!
does it work on gtx cards????????????????????????????????????
@@blackknight47 Works with my GTX 1060, so I guess it works on gtx cards!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
is there any AI denoiser compatible with AMD ?
Since recording this video, some big news was announced:
1. Optix Denoiser might actually make it's way into Blender officially, removing the need for an addon. (nice!)
2. Intel's Denoiser (not shown in this video) will be included in Blender 2.81! Early results look very promising: twitter.com/BlenderBrit/status/1164306780089978881
😍
ODIN is actually already in master.
I have an all AMD build :/
@@nerdroberts1927 Intel's denoiser works with AMD, any CPU that supports SSE4.2
Why wait? It's available already.
And with blender 2.81's AI denoise node (intel's openimagedenoise based) we're all saved!!! Great work as always :)
Ur awesome
* turns on all 3 denoisers at once *
Even with animation?? It would be great
@@fabbrobbaf not yet, unfortunately. It still affects animation as a sequence of single footages, so flickering like in Optix remains. Actually it is already in downloadable alpha version of 2.81.
Andrey Sokolov I made a simple turnaround with susanne and it actually worked (by far better than the current default denoiser) but I wanted to know if it worked in more complex situations. So you say no...😔
watched this video in 2x to make rendering 6272% faster
I don't think it works that way, but I respect the effort 👍
Help How to remove that shadow when setting the camera ??
@@movietheatresherrif7085 could you be more specific?? which part of the video?
@@heyitsmejm4792 I mean my videos when I try to adjust my camera there are some shadow which overcome my objects!
@@movietheatresherrif7085are you doing an interior scene?
After denoising, don't forget the filmgrain.
nice nice
Sounds goofy... But I regularly do denoising on video (with Neat)... And then I add just a tad of film grain. :D
In short... While technically grain is basically the noise equivalent in film, there is a big difference between how film grain appears vs video noise. And it does have the perk that it dithers what otherwise could end up as color banding.
This aint a joke. Dithering. Look it up. Makes the scene look more photorealistic. Within reason of course, nobody's advocating the 8mm look
@@jmalmsten Interesting! I want to try it out. Off to go check out Neat.
in after effects anywhere betwen 2 to 5 % grain is a good amount , it already dithers but doesnt deteriorate
Blender 2.81 will have INTEL's denoiser. It's supposed to be better than Optix.
I've been testing the intel denoiser myself and I'd say it's on par with Optix, not convinced it's better though. The downsides of Optix certainly appear to be present in Intel's one in a like for like situation. The best difference is that the Intel Denoiser is implemented as a compositing node allowing you to work some node magic to preserve details and, of course, it's open source so way more Blender friendly.
@@BlenderBrit Sorry I forgot to clarify that it's better for edges, as it uses the normals in the denoising as well.
at first I thought Andrew is talking about that
yes, I assumed he was going to be talking about it in this video but apparently not, lol
So no AMD denoiser? I suppose it's only gonna work on intel CPUs... oh well neat video still seems like a good option
Someday it'll be nice to get back to Cycles, but for now, I'm loving the 2-6 second render times with EEVEE.
Eevee is so much non-realistic with horrible shadow and low quality material
@Blank Person what, the ram dosn't change anything, i have 8gb of ram too, you need, a nice gpu, or a Nice Cpu, i reccomend you a nice gpu, if you want to gaming too... also the gpu i have render in Cycles in 4 seconds a really big scene, instead the viewport it renders istantly, my gpu is only 170$ Gygabyte Nvidia GeForceGTX 1650 super
@@blk9365 stupidest shit ive heard ever ram does matter for 3d anything 16gb minimal, its neccessary for a few reasons mainly for rendering mid to high poly scenes i myself have 64gb which allows me to load in up to 128 billion polygons granted that isnt really neccessary for small animations but for larger ones its neccessary
@@andrexsintoval3776
Now we were talking about rendering time, the most important thing is the Graphics card
@@blk9365 yeah but if your gonna be using mid to high poly assets its important to have the ram to do so
I can't wrap my brain around the way he says NVIDIA
NAVIDIA
As far as I know, he pronounced it the right way....
Edit:
Alright, he doesnt
th-cam.com/users/redirect?stzid=UgxfpHGcondVv8w9ASZ4AaABAg.8ytJPMmW4HH8ytWq_1V0Aj&event=comments&html_redirect=1&q=http%3A%2F%2Finternational.download.nvidia.com%2Fpartnerforce-us%2FBrand-Guidelines%2FNVIDIA_LogoGuidelines.pdf&redir_token=hVAoCzZiXlcKS2ToBGuIDNoWdNt8MTU2NjQ2OTcyM0AxNTY2MzgzMzIz
I was searching for someone to talk about this. Thank you kind sir!
@GBRL By law of the internet and entropy, this was bound to happen
That's the first time I've ever someone say it that way. I had to pause and look in the comments to see if someone else pointed it out.
Sometimes I like to go back to these old videos and see how far has blender gone from here
It’s genuinely cool to see how much it has improved and the amount of features that have been added over the years
How is nobody here freaking out at how he says Nvidia? Neh-vidia? What madness is this?
Nowhere near as bad as mrwhostheboss and his Fuchsia pronunciation.
That’s how I would say it...How would u guys say it?
Patrick Edwards It's actually pronounced "envidiah"
En-vi-di-a
I was actually shocked too. I have never heard someone saying it this way. From all the beginning i have always pronounced it as "Envidia"
I'm not even an animator or artist, I have never used Blender in my life. But I seem to always watch videos of yours that I come across. You do a great job at making me interested in changing career paths into this industry.
+1 for Neat Video plugin. This plugin is simply the best on the market right now. We were able to render 12+ animations via Octane, and with the samples down to as low as 1500 in some shots.
Then a quick denoise, and it was like we had rendered at 25k.
Man, I just read your "This week in 3D" mail for this week... That donut looks insane! I began to learn how to use Blender with your first blender tutorial and I haven´t even downloaded 2.8 yet because I want to do it when your new tutorial comes out! Thank you so much for making this videos free for us :D
5:11 your hear grew considerably through that little transition there.
Saw your AI vid yonks ago and am still drooling ... have used Neat since for-ever for my astrophotography ... I'm fully invested in your superb channel and love your passion for Blender (which I downloaded way back in late 2001... maybe 2002(?) and struggled with until about 5 yrs ago... as a part time hobbyist only). Imagine my genuine surprise to hear I've been incorrectly pronouncing my gpu as "EN-vidia" rather than, as I can now appreciate, should be referred to as "NAH-vidia". Cheers, guru... much love :)
This is an incredibly well timed video for me, as I was literally today and yesterday trying to fight blender's denoiser to not destroy textures. Now I have some new strategies to try. Thanks!
Everytime you say "neh-vidia" an angel loses their wings.
My buddy would call it Ni-vin-dia . Never understood where he saw the second N.
@@LetTheWritersWrite Hahaha! Hell, if you're gonna say it wrong then you might as well go all out.
why
Man, what a lot of work to present the the edited comparison content in this video! Hugely impressive.
No one:
Andrew: _NEH VIDIA_
Finally! My laptop has been struggling with renders even though it is pretty good. Denoising is so bad, I couldn't fix those marks, so I had to use huge amounts of samples. And my laptop was burning.
Andrew why don´t you mention two of the most important options in Blender?
- Open Image Denoiser: already in master for 2.81, available for months in the Theory Studios build and our own build, Bone-Studio, in graphical.org
- Temporal aware blender denoise, already present in 2.80
Both are way better than the options you present here IMO
Honestly I recorded this a good 6 weeks ago, and I couldn't find any way to make Intel's denoiser work.
@@blenderguru Use our build, it's available in graphicall, in fact tonight (in a few hours in spain) I'll present a new feature that will be implemented in 2.81 or 2.82, but we just implemented it in our branch, stay tuned to our channel, the live will be in english :)
@@qozia1370 calm down, son.
Could you link me to a good temporal aware blender denoise video, article or something? I have never heard about it and in a quick search I couldn't find anything
@@GCAF_ Have you find something about that topic?
Impressive the special effect at 4:11 🤪. Very instructive as usual
What's amazing is that you can use the denoise node in the compositor not just for 3d renders but for 2D Real Life Images too! :O
You upload this the DAY i finish my project and search desperately for rendering help. THANK YOU SOSOSOOSO MUCH
th-cam.com/video/O0BtMVlDXmc/w-d-xo.html
STILL WAIITING FOR THE DONUTS THAT I ORDERED TO ARRIVE...
haha
Yeah. We need the doughnuts!
u are butiful. kis me pls
@@terryd8692 weed need donuts too.
@@Shortydesbwa If its coming all the way from the states my donut won't be very fresh. I need a nice local doughnut.
3:18 Andrew "For seemingly no downside."
Me " *watching the reflection distortion* 🤔"
Guru, please do a video explaining different roles in the industry, and their responsibilities. For example, what is 3D generalist, Technical artist, etc. As someone who is interested in entering this field I would love to get some insight and focus on how stuff works. Keep up the Amazing work you do. Thanks for everything!!
Mid video haircut at 4:10 :D
Yup. Only one month apart too :P
It was left on the "cutting room floor". :D
As someone else mentioned, it's due to the difference in denoisers.
@@blenderguru I thought you went to exercise and then took a shower there in the cut
Thanks so much Andrew i used almost three days to render single animation with duration of 13 seconds for my TH-cam channel, this trick is all i wanted. Can't wait to use them for my next videos.
Have you tried the Red Giant Magic Bullet Denoiser for Premiere? Seems to work quite well
I have three denoisers
In my tests:
Fastest || BlackMagic Resolve - RedGiant - NEAT || Best results in tough situations
If you don't mind rendering at higher samples pick the fastest, if you have want to render at low noise sample use the others.
That's why I like RedGiant. It gives me enough to get by but is still that tad faster on longer sequences.
For hundreds of dollars...no thanks.
4:08 Andrew’s hair is so magical
You didn't mention the new Blender 2.81 INTEL's denoiser node.
How well does that work with animation?
Much more stable but still not fully there.
will it work for animation? or still frames only?
Can confirm that Neat Video is incredible. Worth every penny!
Optix temporal denoise since Blender 3.1 is amazing for animations, more people should be talking about it!
No one seems to have tested it because there is no documentation. I got this far:
Set render output to EXR_Multilayer; turn on vector and denoise_data layers and render an animation. Then open up python console and enter:
bpy.ops.cycles.denoise_animation()
It goes through each saved image file for each frame (so its best to test with just a few frames), and overwrites the layer "View Layer.Combined.RGB" with a denoised version.
(I think you have to have an RTX card, Optix as the Cycles Renderer in System settings, and turn denoise on in render tab, select Optix, then turn off again - I don't know if all this is necessary, but when I initially tried with Cuda selected in system settings, and OpenImageDenoise in render tab, bpy.ops.cycles.denoise_animation() worked, but the results were poor. So I did everything I could to ensure Optix was being used, not any other denoiser, and got much better results.
Open the EXR image sequence in Davinci Resolve, go to Fusion page, and on the MediaIn node, choose the layer RenderLayer.Combined.
Please start a new tutorial series using blender 2.8. I'm struggling with ui in the donut tutorial.
Hunan beings are basically just a walking doughnut
yeah i was waiting for an update like this to learn blender
There are plenty of excellent beginner tutorials on YT already, I can recommend Grant Abbitt, he's made some great videos for 2.8 beginners.
In 2.81 there's a denoising node in the compositor that can render the video with retaining 90% details. But it takes a bit tricky node set-up to get the best results. What's mind-blowing is that you can render at 1 sample and get a 200 sample looking output
1:31 "Na-vidia"
HOLD ME BACK BEFORE I SLAP THIS BOY
He's Australian. He's saying it like that just to piss you off.
Obviously the proper spelling is "novideo"
i say it like that
Thanks for featuring D-NOISE, Andrew!
By the way, Blender 2.81 (or currently available on the site beta build) will have Intel Open Image Denoise implemented in it as a compositor effect, meaning you can apply it ontop of "progressive refine" feature. Or you could save your render for later with right data (diffuse pass and normal map pass) and denoise it later. Or you could go get a render from some other render image/software with those passes and put them into that compositor node and denoise any render in Blender. OR you can feed it a real camera image and leave it to try to denoise that. CRAZY STUFF.
Architector #4 or just buy a Threadripper
The idea of being able to apply denoising on a real image is really nice though.
I appreciate all the work you do for the side by sides. You don't get enough credit for that sometimes. Thanks for the info, especially on glass, the white pixel fly aways drive me nuts, so I'm looking forward to these possible solutions.
6:35: RenderMan comes with a 3-frame denoise algorithm, so there's at least that.
In case you are as lost as I me: The integrated denoising feature in Blender 2.9+ is actually located in the tab called "View layer properties", at the very end, and not in the "Render properties" tab.
wow im impressed how on earth did you get a sponsor with polygon? Thats amazing!!!!
Thank you for the info on the D-Noise, I did some research and ended up using Bone-Studio - BoneMaster - Blender 2.8x which includes OptiX AI which works with GTX cards. Now the render time has reduced considerably and there is almost no noise even at 128 samples in Cycles.
I need the donut tutorial as soon as possible! >_
Ask a police department.
It's in the making :)
@@blenderguru 😠
Why don't y'all just hop into the program and experiment and do your own best logical approach to achieving that, you can only follow tutorials step by step for so long. No need to delay and wait for someone else to figure out the steps for you.
Thanks! - That's a really great look at denoising - it's good to hear the ups and downs (so much of AI raytracing tech is sales talk these days). Also good to hear the comparisons you've mentioned - it'll be fascinating to see how this all settles in few years (to think I used to ray trace on workstations 20+ years ago and we'd wait the night a basic image... ! )
2:16 Click "Install OptiX Binaries" won't install What to do???
me too
lol , it's old mehod , now it's integrated inside blender , no need to install anymore.
Hey, I just wanted to say that I've never loved another man before, so this is all new to me, but your entire channel is just making me feel very different feelings about you.
He has a wife you know. /s (Monty Python)
And just as this came out 2.81 for Intel open image AI denoising
Yep. Great timing :P
I actually recorded this 6 weeks ago, and when I looked there was no easy way to test the intel denoiser back then. I'll eagerly await 2.81 to do a comparison :)
Blender Guru have you tried the bone studio Blender build in GraphicAll, comes with the denoiser as well as adaptive sampling which i think you will love, it decreases render times a lot in some scenes
Thanks for all your videos throughout the years, i wouldn't have been able to learn bledner without them
5:33 Most impressive paranormal moments caught on camera
I dont even use blender or any 3d software but still watch ur videos cause they are so entertaining :)
Can you please make a video on some of the best laptops for doing all this rendering, simulation, and modelling stuff.
As long as it has the best NVIDIA card available, it is a good laptop for blender.
cpu and ram also have big impact depending on what youre doing.
Go for as good components as possible that your economy allows.
16 - 32 gb of ram
ryzen 3600 or above, or intel i7 (newest gen is good, but last gen is not much worse)
GTX 2080
and a decently sized HDD
*i dont know much about laptops, but if you can find one with these specs it will work great. You could also just invest in a Render Farm subscription. At work i have an old PC from around 2012, and it works decently for blender. and then when i need to render i send it off to the Render Farm*
People don't render on laptops! At least not those who are serious about it. Laptops are relatively slow, and would run hot for long periods of time if you rendered on them. Not good. Same goes for complex simulations.
The ideal rendering machine is one with many cpu cores and lots of ram, although exactly how much depends on how complex your scenes are and whether you are rendering a single frame or a sequence of frames for animation. Resolution also matters.
For a desktop pc, consider 8 cores/16 threads and 16 GB of ram as a starting point. Enthusiasts will be looking at 16 core/32 thread or 32 core/64 thread machines with 32 - 128 GB of ram. If you're really serious, then you'll probably build yourself a small render farm. However, if you're determined to manage with just a laptop, then using a cloud render service might be the best option.
Rendering on gpus is also an option, but they are limited by the amount of graphics ram that they have. A big, complex scene with many light sources, materials and complex reflections may not fit into the gddr available. A gpu cannot access system ram, which is why studios use render farms comprising of Xeon processors and huge amounts of ram per machine. Very expensive, but effective.
However, it seems that most people seem to manage rendering on gpus, most of the time.
Right now, I would be more interested in the OpenGL performance of a gpu, as a guide to its EEVEE performance for fast render preview.
BlenderGuru in 4k! Scary!! Great tut Mr. Price!
How the hell YOUR company can sponsor YOU? :P
Great tutorial video BTW ;)
business expense loophole :)
He should rephrase that to "is made possible" or something different..
@@Slickstaff_Stainpants When you want to promote your own company so you sponsor yourself by it :P
@@SuperBialyWilk :) heh, i see no problem with it. Its not like its a secret.
There are different ways to advertise.
There are a few videos in TH-cam that are really diamond for me. And this is one of them. Thabk you so much
You got me stumbled with your pronounciation of nVidia. It's Enn-Vidia, not Navidia.
It's Leviosa, not Laevioeesaaa
Yes, i agree
perfect timed haircut change while explaining what denoising is
Forget temporal consistency, how about Andrew consistency! 😂
About the standard denoiser: I found that the "blotchy patches" you speak of are usually more pronounced when using "randomize grain", which in fact should be turned off when using the standard denoiser.
I use the ignore denoiser all the time!
Most reliable.
FYI: D-noise doesnt work in win7, it will freeze your blender. But if you kill the process after you click quick denoise, it will make it work. A simple trick that worked for me.
Nuh-vidia? NUH-VIDIA?
Neh-Vidia honestly sounds so much cooler than En-Vidia. Also great video Neatvideo is seriously magic and like almost required vfx tool I'm glad you mentioned it!
Am I really the only one wondering about his pronunciation of Nvidia?
I mean, "nahvidia"? What?!?
Isn't it pronounced "N - vidia"?
Kinda like "envy" - dia.?
Other than that, interesting video.
I believe I saw a video mentioning that 2.81 will get an Intel made denoiser. Be fun to see another comparison then.
No, you're not the only one, and you're right, the correct pronunciation is "en-vidia"
You’re sponsored by poliigon?
*BRO YOU WORK FOR POLIIGON!*
"this video is sponsored by poliigon!"
but... don't you own Poliigon?
I mean. he already has a huge following. It's basically free advertisement. and the service offered is really good.
*THAT'S THE JOKE*
He saved us earlier with filmic render, pbr materials and now he shapes blender again with better denoiser! Did you notice, cycles improves very slowly, other parts of blender is running very fast! Still waiting for so many things for cycles, exclude-include future or better interior rendering algorithm like vray or corona etc...
nevidia? it's pronounced as en-vidia....dude, lol! Living under a rock, have you?
Welcome back Mr Price
*Tells me that the flickering will be very noticeable*
*Continues to put a giant arrow pointing at all the flickering*
*Completely ignored the part where YT compression was mentioned* 😜
Man I watched that animal motion presentation at the time but only started out in blender and watching your channel a few weeks ago! Nice how things reconnect.
*Laughs in intel HD graphics*
*speaks in deep japanese voice*
Fool, No software trick can overcome time
Maybe worth noting - Neat Video is GPU accelerated via both nVidia and AMD cards and renders extremely fast.
Red Giants Denoiser is amazing as well, but I prefer to support the smaller neat video, they just do such a great job.
is it just me or is hitting a huge button in blender really satisfying?
Was waiting for this video!
Thanks for making our lives easyer..
4:10 Andrew in 200 samples
4:11 Andrew in 1500 samples
Oh, I am very excited for your tutorial video to be uploaded. I can't wait... Please upload it ASAP. ..
Lol, when you said it doesn't work well with animations, my first thought was "well, then you should get in contact with some video codec people, because quite a few video codecs have temporal denoisers built into their routines.
But, you figured that one out already.
Now, may be a good time to get them involved in making a post processing plugin for blender that does just that step :)
I think that the noise which especally is in the shadows while animation, adds more realism. Its like a real camera. Just rec something in dark areas, and you will notice the noise. I would say, that noise is not always bad, especally when it comes to relism. And you should know, that realism is not perfection, its the opposite: ITS IMPERFECTION. Awesome video.
Just in case somebody isn't aware of it, TH-cam has a 2x speed function, meaning you'll have to spend just half as much time watching any video. It's especially useful for videos where people speak very slowly
I prefer remove denoise in the pos-production, you change your pass layer with noise e apply denoise pass by pass, its keep more information, BUT it is very cool, thanks for this information :D
Might not be the first to mention this, but the two denoisers that I would like to see added to this comparison would be Topaz Labs Denoise AI (for stills), and Red Giant Magic Bullet Denoiser III (for animations).
Hey Andrew, I don't know if you already knew this but for emission objects (I mean not blender lamps but solid objects) there is a node called light falloff that can be connected to the strength of the light. It reduces the strength by providing that the light strength doesn't reach an infinite value (quite common in blender cause of noise). It reduce much much the noise. If you didn't know I suggest you to try it, it changed my life 😅. Let me now if this was helpful
Musics crazy good, crazy smoooooth 🇯🇲🇯🇲🇯🇲
First of all, congrats on the sponsorship. I haven't been keeping the tightest track on your channel, so for all I know you could have had one for months, but good to see you moving up the chain all the same.
So I downloaded D-Noise a few months ago, but I never used it because my friend kept tell me, "just check it and go. You don't need anything else." Needless to say, it didn't work, and I've since learned that if there is a Download button attached in the user preferences, it's usually a good idea to click it. But like you mentioned, it does smudge out edges, which is something I never liked. But, better then nothing.
So what sample range do you generally work at? Obviously each project is general, but as a general rule of thumb. Everyone I've talked to insists that it's a different number, and that their number is the universal standard (Those being from 30 Samples to 1,000+). Currently I have 256 as my default, but I'm not sure if I should bump it up to 500 again, or would that be diminishing returns. I like to go more photo realistic, but I'm definitely not near the point of the kitchen in this video. :P
Uhh ... Poliigon is Andrews company. He is founder and CEO. So when he says "sponsored by .. " he is sponsoring himself. Nice tax right-off though.
@@kotronics_arts Wait, wha? So what is he just handing himself money in a mirror?
@@BrownFoxWarrior Yep ...pure genius ... good business ...
Man, now that is awesome.
Blender Guru: modeling artist and mathematician
Awesome stuff Andrew!
Nice! I immediately thought of the trail neat photo denoiser I've used, didn't know they had one for video as well. Also had no idea how good it really is as far as denoisers go
Now, I do all my animations inside blender 2.81 with the new amazing Denoiser! It's a huge update!
I remember watching this a while back. Such a thorough review of the denoisers. Thanks so much again Andrew. So 3 years later, what do you all think is considered the better denoisers on the market (or even photo enhancers)...especially with all the ongoing A.I technology that is out? There are so many A.I. apps on mobile too. Are there any updated videos out regarding A.i. enhancing topic such as this (not the A.I. art generating topics *yuck*). Would love to find a comparison between Blender, Optix, Topazlab, Intel, etc. 3 years later.
I predict in the future a lot of our workflow and modeling will probably be just setting landmark objects, and giving each object metadata tags for an AI renderer. The 3D interface will most likely only be needed to assist in setting a temporospacial tag for each object and separating them via color for our eyes.
The AI renderer will have a generalized architecture, probably generative adversarial, and have a separate word-association layer for us humans to interface with its settings. Datapacks will be downloadable that have been trained on specific images that can set color grading and textural effects. Multiple datapacks will probably use image masking to give only specific objects neurostyle textures and materials.
Your videos are so helpful! You got me started and I still learn so much from you. Thanks!
The v-ray denoiser works good for animation as it takes into account the previous and next frame. It is slower though. Also not sure how the v-ray implementation in blender works but you can denoise renders with the standalone tool. You only need to generate some passes, then if you generate the denoise passes it could work with renders from other renderers. Its quite fast when in GPU - 4 to 10 seconds maximum at 1080. We'll see how it gets developed.
I think Neat video is amazing solution for now - it can also deflicker if I'm not mistaken. And we'll see soon the Intel's denoiser so that would bring more advancement too.
If you dont know already, in Blender 2.8 32x32 is now the fastest tile size to use with the GPU now in cycles, although the stock denoiser still prefers to be faster with larger tile sizes.
Might have messed up time conversions - diff is ~12x not ~31.
Looking forward to the november release's denoisers review.