Really great video but i have some corrections (not that it's really much of your fault i'm just a programmer): 1.OptiX is a raytracing API, not a GPU compute API, which is what CUDA is, AMD's equivalent to this is OpenCL. 2. GPU's are not "less exact" at computing, they are simply just highly specialized at 3D computations, such as geometric calculations, rotation and calculation of verticies, programmable shaders, and much more. 3. The difference between a CPU and GPU is that a CPU is a general processor, it has a wide range of possible instructions that it can execute fast, it is a jack of all trades in a sense, it is also far more sequential due to the fact that it has less cores, typically 8 to 12. In contrast, a GPU has literally thousands of cores and computes MANY more tasks at the same exact time in parallel.
Thanks for all this info! i stand corrected, i have an arts background so a lot of the fine details about programming are way above my paygrade. Thanks a lot for this very useful information
Another big factor is memory, which is needed for big scenes. System memory is much larger, cheaper, and upgradable. GPU memory is fixed and much more limited. Some gpu renderers have out of core memory algorithms where they are constantly shifting data in and out of the gpu to compensate for this limitation, but at a performance cost.
@@ricbattaglia6976 CPU does seem to give better results, but at the cost of speed (a 5min GPU render can take up to 45 minutes with CPU). It might be better now with the new CPU path guiding render in blender 3.4.
@@emptism So sane file on Blender with GPU 4090 and on Corona with CPU 7959X is much faster with RTX and hust a little more fine with multicore? Thanks!
I think you had the denoiser on, which effects the output. A better comparison would have been with the denoiser off so you can see the raw rendered images. The denoiser always gives weird blury artefacts if there aren’t enough samples. Which can be acceptable for stills, but animation it just looks terrible
So if you let the cpu do all the render samples, will it eventually get the detain that the gpu had, but adding the better calculated light and all that?
Similar thing is seem for video rendering. GPUs render it faster, but the results are worse quality than the CPU version. It turns out that GPUs are only *TRULY* useful for 3D and that's where they should be preferred and used!
correct these are not my models, they are from the blender Site, however we do have some blender courses that cover tecniques to create similar assets :) Cheers!
@@AbeLeal3D For interior design do you prefer Blender on cpu or gpu for quality and velocity? Or Corona (for Blender or for Cinema 4D?)? AMD threadripper with 4070ti for viewport render or intel 12600 and 4090? Thanks!
@@ricbattaglia6976 Hey Ric! so i dont actually use Corona. However i try to render as many things in GPU when possible. the setup that you mention would be excellent as well!
"CPU is precise and slow and GPU is brute and fast" is an old statement. All the render engines nowdays favors the GPU and multi GPU systems. Need to say, that even on GPU rendering, the CPU still do a lot of work. But if you let you CPU to the job alone, better go on vacation, don't need to wait 1 week for your 8K render...
This is why Hollywood used layers So you can render 8k textures and 4096 samples with all refraction materials without involving gpu predictions painting all those shit at once Mostly the reason Why your render so flickery bubbles cause it's not precisely accurate
Ughh Many youtube 3ds artist Still telling us To render samples High at possible But never told me there a scenes layer Which I've could just set each linked scenes different samples task
But hey Nvidia just figured out How to render 3 samples and still get Crazy Good details in like you mention how cpu does it Intel is making cycles X Render oid denoiser As fast at optix denoiser without losing detail
@@NexttutA laptop with i9 13 gen without a dedicated gpu vs i5 12 gen with a dedicated rtx 3050 50watt which is good for 4k video editing ,3d modelling and light gaming
Hey Abraham, You really should groom your beard bro. You'll look fantastic if you would trim and groom your beard (not removing it) just groom. Because most of the videos I see that you look messy, you're sweating, you're working hard to deliver us such an amazing free content. Don't forget about yourself bro!
Really great video but i have some corrections (not that it's really much of your fault i'm just a programmer):
1.OptiX is a raytracing API, not a GPU compute API, which is what CUDA is, AMD's equivalent to this is OpenCL.
2. GPU's are not "less exact" at computing, they are simply just highly specialized at 3D computations, such as geometric calculations, rotation and calculation of verticies, programmable shaders, and much more.
3. The difference between a CPU and GPU is that a CPU is a general processor, it has a wide range of possible instructions that it can execute fast, it is a jack of all trades in a sense, it is also far more sequential due to the fact that it has less cores, typically 8 to 12.
In contrast, a GPU has literally thousands of cores and computes MANY more tasks at the same exact time in parallel.
Thanks for all this info! i stand corrected, i have an arts background so a lot of the fine details about programming are way above my paygrade. Thanks a lot for this very useful information
@@Nexttut Yeah no worries, and not trying to be that rude guy just things i've noticed. Great video though
11:07 I love they way you used that analogy to explain the difference between the GPU and CPU cores. Very good teacher
Thank you!
Another big factor is memory, which is needed for big scenes. System memory is much larger, cheaper, and upgradable. GPU memory is fixed and much more limited. Some gpu renderers have out of core memory algorithms where they are constantly shifting data in and out of the gpu to compensate for this limitation, but at a performance cost.
I just tried using 12900k with 128 GB DDR4 instead of 3080ti with only 12 GB and it almost felt like cheating XD
@@emptism CPU for you is better and faster than GPU? Thanks!
@@ricbattaglia6976 CPU does seem to give better results, but at the cost of speed (a 5min GPU render can take up to 45 minutes with CPU). It might be better now with the new CPU path guiding render in blender 3.4.
@@emptism So sane file on Blender with GPU 4090 and on Corona with CPU 7959X is much faster with RTX and hust a little more fine with multicore? Thanks!
I think you had the denoiser on, which effects the output. A better comparison would have been with the denoiser off so you can see the raw rendered images. The denoiser always gives weird blury artefacts if there aren’t enough samples. Which can be acceptable for stills, but animation it just looks terrible
Have you also done a comparision with moving images?
Is it still the case with laptops that cpu will give better results than gpu?
So if you let the cpu do all the render samples, will it eventually get the detain that the gpu had, but adding the better calculated light and all that?
How do you set up temperature of gpu?
why u did'nt compare a gpu price with a cpu with equal performances?
Similar thing is seem for video rendering. GPUs render it faster, but the results are worse quality than the CPU version.
It turns out that GPUs are only *TRULY* useful for 3D and that's where they should be preferred and used!
cuda is slower than optix, and you should save as PNG, not a lossy format like jpg, when comparing image quality.
Not always, Optix is only better for ray tracing, hence for RTX users, for gtx, Cuda is better
Thanks, very interesting! For 3ds Max which is the best gpu or cpu configuration? Thanks a lot.
Is this just octane or would it be the same with say Arnold?
Question: do you have a course or tutorial for the concept and models you used for this video in Bender?
@hjf4a2 got it, thanks
correct these are not my models, they are from the blender Site, however we do have some blender courses that cover tecniques to create similar assets :) Cheers!
@@AbeLeal3D For interior design do you prefer Blender on cpu or gpu for quality and velocity? Or Corona (for Blender or for Cinema 4D?)? AMD threadripper with 4070ti for viewport render or intel 12600 and 4090? Thanks!
@@ricbattaglia6976 Hey Ric! so i dont actually use Corona. However i try to render as many things in GPU when possible. the setup that you mention would be excellent as well!
@@AbeLeal3D So is better using GPU for render? But Corona uses CPU, or not? Thanks
"CPU is precise and slow and GPU is brute and fast" is an old statement.
All the render engines nowdays favors the GPU and multi GPU systems.
Need to say, that even on GPU rendering, the CPU still do a lot of work. But if you let you CPU to the job alone, better go on vacation, don't need to wait 1 week for your 8K render...
That was very instructive, thanks
You're welcome! ☺
As always, awesome content!
Thanks again!
Can you make cpu vs gpu particle in ue4
This is why Hollywood used layers So you can render 8k textures and 4096 samples with all refraction materials without involving gpu predictions painting all those shit at once Mostly the reason Why your render so flickery bubbles cause it's not precisely accurate
Ughh Many youtube 3ds artist Still telling us To render samples High at possible But never told me there a scenes layer Which I've could just set each linked scenes different samples task
But hey Nvidia just figured out How to render 3 samples and still get Crazy Good details in like you mention how cpu does it
Intel is making cycles X Render oid denoiser As fast at optix denoiser without losing detail
Nice information , can you give more details about it ?
Thnks again for this video.... bt be back soon please bro
Always welcome 👍
Hi can you do rigging and animation course for beginner in blender. Same like the Maya course
a well made informed video
Thanks!
@@NexttutA laptop with i9 13 gen without a dedicated gpu vs i5 12 gen with a dedicated rtx 3050 50watt which is good for 4k video editing ,3d modelling and light gaming
thank you! very interesting information!
Our pleasure!
Thanks!
No problem!
thank you ~~
You're welcome 😊
Hey Abraham, You really should groom your beard bro. You'll look fantastic if you would trim and groom your beard (not removing it) just groom. Because most of the videos I see that you look messy, you're sweating, you're working hard to deliver us such an amazing free content. Don't forget about yourself bro!
haha thanks bro! yeah you are totally right, i am always quite lazy with my beard but i will take your advice, thanks!
gg