I will collect this series of tutorials and carefully read each chapter. Thank you very much for your selfless dedication. These are the knowledge I want to know and need to master. You are great.
Thank you. There's more to come, and I plan to make a full-length video with the entire series as one video to keep everything together after all the sections are posted.
This is a fair comparison with Phoenix in my opinion, thank you! Just wanted to drop in on the left-handed/right-handed coordinate systems - Max and Maya and AFAIK Houdini are right handed, while Cinema 4D is left-handed + Y-up. Cheers!
Could you please tell me why you chose to bring the VDB into the scene instead of rendering the effect in embergen and compositing it? I have a smoke simulation in an animation I've done and am trying to figure out the best workflow. The smoke simulation has several things colliding with it as well as being occluded behind a pot that it's emitting from. My biggest concern with bringing it into Maya is wont look the same as it looks in embergen with the particles! Thanks in advance
The dust needs to interact with the environment and the ship, which is more difficult to fake. Also, the ship occludes the dust at the back. It's possible to render from Embergen and then comp back, but it's not what I wanted to do for this, especially that the point was really to show how to use render layers. I agree that it can be difficult to get the same look outside of Embergen, though.
@@CGStirk Thanks for the response! Do you know if there is a way to bring your animated objects and camera into Embergen and render an exr seq that occludes your object in the correct place? Almost like a render mask? In your example it would occlude the smoke behind the ship? Hope that makes sense!
@@sarulaplays6861 I am so sorry. I thought I had replied to this! So, yes you can bring in a model, at least in the more recent releases, and I would assume you can make it a hold-out object but I've not tried it!
I think Maya starts at 1 because of the way traditional animation works. Zero is considered a frame, like 1, 2 and so on and so forth. If you set an animation from zero to 500, you will end up with 501 frames. Whereas from 1 to 500 you will get 500 frames. It must have something to do with the fact that Maya was originally created in collaboration with Disney (which asked Alias|Wavefront to develop a more artist friendly animation software, compared to Alias Power Animator). I remember when I switched from Power Animator to Maya, back then, and been really (pleasantly) surprised by how much easier and straight forward was to animate, and apply all the principles in Maya. That was in 1998.
Ah interesting if that's the reason. I usually set it to start at 0 to match After Effects, and the programmer side of me wants it to start on 0 too. I can't remember what other DCCs use.
@@CGStirk I usually do the other way around, set after effects time line from 1 ahaha 😂 Although I much prefer using Fusion for composting. Node based systems suit me better in general.
Hi, I have a question about the import of the VDBs of Embergen into Maya. I created in Embergen an explosion and when I tried to render it in Maya the explosion is too bright, like the fire is so bright that almost all the are area became white. If I change the render to alpha I can see all the smoke, so I now that the explosion is there, but I don´t know how to actually see the fire and everything instead of just seeing a big white shape.
I can't believe this software that I've just stumbled across, I was about to build a render / simulation node on a budget and buy a budget nas, but this software solves that issue of having my workstation locked out whilst simulating, and it's rent to own pricing, it feels like Xmas, one thing I was struggling to find the funds for was tyflow Pro, free version is amazing but so slow due to being locked to 1 core of cpu and no gpu acceleration.. Can ermbergen do a lot of what tyflow does? And also I was going to dual card my gpu for vray, would ermbergen use the second card also (not sli) just doubling the cuda cores... Great video man, thank you
Tyflow is definitely more robust than EmberGen. Don't get me wrong, EmberGen is great, but it is pretty limiting once you need complex interactions and large-scale sims. Glad you found it, though, and I hope it can do what you want!
@@CGStirk yeah I became a little excited too quijcly after watching this, but it's a good start, and hopefully with LiquiGen in the works its a huge saving in processing and trial and error times
@@CGStirk absolutely, I've fallen back in love with cgi after a looong break, but I'm finding my impatience on waiting for sims to calculate unbearable 🤣.. One needs a sim /render node ASAP
@@R1PPA-C I've been working on a big sim on and off for a couple of years. It maxes out my RAM and unfortunately EmberGen is not great for large-scale sims.
I have an issue with importing a second VDB into the scene. The first one is similar to yours and only has dust, there were no issues whatsoever. But I made another VDB that's thrusters, It doesn't have smoke but temperature & fire and when I import it into the scene it everything becomes overexposed. I tried tampering with the settings and it doesn't even seem it loads properly only pixels like an object. I'm not sure if maya can't handle 2 VDBs or I'm not exporting from Embergen properly. Any idea how to fix this?
I think it's the way the VDB is interpreted for temperature. I've found that you usually have to remap the temperature yourself which is annoying. It's not a limit on the number of imported VDBs.
@@CGStirk Oh I see, is it under the 3rd party channels mapping, voxel preview or fire (color intensity tab)? Sorry I am being specific, I just can't seem to manage a stable result :/
EmberGen is realtime and the VDBs are small. I also don't know how to do much in Bifrost 🙃 I'm sure you can make something good in Bifrost and with fewer limitations.
On the Export Image node, set the Filepath, and in the file save dialog, change the type there. By default it's PNG. I only have v0.7.5.9, though, so it may have changed since then.
@@CGStirk hi same version i have. But there is something to change in render section na ? Like adding emmissive,temperature,scattering,alpha etc...I m so confused to select the modes for rendering in transparent png
man i was playing with simulatuin size and accidently typed 2256 instead 256 and ofc it froze and trying to load autosave freezes entire pc as well pain ty for tutorial tho, really nice
Comparing EmberGen to Phoenix FD makes no since, they are meant to meet two completely different use cases. EmberGen is made to create volume simulations in "realtime" that you might export to a exr sequence, flip book or vdb's. And if you're having overheating issues on a 3090 then your card is defective.
Despite that EmberGen is intended for realtime use, VDBs are not. There are plenty of use cases that are perfectly fine to use EmberGen for as a replacement for Phoenix FD, and Phoenix is used for rendering flipbooks too. It's just another tool. In regards to the overheating issues, FE cards don't have direct cooling of the vram, and EmberGen hit it hard. Only ever had issues with that card and EmberGen's earlier releases.
@@CGStirk Any realtime rendering engine that supports NanoVDB or newer versions of OpenVDB is capable of realtime playback of VDBs. UE5, Nvidia Omniverse both have that ability and I think Blender EEVEE does also. And your heatsink/fan not sitting on the gpu is caused from the thermal paste shrinking and is a manufactures defect. You can fix it by taking the heatsink off and reapplying new thermal paste. I have a 2060 Super FE and it takes EmberGen no problem. But a better way to render is to use a azure gpu instance. You can rent 12 Nvidia A100's for an hour and it cost like $12. You could probably render Sintel th-cam.com/video/c2gB83g_HSc/w-d-xo.html in an hour with 12 A100s .
I will collect this series of tutorials and carefully read each chapter. Thank you very much for your selfless dedication. These are the knowledge I want to know and need to master. You are great.
Thank you. There's more to come, and I plan to make a full-length video with the entire series as one video to keep everything together after all the sections are posted.
This is a fair comparison with Phoenix in my opinion, thank you! Just wanted to drop in on the left-handed/right-handed coordinate systems - Max and Maya and AFAIK Houdini are right handed, while Cinema 4D is left-handed + Y-up. Cheers!
Thank you. Yes, it would be nice if everyone had just agreed to one standard but alas
Thanks man, I don't know why this only has 1.5k views
I'm just not that popular lol
This is gold bro
thanks for sharing
Thank you!
Thank you 🙏
You're welcome!
Could you please tell me why you chose to bring the VDB into the scene instead of rendering the effect in embergen and compositing it? I have a smoke simulation in an animation I've done and am trying to figure out the best workflow. The smoke simulation has several things colliding with it as well as being occluded behind a pot that it's emitting from. My biggest concern with bringing it into Maya is wont look the same as it looks in embergen with the particles!
Thanks in advance
The dust needs to interact with the environment and the ship, which is more difficult to fake. Also, the ship occludes the dust at the back. It's possible to render from Embergen and then comp back, but it's not what I wanted to do for this, especially that the point was really to show how to use render layers. I agree that it can be difficult to get the same look outside of Embergen, though.
@@CGStirk Thanks for the response!
Do you know if there is a way to bring your animated objects and camera into Embergen and render an exr seq that occludes your object in the correct place? Almost like a render mask? In your example it would occlude the smoke behind the ship? Hope that makes sense!
@@sarulaplays6861 You can bring in a mesh in the newer versions of it. I would like to think you can make it a hold-out object, but I'm not sure.
@@sarulaplays6861 I am so sorry. I thought I had replied to this! So, yes you can bring in a model, at least in the more recent releases, and I would assume you can make it a hold-out object but I've not tried it!
did you render inside embergen or exported it as VDB or something?
Exported VDBs
You are back!!!!!Would you do a tutorial, where you render it in embergen and comp in after effects!!! Still loving vdbs!!
Thanks! Possibly. I think you can import cameras now, but I've only have exported the VDBs.
I think Maya starts at 1 because of the way traditional animation works. Zero is considered a frame, like 1, 2 and so on and so forth.
If you set an animation from zero to 500, you will end up with 501 frames. Whereas from 1 to 500 you will get 500 frames.
It must have something to do with the fact that Maya was originally created in collaboration with Disney (which asked Alias|Wavefront to develop a more artist friendly animation software, compared to Alias Power Animator).
I remember when I switched from Power Animator to Maya, back then, and been really (pleasantly) surprised by how much easier and straight forward was to animate, and apply all the principles in Maya. That was in 1998.
Ah interesting if that's the reason. I usually set it to start at 0 to match After Effects, and the programmer side of me wants it to start on 0 too. I can't remember what other DCCs use.
@@CGStirk
I usually do the other way around, set after effects time line from 1 ahaha 😂
Although I much prefer using Fusion for composting. Node based systems suit me better in general.
Hi, I have a question about the import of the VDBs of Embergen into Maya. I created in Embergen an explosion and when I tried to render it in Maya the explosion is too bright, like the fire is so bright that almost all the are area became white. If I change the render to alpha I can see all the smoke, so I now that the explosion is there, but I don´t know how to actually see the fire and everything instead of just seeing a big white shape.
What renderer are you using? Do you have manual exposure on your camera?
In Embergen i have two colour in fire but in maya the show only one colour ? and how to render multi colour simulation ??????
You probably need to remap the colors in Maya. That doesn't transfer seamlessly between Embergen and other renderers.
I can't believe this software that I've just stumbled across, I was about to build a render / simulation node on a budget and buy a budget nas, but this software solves that issue of having my workstation locked out whilst simulating, and it's rent to own pricing, it feels like Xmas, one thing I was struggling to find the funds for was tyflow Pro, free version is amazing but so slow due to being locked to 1 core of cpu and no gpu acceleration.. Can ermbergen do a lot of what tyflow does? And also I was going to dual card my gpu for vray, would ermbergen use the second card also (not sli) just doubling the cuda cores... Great video man, thank you
Tyflow is definitely more robust than EmberGen. Don't get me wrong, EmberGen is great, but it is pretty limiting once you need complex interactions and large-scale sims. Glad you found it, though, and I hope it can do what you want!
@@CGStirk yeah I became a little excited too quijcly after watching this, but it's a good start, and hopefully with LiquiGen in the works its a huge saving in processing and trial and error times
Yeah, the instant response to know how the settings affect the sim is the best part.
@@CGStirk absolutely, I've fallen back in love with cgi after a looong break, but I'm finding my impatience on waiting for sims to calculate unbearable 🤣.. One needs a sim /render node ASAP
@@R1PPA-C I've been working on a big sim on and off for a couple of years. It maxes out my RAM and unfortunately EmberGen is not great for large-scale sims.
I have an issue with importing a second VDB into the scene. The first one is similar to yours and only has dust, there were no issues whatsoever. But I made another VDB that's thrusters, It doesn't have smoke but temperature & fire and when I import it into the scene it everything becomes overexposed. I tried tampering with the settings and it doesn't even seem it loads properly only pixels like an object. I'm not sure if maya can't handle 2 VDBs or I'm not exporting from Embergen properly. Any idea how to fix this?
I think it's the way the VDB is interpreted for temperature. I've found that you usually have to remap the temperature yourself which is annoying. It's not a limit on the number of imported VDBs.
@@CGStirk Oh I see, is it under the 3rd party channels mapping, voxel preview or fire (color intensity tab)? Sorry I am being specific, I just can't seem to manage a stable result :/
@@johnthunder962 There is that section too. If the channels are listed there, try that. Otherwise, go to Rendering > Fire and remap the values there.
@@CGStirk Thanks a lot!
Greetings Sir, could you please you explain why you are not using bifrost ?
EmberGen is realtime and the VDBs are small. I also don't know how to do much in Bifrost 🙃 I'm sure you can make something good in Bifrost and with fewer limitations.
I'm surprised you don't know bifrost isn't that bleeding esge@@CGStirk
@@mfrancisco_850 That was the old me. I am now all-knowing
@@mfrancisco_850 That was the old me, but I am now all-knowing
hai how to render in png format from embergen
On the Export Image node, set the Filepath, and in the file save dialog, change the type there. By default it's PNG. I only have v0.7.5.9, though, so it may have changed since then.
@@CGStirk hi same version i have. But there is something to change in render section na ? Like adding emmissive,temperature,scattering,alpha etc...I m so confused to select the modes for rendering in transparent png
@@seance0 In the Render node, check Alpha in Capture Types, and then connect Render Alpha to the A input of the Export: Image node.
請問你的電腦是什麼性能的?我匯入特效後電腦在跑v-ray顯示時會很久很卡。
TR 3960x CPU
256 GB RAM
RTX 3090 (at the same of recording)
man i was playing with simulatuin size and accidently typed 2256 instead 256 and ofc it froze and trying to load autosave freezes entire pc as well
pain
ty for tutorial tho, really nice
Oh no. They really should have a warning for that. "Are you sure that's what you want?" Sorry that the autosave crashes too :(
Comparing EmberGen to Phoenix FD makes no since, they are meant to meet two completely different use cases. EmberGen is made to create volume simulations in "realtime" that you might export to a exr sequence, flip book or vdb's. And if you're having overheating issues on a 3090 then your card is defective.
Despite that EmberGen is intended for realtime use, VDBs are not. There are plenty of use cases that are perfectly fine to use EmberGen for as a replacement for Phoenix FD, and Phoenix is used for rendering flipbooks too. It's just another tool. In regards to the overheating issues, FE cards don't have direct cooling of the vram, and EmberGen hit it hard. Only ever had issues with that card and EmberGen's earlier releases.
@@CGStirk Any realtime rendering engine that supports NanoVDB or newer versions of OpenVDB is capable of realtime playback of VDBs. UE5, Nvidia Omniverse both have that ability and I think Blender EEVEE does also. And your heatsink/fan not sitting on the gpu is caused from the thermal paste shrinking and is a manufactures defect. You can fix it by taking the heatsink off and reapplying new thermal paste. I have a 2060 Super FE and it takes EmberGen no problem. But a better way to render is to use a azure gpu instance. You can rent 12 Nvidia A100's for an hour and it cost like $12. You could probably render Sintel th-cam.com/video/c2gB83g_HSc/w-d-xo.html in an hour with 12 A100s .