Interactive Particles with Optical Flow and ParticlesGPU 2022 - TouchDesigner Tutorial 104
ฝัง
- เผยแพร่เมื่อ 18 ธ.ค. 2024
- Get access to 200+ hours of TouchDesigner video training, a private Facebook group where Elburz and Matthew Ragan answer all your questions, and twice-monthly group coaching/mastermind calls here: iihq.tv/Trial
If you’re a TouchDesigner Beginner, check out our TouchDesigner Tutorial Series! We teach you all the basics of how to use TouchDesigner, including mouse controls, hotkeys, parameters, the operator families, and more: interactiveimm...
The feature many developers and artists have been waiting for: optical flow interactivity with particlesGPU 2022! It's finally here and it gives you the ability to use standard webcams to interact with the GPU-accelerated particle systems created by particleGPU. This gives you the power of the new particleGPU with the easy-to-setup optical flow interactivity of the old particleGPU.
I've been playing around with this as well. Instead of lining up the camera in the viewer. You can just go into the system and there is Camera base with parameter and just hit reset camera and it lines up perfectly to the camera and you can just adjust the pivot distance to get it to your liking. It works a lot better imo.
thanks !!
Thanks! I got this video recommended because I want to make immersive theatre! Thank you for this tutorial
Our pleasure! Thanks for watching :)
thanks alot, very helpful. Keep coming back each time I work with ParticleGPU
Our pleasure, we’re glad to hear it! :)
oh man you saved my life. i just got hardware limitations on my art installation and i dont have time to fix the hardware problems thx u so much
Glad to hear it! Hope you were able to get it sorted out :)
What an amazing tutorial, easy to understand (and i'm a complete newbie to TouchDesigner) and covered many points you'd see in FAQs 👏
We’re very glad to hear it - thanks for the feedback! 😀 Cheers!
The pleasure is mine 🙏
How do you invert the particles so that they stick to and follow your arm/body movements?
I have so many great ideas now. Thank you.
Glad to hear it! Thanks for watching :)
This is wonderful thank you! Is there a more precise or methodical way to align the optical flow with the particle grid yet?
Have you tried using the Optical Flow Size Remap parameter on the Forces page? Setting this to the same aspect ratio as the Optical Flow COMP's output can help with alignment. See Markus' comment from the forum here: forum.derivative.ca/t/particlesgpu-with-optical-flow-overlay-is-wrong/191921/2
@@TheInteractiveImmersiveHQ Thank you yes I have corrected the aspect ratio but there is still some transforming required to align the optical flow and particle gpu in the composite TOP to visualize the interaction of the two. If I wanted to create an interactive floor for example, seeing these two things overlayed visually the way they are processed would be very helpful for a quick and precise calibration as opposed to the nudging and checking. As he says at 11:34 "There is not an easy or automatic way to make these two line up" and since the two are working on a coordinate plane, I wonder if that will ever be implemented or available
Super inspiring and wonderfully explained for a newbie.
Love to hear that! Thanks for watching :)
Grateful for this helpful tutorial It wasn't working for sometime at first but then figured out how the bottom input port of the latest version of ParticleGPU is no longer for Optical Flow but for something called Effector. Would you be able to tell me when to use Optical Flow with Web Cams vs a depth sensor like Kinect? In other words, which experience usecases will Kinect be the best and only feasible ?
Since the Kinect provides a user index texture, which shows the detected user's silhouette, it'd be a great choice for use with the Optical Flow COMP/particlesGPU. You could still get similar results with a webcam depending on the environment/processing, but the Kinect would bypass the experimentation required to get it right. If you have an Nvidia card, the Nvidia Background TOP can provide similar results, but the benefit of the Kinect is that it doesn't add any extra performance tax to the computer to provide those results like the Nvidia API does.
@@TheInteractiveImmersiveHQ wonderful 🥰
Thank you
Our pleasure! Thanks for watching :)
Hi! Thank you for the great tutorial!! However I am having questions about setting up the ndiin in your way at the beginning of the video, should I watch anything to catch up? Thank you!
Our pleasure! Elburz was using the Nvidia Broadcast software on his camera feed to remove the background from the image, and then routing that video signal into TouchDesigner via NDI. It’s not a necessary step to create this effect, so you don’t worry about adding it to your network. You can use the Video Device In TOP instead. Hope that helps!
Hello, I am writing to ask you about datamoshing, I saw that you have 3 articles on the ''interative & immersive'' blog, so are you going to continue explaining how to achieve the effect? You mentioned something about a continuation to the fourth part but I can't find it yet.
Thank you very much for your excellent content.
Greetings
Yes, this is actually on our list for future content! Stay tuned :)
Fantastic! I want to ask how to change the render view in 3D , just like change the "display bounds" in ParticleGPU operator. As the output is TOP , which could not been controlled by camera. Thanks !
Great question! If you click on the "Viewer Active" button in the bottom right of the particlesGPU operator, you can click + drag on the viewer to modify the view of the render. If that doesn't offer enough customization, you can also head inside the particlesGPU operator and modify the Camera COMP directly.
YES FINALLY!
would be awesome if we could simply send out the aspect ratio to some input as reference or a toggle switch to automatically detects the resolution output from particle flow that we can turn on and turn off to simply make the bounding box aspect ratio matched with the input. and along with that toggle when turned on, it simply changes the camera to front ortographic view, and the positioning of the particle is in a normalized fashion where (0,0) is on bottom left, and (1,1) is on top right just like UV coordinate. so we can easily do more creative processing by combining the optical flow with more stuffs.
One of the great things about particlesGPU is that it's built with TouchDesigner operators under the hood, meaning that all of these features could be implemented 🤩
Thanks again for this great tutorial ! I've became fan since I encountered your channel ;) Is it possible to make multiple look for particles ? Could it rains not only bananas but also apples and oranges etc all together ?
Yes! To do this, you can use a technique called Instance Texturing. Check out this documentation page for more info: docs.derivative.ca/Geometry_COMP#Instance_Texturing
Amazing. Can these be audio reactive?
anything can be audio reactive in touchdesigner
Yes, definitely! You could trigger numerous different effects with the audio signal, whether changes in color, scale, or positioning of the particles
How do we mask it out so that the subject is in front of the particles.
To place the webcam feed over the particlesGPU output, the order in which they're connected to the Composite TOP needs to be reversed. Connect the webcam to the comp1 TOP first and then connect the particlesGPU output. Hope that helps!
Ok basically my last question was, why do you have the ndiin with the camera there on? I do not have the same thing happen when I have the ndiin :) thank you!
See response to other comment :)
Hi,for the first part ndin how do you on camera,when i put ndiin it did not appear me 😢😅
You can actually use the Video Device In TOP instead! :) Elburz was using the Nvidia Broadcast software on his camera feed to remove the background from the image, and then routing that video signal into TouchDesigner via NDI. It’s not a necessary step to create this effect, so you don’t have to worry about adding the NDI In TOP to your network. Hope that helps!
Hi teacher, what a great job! I really like your videos!
Would you be able to create a tutorial on luminous particles like dysney?👍👏
We're glad to hear it! Thanks for the suggestion, I've sent it to the team to review as a potential topic for a future video 🙂
Is it possible to randomize the Birth Rate, over time so say it starts at 1 then jumps to 5-15 every 30-40 seconds, then back to 1? Is this expression built in?
You could control this parameter with a CHOP channel. I'd suggest looking into something like the Timer CHOP, which could allow you to change the parameter over a specific period of time. To add randomness, you could use a Noise CHOP and a Hold CHOP, to create a sample + hold functionality that outputs a random value each time the system resets to 1.
Hi! I was having an issue after I drag and drop the particlesGPU the particles do not show up at all only the bound box shows up. Is there anyway to fix that? Thank you!
First thing to check is that you're running the latest version of TouchDesigner. At the moment, the latest stable release is 2022.33910 and can be downloaded at: derivative.ca/download. ParticlesGPU has seen a few updates over the last few years and this could be a bug if you're not running the most recent version. Start there and let us know how you fare!
hey, I'm running version 2022.33600 on a mid 2015 MBP (Intel Iris Pro, I7 quadcore) and still having the issue, that particles are not showing up after I put the particleGpu node on the screen.
Do u have any suggestions?
This sounds like it could be a bug specific to your hardware, I'd recommend posting a bug report on the TouchDesigner forum: forum.derivative.ca/. They'll be able to provide more insight into your specific situation. Hope that helps!
hi, sir, i need some help, i need to do something right this, but i need have 5 projectors and 5 kinect sensors, can help me how to connect 5 projects to this and 5 kinect sensor.
That question is a bit outside the scope of this video, best thing would be to join the TouchDesigner community or The HQ PRO and get support about your specific technical question there interactiveimmersive.io/lp/hq-pro-full-trial/
How do you set up NDI on a mac? It keeps showing 'No sources found yet.' Any advice?
Hi Janet! Since you're using a Mac, you can actually skip working with NDI for this effect and use the Video Device In TOP instead to grab an image from your webcam.
Elburz was using NDI so that he could use the output of Nvidia Broadcast, which can be used to cut out the background of your webcam feed. Unfortunately, Broadcast is Windows-only and requires an Nvidia graphics card.
That said, the nice thing about optical flow is that it's based on movement, so you don't actually need to have a transparent background in your webcam feed for this effect to work! Using the Video Device In TOP instead will work just fine.
So collision with a static SOP is out of question I suppose, with that system at least.
Yes, unfortunately - that said, if you're looking for that kind of functionality definitely check out the Bullet Solver COMP or Nvidia Flex!
i love u
I get lines instead of leafs
Double check that you've updated to the latest version of TouchDesigner, particlesGPU has been updated and should start out with the leaf material when you add it to the network