Awesome video! the variables of absolute alpha, beta, gamma..etc. just work when the 4 sensors are touching your head. One of the problems working in general with EEG data is that you need to constantly normalize the data by yourself as sensors tend to disconnect when we move. My approach to that, specifically for Muse headband, is to get muse/eeg1, muse/eeg2, muse/eeg3 and muse/eeg4 and normalize them so that if one of the sensors is disconnected your general input data is not affected. Then with this new signal you can get the different band frequencies alpha, beta,delta .. etc. based on literature. Hope it helps :))
Thanks so much for sharing these useful tips! Sounds like a great technique, as consistent connectivity is always a struggle with these sorts of things. Cheers :)
Cool! I collaborate with a neuroscientist that worked on the Muse app and is also an OSC wizard- excited to see you getting into this. We work with Muse, and OpenBCI in Unity/Unreal and of course TD. :D
Wow, that's really cool! Sounds like it must be a fun and interesting collab. We're definitely excited to see what kinds of cool things creative minds can come up with access to tools like this
This is super helpful! I'm currently working on my graduation project which includes the Muse EEG headset and TouchDesigner, even though I've never worked with either before. So it's quite the challenge for me!
Glad to hear it was helpful! :) Sounds like a fun project, sometimes pushing yourself to try out new tools can help jumpstart the creativity and make some amazing results. If you do end up running into any issues, the TouchDesigner forum (forum.derivative.ca/) and the TouchDesigner Help Group on Facebook (facebook.com/groups/touchdesignerhelp) are great resources for getting assistance, whether that’s looking through related posts from the past or asking questions. There’s a lot of very friendly and knowledgable folks who are happy to help!
Great job Crystal I learned a lot thanks! Looking forward to working with TouchDesigner on a new EEG collaboration with another artist Ana Fonseca and biocybernetic engineer Hugo Ferriera in Lisbon
Yes! You can use a Record CHOP to capture CHOP data over a specific period of time and use it later. Check out the wiki: docs.derivative.ca/Record_CHOP To see it in use, right click on the Record CHOP and click "OP Snippets"
having a bit of an issue connecting via OSC/Mind Monitor- I've changed the IP to my computers, and tried a few different ports- just a blank OSCin CHOP with no data, I'm runnin everything on the same wifi network. Any suggestions?
Are you running this on Windows or Mac? If Windows, the Windows Firewall can cause issues with connectivity/receiving data, and it's recommended that it be disabled: docs.derivative.ca/OSC_In_CHOP
First, double check that you've set both the Local Address _and_ the Network Port to the values you see on your phone. If those are correct, your computer's firewall might be the problem. Windows computers are known to have connectivity issues due to Windows Firewall, so if you are running this on Windows it's recommended to disable Windows Firewall.
Curious if you or anyone has gotten this to work via direct connection to the computer? I found a python tool called "muselsl" which looks promising but doesn't output OSC. I don't like the idea of having to go through the phone (additional point of failure/latency).
Unfortunately it looks like a lot of the desktop apps are meant for logging the information, rather than transmitting it to another program. We've found Mind Mapper + mobile device to be pretty reliable in production settings, although you have to make sure to keep the Muse and mobiles charged!
Yes! If you're using the app Mind Monitor to transmit OSC data from the Muse to TouchDesigner, it supports Muse headset models back to 2014: mind-monitor.com/FAQ.php
Nice Tutorial, I am thinking of getting the Muse EEG headset for my project as well. Is it actually also possible to record eeg data and implement that in TD instead of visualizing live?
Great question! Yes, you can actually use the Record CHOP to record the OSC channels over a set period of time, and then the File Out CHOP to save that information to a file. Hope that helps!
Thanks Crystal. Always love your sessions. I've been playing aorund with a differently formatted output for a wide projection surfance 5760x1080. I'd like to introduce the Color palette as an interactable, where there are several Kinects setup up to detect an audience member's motion as they walk past a large wrap around projection in the hall. To achieve this, I'm wondering how you might handle it? That is, at which part of the network is it best to add the motion detection select variable - at the top network level or within the COLORS container where the original Mitzner color module sits?
Unfortunately, the Colour Lovers Picker is set up in such a way that it’s intended to be used with the mouse/via the panel specifically, and would take a not insignificant (though definitely not impossible) amount of work to be able to control it/select new palettes via CHOP channel or something similar. As it is, there’s no CHOP-controllable parameters available within the .TOX, it’s all based on mouse interaction with containers. A possible route would be to have custom parameters available at the top level of the TOX, which you could control from outside of the TOX with the Kinect channels, that could allow you to select different palettes and crossfade between them. Hope that helps!
Yes! As long as you can get the Muse sensor to send OSC into TouchDesigner, you'll be able to map the channels it sends to create this effect. For sending the OSC information to TouchDesigner, we've had success with the Mind Monitor app that Crystal uses in the video. Hope that helps!
Our pleasure! We haven't had a chance to play with the data doing different tasks yet, but have seen some projects that use it to see how the brain reacts to watching different visuals or hearing different tracks of music with pretty good results. We'll keep you posted as we experiment more!
Awesome video! the variables of absolute alpha, beta, gamma..etc. just work when the 4 sensors are touching your head. One of the problems working in general with EEG data is that you need to constantly normalize the data by yourself as sensors tend to disconnect when we move. My approach to that, specifically for Muse headband, is to get muse/eeg1, muse/eeg2, muse/eeg3 and muse/eeg4 and normalize them so that if one of the sensors is disconnected your general input data is not affected. Then with this new signal you can get the different band frequencies alpha, beta,delta .. etc. based on literature. Hope it helps :))
Thanks so much for sharing these useful tips! Sounds like a great technique, as consistent connectivity is always a struggle with these sorts of things. Cheers :)
Cool! I collaborate with a neuroscientist that worked on the Muse app and is also an OSC wizard- excited to see you getting into this. We work with Muse, and OpenBCI in Unity/Unreal and of course TD. :D
Wow, that's really cool! Sounds like it must be a fun and interesting collab. We're definitely excited to see what kinds of cool things creative minds can come up with access to tools like this
This is super helpful! I'm currently working on my graduation project which includes the Muse EEG headset and TouchDesigner, even though I've never worked with either before. So it's quite the challenge for me!
Glad to hear it was helpful! :) Sounds like a fun project, sometimes pushing yourself to try out new tools can help jumpstart the creativity and make some amazing results. If you do end up running into any issues, the TouchDesigner forum (forum.derivative.ca/) and the TouchDesigner Help Group on Facebook (facebook.com/groups/touchdesignerhelp) are great resources for getting assistance, whether that’s looking through related posts from the past or asking questions. There’s a lot of very friendly and knowledgable folks who are happy to help!
Great job Crystal I learned a lot thanks! Looking forward to working with TouchDesigner on a new EEG collaboration with another artist Ana Fonseca and biocybernetic engineer Hugo Ferriera in Lisbon
We’re glad to hear it! Sounds like an amazing project :)
Just now watching, thank u for the wonderful content
No problem, thanks for watching! :)
Is there a way to record the data coming in off of the MUSE to allow me to work with it at a later date ?
Yes! You can use a Record CHOP to capture CHOP data over a specific period of time and use it later. Check out the wiki: docs.derivative.ca/Record_CHOP
To see it in use, right click on the Record CHOP and click "OP Snippets"
Thanks Alwasy!, BTW do the both of Muse2 and Mind Monitor work with Android? or just with Apple?
Our pleasure! Yes, Mind Monitor is available for Android and supports Muse 2: mind-monitor.com/ Hope that helps :)
To clarify, the motion of the audience member would trigger a transition from one pallet selection to another. e.g. DreamMagnet to LoversInJapan, etc
See other message :)
having a bit of an issue connecting via OSC/Mind Monitor- I've changed the IP to my computers, and tried a few different ports- just a blank OSCin CHOP with no data, I'm runnin everything on the same wifi network. Any suggestions?
Are you running this on Windows or Mac? If Windows, the Windows Firewall can cause issues with connectivity/receiving data, and it's recommended that it be disabled: docs.derivative.ca/OSC_In_CHOP
how about in mac? having the same problem@@TheInteractiveImmersiveHQ
what happens if the Target IP can't be found on touch design? and when I adjust it it has a yellow warning sign
First, double check that you've set both the Local Address _and_ the Network Port to the values you see on your phone. If those are correct, your computer's firewall might be the problem. Windows computers are known to have connectivity issues due to Windows Firewall, so if you are running this on Windows it's recommended to disable Windows Firewall.
Great showcase/tutorial; thanks a lot!
Our pleasure, thanks for watching!
Curious if you or anyone has gotten this to work via direct connection to the computer? I found a python tool called "muselsl" which looks promising but doesn't output OSC.
I don't like the idea of having to go through the phone (additional point of failure/latency).
Unfortunately it looks like a lot of the desktop apps are meant for logging the information, rather than transmitting it to another program. We've found Mind Mapper + mobile device to be pretty reliable in production settings, although you have to make sure to keep the Muse and mobiles charged!
Hi, can i use the muse 1 instead of the 2?
Yes! If you're using the app Mind Monitor to transmit OSC data from the Muse to TouchDesigner, it supports Muse headset models back to 2014: mind-monitor.com/FAQ.php
Nice Tutorial, I am thinking of getting the Muse EEG headset for my project as well. Is it actually also possible to record eeg data and implement that in TD instead of visualizing live?
Great question! Yes, you can actually use the Record CHOP to record the OSC channels over a set period of time, and then the File Out CHOP to save that information to a file. Hope that helps!
Thanks Crystal. Always love your sessions. I've been playing aorund with a differently formatted output for a wide projection surfance 5760x1080. I'd like to introduce the Color palette as an interactable, where there are several Kinects setup up to detect an audience member's motion as they walk past a large wrap around projection in the hall. To achieve this, I'm wondering how you might handle it? That is, at which part of the network is it best to add the motion detection select variable - at the top network level or within the COLORS container where the original Mitzner color module sits?
Unfortunately, the Colour Lovers Picker is set up in such a way that it’s intended to be used with the mouse/via the panel specifically, and would take a not insignificant (though definitely not impossible) amount of work to be able to control it/select new palettes via CHOP channel or something similar. As it is, there’s no CHOP-controllable parameters available within the .TOX, it’s all based on mouse interaction with containers.
A possible route would be to have custom parameters available at the top level of the TOX, which you could control from outside of the TOX with the Kinect channels, that could allow you to select different palettes and crossfade between them. Hope that helps!
Where do we find the color picker tool?
Great question! You can download it here: derivative.ca/community-post/asset/colour-lover-palette-picker/62697
The link is about halfway down the page
I have the Muse S, will this work with the S too?
Yes! As long as you can get the Muse sensor to send OSC into TouchDesigner, you'll be able to map the channels it sends to create this effect. For sending the OSC information to TouchDesigner, we've had success with the Mind Monitor app that Crystal uses in the video. Hope that helps!
Thanks for sharing this! Have you tested how the sensor reacts to different tasks? Is it controllable in some degree or plain chaotic?
Our pleasure! We haven't had a chance to play with the data doing different tasks yet, but have seen some projects that use it to see how the brain reacts to watching different visuals or hearing different tracks of music with pretty good results. We'll keep you posted as we experiment more!
Thanks for the great sharing ! Btw could you leave the link to color picker tutorial ? Many thanks.
Our pleasure! Here's the link: derivative.ca/community-post/asset/colour-lover-palette-picker/62697
wow touch designer is so versatile
It really is! The flexibility of the environment and workflow allows it to be used for a lot of different purposes
@@TheInteractiveImmersiveHQ Yeah it’s amazing how you can input anything and then I’ll put anything with it
Dope.
Thanks!