High quality stuff, really wish I could incorporate more stuff like this as a music producer... I see other sound designers usually work with recordings, how would you use this patch for a real world use-case? Would you use midi to make a performance synced to an animation/video clip? Or would you record yourself improvising the patch and then manipulate the recordings afterwards?
@@KriAsb great question! I’m primarily a game sound designer so my use cases might be different than for music production. Short answer: I’d use MIDI to perform as close as I can get to the game event, then polish it up by manipulating the audio files. Long ramble: 98% of the time I’d choose to work with recordings from the patch rather than dialling it in using MIDI because it’s much faster and accurate to sync to the game events. I could use MIDI to “perform” and sync but I’d wanna configure the MIDI parameters so it’s not just sending chromatic pitch information. If my client uses Unreal engine and has the time and budget, I’d consider building patches directly in the engine using metasounds. It can be a fun/effective way of generating sounds without loading in big wav files. But again, this takes a lot more time and not all game events will benefit from having generative patches. I see these patches similar to making colours on a palette 🎨 rather than a finished painting. So it’s more for source creation than the actual design. Let me know if this makes sense!
I've watched tons of sound design tutorials and I think you are particularly thorough here. Thank you and keep doin' what you're doin'!
hidden gem channel
Yes
High quality stuff, really wish I could incorporate more stuff like this as a music producer... I see other sound designers usually work with recordings, how would you use this patch for a real world use-case? Would you use midi to make a performance synced to an animation/video clip? Or would you record yourself improvising the patch and then manipulate the recordings afterwards?
@@KriAsb great question! I’m primarily a game sound designer so my use cases might be different than for music production.
Short answer: I’d use MIDI to perform as close as I can get to the game event, then polish it up by manipulating the audio files.
Long ramble: 98% of the time I’d choose to work with recordings from the patch rather than dialling it in using MIDI because it’s much faster and accurate to sync to the game events.
I could use MIDI to “perform” and sync but I’d wanna configure the MIDI parameters so it’s not just sending chromatic pitch information. If my client uses Unreal engine and has the time and budget, I’d consider building patches directly in the engine using metasounds. It can be a fun/effective way of generating sounds without loading in big wav files. But again, this takes a lot more time and not all game events will benefit from having generative patches.
I see these patches similar to making colours on a palette 🎨 rather than a finished painting. So it’s more for source creation than the actual design. Let me know if this makes sense!
@@jamieleesounds Yeah that makes sense, thanks for the reply!