I always will work with a Mastering Engineer - a second pair of ears is irreplaceable - in mixing but also in questions of art. I have more self confidence to a track, when I can discuss it with another one who is seriously working on the track.
Call me old fashioned, but I really miss the days of collaborating with other humans and going on a journey together, in order to enrich our lives by doing the things that bring us joy, while simultaneously celebrating the quirks, imperfections, and flaws that make each of us unique. Excuse me now, while I suck on some Werther’s Originals.
The first time I saw that robot he was working for Esset Nod32 antivirus a few years ago.....Who would have thought that today he would become a prominent audio engineer!!!.
Considering programs like Ozone and Lurssen do a great job without having crazy knowledge on mastering it seems natural. PLus people aren't as picky when it comes to audio quality as engineers and producers are. Not as drawn to 'perfect' fidelity as the industry might think.
Before you retire, you have to save a lot of money before your mastering business goes down. After that, you have to go with the change of life; try skydiving, zip lining, or touring around the world. You have to forget about music processing; all you have to do is listen to music while traveling. Smooth jazz, soul etc... Build a palace, etc.
It does 'a thing'. But I'm a mastering engineer and I still think there are things that just can't be noticed by programs. I have a 20 point checklist that I make sure every mix that I get goes through before it gets mastered. I often send tracks back to the client with some suggested changes that they might want to make before I go ahead with the final master. I check for clicks, pops, distortion, clips, export errors.. all kinds of things. Also, being a mastering engineer helped all my mixes because I know what happens at the final stage. If nobody is working backwards from the finished product, then they won't know how to mix. And nobody should rely on mastering to fix a bad mix.
@@thank_you_thank_you I checked the program code and it checks a lot... that's why you have to play the full song. it does more then Ozone for example.
@@osamabinladen2075 Not really. humans will always be in demand for authentic artisan analogue or creative tasks. AI is incredible and only getting better, but can it feel like a human and give the emotional touches that mixes have, not just a balanced mix?
Companies will say we have “AI” on things that are not really AI. Ozone has had a mastering application for many years. AI has become a catch phrase to sell product.
7:00 do you realize you have just created a new software by putting up this idea? Could be a great time for you on developing one, before someone else does. As you have said, you can't stop it, but at least you can get on that train.
Having to reduce the output gain to stop the distortion works for me. I only used this on an old album of mine, and it improved a 1920 Stieff upright piano in a bar for a live recording pulled from a half track master tape. I probably will stick with Ozone for everything more recent.
I did some testing with this one. I would also love to do some changes if I want but I must admit this one is really good. Definitely had better results with it then the other available solutions on the market. Nice one - let's see how it will evolve over time.
Version 1.5 just dropped and it's pretty amazing! Just tested it on a few unmastered songs and it really brought them to life. It sounds a bit less bright than the previous version, a smoother top end, which is always a positive in my books
Hi Wytse, Cool that you covered this plugin. The great thing about 1.5 AI Master is that it sounds good everywhere, such as in the car, telephone, headphones, Onkyo material and other high fidelity audio equipment and with today's playback equipment. It is completely different from Izotope or T-Racks. I'm not saying these aren't good but different. The minimal differences are for a trained ear, but I notice that the regular listener does not hear it at all. What you should not do is make the AI Master angry (red) then the mix is too loud. Of course, as always, it depends on how the mix sounds without a master. A bad mix will never be made a prodigy by a master. Yes, if you could tweak the plugin than ......... But I don't think it's made for the master engeneers. In my opinion, Guy has designed a great plugin that many musicians can really use. It's great for a demo anyway.😉 Hallo Wytse, Cool dat je deze plug-in hebt besproken. Het mooie van 1.5 AI Master is dat het overal goed klinkt, zoals in de auto, telefoon, koptelefoon, Onkyo materiaal en andere hifi audioapparatuur en met de hedendaagse afspeelapparatuur. De minimale verschillen zijn voor een geoefend oor, maar ik merk dat de gewone luisteraar het helemaal niet hoort. Wat je niet moet doen is de AI Master boos maken (rood) dan staat de mix te luid. Natuurlijk hangt het, zoals altijd, af van hoe de mix klinkt zonder master. Een slechte mix zal door een engineer nooit tot een wondersound worden gemaakt. Ja, als je de plug-in zou kunnen aanpassen dan......... Maar ik denk niet dat deze voor de master engineers is gemaakt. Naar mijn mening heeft Guy een geweldige plug-in ontworpen waar veel muzikanten echt gebruik van kunnen maken. Het is sowieso geweldig voor een demo.😉 Jack.
Back in the dark age, [late 70's and 80's] the "mastering engineer" was the individual who made cut the master disk that the vinyl records were made from. Since this required a lot of science, alchemy, outright voodoo and highly specialized equipment that was VERY expensive, it was a very different job. These daze, it's mostly about making sure that a mix translates to all of the delivery mediums for the consumer. I see no reason that this job can't be done by a computer. Technology marches on, we no longer record on wire, or tape for that matter. Oh well, we've had a pretty good run...
I've only had 2 albums professionally mastered. The engineer did far more than hit all the industry standards, he clearly heard the intentions of the music and brought it all home, exceeding all my expectations, and of course he bounced approaches off me. A.I. gonna do that?
There is a possibility that advanced AI can do that. Perhaps. Hopefully not quite yet, haha. Even if that happens, I still think a dedicated mastering engineer will give any AI decent competition. Yes, our tasks are certainly challenged. But as long as we love music, really listen, and really communicate, then my vote still goes to mastering engineers, be it myself or someone else with good ears and musical souls. Do AIs Dream of Electric Sheep?
a previous post here he compared some on line mastering services, and an engineer. The engineer at one point in the song, discovered a subtle drop in energy and sonic space, very slight but effected the impact of the listen negatively, and fixed it. A.I.?.
I've had clients in the past describe what they were looking for and when they finally got it, it wasn't really what I thought they were asking for. Examples: [what they asked] Can you make the snare sound fatter? [what they really wanted] Less high hat. [what they asked] I think my voice sounds kind of thin. [what they really wanted] More reverb. [what they asked] Don't you hear that distortion on the bass? [what they really wanted] Less distortion on the guitar. So I don't think even AI could sort those things out. If we let AI start doing all this stuff the art is gone from music. Tragically, we may be on that pathetic path.
i feel like half of these "ai" apps and programs are actually just recording the data from the users and essentially combining all that aggregated data into an "assistant" . There is, and was no AI. it was just a data collector summarizing user inputs.
I think you summed it up best with, "This gives me a little bit of a glimpse into the future." There is a basic proof of concept with this software, but it's extremely limited without the ability to receive prompts. Before AI matures it will prove that mastering engineers are still very much needed.
I'm a bit worried we could reach a point HiFi systems and streaming services will compete with its own mastering, kinda like what they do with EQ curves and Loudness nowadays, but in much more complex scale, I can totally imagine having this "AI Mastering" done on the fly, no need to buy it as a product as additional step, just throw it right into the listener's "personalized FX chain" (add room corrections, headphone corrections, spatial augmented reality stuff etc...)
@@Right_in2 it's been over year since EBU suggested embedding loudness metadata to all audio content, so streaming services could properly play normalized audio based on target device specs, and yet it's not a thing yet :)
i tried it, feels a lot like a mastering preset on a mastering suite like T racks.. i compared it to my own and it killed the punch of all the drums and made everything sound flat, like basic limiter eq work, the way i did it myself is by feeding material into a clipper/limiter combo and fine tuning the clipper allowed my drums to maintain their punch.. maybe this is "ok" for people who just want a quick demo or cant get any better themselves or for some styles that just require a straightforward limiter and eq job
Alright, let's start with the usual suspects: No autogain, which otherwise leads to severe complaints at this point. Not worth mentioning here...fair enough. Extremely much more serious: You can only turn this plugin on or off, nothing in between.... If anything ever screamed: call me snake-oil, this is it. Not even delay compensation works properly here. I tested the demo version and this is what I found: It makes the signal louder, compresses it and adds saturation in the higher frequencies. That's it. I have intentionally bent a final mastered mix with the eq extremely and nothing is changed on the intentionally "bad" EQ, except see above. Honestly, not really trustworthy,
every year I hear “AI will replace XXXXXX". actually by far, AI is possible to replace people who do the works which are not associated to artistic aesthetics. If the goal is "make it fits the industrial standard" AI is quite helpful.
I asked Gpt4 for you ;) What will be the probable development of A.I audio mastering in the future? Enhanced Audio Quality: AI algorithms will continue to improve in their ability to analyze and understand audio content. This could lead to advancements in audio restoration, noise reduction, and overall sound quality enhancement. AI models might be able to identify and correct audio artifacts, imperfections, or inconsistencies with greater accuracy and precision. Personalized Audio Mastering: AI systems may become more adept at tailoring audio mastering to individual preferences. By analyzing listening habits, user feedback, and physiological data, AI could automatically adjust the mastering process to optimize the sound experience for each listener. This could result in highly personalized audio output that adapts to individual preferences and playback devices. Real-Time Audio Mastering: Currently, audio mastering is typically done after the recording process. In the future, AI could be integrated into live audio processing systems, enabling real-time mastering and audio optimization during live performances or streaming sessions. This could provide artists and engineers with immediate feedback and control over the sound, enhancing the overall live audio experience. Collaborative AI Audio Mastering: AI models might be designed to work collaboratively with human audio engineers, serving as intelligent assistants. These AI assistants could offer suggestions, automate repetitive tasks, and provide creative insights, enabling engineers to focus on more complex and artistic aspects of the audio mastering process. This collaboration could lead to greater efficiency and creativity in audio production. Improved Genre-Specific Mastering: AI algorithms can be trained on large amounts of audio data from different genres. This allows them to develop genre-specific mastering techniques that understand the unique characteristics and requirements of each musical style. Future advancements may lead to AI models that excel in mastering specific genres, resulting in more authentic and tailored audio production. Ethical Considerations: As AI audio mastering continues to evolve, ethical considerations will become increasingly important. Questions around copyright, intellectual property, and the potential for AI-generated music will need to be addressed. Additionally, ensuring transparency and accountability in AI algorithms to avoid bias or unintended consequences will be crucial for the responsible development of AI audio mastering systems.
I see a huge market for plugins that can handle live performances, adjusting to changing sounds/songs in realtime. Think about you could train your plugin by feed/prompt it with the style you want to hear. Same plugin could create a complete different mastering results than same plugin in other hands. And you could make a living by let others use your well trained puppy.
It is what it is as you said. People did not stop industrial revolution so we won't stop AI. We need to adapt to the future. In my opinion there is still room for a real Mastering Engineer and even more room for a real Mixing Engineer.
All I hear is a dynamic mid-side eq set up to expand certain frequencies to the mid or sides. Like fabfilter proQ 3... the pop track is kind of ok, outside 0f the weird 'real player' style phasing in the mids, the jazz track was destroyed by it... the smoky club vibe was lost and replaced with a cheap, metallic, modernizer... I think you're safe for another little while yet.
Maybe you are right and AI will replace technicians and engineers. Chances are that by learning AI in the end will make everything sound like an unitysausage 🇳🇱.. artists will be creative so big upportunities are coming for new directions!
A lot of people have a lack and basically poor understanding about the mastering domain, as Bob Katz stated once "Mastering is not about processing, it can be how not process", the processing side of things in mastering is secondary to the nature of the discipline itself, there is a lot of considerations that need to taken in to place before you do anything, after a good conversation and grasp of the context you can start to think how you go from point A to B, remember, a good mastering engineer not only knows how to process but also knows when not to touch anything.
The thing with AI is that you dont need to understand things to create them. Which can be a scary thought. People will be making the discisions, how to train the AI, how to decided what we like and what not, what we ask the AI to do etc. But the technical bits for a lot of jobs AI's will be able to do. Like you mention the human part, the fresh pair of ears listening to a new piece of music, being able to give advice to an artist is a lot more valuable than some fancy "mastering studio" with $ 200.000 speakers and golden cables.
@@elowine In order to make good used of AI tools you need good grasp of audio fundamentals, AI can do all kinds of things but you are unable to evaluate it then those tools are useless, because you as a human engineer are the one who pulls the trigger not the AI, your customers / artist are the ones who decide and approve the end master you handle off to them, not the AI, mastering is more than than processing and implementing AI tools, AI will speed up the process for sure but you still need to have solid criteria.
idk about engineers but AI will definitely replace those “5 must have plugins” , “secret plugin to make your mixers killer” , “omg when I tried this plugin my life changed ” type of TH-camrs
As you said, comunication with the mastering plug-in is what it can make it of some use, but, nonetheless as I hear, it will still sound artificial and plasticky.
Anyone who installed and tried out the smaller GPT4 model locally knows how sluggish and slow it runs. Currently, language models are still too computationally intensive to offer them as a function in such a plugin. However, the current progress in chatbot technology lies in reducing the models' size to a minimum while still achieving similar quality. It is entirely possible that in the next few years, specialized models will quickly produce results even on medium-performance hardware. However, we are not there yet. We still need powerful server farms to ensure that there is not a cup of coffee between prompt and result.
AI mastering can only reproduce (at the moment) things it heard and analysed before. A human being is always able to "break the rules" and invent different approaches.
I really wish more people understood this. things like chat GPT just regurgitate what an army of people making $1 an hour tell it to do. the entire system is limited by the ability to train the network on existing “free” content (or just using copyrighted material and a rather tenuous legal loophole). if humans stop making new music then there’s nothing to train the AI with
Did you try something that was already mastered to see if it "notices" it pretty much leaves it alone? It seemed to add a smile EQ and chuck the material through a limiter.
Hi, I tested it out and it's really really bad. I tried to put a LOT of hi EQ before this plugins, and it didn't compensate for it at all. Seems like it just pushes a limiter very high up to around -10 LUFS and don't do much about EQ-ing. It also distorts a lot. Doesn't seem to do much about the stereo widths either.
You are asking for real AI, but all we currently get is machine learning or deep learning. These two terms get mixed up over and over again - mostly just to scare people. IMO there still is no need to be worried about,. Real AI would mean that the software has some kind of taste / consciousness - and this I do not see for the next decade(s?).
_"this I do not see for the next decade(s?)"_ If it's even possible . . . but yes, everything you say, agreed, so many people at the moment throwing 'AI' around, but AI doesn't actually exist, it's not actually been invented, remember the headline "scientists make massive breakthrough in efforts towards AI" ? No, me neither, because it never happened . . . there's not even agreement within research as to whether AI is even possible, like you say, what we do have is better machine learning, algorithmic efficiency and much lager datasets . . . but no AI . . . at the moment it's simply this year's marketing term.
I would be careful on that. You can already create different personas in ChatGPT and have them answer with this persona. The same could be applied to any AI plugin, where you describe, what you like, and the AI will just translate your frequency preferences into the mastering approach. You save it as a preset and suddenly every track sounds the same or has a similar vibe to it, if you were to use just one preset. But imagine, you can have several presets for different song types, just like Izotope's Ozone has. Stable Diffusion and Midjourney can already do this on images. For audio, it's just about identifying the words and learn, what needs to be boosted to what level. It will be there faster than you think.
I thought that until recently too, until I discovered that AI is now teaching itself. It no longer needs us. We are on the brink of real AI being hundreds of thousands of times smarter than the smartest person on the planet. It will be able to consider things that we as human beings won't even be able to comprehend. Hold on to your hat, everything is about to change.
A human that can interpret your creativity will always be a better way to go. AI can be a great tool to assist in specific tasks, but not for actually creative purposes. I think this plugin is, at best, a smart EQ
I did a test review of this a year (or so) ago and had to admit that it was actually pretty good. I then did a second take to confirm my first impression. Second round I was able to be more sciencey and again it handled itself very well, esp compared to the usual contender websites. In both cases tho I felt that the human (me) could do better BUT the automated result was eminently usable (assuming a good mix). This is the ONLY "AI" thing I have ever had any time for, the others just leave me worse than cold. 🙂
I can see AI helping bedroom producers like myself, but they definitely won't be able to replicate the individual ears, tastes, and choices of experienced mastering engineers. If anything, they could probably give some producers more appreciation for mastering in general, who might then be more tempted to chase after a real engineer in the future instead of just ignoring mastering or doing a quick loudness bump. Edit: I'm not saying AI won't get really good, but different engineers have different styles and tastes. Unless the AI is trained off of each engineer, it won't be able to replicate them individually. But who knows, maybe someone will try that in the future - like an AI trained with the help of five iconic mastering engineers, whose styes you could then choose between. That would definitely beat a more generic one-size-fits-all solution like this one.
The artist will be the guy with taste and result in mind that AI will achieve. Is the reality . Anyone with great taste...including well trained AI . Anyways... wouldn't it be the ti e to get back into playing music ? Its always about sounds technique and blablabla. Let AI be and let's get back to music is my wat to not get into a psychosis 😂 . I mean please .. mainstream co e back to some real stuff is my wish
But if you send it a file that's already been mastered... how can you expect it to do a proper job? That's just wrong. You have to feed it a the original MIX and listen to the results between the professional master and the plugin on separate tracks. Or did you just goof up the terminology here? ;)
what i noticed is all it does is make it so you can turn your focusrite all the way up without distortion on the audio, in a nutshell, am i the only one who noticed that, especially the soundcloud one thats been there for a while
I always said that the mastering-engineers are the ones who can be replaced the easiest. Theoretically they are there to enhace what is there and make it loud, not to be creative. In my opinion, if a mastering engineer does more than that, the mix was bad. Thus, AI can theoretically do it. And yes, you're right, once this kind of AI gets better AND learns to take text-input, it's good enough for most cases.
You're observations demonstrate the lack of understanding about the mastering domain, mastering has nothing to do with processing, the activity of changing something in the signal is secondary, there are a lot of high level architecture decisions and considerations that you need to take in to account, there is a lot of conversation you need to have with your client before doing anything including the context of the program you're giving to master, lastly a good mastering engineer knows when to do something but also knows when not do something.
@@PabloMessier Agreed , a mastering engineer defintely has a deep fine-tuned creativity. He has to use his ears first. Measurements doesn't make stuff sound great.
in 5 years it'll be better than most mastering engineers, it may not nail it everytime but the ability to get a different result within minutes will be invaluable
Let's be realistic . A.I gonna kick anyone s ass. High skills dép stuff blablah . A I can learn in 1h what takes a life time .... the master8ng guy will just prompt it..and sooner or later AI will have the best taste with in worse case à few options . Like it already is.... is just gonna be better better faster. We gotta move on
I want to know how it works on EDM music and how loud in "LUFS" this plugin can push the track w/o any distortion because -14 LUFS is false. Every top EDM track is loud at -4 ~ -6 LUFS.
@schlawpyJ you cant learn creativity, you can copy workflows as much as you want you will be only working with "presets" . Creativity is not programable, its natural to the individual and emotion fuelled.
We got rid of the drummer, the bassist, keyboardist, in-tune vocalist, the real console and audio gear. Mastering engineer: you didn’t think they would get rid of the engineers, too?
I tried it and it's pretty impressive but I still rather do the work myself because I like the process of making a song sound how I want it to sound. But it is definitely useful as a tool for reference so if I'm not sure whether or not my master is good as it is I can put it through this little baby and I can listen if I need to tweak something still or not. But I wouldn't buy it just for this reason so for me it's still not something I need if I'm being honest. Nevertheless, it is pretty incredible what this plugin can do without any settings made by the user.
I very much dislike how the saxophone sounded in the AI mastering, sounded like from a... toilet. That beside the distortion which I clearly heard even with crappy earplugs on the compressed YT quality. With all the automatic plugins (Neutron, Ozone...etc) I have the same similar problem, it's almost always distorted. I have to disable the compressors or back off the gain with 6-12 db.
I learnt to make banana cake, and after a few times, I slightly changed the recipe based on taste, and got it super fluffy, really nice cake actually, was rather proud of myself, and gave some to my friends that popped over to record a song, they said it was better than one from the supermarket, and I agreed I was proud of myself, I didn't actually realise I had a secret passion for baking, I might try and make other things, its pretty fun process actually, reminded me of making music, adding all the ingredients and kinda jamming out in the kitchen. it was really fun, the moral of this story, or if there even was one, its fun making things and doing it manually to taste, mastering is a bit like that actually. (P.S Sonny from I, Robot likes your Jazz!)
I would say mixing is a more accurate metaphor. Mastering is and always was, a final preparation for intended media. Not just putting final sheen on, but ensuring everything is as smooth and clear to the audience as possible. That why AI can be really objective. Mixing?? That’s a whole different, creative thing, that AI will only succeed, if it is able to strictly reference music that has come before it. It will further homogenize music when trying to be creative, thus pushing true creatives to escape the matrix (eventually)
@@scohills I feel ya, yeah I think I was more meaning that mastering like mixing is a human thing and tbh. I think there will always be people that love making things because it’s really fun and cool to get it right AI will end up being good at things AI does, honestly I just had a great time making cake and realised most creative things have that enjoyable process and result, why let AI do that. Lol
Hehe!! I actually asked you to check this out a few years ago! It does an interesting job very quickly. 😀 I found it usually needed a little more processing either before or after depending on the song. I also found that I preferred working into it and making small tweaks in the mix to really polish off the sound. 👍🏻 so basically this was a microscope to surgically polish off the the few last pieces in the mix. It really depended on the song though. Cool vid.. you can get much deeper with it if you use it the way it was not intended. Like a limiter on the master. 😀👍🏻
I tried it but.... DEFINITELY not close to what ozone can do... after 3 different tries on different tracks with both Ozone & this one... not for me. Keep on trusting in your ears and your human work!
You are fine for the time being, while this AI might be acceptable it's incredibly limited in what it can do, compared to the AI tools in the art sector like Stable Diffusion where you are able to instruct the AI exactly what you want and make refinements, through prompts, which is basically a dark art in itself. I think in the future, you'll still have a job as a mastering engineer if you get ahead of the curve by utilising AI to help you master and you become a promptsmith who excels at giving AI the perfect prompts to get the sound you and your clients are aiming for. Most importantly of all you already have the ears for a good master vs bad master, something the AI will likely never have because you cannot code subjectivity and what is a good master on one track may not be a good master on another, which is where the refinement comes in and where you ears are as important and I'm not sure AI can compute the subtle nuances between masters that an actual engineer is able to.
A.I. will become a very serious problem in the future. People will become lazy to do things themselves, and years later they won't know anymore how to do the things themselves if the electronics fail. The same is that most people can't tune their own guitar anymore by ear cause they use a guitar tuner all the time. I still use a tuning fork for the A, and the other strings I tune by ear. In this way you keep yourself training instead of get lazy. Btw... I tested the plugin to know what it does, otherwise I can not write a comment about. I have to say: it sucks because it made my mixes way to harsh without any emotional feeling. I'd rather to keep more AIR in the mix and this plugin does not.
Meh, tried the demo with unmastered track against it's mastered version and this plugin lost big time! Funny is that AI version was not as loud as human mastered version. One would think that the plugin would crank up the loudness. But it's was modern club music and there is no way that this plugin has enough big parameter count to cover all styles. But be sure that next plugins are gonna be better.
I bought this one about 1 year ago, but ended up not using it. There hasn't been any update since then either, which doesn't speak for its future, as it is not perfect by any means. In the "one more song" chapter I can clearly hear loss of transients, especially the snare is crushed, compared to your master, and the bass boom it added is disturbing. With no parameters to adjust anything, this should be a set and forget experience, but it's not, so at least to me it is unuseable. Your master is way better and definitely not boring.
With automated technologies, it feels like people are willing to cut costs if the results are "good enough." Even if something isn't better, people will still use it if it's significantly more convenient and less expensive.
Lot of good comments here. Thanks for demo. The Ai mastering seems more “sensational” and in your face. And the original masters are more polite / less offensive.
i tried the demo i think for a lot of people it will be more than good enough especially for the price which is crazy because i remember trying this kinda stuff before and feeling like just putting a limiter would have been better. a good engineer can still beat it but ive gotten way worse masters from real people lol
just tried it because really curious...and i mastered one song of a job made for a band .....in conclusion i can't say it was bad but far away to what i did myself ....i use a analog chain for mastering mainly, and i can say for sure that the sound lost life and color and it was quiet comparate to mine......i know at this point everyone can do everything without a sound engineer but i got a taste in the sound that is really personal.....i will keep going doing it myself....i don't like the idea of mac donalds way on the sound where everything will taste the same
i think you have no worries. the fact that there was distortion on the bass in the jazz example is completely unacceptable. the pop example was very harsh relative to the original mix (but one person's harsh is another's aggressive). if you master and the end results do not show those characteristics, your method of making. living is still safe!
Run a file though, save the results. Then drop the file by 3dB and run that through. Compare the results; they are different. The AI will produce a different master depending on the level of the pre-master. That does not seem correct or very intelligent.
Not really fair because you are already giving it something that has passed through your expert hands. Give it a balanced mix that's clearly not a master. Something from a less experienced engineer.
Agree with the assist and also training if it comes to a better job find out what it did learn from it also lot of musicians we want to stay in the vibe and not worry about that technical side of the whole process destroy the entire art the entire human part but as an assist to get us to that next level of standard to be able to find that engineer afford that engineer or collab with that engineer Fai not taking engineer jobs assisting a diamond in the rough be heard eventually inspiring a third-party investor for human engineer❤ Great video
Well, it will not only take over mastering engineers, it will take over studio's, instruments, mixing consoles, expensive mics... this will simply end in realtime orchestration. You hum a song or a melody and it will be instantly orchestrated mixed etc and ready to upload. i think within 4 years. So grab your guitar and become a real troubadour again the future is live.
"A.I. mastering seems to be a thing for years already. And every time I test it I get mixed results..." I see what you did there.
what did he do
what?
"mixed results" haha :D
😂he got bars
Nice one, i see what YOU did there 😉👌🏼
I always will work with a Mastering Engineer - a second pair of ears is irreplaceable - in mixing but also in questions of art. I have more self confidence to a track, when I can discuss it with another one who is seriously working on the track.
Yea, no you won't.
You will still be able to do that but your mastering engineer will just use ai as a tool
Call me old fashioned, but I really miss the days of collaborating with other humans and going on a journey together, in order to enrich our lives by doing the things that bring us joy, while simultaneously celebrating the quirks, imperfections, and flaws that make each of us unique. Excuse me now, while I suck on some Werther’s Originals.
you can still do that :)
@@bit1856 that is certainly true, correct answer here
Life is a journey you should go on with other humans to discover and explore. Nice point you said and its absolutely true!
The first time I saw that robot he was working for Esset Nod32 antivirus a few years ago.....Who would have thought that today he would become a prominent audio engineer!!!.
Considering programs like Ozone and Lurssen do a great job without having crazy knowledge on mastering it seems natural. PLus people aren't as picky when it comes to audio quality as engineers and producers are. Not as drawn to 'perfect' fidelity as the industry might think.
Before you retire, you have to save a lot of money before your mastering business goes down. After that, you have to go with the change of life; try skydiving, zip lining, or touring around the world. You have to forget about music processing; all you have to do is listen to music while traveling. Smooth jazz, soul etc... Build a palace, etc.
It does 'a thing'.
But I'm a mastering engineer and I still think there are things that just can't be noticed by programs.
I have a 20 point checklist that I make sure every mix that I get goes through before it gets mastered. I often send tracks back to the client with some suggested changes that they might want to make before I go ahead with the final master. I check for clicks, pops, distortion, clips, export errors.. all kinds of things.
Also, being a mastering engineer helped all my mixes because I know what happens at the final stage. If nobody is working backwards from the finished product, then they won't know how to mix. And nobody should rely on mastering to fix a bad mix.
@@thank_you_thank_you I checked the program code and it checks a lot... that's why you have to play the full song. it does more then Ozone for example.
you are scared of losing your job
@@osamabinladen2075 Not really. humans will always be in demand for authentic artisan analogue or creative tasks. AI is incredible and only getting better, but can it feel like a human and give the emotional touches that mixes have, not just a balanced mix?
Companies will say we have “AI” on things that are not really AI. Ozone has had a mastering application for many years. AI has become a catch phrase to sell product.
7:00 do you realize you have just created a new software by putting up this idea? Could be a great time for you on developing one, before someone else does. As you have said, you can't stop it, but at least you can get on that train.
Having to reduce the output gain to stop the distortion works for me. I only used this on an old album of mine, and it improved a 1920 Stieff upright piano in a bar for a live recording pulled from a half track master tape. I probably will stick with Ozone for everything more recent.
This plugin is crazy good. best of all, it doesn't overprocess or add artifacts to your original track.
I did some testing with this one. I would also love to do some changes if I want but I must admit this one is really good. Definitely had better results with it then the other available solutions on the market. Nice one - let's see how it will evolve over time.
Version 1.5 just dropped and it's pretty amazing! Just tested it on a few unmastered songs and it really brought them to life. It sounds a bit less bright than the previous version, a smoother top end, which is always a positive in my books
Hi Wytse,
Cool that you covered this plugin. The great thing about 1.5 AI Master is that it sounds good everywhere, such as in the car, telephone, headphones, Onkyo material and other high fidelity audio equipment and with today's playback equipment. It is completely different from Izotope or T-Racks. I'm not saying these aren't good but different.
The minimal differences are for a trained ear, but I notice that the regular listener does not hear it at all. What you should not do is make the AI Master angry (red) then the mix is too loud.
Of course, as always, it depends on how the mix sounds without a master. A bad mix will never be made a prodigy by a master. Yes, if you could tweak the plugin than ......... But I don't think it's made for the master engeneers.
In my opinion, Guy has designed a great plugin that many musicians can really use. It's great for a demo anyway.😉
Hallo Wytse,
Cool dat je deze plug-in hebt besproken. Het mooie van 1.5 AI Master is dat het overal goed klinkt, zoals in de auto, telefoon, koptelefoon, Onkyo materiaal en andere hifi audioapparatuur en met de hedendaagse afspeelapparatuur.
De minimale verschillen zijn voor een geoefend oor, maar ik merk dat de gewone luisteraar het helemaal niet hoort. Wat je niet moet doen is de AI Master boos maken (rood) dan staat de mix te luid.
Natuurlijk hangt het, zoals altijd, af van hoe de mix klinkt zonder master. Een slechte mix zal door een engineer nooit tot een wondersound worden gemaakt. Ja, als je de plug-in zou kunnen aanpassen dan......... Maar ik denk niet dat deze voor de master engineers is gemaakt.
Naar mijn mening heeft Guy een geweldige plug-in ontworpen waar veel muzikanten echt gebruik van kunnen maken. Het is sowieso geweldig voor een demo.😉 Jack.
Back in the dark age, [late 70's and 80's] the "mastering engineer" was the individual who made cut the master disk that the vinyl records were made from. Since this required a lot of science, alchemy, outright voodoo and highly specialized equipment that was VERY expensive, it was a very different job. These daze, it's mostly about making sure that a mix translates to all of the delivery mediums for the consumer. I see no reason that this job can't be done by a computer.
Technology marches on, we no longer record on wire, or tape for that matter. Oh well, we've had a pretty good run...
In most UK studios it was the the third job up the ladder after tea boy, tape op, then lathe op.
😂 when the dead-faced robot started dancing to the quirky jazz whilst staring at the viewer😂
I've only had 2 albums professionally mastered. The engineer did far more than hit all the industry standards, he clearly heard the intentions of the music and brought it all home, exceeding all my expectations, and of course he bounced approaches off me. A.I. gonna do that?
No, A.I certainly cannot do that unless it becomes sentient, then maybe.
There is a possibility that advanced AI can do that. Perhaps. Hopefully not quite yet, haha. Even if that happens, I still think a dedicated mastering engineer will give any AI decent competition. Yes, our tasks are certainly challenged. But as long as we love music, really listen, and really communicate, then my vote still goes to mastering engineers, be it myself or someone else with good ears and musical souls. Do AIs Dream of Electric Sheep?
a previous post here he compared some on line mastering services, and an engineer. The engineer at one point in the song, discovered a subtle drop in energy and sonic space, very slight but effected the impact of the listen negatively, and fixed it. A.I.?.
I've had clients in the past describe what they were looking for and when they finally got it, it wasn't really what I thought they were asking for. Examples:
[what they asked] Can you make the snare sound fatter?
[what they really wanted] Less high hat.
[what they asked] I think my voice sounds kind of thin.
[what they really wanted] More reverb.
[what they asked] Don't you hear that distortion on the bass?
[what they really wanted] Less distortion on the guitar.
So I don't think even AI could sort those things out. If we let AI start doing all this stuff the art is gone from music. Tragically, we may be on that pathetic path.
Its one of those things that GPT doesn’t do yet, which is, asking counter questions to clarify and zoom in on the exact issue… it just assumes
yeah, communication is so important. Often, I find clients have problems putting their thoughts into words
I use it all the time for demos, its 2 minutes and job done!
i feel like half of these "ai" apps and programs are actually just recording the data from the users and essentially combining all that aggregated data into an "assistant" . There is, and was no AI. it was just a data collector summarizing user inputs.
I kinda want to throw Dan's "I won the loudness wars" to see/hear what it does to it xD
It'll go all Skynet and nuke us all. 😂
That track is sooo dangerous!
I think you summed it up best with, "This gives me a little bit of a glimpse into the future." There is a basic proof of concept with this software, but it's extremely limited without the ability to receive prompts. Before AI matures it will prove that mastering engineers are still very much needed.
So, another 2-3 years..
I still haven’t heard an automatic Ai mastering service make any master sound amazing.
The more people use AI tools, the more my work will stand out. So, go for it!
Just wait 5 years...
I'm a bit worried we could reach a point HiFi systems and streaming services will compete with its own mastering,
kinda like what they do with EQ curves and Loudness nowadays, but in much more complex scale,
I can totally imagine having this "AI Mastering" done on the fly, no need to buy it as a product as additional step, just throw it right into the listener's "personalized FX chain" (add room corrections, headphone corrections, spatial augmented reality stuff etc...)
It's less than a month away before this happens.
@@Right_in2 it's been over year since EBU suggested embedding loudness metadata to all audio content, so streaming services could properly play normalized audio based on target device specs, and yet it's not a thing yet :)
Definitely needs some ability to tweak; how about if you want a hotter master or a more dynamic one?
i tried it, feels a lot like a mastering preset on a mastering suite like T racks.. i compared it to my own and it killed the punch of all the drums and made everything sound flat, like basic limiter eq work, the way i did it myself is by feeding material into a clipper/limiter combo and fine tuning the clipper allowed my drums to maintain their punch.. maybe this is "ok" for people who just want a quick demo or cant get any better themselves or for some styles that just require a straightforward limiter and eq job
Alright, let's start with the usual suspects: No autogain, which otherwise leads to severe complaints at this point. Not worth mentioning here...fair enough. Extremely much more serious: You can only turn this plugin on or off, nothing in between.... If anything ever screamed: call me snake-oil, this is it. Not even delay compensation works properly here. I tested the demo version and this is what I found: It makes the signal louder, compresses it and adds saturation in the higher frequencies. That's it. I have intentionally bent a final mastered mix with the eq extremely and nothing is changed on the intentionally "bad" EQ, except see above. Honestly, not really trustworthy,
this is why reaper is so awesome: it has a dry/wet mix knob on the top right corner of any plugin one is currently utilizing the interface of
Legit the biggest snake oil plugin I’ve ever seen, and your comment nails it.
every year I hear “AI will replace XXXXXX". actually by far, AI is possible to replace people who do the works which are not associated to artistic aesthetics. If the goal is "make it fits the industrial standard" AI is quite helpful.
What song is that at the end?
I asked Gpt4 for you ;) What will be the probable development of A.I audio mastering in the future?
Enhanced Audio Quality: AI algorithms will continue to improve in their ability to analyze and understand audio content. This could lead to advancements in audio restoration, noise reduction, and overall sound quality enhancement. AI models might be able to identify and correct audio artifacts, imperfections, or inconsistencies with greater accuracy and precision.
Personalized Audio Mastering: AI systems may become more adept at tailoring audio mastering to individual preferences. By analyzing listening habits, user feedback, and physiological data, AI could automatically adjust the mastering process to optimize the sound experience for each listener. This could result in highly personalized audio output that adapts to individual preferences and playback devices.
Real-Time Audio Mastering: Currently, audio mastering is typically done after the recording process. In the future, AI could be integrated into live audio processing systems, enabling real-time mastering and audio optimization during live performances or streaming sessions. This could provide artists and engineers with immediate feedback and control over the sound, enhancing the overall live audio experience.
Collaborative AI Audio Mastering: AI models might be designed to work collaboratively with human audio engineers, serving as intelligent assistants. These AI assistants could offer suggestions, automate repetitive tasks, and provide creative insights, enabling engineers to focus on more complex and artistic aspects of the audio mastering process. This collaboration could lead to greater efficiency and creativity in audio production.
Improved Genre-Specific Mastering: AI algorithms can be trained on large amounts of audio data from different genres. This allows them to develop genre-specific mastering techniques that understand the unique characteristics and requirements of each musical style. Future advancements may lead to AI models that excel in mastering specific genres, resulting in more authentic and tailored audio production.
Ethical Considerations: As AI audio mastering continues to evolve, ethical considerations will become increasingly important. Questions around copyright, intellectual property, and the potential for AI-generated music will need to be addressed. Additionally, ensuring transparency and accountability in AI algorithms to avoid bias or unintended consequences will be crucial for the responsible development of AI audio mastering systems.
I see a huge market for plugins that can handle live performances, adjusting to changing sounds/songs in realtime.
Think about you could train your plugin by feed/prompt it with the style you want to hear. Same plugin could create a complete different mastering results than same plugin in other hands. And you could make a living by let others use your well trained puppy.
It is what it is as you said. People did not stop industrial revolution so we won't stop AI. We need to adapt to the future. In my opinion there is still room for a real Mastering Engineer and even more room for a real Mixing Engineer.
All I hear is a dynamic mid-side eq set up to expand certain frequencies to the mid or sides. Like fabfilter proQ 3... the pop track is kind of ok, outside 0f the weird 'real player' style phasing in the mids, the jazz track was destroyed by it... the smoky club vibe was lost and replaced with a cheap, metallic, modernizer... I think you're safe for another little while yet.
Maybe you are right and AI will replace technicians and engineers. Chances are that by learning AI in the end will make everything sound like an unitysausage 🇳🇱.. artists will be creative so big upportunities are coming for new directions!
A lot of people have a lack and basically poor understanding about the mastering domain, as Bob Katz stated once "Mastering is not about processing, it can be how not process", the processing side of things in mastering is secondary to the nature of the discipline itself, there is a lot of considerations that need to taken in to place before you do anything, after a good conversation and grasp of the context you can start to think how you go from point A to B, remember, a good mastering engineer not only knows how to process but also knows when not to touch anything.
The thing with AI is that you dont need to understand things to create them. Which can be a scary thought. People will be making the discisions, how to train the AI, how to decided what we like and what not, what we ask the AI to do etc. But the technical bits for a lot of jobs AI's will be able to do.
Like you mention the human part, the fresh pair of ears listening to a new piece of music, being able to give advice to an artist is a lot more valuable than some fancy "mastering studio" with $ 200.000 speakers and golden cables.
@@elowine In order to make good used of AI tools you need good grasp of audio fundamentals, AI can do all kinds of things but you are unable to evaluate it then those tools are useless, because you as a human engineer are the one who pulls the trigger not the AI, your customers / artist are the ones who decide and approve the end master you handle off to them, not the AI, mastering is more than than processing and implementing AI tools, AI will speed up the process for sure but you still need to have solid criteria.
I like the idea of AI as a help tool. What i fear though is AI making the human race lazy and not want to push the boundaries any further!
In the end it will just make a big jelly out of us to lubricate the joints of the AI machinery.
@@Vestu Terminator springs to mind.
Exactly
Actually AI does push the boundaries...and on its own too. Much faster and efficiently.
@@yomamaa13 AI pushes the boundaries for AI though, not humans! That's my fear
idk about engineers but AI will definitely replace those “5 must have plugins” , “secret plugin to make your mixers killer” , “omg when I tried this plugin my life changed ” type of TH-camrs
As you said, comunication with the mastering plug-in is what it can make it of some use, but, nonetheless as I hear, it will still sound artificial and plasticky.
Anyone who installed and tried out the smaller GPT4 model locally knows how sluggish and slow it runs. Currently, language models are still too computationally intensive to offer them as a function in such a plugin. However, the current progress in chatbot technology lies in reducing the models' size to a minimum while still achieving similar quality. It is entirely possible that in the next few years, specialized models will quickly produce results even on medium-performance hardware. However, we are not there yet. We still need powerful server farms to ensure that there is not a cup of coffee between prompt and result.
AI mastering can only reproduce (at the moment) things it heard and analysed before. A human being is always able to "break the rules" and invent different approaches.
I really wish more people understood this. things like chat GPT just regurgitate what an army of people making $1 an hour tell it to do. the entire system is limited by the ability to train the network on existing “free” content (or just using copyrighted material and a rather tenuous legal loophole). if humans stop making new music then there’s nothing to train the AI with
Mastering engineers rarely break the rules anymore.
@@craigadamjohnston8783 I am probably "rarely". At least sometimes :)
Did you try something that was already mastered to see if it "notices" it pretty much leaves it alone? It seemed to add a smile EQ and chuck the material through a limiter.
Hi, I tested it out and it's really really bad.
I tried to put a LOT of hi EQ before this plugins, and it didn't compensate for it at all. Seems like it just pushes a limiter very high up to around -10 LUFS and don't do much about EQ-ing. It also distorts a lot. Doesn't seem to do much about the stereo widths either.
It worked nicely for me.
You are asking for real AI, but all we currently get is machine learning or deep learning. These two terms get mixed up over and over again - mostly just to scare people. IMO there still is no need to be worried about,. Real AI would mean that the software has some kind of taste / consciousness - and this I do not see for the next decade(s?).
_"this I do not see for the next decade(s?)"_
If it's even possible . . . but yes, everything you say, agreed, so many people at the moment throwing 'AI' around, but AI doesn't actually exist, it's not actually been invented, remember the headline "scientists make massive breakthrough in efforts towards AI" ? No, me neither, because it never happened . . . there's not even agreement within research as to whether AI is even possible, like you say, what we do have is better machine learning, algorithmic efficiency and much lager datasets . . . but no AI . . . at the moment it's simply this year's marketing term.
I would be careful on that. You can already create different personas in ChatGPT and have them answer with this persona. The same could be applied to any AI plugin, where you describe, what you like, and the AI will just translate your frequency preferences into the mastering approach. You save it as a preset and suddenly every track sounds the same or has a similar vibe to it, if you were to use just one preset. But imagine, you can have several presets for different song types, just like Izotope's Ozone has.
Stable Diffusion and Midjourney can already do this on images. For audio, it's just about identifying the words and learn, what needs to be boosted to what level. It will be there faster than you think.
I thought that until recently too, until I discovered that AI is now teaching itself. It no longer needs us. We are on the brink of real AI being hundreds of thousands of times smarter than the smartest person on the planet. It will be able to consider things that we as human beings won't even be able to comprehend.
Hold on to your hat, everything is about to change.
@@Yuusou. this 100%
A human that can interpret your creativity will always be a better way to go. AI can be a great tool to assist in specific tasks, but not for actually creative purposes. I think this plugin is, at best, a smart EQ
I did a test review of this a year (or so) ago and had to admit that it was actually pretty good. I then did a second take to confirm my first impression. Second round I was able to be more sciencey and again it handled itself very well, esp compared to the usual contender websites. In both cases tho I felt that the human (me) could do better BUT the automated result was eminently usable (assuming a good mix). This is the ONLY "AI" thing I have ever had any time for, the others just leave me worse than cold. 🙂
And that was a year ago
@@AndyRoidEU Thanks for checking 😎
I can see AI helping bedroom producers like myself, but they definitely won't be able to replicate the individual ears, tastes, and choices of experienced mastering engineers. If anything, they could probably give some producers more appreciation for mastering in general, who might then be more tempted to chase after a real engineer in the future instead of just ignoring mastering or doing a quick loudness bump.
Edit: I'm not saying AI won't get really good, but different engineers have different styles and tastes. Unless the AI is trained off of each engineer, it won't be able to replicate them individually. But who knows, maybe someone will try that in the future - like an AI trained with the help of five iconic mastering engineers, whose styes you could then choose between. That would definitely beat a more generic one-size-fits-all solution like this one.
give it 5 years
The artist will be the guy with taste and result in mind that AI will achieve. Is the reality . Anyone with great taste...including well trained AI . Anyways... wouldn't it be the ti e to get back into playing music ? Its always about sounds technique and blablabla. Let AI be and let's get back to music is my wat to not get into a psychosis 😂 . I mean please .. mainstream co e back to some real stuff is my wish
not yet maybe, but eventually it will be better and def faster than us.
@@wayback1010 That's what they were saying about AI image generation 12 months ago. Now it's winning first prize in art exhibitions around the world.
Same thing people said about programmers and other jobs that are now being done by ai.
Honesty is the best policy...It will happen!
The AI sounds really great. Wow.
But if you send it a file that's already been mastered... how can you expect it to do a proper job? That's just wrong. You have to feed it a the original MIX and listen to the results between the professional master and the plugin on separate tracks. Or did you just goof up the terminology here? ;)
It was fed the MIX and I compared it to the approved master
what i noticed is all it does is make it so you can turn your focusrite all the way up without distortion on the audio, in a nutshell, am i the only one who noticed that, especially the soundcloud one thats been there for a while
Graphics are horrific. Please share the feedback. Yes it needs to be able to converse. This is a cool idea in its infant stage imo
I always said that the mastering-engineers are the ones who can be replaced the easiest. Theoretically they are there to enhace what is there and make it loud, not to be creative. In my opinion, if a mastering engineer does more than that, the mix was bad. Thus, AI can theoretically do it. And yes, you're right, once this kind of AI gets better AND learns to take text-input, it's good enough for most cases.
You're observations demonstrate the lack of understanding about the mastering domain, mastering has nothing to do with processing, the activity of changing something in the signal is secondary, there are a lot of high level architecture decisions and considerations that you need to take in to account, there is a lot of conversation you need to have with your client before doing anything including the context of the program you're giving to master, lastly a good mastering engineer knows when to do something but also knows when not do something.
@@PabloMessier Agreed , a mastering engineer defintely has a deep fine-tuned creativity. He has to use his ears first. Measurements doesn't make stuff sound great.
in 5 years it'll be better than most mastering engineers, it may not nail it everytime but the ability to get a different result within minutes will be invaluable
Let's be realistic . A.I gonna kick anyone s ass. High skills dép stuff blablah . A I can learn in 1h what takes a life time .... the master8ng guy will just prompt it..and sooner or later AI will have the best taste with in worse case à few options . Like it already is.... is just gonna be better better faster. We gotta move on
@@PabloMessierIf the end result is the same, then nobody cares. AI also knows when not to do anything.
What’s the song at 8:20?
I want to know how it works on EDM music and how loud in "LUFS" this plugin can push the track w/o any distortion because -14 LUFS is false. Every top EDM track is loud at -4 ~ -6 LUFS.
It's not that 'it could happen'. It will happen... The question is WHEN, not IF
I tested this plugin a while ago, and it sounds really good.
Thanks for a great video but what is the name of the second song you tested with? Is it out yet?
where can I buy this version of Sandstorm by Darude?
As soon as A.I. figures out how to buy a real tape machine, it's over.
Creativity is irreplaceable.
lol not really
@@pleiadi666 yeah Really, lol.
@schlawpyJ you cant learn creativity, you can copy workflows as much as you want you will be only working with "presets" . Creativity is not programable, its natural to the individual and emotion fuelled.
We got rid of the drummer, the bassist, keyboardist, in-tune vocalist, the real console and audio gear. Mastering engineer: you didn’t think they would get rid of the engineers, too?
Does it do variations every time you analyze?
I tried it and it's pretty impressive but I still rather do the work myself because I like the process of making a song sound how I want it to sound. But it is definitely useful as a tool for reference so if I'm not sure whether or not my master is good as it is I can put it through this little baby and I can listen if I need to tweak something still or not. But I wouldn't buy it just for this reason so for me it's still not something I need if I'm being honest. Nevertheless, it is pretty incredible what this plugin can do without any settings made by the user.
People like yourself will always be needed!!🙏😀✌💅👍Rock on my friend!
What about using a duplicate mastered track alongside the original to mix the effect lower than 100%?
I very much dislike how the saxophone sounded in the AI mastering, sounded like from a... toilet.
That beside the distortion which I clearly heard even with crappy earplugs on the compressed YT quality.
With all the automatic plugins (Neutron, Ozone...etc) I have the same similar problem, it's almost always distorted. I have to disable the compressors or back off the gain with 6-12 db.
I learnt to make banana cake, and after a few times, I slightly changed the recipe based on taste, and got it super fluffy, really nice cake actually, was rather proud of myself, and gave some to my friends that popped over to record a song, they said it was better than one from the supermarket, and I agreed I was proud of myself, I didn't actually realise I had a secret passion for baking, I might try and make other things, its pretty fun process actually, reminded me of making music, adding all the ingredients and kinda jamming out in the kitchen. it was really fun, the moral of this story, or if there even was one, its fun making things and doing it manually to taste, mastering is a bit like that actually. (P.S Sonny from I, Robot likes your Jazz!)
I would say mixing is a more accurate metaphor. Mastering is and always was, a final preparation for intended media. Not just putting final sheen on, but ensuring everything is as smooth and clear to the audience as possible. That why AI can be really objective. Mixing?? That’s a whole different, creative thing, that AI will only succeed, if it is able to strictly reference music that has come before it. It will further homogenize music when trying to be creative, thus pushing true creatives to escape the matrix (eventually)
you had me at "banana cake"
@@scohills I feel ya, yeah I think I was more meaning that mastering like mixing is a human thing and tbh. I think there will always be people that love making things because it’s really fun and cool to get it right AI will end up being good at things AI does, honestly I just had a great time making cake and realised most creative things have that enjoyable process and result, why let AI do that. Lol
@@BRIGGS2710 I might have a new calling. Jeramiah’s cakes!! Here we go 😊🍰
Hehe!! I actually asked you to check this out a few years ago! It does an interesting job very quickly. 😀 I found it usually needed a little more processing either before or after depending on the song.
I also found that I preferred working into it and making small tweaks in the mix to really polish off the sound. 👍🏻 so basically this was a microscope to surgically polish off the the few last pieces in the mix. It really depended on the song though. Cool vid.. you can get much deeper with it if you use it the way it was not intended. Like a limiter on the master. 😀👍🏻
and I’ll just add this… like a limiter on the master while mixing. I’ve just personally always preferred to work that way.
I spat out my drink when the robot looked at me. Wonder if it analyzes my soul as well
It’s more likely that people will use professionals to use ai for them.
I tried it but.... DEFINITELY not close to what ozone can do...
after 3 different tries on different tracks with both Ozone & this one... not for me.
Keep on trusting in your ears and your human work!
Can you create a video on how to use Reaper to create a 360 music video for TH-cam? (Ambisonic 360 music video)
What would you do with the time if you weren't mastering? Would you do more recording?
it won't happen if people don't let it happen.
This plugin said hold my beer and don't talk to me. I'm busy. 😂
Wooooow the 2nd song sounded crazy good
You are fine for the time being, while this AI might be acceptable it's incredibly limited in what it can do, compared to the AI tools in the art sector like Stable Diffusion where you are able to instruct the AI exactly what you want and make refinements, through prompts, which is basically a dark art in itself. I think in the future, you'll still have a job as a mastering engineer if you get ahead of the curve by utilising AI to help you master and you become a promptsmith who excels at giving AI the perfect prompts to get the sound you and your clients are aiming for. Most importantly of all you already have the ears for a good master vs bad master, something the AI will likely never have because you cannot code subjectivity and what is a good master on one track may not be a good master on another, which is where the refinement comes in and where you ears are as important and I'm not sure AI can compute the subtle nuances between masters that an actual engineer is able to.
You got it! Times change, new possibilities arise. Prompt engineer is already a thing. You cant stop development, but you can be part of it.
Finally, a year after I commented you should try it.
I've looked at it a few times, and didn't feel like doing it... I was wrong...
A.I. will become a very serious problem in the future. People will become lazy to do things themselves, and years later they won't know anymore how to do the things themselves if the electronics fail. The same is that most people can't tune their own guitar anymore by ear cause they use a guitar tuner all the time. I still use a tuning fork for the A, and the other strings I tune by ear. In this way you keep yourself training instead of get lazy. Btw... I tested the plugin to know what it does, otherwise I can not write a comment about. I have to say: it sucks because it made my mixes way to harsh without any emotional feeling. I'd rather to keep more AIR in the mix and this plugin does not.
Meh, tried the demo with unmastered track against it's mastered version and this plugin lost big time! Funny is that AI version was not as loud as human mastered version. One would think that the plugin would crank up the loudness. But it's was modern club music and there is no way that this plugin has enough big parameter count to cover all styles. But be sure that next plugins are gonna be better.
Plugin developers might worry too that people will stop buying their compressors, limiters, EQs, anything used in mixing and mastering.
Big time!!
Maybe its also better for the people because they dont know how to use these plugins anyways. 😅 me included
most songs in the radio make me feel it allready has...
I bought this one about 1 year ago, but ended up not using it. There hasn't been any update since then either, which doesn't speak for its future, as it is not perfect by any means. In the "one more song" chapter I can clearly hear loss of transients, especially the snare is crushed, compared to your master, and the bass boom it added is disturbing. With no parameters to adjust anything, this should be a set and forget experience, but it's not, so at least to me it is unuseable. Your master is way better and definitely not boring.
Will we finally get a review of the freqtube FT1? I can see it creeping in the background 😋
With automated technologies, it feels like people are willing to cut costs if the results are "good enough."
Even if something isn't better, people will still use it if it's significantly more convenient and less expensive.
Lot of good comments here. Thanks for demo.
The Ai mastering seems more “sensational” and in your face. And the original masters are more polite / less offensive.
What about your AI mastering service ?
i tried the demo i think for a lot of people it will be more than good enough especially for the price which is crazy because i remember trying this kinda stuff before and feeling like just putting a limiter would have been better. a good engineer can still beat it but ive gotten way worse masters from real people lol
just tried it because really curious...and i mastered one song of a job made for a band .....in conclusion i can't say it was bad but far away to what i did myself ....i use a analog chain for mastering mainly, and i can say for sure that the sound lost life and color and it was quiet comparate to mine......i know at this point everyone can do everything without a sound engineer but i got a taste in the sound that is really personal.....i will keep going doing it myself....i don't like the idea of mac donalds way on the sound where everything will taste the same
And what is the AI benefit?
May be mastering, but no mixing will be replaced. Definitely
i think you have no worries. the fact that there was distortion on the bass in the jazz example is completely unacceptable. the pop example was very harsh relative to the original mix (but one person's harsh is another's aggressive). if you master and the end results do not show those characteristics, your method of making. living is still safe!
Run a file though, save the results.
Then drop the file by 3dB and run that through.
Compare the results; they are different.
The AI will produce a different master depending on the level of the pre-master.
That does not seem correct or very intelligent.
It introduces a weird aliasing artifact if you pay close attention. Like when you use a headphone spacial virtualizer or something.
Not really fair because you are already giving it something that has passed through your expert hands. Give it a balanced mix that's clearly not a master. Something from a less experienced engineer.
Agree with the assist and also training if it comes to a better job find out what it did learn from it also lot of musicians we want to stay in the vibe and not worry about that technical side of the whole process destroy the entire art the entire human part but as an assist to get us to that next level of standard to be able to find that engineer afford that engineer or collab with that engineer Fai not taking engineer jobs assisting a diamond in the rough be heard eventually inspiring a third-party investor for human engineer❤ Great video
Moronic graphics.
Beyound moronic, purely idiotic.
U can say that again
You can say that again
You can say that again
Just wanted to say that when you use one of those mastering websites and you get mixed results, it means they're doing absolutely nothing.
This whole time we thought you were the AI
Well, it will not only take over mastering engineers, it will take over studio's, instruments, mixing consoles, expensive mics... this will simply end in realtime orchestration. You hum a song or a melody and it will be instantly orchestrated mixed etc and ready to upload. i think within 4 years.
So grab your guitar and become a real troubadour again the future is live.
Is it free ?