No, there's no need for that. Training AI is Fair Use, and in no way remotely close to the definition of "stealing". Using an AI tool to generate images that resembles someone else's, and selling it for profit, would be stealing, but in that case, the individual is responsible for how they use the tool, not the tool itself.
it's not stealing, at the most basic level machine learning doesn't copy or store anything internally. it's a tool and it would take further actions from a human to constitute copyright infringement let alone theft.
I think generative AI companies that do scrap data off the internet must pay a fee for every image stolen from an artist that fee going towards a UBI fund for artists that way the get compensation the ideal is 60$ per image scrapped
I'm actually surprised that they didn't got cancelled with such amount of stolen work. Heard about a lawsuit or something towards these companies though.
I'm certain those companies are terrified of being forced to make restitutions. Even if for one dollar per image, the amount of images they scraped likely number in the trillions, which may alone be enough to bankrupt a company. So....good riddance.
If you can program artificial intelligence to steal large amounts of data from particular sources, then you can program it to provide a digital bibliography of its primary source materials.
It doesn't mean they'll actually do it and punishing them requires the cost of actually finding out they're doing it or if they're doing it incorrectly. Large AI companies have better software and hardware to do so but the technique is pretty standard machine learning and LLMs, anyone can do it and open sources just get better as time passes.
@@kozad86 Here is the issue... analogy time. AI models using assets that were not sold or given freely to the model is like a game developer using assets without paying for them. The assets provide literal one to one value and are not really transformative. It is like they went to a public library, copied all the books and then charged for a simulation of combined knowledge stealing the operating costs of the library, they author's works and the end users own real education.
This "opt out" nonsense by these companies needs to stop. By taking that stance they are, in effect, placing all the burden on the victims that are being impacted by their efforts. This is a great opportunity for laws to be put in place that places the burden back on those doing the damage. It should always be "opt in". This is same as placing the recycling burden on consumers instead of the producers using more environmentally friendly packaging and processes.
I support the artists, whatever the medium. The ' creators ' of AI generated media just lay back and let something else do all the work. Where's the creativity in that?
People forget that Photoshop has announced they copy and record artists drawing styles for AI and PS is something most Artists use! So when you paint, it'll all be for AI
I would use the term 'arms race' and I would not use the term anti-AI researchers - I would use the term anti-exploitation AI researchers. I think an AI will be needed to generate unique filters for each new image, since the Nightshade filter could be 'learned'. But it's possible even that won't work as there would be general constraints on any filter so as to keep the images looking normal to the human eye and that might be enough for an AI to learn and then undo the filtering.
@@Jianju69 This video is on how to protect artwork from AI. My AWESOME comment was directed to all of the people (including the artist) on how they found a way to protect the art from being stolen by AI. Stealing it using AI or using any other method is NOT awesome. You misunderstand my comment.
@@rupertpupkin2493unfortunately I don’t think this specific solution protecting artwork would work the same in the music world, assuming you mean for copyright issues. That is just due to the nature of both industries and how they work. Basically AI models “create” new artwork / pictures that have never existed before, based on what you enter as the text prompt. It learns how and what’s what from a huge training data set.. or basically like everything on the internet. so that’s the issue is art being used in training data without their permission or even an acknowledgment. Music is just streaming it like not creating something new out of the file. Idk if this is making sense but basically it wouldn’t work because even if you did something to corrupt a song file until it was paid for, that doesn’t stop one person from buying it then uploading on a file sharing site for free for everyone. Anyone else remember limewire? lol good times
FYI, Nightshade does NOT work on any model other than the one they developed it for. They can demonstrate it works, on their own version of that model, but it has no impact on the actual generative AI out there. And the images are easy to filter out from training data.
There's also screenshots, scans, right click saveas... Not sure how this is really going to help if you can put a couple hundred humans in the middle to filter these out in the temp.
How much will that cost them though, especially when there are billions of art out there, and you don't need a lot of this poisoned images to get through to deal damage to the model; doing so will just add major bottleneck to the automated web scraping process, if you have to pay for humans to filter out poisoned images you might as well pay the actual artists instead, not to mention when the majority of artist start using this, if you just filter the one that is poisoned out, there isn't much data left to train your model anymore, so actually they can't just filter it out, they have to try to remove the filter individually since each filter are random. And no, they specifically mentioned that this will still work even if you downscaled or upscale the image, you can try to apply a blur filter, but then it damages the quality of the training data itself, plus even if upscale and downscale does work in removing the poison, downscaling will reduce the quality of the data significantly, and upscaling requires AI to do so, and feeding an AI processed image back into an AI is notoriously bad because you're basically inbreeding it.
@@dpptd30 Last I checked, there's at least 15,000+ in the Philippines doing this everyday. It only costs about $0.005-7 per 20 image most of the time. Yeah, lower than a cent, and there are people out there desperate enough to crunch this type of work. Not counting Africa and South America. Stealing is way, way cheaper for them. Slavery exists, even in the digital age, unfortunately.
I do not know enough about tech or Ai to understand the logistics and programing behind this, but I get the concept and that's awesome they are fighting back. I think ai is incredible and so useful in countless applications - art however I don't really think is one of those spaces. That said, I've seen so many ai generated drone footage of actual places and they look completely real. In that regard for movies I think that's a great way to cut costs. To send out a drone pilot to a location to shoot a 5 second clip just zooming into a location so the audience knows where the scene takes place - time wise and logistically is inefficient. But I think that's where the line needs to be drawn for the time being until we learn more. It should be a tool to aid in cutting time and costs on those types of shoots that really are unnecessary to go out to location to film. Ai is so advanced, and accesses so much data. If you tell it to do a panned in from top left of the Denver metro area, it will do it and be accurate by basing the generated content on actual footage that already exists of the Denver metro area. That, I see no problem with. But using it as a full on replacement for a film crew, screen writers, and even actors is, for me, where I have to oppose. Every new idea, story, song, movie has some influence in past creators work. At its core, ai isn't really all that different from us - it just does everything billions of times faster with its learning model. The crediting and compensation for actual human artists though is a huge area of concern. Like in theory, all artists do take inspiration or techniques from the artists whose work they study or learned from. That's how we grow. That isn't plagiarism in my mind. But this is so accelerated and hard to monitor that it's going to take a lot of collaboration and cooperation by many leaders in tech and regulation to figure out where this is going to - and how it's going to fit in our society moving forward. It's scary too with the deep fakes and the US election coming up...
Assume Nightshade takes the amplitudes of the Fourier transform of the bytes of the images and adds Gaussian noise. That is, because the sum of two random probability distributions is the sum of the convolution of the two probability distributions, a Fourier transform of the two products, one is the amplitude of the Gaussian noise and the other is of the image bytes, this can be multiplied together. In other words, you can scale the two together. The result is that the human just sees whatever the image is and the computer just sees the Gaussian noise. (This is just a wild guess - don’t take this too seriously if I’m totally wrong.)
@@Jianju69 Don't think you understand the complexity of removing Gaussian noise. Assuming my guess is correct, it's computationally infeasible to remove Gaussian noise from images. This is a topic that's not suitable for this forum. But, the cryptography system learning with errors is based on the extreme complexity of doing so.
Artists in the video are allowing them to broadcast their artwork. Also, it is not legal to earn money based on copyrighted content that you do not own or are allowed to use.
@@musicapereira2091 wrong, they do not require the artist permission to train their audience. Learn fair use laws. It’s the same law that protects people from Disney from suing them for using their IP in TH-cam videos as long as they’re transforming the content by explaining it (training) their audience.
@@musicapereira2091 Fair use says otherwise. Pretty much every video on youtube uses someone else's content. All you have to do is add something to the original work and it's fair use.
AI should only be used in a closed and controlled environment.... This just proves that the system is garbage in garbage out.... you don't even need the nightshade software... if everyone in the world who uploaded pictures started calling pictures of their grandma a house, and their baby a crocodile..... AI imagery would be really bad !!
Wrong, because their would be no consensus of people calling babies crocodiles. The strongest correlation will always remain aligned with social truth.
u can caption ur uploaded image xyz, ai already has huge datasets of preexisting imgs which it will relate the new img with and figure out that ur dog img is a dog and not toyota corolla
It won’t know if it’s a cat (or whatever used to poison) or the original image. It’s similar to how TH-cam (or anything that requires you to be logged) has an algorithm to authenticate you while not knowing your input password (hashing)
Not true, forgery is nothing new this is just the latest form of it. People will know what to look for and as always people prefer authentic over being scammed or duped
GOOD! It's about time someone spoke up about this. It's not "AI Generated work," it's a down right bad remix of other ppl's work and that is a problem. It's not "generating" nothing. It's a collage of other ppl's work. Just like a real person can't take two books from two authors and collage them together and say I wrote this, "generative" AI shouldn't be able to either.
You do realize that arriving at those similar pictures takes careful crafting of a prompt, editing and refining it until it gives you what you want: images similar to an arbitrary image.
So if big tech takes over your income, and/or your loved ones' income, you and your loved ones should just accept it? Hackers are going to get more skilled. Are you just going to get over it and accept it if they steal your identity?
@@kelvinmorris1991 Because it's just going to happen whether they want it or not. The artists are fighting against the stream of a river right now. They can be angry about it, but they need to make peace with the fact that generative AI is a thing that exists now. Just as landscape painters made peace with the invention of photography.
It doesn't do anything, these folks aren't ahead of the models and training them now requires highly curated data in the first place, it's basically snake oil and becomes more irrelevant every day.
Dosen't work , just take a glazed picture from their website then add it into a online ai image expand.And it can generate an expansion of that photo,that means it can read it so it is not working atm.
The inevitable is either the models so poisoned from other AI images or AI companies actually paying for their training data so that the images are unique to their model. The brazen stealing was never going to last, as it was meant for research purposes not unethical commercialization.
i can bet back when cars were overtaking the roads, how horse and buggy businesses tried to sabotage car manufacturers too. i can imagine the scribes wringing their hands and protesting when the printing press came into full production. progress will take industries down. it may not be fair to the ppl on the losing end, but that's how it is.
Art jobs don't exist anymore in the future : well it sucks but it is what it is but I still get to keep my art from myself, these mfers will probably scrape personal websites as well
AI will eventually figure out a way around this anyway. Either case, Instead of building tools to fight it, artists need to embrace it to see how they can capitalize on AI and AI tools themselves to make money..
well that scenario goes both ways.... the tools will also get more refined.... I think the key is to have the image tags scrambled and only people you allow with a local key see it for what it really is... so you can write this is my cat but AI sees pickup truck from the tags.... confusing the AI to look elsewhere for a cat !!
Painfully clear that this newsman knows nothing about the subject. Unlikely the activist does either. Look, this process won't do a thing. AI can very easily learn to overcome this process simply by taking a million photos, applying this "Nightshade" process, and learning to undo the process by comparing with the originals.
Ay-Eye is terrible and it's going to get worse. All the films warned about it. I work as an animator and I'm job searching again now. I'm still starting my career in animation after going to school for ten years. This is all disappointing. That's why I'm trying to start side online businesses as I continue to grow as an animator.
I think the researchers will create an AI to generate stylish character animation, since it's costly to make. I believe Wall-E era is inevitable, where people don't have to work laborious jobs anymore
@@Tea-zu2he That's highly suspect, cause you can change an image a lot with photoshop and overlays, even using img2img. I'll read up when I get a chance, but I'm very skeptical.
@@JustinWiggins this is literally on their website they address it: "As with Glaze, Nightshade effects are robust to normal changes one might apply to an image. You can crop it, resample it, compress it, smooth out pixels, or add noise, and the effects of the poison will remain. You can take screenshots, or even photos of an image displayed on a monitor, and the shade effects remain. Again, this is because it is not a watermark or hidden message (steganography), and it is not brittle."
Screeching about AI being some kind of threat is peak inceldom. Just admit you’re afraid that its art, skill and desirability are all greater than yours. 🤷🏼 Cue all the DeviantArt bros in the replies in 3… 2…
Wait till artists realize that anyone can “steal” their art by simply explaining it to an audience. As long as I’m training my audience, it’s perfectly legal fair use. Now show me where in the law says I have to train people… I’ll wait.
People like you cannot put 2 and 2 together usually, artists are not against ai, they just don't want their hardwork being trained on and used for profit, hardwork without which none of these models would even reach to produce the output they do. People like you never will understand this until you have worked to create something.
Software engineers are about to be decimated by AI (and I mean that in the original sense of the word, the work of 10 can now be done by 1 with an AI assistant). The only real answer to AI is universal basic income.
Sure, they'd be glad to have you - you just need speak and write Mandarin Chinese fluently and have a PhD in computer science from a major university. Then you can work 12 hours a day, 6 days a week at a Chinese university as a postdoc. The salary isn't bad - around $US 50k/yr. Of course, you could make almost that much working the same amount of hours here in the US at a fast food restaurant with no college education or foreign language skills whatsoever.
The AI companies shouldn’t have an opt-out policy, they should have an opt-in policy. Don’t steal!
No, there's no need for that. Training AI is Fair Use, and in no way remotely close to the definition of "stealing". Using an AI tool to generate images that resembles someone else's, and selling it for profit, would be stealing, but in that case, the individual is responsible for how they use the tool, not the tool itself.
it's not stealing, at the most basic level machine learning doesn't copy or store anything internally. it's a tool and it would take further actions from a human to constitute copyright infringement let alone theft.
@@GrumpDog If the tool itself is made by using everyone's work without their consent, then that is stealing. Plain and simple.
I think generative AI companies that do scrap data off the internet must pay a fee for every image stolen from an artist that fee going towards a UBI fund for artists that way the get compensation the ideal is 60$ per image scrapped
I'm actually surprised that they didn't got cancelled with such amount of stolen work. Heard about a lawsuit or something towards these companies though.
I'm certain those companies are terrified of being forced to make restitutions. Even if for one dollar per image, the amount of images they scraped likely number in the trillions, which may alone be enough to bankrupt a company. So....good riddance.
@@ryanartward What are your takes on syndicalism?
Great job Dr. Zhao and his team of PHD students from Arizona State University.
Finally! A way for artists to fight back
If you can program artificial intelligence to steal large amounts of data from particular sources, then you can program it to provide a digital bibliography of its primary source materials.
AI art is no more of a "theft" than any other artwork.
It doesn't mean they'll actually do it and punishing them requires the cost of actually finding out they're doing it or if they're doing it incorrectly. Large AI companies have better software and hardware to do so but the technique is pretty standard machine learning and LLMs, anyone can do it and open sources just get better as time passes.
Training is fair use.
@@kozad86 Here is the issue... analogy time. AI models using assets that were not sold or given freely to the model is like a game developer using assets without paying for them. The assets provide literal one to one value and are not really transformative. It is like they went to a public library, copied all the books and then charged for a simulation of combined knowledge stealing the operating costs of the library, they author's works and the end users own real education.
People dont program ai.
Opt-Out, aka, having to ask a company to "please not steal my stuff".
This "opt out" nonsense by these companies needs to stop. By taking that stance they are, in effect, placing all the burden on the victims that are being impacted by their efforts. This is a great opportunity for laws to be put in place that places the burden back on those doing the damage. It should always be "opt in".
This is same as placing the recycling burden on consumers instead of the producers using more environmentally friendly packaging and processes.
By the law, they don't even have to do the opt-out. They are just being nice by allowing people to opt out.
Silicon valley owns washington
@@themartdog Then we have to change that. Clearly this isn't for science, its for lining their wallets.
I support the artists, whatever the medium.
The ' creators ' of AI generated media just lay back and let something else do all the work.
Where's the creativity in that?
Is Nightshade purposely inserting cat images as the noise? What better than the ultimate troll? 😺
Did the cats opt in! lol
That OR the government could just regulate AI. Just like they should have been regulating US social media platforms for years now
I think it's so funny that you think the government would be able to regulate it. The whole world is involved, USA can't stop it. NATO can't stop it.
Regulate social media companies how?
Poisoning the well for IA bots is a smart but i feel that these companies will find a way around it. 😮💨
I suppose it's an arms race. AI companies find a solution, Nightshade comes out with countermeasures, and it goes back and forth.
Especially with the money behind AI. Some of these folks might disappear.
🤞 they 2️⃣❗️
People forget that Photoshop has announced they copy and record artists drawing styles for AI and PS is something most Artists use! So when you paint, it'll all be for AI
What if you pirate the software?
I believe it's going to be a cat-and-mouse game between the AI and anti-AI researchers
Anti ai researchers don’t exist though
I would use the term 'arms race' and I would not use the term anti-AI researchers - I would use the term anti-exploitation AI researchers. I think an AI will be needed to generate unique filters for each new image, since the Nightshade filter could be 'learned'. But it's possible even that won't work as there would be general constraints on any filter so as to keep the images looking normal to the human eye and that might be enough for an AI to learn and then undo the filtering.
AWESOME ! Can they do that for music, also ?
It's only awesome because you don't understand it. This will do nothing.
@@Jianju69
This video is on how to protect artwork from AI. My AWESOME comment was directed to all of the people (including the artist) on how they found a way to protect the art from being stolen by AI. Stealing it using AI or using any other method is NOT awesome. You misunderstand my comment.
@@rupertpupkin2493unfortunately I don’t think this specific solution protecting artwork would work the same in the music world, assuming you mean for copyright issues.
That is just due to the nature of both industries and how they work. Basically AI models “create” new artwork / pictures that have never existed before, based on what you enter as the text prompt. It learns how and what’s what from a huge training data set.. or basically like everything on the internet. so that’s the issue is art being used in training data without their permission or even an acknowledgment.
Music is just streaming it like not creating something new out of the file. Idk if this is making sense but basically it wouldn’t work because even if you did something to corrupt a song file until it was paid for, that doesn’t stop one person from buying it then uploading on a file sharing site for free for everyone.
Anyone else remember limewire? lol good times
FYI, Nightshade does NOT work on any model other than the one they developed it for. They can demonstrate it works, on their own version of that model, but it has no impact on the actual generative AI out there. And the images are easy to filter out from training data.
There's also screenshots, scans, right click saveas... Not sure how this is really going to help if you can put a couple hundred humans in the middle to filter these out in the temp.
How much will that cost them though, especially when there are billions of art out there, and you don't need a lot of this poisoned images to get through to deal damage to the model; doing so will just add major bottleneck to the automated web scraping process, if you have to pay for humans to filter out poisoned images you might as well pay the actual artists instead, not to mention when the majority of artist start using this, if you just filter the one that is poisoned out, there isn't much data left to train your model anymore, so actually they can't just filter it out, they have to try to remove the filter individually since each filter are random. And no, they specifically mentioned that this will still work even if you downscaled or upscale the image, you can try to apply a blur filter, but then it damages the quality of the training data itself, plus even if upscale and downscale does work in removing the poison, downscaling will reduce the quality of the data significantly, and upscaling requires AI to do so, and feeding an AI processed image back into an AI is notoriously bad because you're basically inbreeding it.
@@dpptd30 Last I checked, there's at least 15,000+ in the Philippines doing this everyday.
It only costs about $0.005-7 per 20 image most of the time. Yeah, lower than a cent, and there are people out there desperate enough to crunch this type of work.
Not counting Africa and South America.
Stealing is way, way cheaper for them. Slavery exists, even in the digital age, unfortunately.
I do not know enough about tech or Ai to understand the logistics and programing behind this, but I get the concept and that's awesome they are fighting back. I think ai is incredible and so useful in countless applications - art however I don't really think is one of those spaces.
That said, I've seen so many ai generated drone footage of actual places and they look completely real. In that regard for movies I think that's a great way to cut costs. To send out a drone pilot to a location to shoot a 5 second clip just zooming into a location so the audience knows where the scene takes place - time wise and logistically is inefficient. But I think that's where the line needs to be drawn for the time being until we learn more. It should be a tool to aid in cutting time and costs on those types of shoots that really are unnecessary to go out to location to film. Ai is so advanced, and accesses so much data. If you tell it to do a panned in from top left of the Denver metro area, it will do it and be accurate by basing the generated content on actual footage that already exists of the Denver metro area.
That, I see no problem with. But using it as a full on replacement for a film crew, screen writers, and even actors is, for me, where I have to oppose. Every new idea, story, song, movie has some influence in past creators work. At its core, ai isn't really all that different from us - it just does everything billions of times faster with its learning model. The crediting and compensation for actual human artists though is a huge area of concern.
Like in theory, all artists do take inspiration or techniques from the artists whose work they study or learned from. That's how we grow. That isn't plagiarism in my mind. But this is so accelerated and hard to monitor that it's going to take a lot of collaboration and cooperation by many leaders in tech and regulation to figure out where this is going to - and how it's going to fit in our society moving forward. It's scary too with the deep fakes and the US election coming up...
Assume Nightshade takes the amplitudes of the Fourier transform of the bytes of the images and adds Gaussian noise. That is, because the sum of two random probability distributions is the sum of the convolution of the two probability distributions, a Fourier transform of the two products, one is the amplitude of the Gaussian noise and the other is of the image bytes, this can be multiplied together. In other words, you can scale the two together. The result is that the human just sees whatever the image is and the computer just sees the Gaussian noise.
(This is just a wild guess - don’t take this too seriously if I’m totally wrong.)
As a MSDS student, thank you for that explanation.
Remember: AI can learn any process that it has a sufficient number of before-and-after examples. AI will quickly learn to circumvent this process.
@@Jianju69 Don't think you understand the complexity of removing Gaussian noise. Assuming my guess is correct, it's computationally infeasible to remove Gaussian noise from images. This is a topic that's not suitable for this forum. But, the cryptography system learning with errors is based on the extreme complexity of doing so.
@1:01 training is fair use. The same rule that lets this video use examples without paying the artists without consent.
Artists in the video are allowing them to broadcast their artwork. Also, it is not legal to earn money based on copyrighted content that you do not own or are allowed to use.
@@musicapereira2091 wrong, they do not require the artist permission to train their audience. Learn fair use laws.
It’s the same law that protects people from Disney from suing them for using their IP in TH-cam videos as long as they’re transforming the content by explaining it (training) their audience.
@@musicapereira2091 Fair use says otherwise. Pretty much every video on youtube uses someone else's content. All you have to do is add something to the original work and it's fair use.
No F@B_C_
Soy boy beta.
0:40
What about the potential medical molecules that generative AI can produce?
What about it
Fantastic!
AI should only be used in a closed and controlled environment.... This just proves that the system is garbage in garbage out.... you don't even need the nightshade software... if everyone in the world who uploaded pictures started calling pictures of their grandma a house, and their baby a crocodile..... AI imagery would be really bad !!
Wrong, because their would be no consensus of people calling babies crocodiles. The strongest correlation will always remain aligned with social truth.
u can caption ur uploaded image xyz, ai already has huge datasets of preexisting imgs which it will relate the new img with and figure out that ur dog img is a dog and not toyota corolla
Waste of time. AI will simply filter out these altered images. 🙄
It won’t know if it’s a cat (or whatever used to poison) or the original image. It’s similar to how TH-cam (or anything that requires you to be logged) has an algorithm to authenticate you while not knowing your input password (hashing)
Not true, forgery is nothing new this is just the latest form of it. People will know what to look for and as always people prefer authentic over being scammed or duped
That's kinda the point though. These artists don't want their images to be taken without consent.
Mission accomplished...
🤞 🤖 ♾️ 🚀
GOOD! It's about time someone spoke up about this. It's not "AI Generated work," it's a down right bad remix of other ppl's work and that is a problem. It's not "generating" nothing. It's a collage of other ppl's work. Just like a real person can't take two books from two authors and collage them together and say I wrote this, "generative" AI shouldn't be able to either.
You do realize that arriving at those similar pictures takes careful crafting of a prompt, editing and refining it until it gives you what you want: images similar to an arbitrary image.
Good. Keep after em.
Bravo!
I feel this is only temporary. Tech is always evolving and artists need to accept it.
Why should they accept not being compensated for their work?
So if big tech takes over your income, and/or your loved ones' income, you and your loved ones should just accept it? Hackers are going to get more skilled. Are you just going to get over it and accept it if they steal your identity?
@@AFloridaSon you're not even talking about the same thing and are resorting to hyperbole. shove off.
@@kelvinmorris1991for the same reason they already do for fair use training purposes.
@@kelvinmorris1991 Because it's just going to happen whether they want it or not. The artists are fighting against the stream of a river right now.
They can be angry about it, but they need to make peace with the fact that generative AI is a thing that exists now. Just as landscape painters made peace with the invention of photography.
Kinda like strips you find in money to tell if it's fake, and just another reason why I have never liked AI to begin with
oh now when it is digging into your pockets you care...cry
It doesn't do anything, these folks aren't ahead of the models and training them now requires highly curated data in the first place, it's basically snake oil and becomes more irrelevant every day.
I like the cat photos more
Dosen't work , just take a glazed picture from their website then add it into a online ai image expand.And it can generate an expansion of that photo,that means it can read it so it is not working atm.
Let's go!
Fight against generative AI? Delaying a ’fight' against the inevitable it seems
The inevitable is either the models so poisoned from other AI images or AI companies actually paying for their training data so that the images are unique to their model. The brazen stealing was never going to last, as it was meant for research purposes not unethical commercialization.
You sound like such a clown right now
@@rodrigopetunio the poison is snake oil, it's not effective. it's also not stealing, it doesn't copy or store anything internally.
🙄
i can bet back when cars were overtaking the roads, how horse and buggy businesses tried to sabotage car manufacturers too. i can imagine the scribes wringing their hands and protesting when the printing press came into full production. progress will take industries down. it may not be fair to the ppl on the losing end, but that's how it is.
Art jobs don't exist anymore in the future : well it sucks but it is what it is but I still get to keep my art from myself, these mfers will probably scrape personal websites as well
AI will eventually figure out a way around this anyway. Either case, Instead of building tools to fight it, artists need to embrace it to see how they can capitalize on AI and AI tools themselves to make money..
well that scenario goes both ways.... the tools will also get more refined.... I think the key is to have the image tags scrambled and only people you allow with a local key see it for what it really is... so you can write this is my cat but AI sees pickup truck from the tags.... confusing the AI to look elsewhere for a cat !!
Oh? Is “starving artists” no longer a thing now? Aren’t they always starving….
Yes, it is about time we fight back against Skynet!
Sample! Sample! Get more! 😅😂😅
She needs to understand.
Everything is remix
thank you
You cannot stop someone from TAKING HI-RES PHYSICAL PICTURES and train from that. NOTHING will. GAME OVER.
Painfully clear that this newsman knows nothing about the subject. Unlikely the activist does either.
Look, this process won't do a thing. AI can very easily learn to overcome this process simply by taking a million photos, applying this "Nightshade" process, and learning to undo the process by comparing with the originals.
Unless you use an AI to generate a new noise filter unique to each image.
Yeah not fond of total almost algorithmic ripoffs without any real creativity or any homage intended.
Ay-Eye is terrible and it's going to get worse. All the films warned about it. I work as an animator and I'm job searching again now. I'm still starting my career in animation after going to school for ten years. This is all disappointing. That's why I'm trying to start side online businesses as I continue to grow as an animator.
I think the researchers will create an AI to generate stylish character animation, since it's costly to make. I believe Wall-E era is inevitable, where people don't have to work laborious jobs anymore
And the AI figures it out anyway.
This can be bypassed by taking a screenshot 🤦♂
No it can't maybe actually read up on how it works they wrote a paper
@@Tea-zu2he That's highly suspect, cause you can change an image a lot with photoshop and overlays, even using img2img. I'll read up when I get a chance, but I'm very skeptical.
@@JustinWiggins this is literally on their website they address it: "As with Glaze, Nightshade effects are robust to normal changes one might apply to an image. You can crop it, resample it, compress it, smooth out pixels, or add noise, and the effects of the poison will remain. You can take screenshots, or even photos of an image displayed on a monitor, and the shade effects remain. Again, this is because it is not a watermark or hidden message (steganography), and it is not brittle."
@@JustinWiggins Just create your own stuff, sorry you can't do the basics that even a 6 year old can do
This is so important!
This is Evil and Wrong for real . Because they habe thier Online . They will Fail for real .
lol
This is the best solution, hope they keep up with it!
Rent needs to lower or go extention I rather swim to Africa then that job
Screeching about AI being some kind of threat is peak inceldom. Just admit you’re afraid that its art, skill and desirability are all greater than yours. 🤷🏼
Cue all the DeviantArt bros in the replies in 3… 2…
Incel ai bro triggered 😂
Wait till artists realize that anyone can “steal” their art by simply explaining it to an audience. As long as I’m training my audience, it’s perfectly legal fair use.
Now show me where in the law says I have to train people… I’ll wait.
@@sameyo Threatened by AI besting you at art when you can’t even compose an original insult. Go lube your glans and cry. 🗿
shocked at how fast the replies proved ur point 😳
Just admit that you have no skills and this messes your plans to steal other people's work and profit off it.
The hero we need
Spys
Good! And the music made by A.I needs more transparency when I am listening to it
No one cared when automation came for the assembly line workers. Now it strikes a highly privileged class and there's a outcry
Ah yes the famous privileged class of... artists? Were you born stupid or did you have to learn it, like a skill?
People like you cannot put 2 and 2 together usually, artists are not against ai, they just don't want their hardwork being trained on and used for profit, hardwork without which none of these models would even reach to produce the output they do. People like you never will understand this until you have worked to create something.
I'm down with this. My unconditional support.👍👍👍
Ah too late for 100 yr old artist lol
Guys , all the new music sounds like Steamboat Willy this year
we must fight against our AI overlords by poisoning it.
I'll love to be a part of this.
Anti ai are using ai too... Wow
People won't be able to make a living drawing pretty little pictures anymore because of AI. Oh, the humanity.
Learn to code.
Troll
exactly, it's a no-brainer! but then again, these luddites clearly don't have brains 😂😂😂
Software engineers are about to be decimated by AI (and I mean that in the original sense of the word, the work of 10 can now be done by 1 with an AI assistant). The only real answer to AI is universal basic income.
"Learn to code" as ai coding is becoming more and more a reality. Have fun when that job is also replaced.
Shoutout to academic nerds doing the right thing love it
Luddites
Someone seems salty
A cute and pathetic attempt at making a new slur, bot
Good artists borrow. Great artists use AI. ~Gandhi
Hi
Pay our artists or become a cat. Easy.
I wonder as a white guy if I could go to China and do the same thing????
Sure, they'd be glad to have you - you just need speak and write Mandarin Chinese fluently and have a PhD in computer science from a major university. Then you can work 12 hours a day, 6 days a week at a Chinese university as a postdoc. The salary isn't bad - around $US 50k/yr. Of course, you could make almost that much working the same amount of hours here in the US at a fast food restaurant with no college education or foreign language skills whatsoever.
Oh yeah 😏
My art and writing being part of AI means my work will be remembered far into the future while your work will be lost and forgotten.
And they’ll parade your legacy as they see fit, claiming as their own.
“Remember the fool who sold his soul to live forever”.
Uhh, that's not how AI works, it uses your data to train itself, it could never give a 1 to 1 copy unless specifically asked.
It needs to stop.
good luck wit dat.
Well done!👏🏻👏🏻👏🏻👏🏻👏🏻👏🏻👏🏻👏🏻👏🏻👏🏻👏🏻👏🏻👏🏻👏🏻👏🏻👏🏻👏🏻👏🏻👏🏻👏🏻👏🏻👏🏻👏🏻👏🏻👏🏻👏🏻👏🏻🔥🔥🔥👍🏻💐