@@ScuubiDuubi It's not just the information we're talking about. It's the presentation, the interpretation, and the synthesis. Putting the physics principle in layman's terms is work that has value. Teaching is a skill. If it wasn't, you wouldn't need Blender tutorials, you could just look at Blender and learn everything you need to know. If an AI scrapes thousands of hours of that work and breaks it down into soulless slurry to be marketed to morons, the thing being stolen is the interpretation, presentation, and explanation of the physics, not the physics themselves. Are paintings the property of the painter? They probably didn't create the brush, the canvas, or the paint they used, so by the logic you're arguing, they're somehow not the property of the person who created them. Also, for the record, a Blender tutorial produced by an AI would give you incorrect information and be useless for learning. The model is trained to produce something that looks like a Blender tutorial, it's not learning how to use Blender. The result will be videos that are very well-produced to LOOK like useful tutorials posted by someone knowledgeable, but all the content in them is nonsense. Just a whole new generation of scams.
@@deggy42 " Teaching is a skill. If it wasn't, you wouldn't need Blender tutorials, you could just look at Blender and learn everything you need to know. If an AI scrapes thousands of hours of that work and breaks it down into soulless slurry to be marketed to morons" This is an obnoxious statement. What exactly would be wrong with that? So teaching should be gatekept because some people make money off of it? If I dont want to or more likely, dont have time to waste hundreds of hours sifting through all their tutorials I am a moron? This misses the entire point of the second part of what I said. Citing video sources with time stamps would be a win for everyone. It would be very easy for an AI to provide a weighted list of sources for how it came up with its answer. If this were the case, AI gets its training data, the youtuber gets to be sourced and linked to in its answer, therefore driving more traffic to that youtuber, and the person looking for a solution doesnt have to comb through hundreds of hours of youtube to find the needle in a haystack. If you're a proponent of free and open education, then you should support making tools like this better. Even if someone does have the TIME required to find solutions themselves, very few people have the patience. otherwise EVERYONE would know everything they wanted. I believe everyone SHOULD be able to know everything they want, and if AI served as a search algorithm of sorts for videos, that would open up web searches to a massive new realm of information that they cant address right now. I'll give you a specific example. It took me over a year to figure out how to transfer data from a digital sculpt to a 2d normal map in blender.. in other programs It worked effortlessly, but in blender I always got the SAME error. I googled it in every way possible, watched tons of tutorials, none of them ever had the issue i had. they would run through their process, it would work. I would get that same error. It made me want to scream, and i simply gave up on texturing and normal maps for a long time until one day i had a tutorial going as im sculpting, as i usually do.. and all of the sudden on a short tanget the presenter explains why that was happening about 30 minutes into the video. So my experience with that issue is just "the way it should be done" because youtubers think AI is learning how to present better from them? No.. screw that. I want the ability to google youtube videos' content so that i can find the effective tutorials more efficiently. and your next statement just proves my point because "Also, for the record, a Blender tutorial produced by an AI would give you incorrect information and be useless for learning." EXACTLY I can't effectively google solutions to many problems i have while learning blender, but if I could google it, and google gave me a brief explanation with links to its sources, I would immediately go watch those sources to find the correct information. not to mention, you're all just assuming they plan to use this to make tutorial videos for youtube.. I think google has better and more profitable shit to do dude. they arent trying to steal their youtubers views.. and most youtubers personalities are what attract people to them. I dont watch smarter every day because I need to know how a pulley works. I like his explanations of stuff, and if AI rips that off it will be a "soulless" version as you said, that im not interested in.. so this whole argument just kinda doesnt make sense. lastly "Are paintings the property of the painter? They probably didn't create the brush, the canvas, or the paint they used, so by the logic you're arguing, they're somehow not the property of the person who created them." This isnt the equivalent of my argument at all. The youtuber didnt create youtube, but they still own the video, right? the artist didnt create the tools, but they own the painting. MY ARGUMENT is more akin to "if a youtuber made a video about a ferarri, then said he owns said ferrari.. everyone would agree that was dumb." youtubers dont own the subjects of their videos, and if youtube starts pumping out error filled videos with bad information in styles imitating real youtubers.. WHO TF IS GOING TO WATCH THEM? im not? are you? have you seen AI art? it cant even count fingers yet man.. It has no UNDERSTANDING of its subject whatsoever, but it could be an INCREDIBLE referencing tool, and if you oppose that because you think only morons dont want to or shouldnt have to sift through mountains of information for an answer.. too bad?
@@deggy42 " Teaching is a skill. If it wasn't, you wouldn't need Blender tutorials, you could just look at Blender and learn everything you need to know. If an AI scrapes thousands of hours of that work and breaks it down into soulless slurry to be marketed to morons" This is an obnoxious statement. What exactly would be wrong with that? So teaching should be gatekept because some people make money off of it? If I dont want to or more likely, dont have time to waste hundreds of hours sifting through all their tutorials I am a moron? This misses the entire point of the second part of what I said. Citing video sources would be a win for everyone. It would be very easy for an AI to provide a weighted list of sources for how it came up with its answer. If this were the case, AI gets its training data, the youtuber gets to be sourced and linked to in its answer, therefore driving more traffic to that youtuber, and the person looking for a solution doesnt have to comb through hundreds of hours of youtube to find the needle in a haystack. If you're a proponent of free and open education, then you should support making tools like this better. Even if someone does have the TIME required to find solutions themselves, very few people have the patience. otherwise EVERYONE would know everything they wanted. I believe everyone SHOULD be able to know everything they want, and if AI served as a search algorithm of sorts for videos, that would open up web searches to a massive new realm of information that they cant address right now. I'll give you a specific example. It took me over a year to figure out how to transfer data from a digital sculpt to a 2d normal map in blender.. in other programs It worked effortlessly, but in blender I always got the SAME error. I googled it in every way possible, watched tons of tutorials, none of them ever had the issue i had. they would run through their process, it would work. I would get that same error. It made me want to scream, and i simply gave up on texturing and normal maps for a long time until one day i had a tutorial going as im sculpting, as i usually do.. and all of the sudden on a short tanget the presenter explains why that was happening about 30 minutes into the video. So my experience with that issue is just "the way it should be done" because youtubers think AI is learning how to present better from them? No.. just no. I want the ability to google youtube videos' content so that i can find the effective tutorials more efficiently. and your next statement just proves my point because "Also, for the record, a Blender tutorial produced by an AI would give you incorrect information and be useless for learning." EXACTLY I can't effectively google solutions to many problems i have while learning blender, but if I could google it, and google gave me a brief explanation with links to its sources, I would immediately go watch those sources to find the correct information. not to mention, you're all just assuming they plan to use this to make tutorial videos for youtube.. I think google has better and more profitable things to do dude. they arent trying to steal their youtubers views.. and most youtubers personalities are what attract people to them. I dont watch smarter every day because I need to know how a pulley works. I like his explanations of stuff, and if AI rips that off it will be a "soulless" version as you said, that im not interested in.. so this whole argument just kinda doesnt make sense. lastly "Are paintings the property of the painter? They probably didn't create the brush, the canvas, or the paint they used, so by the logic you're arguing, they're somehow not the property of the person who created them." This isnt the equivalent of my argument at all. The youtuber didnt create youtube, but they still own the video, right? the artist didnt create the tools, but they own the painting. MY ARGUMENT is more akin to "if a youtuber made a video about a ferarri, then said he owns said ferrari.. everyone would agree that was dumb." youtubers dont own the subjects of their videos, and if youtube starts pumping out error filled videos with bad information in styles imitating real youtubers.. WHO THE HECK IS GOING TO WATCH THEM? im not? are you? have you seen AI art? it cant even count fingers yet man.. It has no UNDERSTANDING of its subject whatsoever, but it could be an INCREDIBLE referencing tool, and if you oppose that because you think only morons dont want to or shouldnt have to sift through mountains of information for an answer.. too bad?
@@deggy42 " Teaching is a skill. If it wasn't, you wouldn't need Blender tutorials, you could just look at Blender and learn everything you need to know. If an AI scrapes thousands of hours of that work and breaks it down into soulless slurry to be marketed to morons" This is an obnoxious statement. What exactly would be wrong with that? So teaching should be gatekept because some people make money off of it? If I don't want to or more likely, don't have time to waste hundreds of hours sifting through all their tutorials I am a moron? This misses the entire point of the second part of what I said. Citing video sources would be a win for everyone. It would be very easy for an AI to provide a weighted list of sources for how it came up with its answer. If this were the case, AI gets its training data, the youtuber gets to be sourced and linked to in its answer, therefore driving more traffic to that youtuber, and the person looking for a solution doesn't have to comb through hundreds of hours of TH-cam to find the needle in a haystack. If you're a proponent of free and open education, then you should support making tools like this better. Even if someone does have the TIME required to find solutions themselves, very few people have the patience. otherwise EVERYONE would know everything they wanted. I believe everyone SHOULD be able to know everything they want, and if AI served as a search algorithm of sorts for videos, that would open up web searches to a massive new realm of information that they cant address right now. I'll give you a specific example. It took me over a year to figure out how to transfer data from a digital sculpt to a 2d normal map in blender.. in other programs It worked effortlessly, but in blender I always got the SAME error. I googled it in every way possible, watched tons of tutorials, none of them ever had the issue i had. they would run through their process, it would work. I would get that same error. It made me want to scream, and I simply gave up on texturing and normal maps for a long time until one day I had a tutorial going as I'm sculpting, as I usually do.. and all of the sudden on a short tangent the presenter explains why that was happening about 30 minutes into the video. So my experience with that issue is just "the way it should be done" because youtubers think AI is learning how to present better from them? No.. just no. I want the ability to google TH-cam videos' content so that I can find the effective tutorials more efficiently. and your next statement just proves my point because "Also, for the record, a Blender tutorial produced by an AI would give you incorrect information and be useless for learning." EXACTLY I can't effectively google solutions to many problems I have while learning blender, but if I could google it, and google gave me a brief explanation with links to its sources, I would immediately go watch those sources to find the correct information. not to mention, you're all just assuming they plan to use this to make tutorial videos for TH-cam.. I think google has better and more profitable things to do dude. they aren't trying to steal their youtubers views.. and most youtubers personalities are what attract people to them. I don't watch smarter every day because I need to know how a pulley works. I like his explanations of stuff, and if AI rips that off it will be a "soulless" version as you said, that I'm not interested in.. so this whole argument just kind of doesn't make sense. lastly "Are paintings the property of the painter? They probably didn't create the brush, the canvas, or the paint they used, so by the logic you're arguing, they're somehow not the property of the person who created them." This isn't the equivalent of my argument at all. The youtuber didn't create TH-cam, but they still own the video, right? the artist didn't create the tools, but they own the painting. MY ARGUMENT is more akin to "if a youtuber made a video about a Ferrari, then said he owns said Ferrari.. everyone would agree that was dumb." youtubers don't own the subjects of their videos, and if TH-cam starts pumping out error filled videos with bad information in styles imitating real youtubers.. WHO THE HECK IS GOING TO WATCH THEM? I'm not? are you? have you seen AI art? it cant even count fingers yet man.. It has no UNDERSTANDING of its subject whatsoever, but it could be an INCREDIBLE referencing tool, and if you oppose that because you think only morons don't want to or shouldn't have to sift through mountains of information for an answer.. too bad?
I wonder if the court will fully side with copyright on this one. I get this feeling of this whole race agents China narrative in the background of all of this. I wonder how much of a role it will play in the decision making.
I don’t think they can get away with it that easily, i’m pretty sure companies like TH-cam with all the power they have would at least try something to stop it like developing new technologies to stop them from training or something
I've only just started watching the video so this may be touched upon, but this is exactly my understanding of how these companies work. If you ask for permission at the start then you will a) never get permission and therefore b) never improve your model. It simply doesn't make sense for these companies to be legal about this at the start because it will pay far more dividends to just fire ahead and then deal with whatever legal implications show up later. Is it shit? Yes! Is it going to change? No! There is no means to make this better. Severe regulation simply won't work because it can be ignored until its obvious and then its too late (i.e. exactly what happens now) This is a weird new quirk of copyright law that we don't know how to navigate effectively yet.
@@rowanmurphy4986 they're not going to stop there. everything you love will be owned, including everything you've made, and as long as you abide by the TOS, you can pay the company that owns the thing you love and made for the conditional, revoke-able right to view it... not to distribute it though, of course; that's their thing.
Which is more frustrating, that they used your videos, or that when educational videos are so popular, that they didn't use MORE of your videos? Both are kind of a big oof.
Which is also such BS because so many of these companies say that they don't let non-copyright holders upload content. If putting it on the open web means a copyright holder needs to authorize it, then why would someone viewing it be allowed to use it however they want??
My entire life I was told that "just because it's accessible on the internet doesn't mean that you can use it in any way you want because often those things are protected by copyright" and now those companies that strictly enforced copyright law (sometimes over zealously) are saying "actually we can do whatever we want with copyrighted materials that we don't own now" because they figured out a way to make money off of it. Before these companies were investing in AI their standard policy would be to issue cease and desist letters to anyone doing exactly what they are doing now; the only difference is that now they are the ones who stand to gain from a grey area. It doesn't feel remotely fair because it isn't.
No, they're not saying they can do whatever they want with it. They're saying that the thing they want to do with it is not covered by copyright law, and also not within the spirit of the thing that copyright was intended to protect. Copyright is about duplicating a work, either as a whole or as a part, and about "derivative works", where the US copyright office gives examples of things like *translating a book, adapting a book into a movie, releasing a revision of a book, and making your own arrangement of a song* . You've always been totally fine with taking a Disney movie and studying the character's proportions, the color palette, how they draw eyes, and it's been totally fine to sell books describing this or putting that data into a computer program and selling that. And as long as what you make doesn't start to encroach on being considered derivative, you've been free to make whatever you want that's inspired by the things you learned as part of that study, both in a fuzzy "inspiration" sense as well as a literal "I measured the body proportions of Disney characters" sense. Machines are automating this and becoming tools that autonomously can use this data, but it's so much different than both the spirit and the letter of the law that copyright is protecting.
I'm going to advocate for one step further: None of these systems should be Opt-Out, they should 100% and unambiguously be Opt-In. The internet currently is a barrage of "you didn't turn off us stealing from you, so we're allowed to steal from you." and that is not and never should be how IP ownership works. If someone broke into my house and then told me "well you didn't tell me not too", that would be ridiculous and for these companies to insist that that is the system we should all be working under is even more ridiculous. And anyone claiming that burying a sentence in a 500 page ToS waives all obligation to basic respect of other people and their property is wrong and should be laughed out of the room.
Yes, my exact thoughts. Plus, there are the creators who have died, who have lost access to their accounts, or who just one day disappeared from the online space. The first one in particular is very upsetting. Opt-in is the ONLY ethical way to do this.
Legally, not stealing. People are calling it stealing, but there is no precedent in law or society of ai training, and whether or not it's stealing is not decided in court or in the eyes of society as a whole.
26:24 "go somewhere else" is the situation NOBODY is in. The size of these mega social media sites makes it impossible to have a thriving business "somewhere else." Artists have been screaming this for MONTHS, if not longer. We shouldn't have to submit to our rights being violated online that they wouldn't be violated elsewhere. If it's illegal for a company to make physical product using intellectual property they don't own, the same should apply to the digital space. Anyone trying to argue otherwise just stands to benefit from you rolling over.
The problem is, its more complex than that. It ISNT illegal for anyone to make a physical product using intellectual property they don't own. Fair Use explicitly says copywritten works may be used for transformative works. It could just as well be argued that the output of a genAI is so different from the source (in most cases) that it is transformative. It could also go that the transformative nature is the AI model itself, not its output, which is definitely different enough to be transformative. Under that interpretation, Each output would need to be scrutinized as its own potential copywrite violation. In either of these interpretations, no rights are being violated. It really does come down to how the courts interpret it. The slow arm of the law loses again to the lightning speed of innovation.
Exactly. We artists have been sounding this alarm for a while now. We can get into the weeds about technicalities and ToS phrasing, but that’s ultimately obfuscation; legal casuistry meant to deliver to tech industry executives what they want from generative AI: to profit from the labor of creatives without having to compensate creators for their work. To generate “content” without having to hire creators; or at the very least, to remove what bargaining power creators have (their creations) and therefore make creators into unpaid interns at best. For most of us, that just means erasing any chance of making a living in a creative medium. For lawyers to say “it’s okay to commit theft so long as it’s rich people stealing from massive groups of people.” I appreciate that Hank has a good relationship with many people at TH-cam but if the era of enshittification has taught us anything, it is TRUST NO CORPORATION, TRUST NO BILLIONAIRE.
I've hoy more job opportunities in fedi 1 year (3-5 people asking me to work for them) than Twitter in 3 years, including the same year fediverse got me 3-5 (0 a literal 0 works, not even for free). We're just afraid of algorythms the trick is that uploading your stuff somewhere else is cheap and not that much extra work (2 extra clicks once you have setted Up) and contrary to popular beliefs, it's rather less sressful than just betting on a hellsite to help. Gives you a sense of actual chance of living better and outside this shit, to a more personal situation, more like the old internet with the modern money movement. We can't scape PayPal patreon and such tho.
Also, 'go somewhere else' is a legitimately impossible thing to do for video streaming that is above 360p. Mass-scale video is the one Internet business that is legitimately extremely expensive to run, it's insanely unlikely anyone can _ever_ compete with TH-cam short of having state backing (China) or incredibly extensive legal reforms (EU maybe?).
Love how these CEOs want to claim all this content is "fair use" because it's on the open web, but then will be first in line to sue an individual who takes the same stance when it comes to their own IP.
Why anbody is surprised, this is why we overthrew monarchies and empires and why most of the world is some form of democracy, mostly republics. Maybe we'll do better next time, but the wannabe Empererors are getting too powerful now. Again.
I mean, the video creators did click the box that said they agreed to the contract that says TH-cam can do all this stuff and more. They did ask already, years ago.
@@ReverendRaff That sort of logic only works in the world where putting "All content is fair use, no theft intended" in the description somehow shields you from DMCA takedowns.
@@Name-ot3xw it's wrong and hypocritical for them to do it, and most people understand that. If you want to be a bootlicker for these corporations, just say that 🤷
Well yeah, obviously - corporations have a right to profit. Consumers pirating = infringing their rights; companies stealing for profit = exercising their rights. Really, we should just give them our money for free, they deserve it
@@orterves While we're at it, we should also give them ~1/3 of our lifespan, neglect our relationships in favor of meeting the companies' demands, and educate our children to do the same! It's simply knowing our place, really. :V
@@broccoli9308 The philosophy behind this is that the movies being pirated were made by organizations with a lot of money, and have probably made back their money, so they suffer a negligible amount of harm (if at all, there's a debate about your potential as a customer there). Public internet content, in this case, the vast majority of it is people who have comparatively far less money, and did not create any of it for commercial reasons.
Yes, absolutely TH-camrs should be allowed to opt out. In fact I will go farther. Opt out should be the default option and users should be given the option to opt in. I know, naive, right? 🤣
It's incredible the amount of people that suddenly want to restrict access to information. Do people tell themselves that they are not using any other sources for inspiration or reference?
@@broccoli9308Restricting information would be taking down the videos or putting them behind a paywall-you’re misdirecting the conversation. The people who want their AI to train on TH-cam videos plan to make a lot of money from that. If you want to make money off of work someone else did, that absolutely warrants at minimum asking their permission. Some people might be down to let someone make money off of them even without getting paid a cut of the profits if they think the tech is cool, and that’s their decision. Not everyone is going to be okay with that, as most people want to be compensated for their labor.
@@broccoli9308 I don't know how many times I've had to reply to this sentiment today. AI is not the same as people. There is a difference between people and AI. How on earth can you think that AI can be inspired? Inspiration is a uniquely human trait.
@@draconvarie AI in its current state can not be inspired. You are correct. Its the person that controls the AI that provides the inspiration. Its just a tool. So honestly we should be asking the question does each individual person who is using any give generative AI model have the right to use the data in said model to create derivative works.
It is my absolute right to learn from any source I want, and I will never allow someone to opt out of being a source for my learning. My learning, or my machine's learning, it's the same thing. The only right you own is copy-right, and absolutely nothing is being copied and reproduced.
*_Give TH-camrs an option to opt of training AI models. I'm not a youtuber. I just love watching TH-cam and I love the TH-camrs I watch and want their work to be respected by Google as just as much as I respect it. Google you have nothing without these creators._*
Then we are all disconected nodes... The internet needs to internet. "Useragent: *. Disallow: " is the way to go. Open google's data for us to use aswell.
I'm not a creator and I still want an opt-out. I don't want these awful "AI Search" functions that are becoming mandatory everywhere and are worse than regular search. I don't want these awful AI ads and images everywhere. I don't want to support companies that use AI to save money instead of paying artists. And yes, every creator and publisher should have the option to opt out. And publishers/websites should not be able to opt in their individual writers and artists without consent. I'm starting to think I was way ahead of the game when I took part in a protest march at Google headquarters 10 years ago over copyright issues.
One way of reading this, is that all these AI companies are killing the open web. If you can be scraped you can be trained means that suddenly there's a new incentive to leave the open web. That the whole internet is slowly filled with AI sludge means it's harder and harder to find genuine good human interaction. They're killing something we knew had value and replacing it with something that is unproven but makes them money.
@@fallingphoenix2341 Yep. I'm spending more time on Mastodon, on a server hosted by someone I know, with strong privacy controls, where I can opt out of web scraping. It will never be popular because enough "influencers" won't migrate there - there's no algorithm to cater to their egos and funnel followers to them. But it's so much healthier for the end user. No ads, no Nazis, no AI, no algorithmic junk.
I do believe content creators should be able to opt out. The data is clearly valuable and it being used for the purpose of training. Creators should get a royalty cut for helping to train as an incentive to opt in. Seems fair
They did it probably because the AI is dumb and thinks if you talk about AI for a whole video and even mention Gemini in it then this must be the perfect place to advertise that.
@@tradutorajuliana Personally, I don't think they assume that. I think they know that nobody really wants that, but *sure would love * getting paid for the stuff we all made instead of us getting paid for it.
Google being allowed to use TH-cam videos to train their AI because of the terms of service is a great argument for anti trust action to break up the company. It's anticompetitive if google can use TH-cam to train but apple can't, but more importantly it sucks for people who rely on TH-cam for their livelihood have to agree to letting google train GenAI or find a new job.
Anyone can train on TH-cam videos. Do you see Google doing anything to stop anyone? If Apple owned TH-cam they would put an app-store-esque fee on training. Apple's ethos is HEAVILY closed-source. Google is and always has been the opposite. Why punish the best actor of the bunch? (Whether these companies SHOULD be able to train on TH-cam videos is a separate issue, but this hate toward google is irritating when they are the LEAST value-extracting company of the big techs.)
@@flipro23did you watch the video? Google says they could sue Apple for breaching TOS if they scraped transcripts because transcripts are not free to scrape, but Google's TOS also grants them a license to use transcripts of TH-cam users. Being the least evil of the bunch isn't good enough in this situation.
@@Scarybug There's a difference between saying it breaks TOS and actually preventing scraping like Reddit recently did. Your comment said Apple can't use TH-cam. But they can. There are new generative video models every day and most of them have likely scraped TH-cam. Say what you will about the ethics of artist compensation but you can't say Google is shutting out competition (like Apple does with their app store--and they are getting hit with anti-trust enforcements for that right now). If Google blocks web-scrapers on TH-cam, I'll change my opinion. But as of now I DO think they are the least evil (with regard to allowing competition), and that counts for a lot.
Name a more iconic duo than techbros and disregard for consent. All this "opt out" shit just screams "it's easier to ask for forgiveness than permission." Except no one peddling these models is bothering to ask for forgiveness either.
Absolutely. They should be significantly taxed and that money go to projects that help fundamentally reshape society (as voted on through direct democracy such as citizens councils-an AI federated co-op, if you will). Companies whose business models aren't aligned with human values can't create AI aligned with human values.
Indeed. It is way too late to put the AI genie back in the bottle, but what we can do is to prevent a future where only a few companies and the governments have access to AI to do whatever they want with them, while 99.9% of humanity have nothing and are at their mercy. AI trained on public data should be seen as public domain.
"I cannot opt out of TH-cam. ...I live here." This reminds me so much of people talking about changes in their beloved hometown. I hope your digital hometown will be ok
It's an interesting analogy, but towns are owned and governed communally, by the people who live there. Meanwhile TH-cam is a private platform so neither users nor creators have any say on its governance. Which sucks.
It won't be because the people "living" there won't actually do anything meaningful and lobby to change the laws to curb this sorta behavior. They expect lawmaker's to wake up and do something about it, lame.
@@databoy2396 Cities and towns generally are actually corporations, technically speaking. But it's certainly different from a world-spanning media corporation.
There is a rather odd analogous situation. Back in the 1980's there were multiple lawsuits regarding strip mining. The problem was that in the 1800s coal companies had bought the mineral rights to large swaths of land that were then being used as farms. When those rights were sold the technology for strip mining did not exist. Therefore, those selling their rights very reasonably expected that the mining companies would, at most, sink a shaft to mine out any coal under their land. I believe access rights were included in the original sales of rights. Then, in the 1960's - often almost a century after the rights were sold - the mining companies could (and did) start showing up and destroying the land over which their mineral rights resided. In short, they destroyed the farms without permission or further compensation. The obvious issue is the landowners had never agreed to have their farms destroyed nor would that have been possible when they sold those rights. The mines made no attempt to restore the land after having extracted the coal. In short, the land had no residual value despite the surface rights having not been acquired by the coal companies. Effectively, they had acquired ALL the rights to the lands without paying for them. This is not about the EPA coming after the mining companies to rectify the environmental damages they had inflicted, which is also a thing. No, I do not know if there was ever a broad settlement on this but I do know that if you enter "strip mining lawsuit" into Google search you will find pages and pages of lawsuits. So, now you have AI developers "strip mining" the web. Some things never change.
AND not to even mention outright theft of land tirelessly built and stewarded for thousands of years by people whose work was not seen as valuable until its fruits could be sold for US dollars.
it’s so depressing to me because there really is potential for AI use as a tool like Hank mentioned in subtitling or translating or the YT recommended algorithm. but, companies are focusing on the generative AI side, which is exactly where that robbed feeling comes from I think. Feigned creativity from genuine creativity :/ I’m just hoping the inbreeding ouroboros of generative AI trained on generative AI will kill companies’ interest in it
32 of ours according to the proof search. 🙃 No me gusta. I mean I love tech as much as the next guy and my background is a masters in computer science with specialty in AI, ironically enough right before I got into doing this 15 years ago. BUT, so much of the way the data is used here for profit on the backs of other's work seems insanely problematic with a lot of broad implications. Even when discussing just basic search where google or microsoft giving the AI answer derived from other's work, but not sharing revenue with rhe sources the AI was trained on, and potentially taking clicks away from people going to such sources for themselves. Definitly should be an opt-out capability at the minimum, and it should actually be the other way around of an opt-in, and opt out be default. Yes, we learn from other humans for free. And yes, we create content towards this end as well for humans to learn from for free. BUT its a mutually beneficial relationship for all snd sustainable. An AI learning from creator's content is completely one sided and even hurts the source. And, of course, AI can learn in minutes from thousands, etc. Etc. Anyway, great video and awesome getting more public attention on all this. -Daven
@@michaelmicekderivate works only work when they don't have copyrighted elements and taking away an entire video to do something like training an ML without permission consist of copyright. It's similar to taking a song in it's entirety and using it as background in a video without permission, sure it's derivative as I can add a random video to an entire song but i can be copyright striked too for doing that.
It’s pretty astonishing how companies are able to escape legal responsibility for anything posted on their platforms on the grounds of “we didn’t make it” but then turn around and sell the content on the grounds of “it’s on our platform.”
WOOOOOOW! You hit the nail on the HEAD! That is an apt observation. Thank you for articulating it this way because those aspects hadn't occurred to me.
its disappointing that you can't understand that both of those statements can be true. Your content is produced by you and so you bare personal responsibility for the content of that material but you are posting it to a third party site, who as part of that agreement gain rights to your work. It's not that hard.
This was my first thought. It's a dangerous game Google is playing. There are loads of politicians who would love to revoke their Section 230 status, and this feels like an inroad for them.
@@anotheryoutubeuser142 Exactly, if you preformed at a venue, you're responsible for the show you put on, but the venue's owner could sell a recording of your preformance, because its their venue. This is what TH-cam is doing
Four of my videos were used, I just learned. That's about 5% of a vlogbrother. I can live with that ratio, but am a bit miffed that my work is used without my being asked.
Months ago TH-cam recruited creator for a “study” I was one of the selected candidates and you had to pass certain filters but only if you accept their terms. I spent few hours reading it and oooh boy. Literally said, giving away my voice and face for their AI training study!!!!!
All TOSs and privacy policies are chilling when you actually read them. You should see what’s in the policies for doctor’s office portal software. If you mention it to your doctor they’ll squawk about HIPAA but then are surprised if you tell them what’s actually in it.
This is 100% a “steal first, ask questions later” situation. They know for a FACT this is bullsht, they’d never let another company use their own data like this without proper payment and consent, but went ahead and did it to others because they know, by the time some people can fight back, they’ve made so much money on it, it’s not gonna matter.
@@TallicaMan1986Here comes the bootlicker, are you going to defend disney too? They recently forgave themselves for a death in disney world because one of the victims family members had disney plus, and in the contract it forbade them from pressing charges. These are the types of companies you are willing to die on a hill for debating semantics
Reminds me of the saying that goes like "ask for forgiveness not permission." I guess the lawsuits signify that we are approaching the "forgiveness" stage.
But creators did consent when they accepted the terms and conditions lol This is not some grand heist. It's just people like usual not reading the terms and thinking they've been swindled
Not people, companies. Companies aren't people, companies don't _"want"_ anything. Yes, companies are run by people, but remove every greedy CEO today and new ones with the same motivations will take their place tomorrow. Because Companies survive on profit, on reporting growth. It's like how you can't fault mold for eating its way through your food pantry. It's just what mold does, it's just following its survival instincts as the living death taxonomical nightmare organism that it is. A capitalist company is the same. The people don't matter because there's enough selfish greedy people to go through in the world that it would never run out, and should a "benevolent" CEO one day be at the helm, the executive boards of directors, shareholders, and the capitalist system in general would make their influence non-existent and their career short-lived. A publicly traded mega-company is like a superintelligence with profits as its reward function. It's not about the people, it's about the ecosystem of our global economy that provides the perfect conditions for these companies to capitalize.
Is this why the algorithm has been listing videos titled "why everyone should have a TH-cam channel" "how starting a TH-cam channel saved my life" "I post even if no one is watching" oh they're watching alright. Thank you Hank you and your team make this world make a little more sense to me. ❤
I think the important issue for most TH-camrs to understand is that no matter how good your relationship is with TH-cam, TH-cam's relationship with Google/Alphabet and its partners will always be the more important relationship to them. No matter how well individuals at TH-cam treat you, no matter how much they "get it" when it comes to creators, you will always come in last when it comes down to other priorities.
@@Leftistattheparty This is why workers should unionize-and maybe creators should start thinking about what that would mean for them, too. You vs Google loses every time. Us vs Google wins every time. We just need to coordinate and organize.
TH-cam doesn't have a "relationship" with Google/Alphabet. *It is Google.* It is literally just one department of Google. It's like talking about your relationship with your left hand.
I love the term different incentives. The best way for people to work together is by having the same incentives. The reason why management gets along well with employees is when intensives are aligned. When they there is a disconnect it's because incentives are different.
Idk if other people have brought this up or if you noticed Hank and just didn’t mention because the video is already fairly long but Scott from Nerdsync did a video on this topic as well and went through the proof database and found that their videos that were scrapped all had user uploaded subtitles. They talked to some other smaller TH-camrs and discovered that the few videos of theirs that had been used to train AI also all had user uploaded subtitles. So it seems like the ai is targeting videos with user uploaded subtitles which is very interesting and might be another reason why educational content is being used so heavily to train ai because companies like Kahn Academy and you and John's company Complexly's videos have extremely consistent user uploaded subtitles. Anyways, just thought you would like to know.
Wow. I'll have to check that out. That's so insidious, weaponizing an adaptive tool that so many need and appreciate, given the fact that auto generated text is often garbage. I'd be truly disheartened right now if it weren't for Hank and people in this community being the ones looking to take Google to task in this. They are ones to take on Goliath and give him a proper trouncing.
+ This makes a lot of sense. AI researchers are aware of the "ouroboros" problem Hank mentioned and this is their way of getting around it. Most AI-generated content has some way to mark it as AI-generated integrated into it (I believe this is usually done through a "fingerprint" from a model, although that's not required and not in the interest of these companies so not all of them will do it), and then they don't use AI-generated stuff in the training data, only the real human-written words, images, videos etc... which means pretty much always breaking the moral trust with the people who wrote those things, and usually also violating copyrights.
The infuriating part is to not even ask for creator's permission before using them for training. Creators being able to opt out is the absolute bare minimum that TH-cam can do.
It demonstrates that the company only values creators ability to make them money (primarily by capturing attention for ad revenue). The business has (nearly) no incentive to care about creators for their own sake.
It's infuriating the lengths companies will go to make people think that the thing they don't want to happen is not happening all while they're actually directly making it happen.
@@cuthbertallgood7781 Or that something might sound like a good idea but overall be even worse. At least currently other then compute everyone is on an even field. Would you rather only big companies and rich entities be in control of such tech?
@cuthbertallgood7781 Hiring people to make videos, art, music, photos, and Computer Code for the AI to train on is ethical; this isn't. AI is a Tool; not a person.
In addition, I think Google is using university Catalogues for AI training. I work at a LARGE (will not name) uni. Google is collecting books to scan for open access. I am pro open access obvi, but a ton of these books are already available across the internet, like Internet Archive. Does Google just want to have their version? I mean probably but it is Already open access. We are picking hundreds of books per day to send over in wooden carts every month. Some of these books are not copyright free yet and Google told us that they want to have it "ready to go when copyright has cleared"
@@nexypl No, OP is referring to Google Books (distinct from the ebook store Google Play Books) project to scan and make searchable physically published books. Your thesis will be under whatever appropriate copyright applies between you and your university and should only be scanned if it's made public domain or your university has a specific agreement with Google or some other AI company
As a photographer, this is a situation we're dealing with too and has caused myself buckets of existential anxiety. That our agency wasn't even considered when these companies just vacuumed our work just to get hands a little less creepy looking. This is the zenith of the silicon valley concept of move fast and break things and I sincerely hope that it thoroughly comes back to bite them hard.
If someone came to you wanting to buy a license to train their (A.I.) models on your photographs, I presume you would be unable to agree until you had worked out agreements with every (human) model and landowner you had used in your work?
@@ps.2 Straw man dude. That shit is a no brainer, especially with models since you license photographs of them to the photographer. The models own part of the photographs they've been photographed in. And the land thing, you can take pictures in public and you own the picture...
@@johngddr5288 Yeah you have a license agreement with the model. But does this license agreement include permission to train A.I. models? Or do you agree with Google that this is simply implied by language that was written before anyone really thought about this possibility?
@@ps.2 No f google and every company abusing a loop hole by a new technique that requires a firmer clarification of copyright law. Just because Google thinks they're safe because they wrote "derivative" in their licenses, doesn't imply people would agree to be exploited by a technology that didn't exist before. A case should be made that simply saying you give them the "derivative rights" does not include training for AI models
@@johngddr5288 No need to get defensive. I'm just trying to figure out how a photographer feels about the right to use their photographs in some new way, such as training an A.I. Would you feel obligated to get permission from the subjects in your photographs? Or do you feel that because they already signed an earlier agreement, that they would have no moral right to object to their likeness being used in this new way?
I love how the title is the question and thumbnail is the answer. No browsing ten minutes to find if the creator even answers the question during the video.
On a very simple, basic level, if companies can block your video because it has a song in it or Disney can sue you for having Iron Man or a character similar to Iron Man on the banner of your company, then AI companies should not be allowed to take the entirety of your body of work to shove into their machines You can't know exactly what an AI is basing a generated video on, companies bank on it, but you can 100% ask for consent from creators before scraping their stuff. If tech relies solely on exploitation of others in order to function then it probably shouldn't exist
@@isaacbagley8211I mean sure, but that doesn't mean we shouldn't strive to stop it. Just because our agricultural system relies on human trafficking and slavery doesn't mean that we should strive for a world free from slavery.
@@isaacbagley8211 Yes, but it doesn't mean it should be allowed When the exploitation is this blatant and in your face then there's no reason why people shouldn't try and stop it from happening
You're confusing copying ("has a song in it") with learning ("I learned from this song and created an original work"). You can learn from Disney stuff all day long, and all the AI companies have learned from Disney material. "Copy rights" protecting copying. It does NOT protect learning, and we must fight to prevent the insane greed of people wanting to be paid just for seeing their stuff and learning from it. My learning, or my machine learning, it's the same thing. Learning must be absolutely protected.
But those two things aren't the same. You using a disney song is infringing because the _outcome_ has copyrighted material in it. If an AI model _reproduces_ your work (or something similar enough for copyright law) then that'd be a good comparison. But an AI model that is trained on copyrighted material and produces content that is different enough (by the same legal standards that regulate derivative work) isn't breaking copyright in the same way. I'm not saying that training generative AI on copyrighted works is good, but the argument you made (and that I see lots of people make) doesn't make sense.
In some ways I agree with him, but not when it manifests in this way. The hallmark of a good government in the internet era is agility. Let's clamp down on this before we see what happens. Because if we wait we may _really_ not like the result.
Yeah, that's definitely not true. At all, except for things like public domain or Creative Commons (CC) licensed content. Funny thing is: he rants in this video, but they choose to put CC on these videos, so he's the exception to many of the other creators, who left it on the default Standard TH-cam License and they have far more rights than the CC license he put on it !
Hank I just wanna say how much I appreciate whatever the opposite of clickbait is, because your thumbnail absolutely killed it. In the hall of fame alongside Veritasium's salt lamp video you go
And if you want, you can just take his video and do whatever with it, it's in the Creative Commons license he choose for the video and at first glance all the videos on this channel. The funny thing is, he's the exception to the rule, those other TH-camrs all choose the default option: TH-cam Standard License.
I love this take, I feel like it's realistic and measured. Google/TH-cam does have an opportunity to just add an "opt-out" button for creators, which 99% of people would leave on, but would sanctify all the youtube content they're training on. It sounds like a hassle to redo costly model training, but it's still early, and getting data permissions in order at this point would give them a huge advantage once the courts come down on AI companies.
As an independent creator, I don't want ANY of my content used to train ANY generative models without my express consent, and I should NEVER have to "opt out" after the fact - my content should only EVER be accessed and used for this purpose if I opt IN FIRST. ESPECIALLY if there's no way to guarantee that my content can be expunged from the model after I opt out. Otherwise, we're agreeing to a system where TH-cam and anyone else is free to steal all the content we've ever made, and only have to ask if they get to *keep* stealing all the work we do from the present onward. My content should be removed from all of these models, and these companies shouldn't have the right to access it as training data at all without my explicit and informed consent. Period. Edit: all of the replies drawing an equivalence between human beings learning and taking inspiration from creative works, and generative models ingesting creative works as training data, are foolhardy at best or bad-faith propaganda at worst. Google training a generative model on your work and spitting out lookalike content is categorically NOT the same as a human being taking inspiration from your work. It's corporate IP theft at scale. You anthropomorphize corporate AI models at your own peril. And, by the way - human artists credit their inspirations.
I’ve got to give you credit-this was a solid video. The fact that you actually mentioned the EULA and broke down critical sections like “sublicense,” “transferable,” and “royalty-free” is something most people skip over. It’s wild that nobody reads these things-99.9% of us just scroll down and hit “Agree.” But you’re right; this stuff is universally tucked into all EULAs. And as long as there’s a door with a key somewhere down the line, your data and personal work are never really protected. If that key gets handed off, even just once, whoever holds it can access and share everything. And let’s be real-none of us agreed to this new wave of data scraping for AI. Back then, we didn’t even know what was coming. It’s kind of like when we didn’t sign up for the CIA and other agencies recording our calls and texts for “national security” after 9/11, but they still did it. The same loophole-y, vague lawyer-speak that lets them justify that is now letting companies scrape our data for AI. Whether you’re just a regular viewer or someone running a small business, we’re all stuck agreeing to this nonsense because, in the end, we all just want to get our content online and maybe make a little Praise and money if we’re lucky. The truth is, as long as companies can find that “key,” (raddit and alike) - backdoors will always exist. Sure, opting out-or better yet, opting in-would be great, but the laws and EULAs are always going to be written in favor of those with power. And let’s face it, government and big corporations aren’t going to change the rules that benefit them. So, what’s left for us? Either we deal with it or we start hosting everything on our own servers, behind paywalls, with strict “do not scrape” policies-good luck getting the average person to do that! It’s just easier to scroll past those long agreements without reading a word. And yeah, this is coming from a dyslexic who actually *did* read the EULA with some AI help, and I still don’t agree with it. I especially didn’t sign up for that one site trying to steal my character and let others use it-no thanks! It’s frustrating, but as individual users, our hands are tied. Until the big players decide to make changes, we’re stuck with this reality. The end-at least until the next convoluted EULA update drops.
Yes Google is training on your videos. The New York Times reported on this months ago. They found out Open AI was training on TH-cam videos. They also found out Google was reluctant to do anything about it because if they called out Open AI, it could expose the fact that Google was also training on TH-cam, which is against Google's own terms of service. All Google said to the New York Times was if what they uncovered was true, Google would pursue AI companies to the fullest extent of the law, since it was against its own terms of service. Months later, Google has not pursued any legal actions against AI companies.
If you didn’t know, Google is using the scapegoat excuse of “We didn’t let them; a third party did.” They claim they handed the key to one third party, who then passed it to another, and so on, leading to the situation where none of our data is truly private. If we really want privacy, we’d have to start using encrypted scrambled text. Unfortunately, even that wouldn’t hold up for long. Sad to say.
HEY SO! ARTISTS! Have been talking about this for the last couple years. This has been decimating 2d concept artists and visdevs and companies want to create internal models based on work from people signed on to give up their created works under the company to then internally replace them. They did this without consent or compensation already, now they're trying to backpedal to just going "well now we have the base line foundation but will you help us finish it off so we can remove you all from existence?"
Hello. It seems you have added an inappropriate and unnecessary word to your comment. The word in question is "don't" . We have removed that for you. There, isn't that better ? 📈💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵
Wow 18 years is incredible I've been watching your content for at least 10 of those years, I even remember creating an account just so i could comment on scishow videos. I choose to do a chemistry degree after I watched crash course chemistry. The work you've done is invaluable and the fanbase is ready to defend you
The phrase "derivative works" is doing a LOT of heavy lifting in the ToS. I think it's super unfair for a company to one-sidedly profit from your hard work without even giving you an opportunity to disagree. This is an excellent analysis of the situation! Love the videos
Class action idea for ya: If your account existed before November 2022, Google has performed a bait-and-switch on the terms. Generative AI was not available to the general public before then, and it is reasonable to believe that it would **not** be covered in that license. ANY content before November 2022 should be off-limits, and users should have the right to be compensated REGARDLESS of whether they know Google used it, unless Google opens up its usage to a third-party auditor to calculate who is part of the class, at Google's expense.
@@everettvitols5690 available to the general public and with corporations created around it? Being technically correct isn’t correct if all context is ignored.
I really really appreciate you adding the disclaimer about "AI ≠ generative AI", it's been so frustrating working as an academic in machine learning and trying to have public conversations about what generative AI is/what it can do/how it works/the associated ethics/etc with people parroting blatantly incorrectly nonsense. A nuanced worldview doesn't exclude feeling righteous indignation at it's problems, if anything it informs and justifies it.
I am not a content creator, but the work of so many creators on this platform has enriched my life. Firmly believe creators should have a choice about this--and it should be opt IN, not opt out.
I'm sick of companies using TOS obfuscation to make us "consent" to things they know full well we wouldn't agree to. It should be opt in, not opt out and certainly not no choice at all.
@@gavinjenkins899 WE didn't. Cause we aren't machines that ingest and regurgitate things with a % more one style to another by taking in every possible bit of information and repurposing it with the express purpose of feasting and leeching off of the source material until it is unable to support itself. Hope that helps.
@@zach7 We literally. Literally. Are machines that ingest visual information and regurgitate things with a % more one style [etc etc]. So no, that doesn't help, since you just equally described human brains. Except for the "unable to support itself" part which doesn't describe either AI or humans, and thus was just off topic. The AI works the same way your brain does, it's called a "neural network" for a reason, it works pretty much just like neuron systems do.
@@gavinjenkins899 I share your perspective about us doing the same thing. If one can do something better than another, like a person with a photographic memory remembering the pages in a book. Do we disallow them from looking at books because they will remember it to well and may harness that better recall to write a better book in the future? I think the answer in that situation is, of course not. On the other hand what if a program is taking a photograph of every page in all books to create a better "on demand" book than any human could possibly hope to write in the future. This program would only financially benefit the group of people who created and implemented this program. The program is not a sentient being but a tool that was used to take opportunity away from all others. Is that different? I think that it is. Should all of this result in an AGI that is not a tool of some for profit company then I think however it wants to learn is fine. I do not think that is what is happening right now. These companies are making tools that strip economic opportunity from the rest of society and then funnel it to themselves. There is no single superior entity here doing something better than another. What we are talking about is a tool made by the many that will result in opportunity theft and that will benefit a select few. You may argue that this is fine but I do not think this is a very wise move if a proponent of the well being of humanity. There should be some conversations on is now the time to seriously consider universal basic income and does everyone need to become tradesmen and farmers from now? Will everything else will be handled by for profit AI companies for the foreseeable future? I know my advice for young people has changed in the last several years.
@@gavinjenkins899 Nope. You have a fundamental misunderstanding of how human brains work. Humans can’t make a 1-1 replica of someone’s art. AI can and do, to the point where artists’ watermarks very often pop up on AI-generated images. However, this goes beyond just art; these AI companies are stealing valuable data that they do not have permission to take. Saying AI works exactly like humans is a very poor, weak excuse that does not hold up in reality. AI are not human; they are a product. Billion-dollar products, in fact. If you take someone’s intellectual property and/or data to create a product without their permission and without compensating them, you are stealing. Stealing. Full stop. You do not get to take things for free. That’s being a thief. I should think that’s a very simple concept to understand.
I think this really boils down to the fact that generative AI is a product. It's not a person, it has no rights or legal autonomy. It is a tool built by a large tech company to sell to other companies for a profit. When people make movies, everyone that participates in the making of that movie gets paid and credited. When you build a house every piece and laborer gets paid to build that house. The ONLY industry that gets away with theft of labor, code and intellectual property is Big Tech. Microsoft did it to put together Dos. Facebook did it to create user profiles for Students that weren't using their service. And the list goes on and on. AI is just the latest caper by big tech to unlawfully steal people's property and information to repackage it and sell it as a product to someone else. The problem is that governments don't know how to handle or classify Data. In part because they themselves LOVE the ability to buy it from these massive monopolistic corporations. But it's a total violation of privacy, a massive security risk for citizens, and just a blatant violation of copyright law. Data is one thing.... but TH-cam videos aren't data, they're copyrighted works. Nobody has the right to use, copy or distribute a copyrighted work without express permission to do so from the author. If AI was being trained on Mission Impossible 3 you bet your bottom the MPAA would be suing these companies for millions of dollars in damages. Ya'll need a class action suit. Because TH-cam / Google is taking advantage of their position as a platform to steal from creators. And if they don't get hit with a hard legal bat now, they're just going to do worse in the future.
YES!! This! There are so many comments that are trying to equate human learning with AI learning. They are not the same and should be looked at differently. AI is so very powerful that we cannot compare that to how humans learn and build on one another's knowledge. Trying to argue that AI should be able to scrape all the available knowledge on the internet because that's how human's learn is a ridiculous comparison.
I'm a bit of a writer and this last year I've struggled to put any words down. My whole corpus has already been eaten to feed regurgitive AI (on Google and Microsoft clouds) and I just hate that anything I share going forward is gonna be too.
Thankfully, none of what these AIs could come up with would be on par with the worst writer. They don't understand what they're saying so continuity and psychology are extremely inconsistent and you just end up with something involuntarily goofy. Not saying there's never gonna be an AI able to do that, but LLMs aren't the ones which are going to eventually do that. Their very structure prevents them from understanding what they are outputting. If they're able to "understand" something it's more like an instinct for grammar, not of the meaning. It's good but not great at writing short posts to pretend to be a regular user on a website but it can't be good at creating a script for a movie.
@@Maverick_Mad_Moiselle while I agree with you it's less that I'm worried AI will do better and more that I just hate knowing my work is being stolen to enrich the rich and devalue the creative. I put a lot of myself into my writing and it almost feels like that self is being assimilated into this amorphous mass of corporate greed.
As a visual artist and first wave victim of gen ai training and market disruption, I was very disappointed by several big science channels (not yours /nevermind, edited below) as they were using ai thumbnails and illustrations in their videos. Some of them were very eager to defend their actions as they (probably) felt safe from scraping and training. This situation shows nobody is safe, any domain is on the chopping block. I wish we could all get over this new type of exploitation and become more united as creators. Good luck! Edited to add: these last 2 days, 2 video thumbnails looked like they were ai generated on the sci show channel. Both were changed after a few hours. Please stop, if you are doing it.
Problem is the viewers don't care because the creators didn't care when THEIR jobs were on the chopping block. When it's programmers, cashiers, drivers or factory workers getting replaced it was "progress" and told everybody to become a creative "Robots can never take our jobs" There's a level of hypocrisy in the creator mindset, that's content creators have no problem using AI based on other arts they don't participate in. Because they don't actually care about anybody but themselves.
Agreed. This is something that needs to be fought, and Hank/people in this community have shown that they can do it. The commenter saying that the creators didn't care when others jobs were cut is being disingenuous. On this platform, from creators here is where I've usually learned about these incidents, often including what action can be taken or how to help. There is a large section of content creators who have been supportive of workers, unions, small businesses/sole creators. Greed and exploitation amongst these monopolies and conglomerates has reached staggering, pathological, proportions within the last 20 years. It will remain to see if we stop the combined Handmaid's Tale and Rise of the Machine(s)....
@@erinmac4750disingenuous? No, it's reality. People are so out of touch that they don't realize that these creators are only making noise because they're afraid of losing their job. Fact is these guys aren't talking about AI is being trained on people's output in work from home which is allowing companies to start replacing workers. Why aren't they fighting against that? Because they don't care, there's OTHER uses of AI happening right now that should be talked about, but creators are only focused on "it's stealing my art" when it doesn't really even work that way instead of discussing everything happening with AI taking jobs.
@@SherrifOfNottingham we are all getting shafted. let's not fall into a classic "divide and conquer" all human output is shoved in training. we all need to recognize, get educated on the tech, communicate our needs, and propose policies. together. personally, I am more read up on ai image generators and I find musicians and youtubers to be adorable saying the same things I did 2 years ago, while they catch up on training, datasets, neural nets and diffusion models. But we need each other ps: your complaints are totally legitimate
I actually was a little bit surprised. I think the fact that they won't say it out loud indicates that it matters whether or not they are actually doing it.
@@vlogbrotherswhat I find crazy is that google can change the TOS whenever they want, but the people that are bound by that TOS can't change it at all. If we say that training an AI is not part of the TOS and the TOS needs to be updated...radio silence
@@IceMetalPunk well the Disney library is publically available and free! On pirate websites! Therefore, it's not piracy! Cue the tone deaf reply that doesn't understand humor and just loves the taste of boot
@@thedarter 🙄 You and I both know there's a difference between "someone stole it and made it available for free" and "the person who made this made it available for free". Humor's fine, but if you call that humor, it's about the same as those conservative, "anti-woke" comedy specials: it misses the mark and is being used to make bad points.
What upsets me the most about generative AI is how suddenly it has seeped into everything. Real life and internet life. No matter how much I want to, I can't avoid it. You can't scroll through instagram or facebook without coming across AI generated content and so many people fall for it because it is never, ever tagged as AI generated. I worked in a primary school and so many of the teachers so quickly started to rely on AI for word mats and picture inspiration and even lesson planning. People now use it to write essays for them at university - you are not learning if you are getting a machine to learn for you! The whole point of university is to broaden your mind and LEARN! Companies suddenly don't need to hire as many writers because an AI programme will write it and then one human can fact check and edit it all. Voice actors are getting replaced with robots. Etc. Etc. Generative AI will make a generation that is completely incompetant and unable to think for themselves. I feel so sad for the kids having to grow up with this. Generative AI is like the dinosaurs in Jurassic Park. They "were so preoccupied with whether or not they could, they didn't stop to think if they should."
this is what scares and upsets me. it feels very distopian and it seems that people are just like ... okay??? with kids not learning basic things. reminds me of an article I read a few years ago about computer programming classes needing to do intro into file directories because kids grew up so reliant on search they never learned how to follow a directory, which is how computer programming works. to take that google commercial during the olympics as an example... so instead of sitting with your kid and helping them write a letter to their hero, you're going to sit with them and tell google to help your kid write a letter. which means your kid never has to learn how to write a letter........ like??? not only is it soulless and replacing human parenting with literal google but also it's doing the kid a disservice. how are people so okay with this lazy attitude. furthermore, you have to go back behind genai to make sure that what is ther makes sense, because a lot of it doesn't. so then you're editing it when you could have just done the work??? idk seems so crazy to me. and like I feel like we're all just sitting here pretending there's no issue when like.... if you think through "google teaches my kindergartener!" that sounds fucking terrifying!!! and immoral!!!! and BAD!!!!!!!!! like????????
@@zobothehobo You make so many good points. I can't understand the complacency either. Like with your example of gen AI replacing parent/child bondong time. That's a valuable moment cementing that relationship. That's the kind of love we remember and carry with us through our lives. The kind of love that makes us human. How are people okay to let what, essentially, makes us human be stripped from us so easily. And yeah, it might be easier to let an AI write a letter for you, but writing a letter shouldn't be a hardship. It should bring people joy like any artistic pursuit.
@@hcstubbs3290 "How are people okay to let what essentially makes us human be stripped from us so easily" THIS especially when you think about the first piece of culture or society that we did after becoming biologically modern humans... was create art. Art is human. Humans are art. It is a part of who and what we are as animals. so removing art from humanity is literally BAD like it's never been that way!! we know how important creativity is for mental, emotional, and physical health!!! and yet?????? like?????
My main takeaway of this video is that Google is a monopoly and should be broken up. Same company can't be owning the biggest search engine, youtube, and now creating their own ai-tools using the youtube-data. It's clear conflict of interest that youtube is giving the data to google while not selling the data to competitors, which then hurts the creators, who should get their share of the profits.
The only greed here is from "creators" who want to be paid if I, or anyone, learn from their videos. Creators should get NOTHING from learning. Absolutely zero. Don't like it? Don't release anything into the public. Copy right only gives you the right control copying, not learning. People need to think about a world where people can control what you learn and sue you if they think you learned from their material.
It's not a conflict of interest, because none of their divisions would have an interest in doing otherwise. Users are not the customers, our attention is the product. Videos are how they acquire the product. So, our interests aren't ever a part of the decision-making process. However, that is irrelevant to their monopoly status.
@@cuthbertallgood7781 You are deeply misinformed if you think we live in a world that doesn't control what you learn. Machine learning algorithms aren't people. They're commercial products that create commercial products. If that commercial product uses other commercial products in the process of creation, then that's a very clear use of intellectual property for commercial use, which is very much subject to copyright law. This isn't about controlling what you learn. It's about preventing the literal theft of intellectual property by multi-billion dollar corporations.
Could this be a QR code that links to a universal legal document ? I had this thought that I would want to print it on masks and/or patches when I heard they were training with security camera data without telling people.
I'm an academic researcher and spend most of my time critiquing "AI" development and deployment ("AI" is a confusing term, large language models and generative image models and so on and so forth are not intelligent, they are just quite complex statistical models of limited and necessarily biased data sets). This is a really valuable insight into the artists' perspectives and I'm really heartened by the fact that TH-camrs are catching on to this use of their "content". Thanks for drawing more attention to this - I've been watching Vlogbrothers since I was maybe 13 years old and could have never anticipated that I'd ever get to cite a video of yours in an academic article.
It is also important to note that WE are the ones who provide this valuable data, including our personal data for ads. Without us giving them this information for free, these companies would be worth 0 dollars.
To make Reddit's nonsense worse - I was one of those people who left Reddit in objection to their API debacle. I purposefully (before the TOS changed) used an app to change all of my posts/comments to gibberish. Then deleted my account, with a note saying why I was deleting it. Three days later, I searched for something I knew would be one of my posts. Google found it. Okay, fine, the crawler just hadn't updated to see my gibberish-replacement. Let's follow the link aaaaand-nope. Reddit restored all of my posts/comments to be their last version before I swapped them for gibberish. Now all attributed to "[deleted user]" instead of my user name. But they still exist. And since my account is deleted, I have no way of editing them again. So I have no way of removing my content from their just-added-to-their-TOS AI training provision. (Although I enjoy that their User Agreement says "if you don't agree with the new terms, just stop using Reddit." - that doesn't stop your *CONTENT* from being there!) I have sent multiple emails to Reddit support, demanding that my copyrighted content be removed from Reddit, to no avail. (I haven't even gotten a single reply, to the multiple different attempts at multiple different emails; I've filled out different web forms, I've even created a new "created just to complain" account, and have gotten zero replies to anything I've sent.) I'm a single individual, and can't exactly afford to hire a lawyer to sue Reddit, but I guarantee there are other people in my position.
The fact that reddit can get away with shit like that when all the little guys on earth have to quiver in fear that their protected work will get a DMCA takedown they can't fight is just despicable. Our current copyright system is here to pleasure Disney and maybe two other people, and leave the rest of us to suffer
I don't know. I think it's interesting that they agreed to granting TH-cam a transferable commercial license in perpetuity and people didn't understand what that meant. It's even more interesting how they're acting like it's something new and will be treated differently by the courts in this instance. Transferable commercial license in perpetuity is very well defined by the court. It's going to be interesting seeing people try to prove that they are not using the license appropriately when they are in fact making it for commercial means. It's going to be sad when IP the lawyer TH-camrs hop on this and give a little insight because this is not going to go the way a lot of these creators. Hope it will. It sucks but it is what it is. It's definitely not like how certain movies never had rights negotiated for future formats. It's completely different. And it's going to be interesting seeing a lot of people swallow this bitter pill.
I know nothing about how TH-cam will use the scraped data from videos, but today I did a google search and the text in the “generative ai overview” wasn’t just a reinterpretation, but an exact copy of information in the the top sponsor hit. Not a training set, but the exact words were lifted.
John's taking on the biggest pharmaceutical companies to help them do what's right, Hank's taking on the biggest tech companies to help them do what's right. love to see it, proud to be here
*To _make_ them do what's right. Under capitalism, ethics will not (arguably cannot) be weighted more heavily than profit. 99% of the time companies will only do what's right if they think it will positively affect their stock price or bottom line.
@@silverandexact Yup! Regulation is necessary. Best thing you can do is find a useful piece of potential legislation and tell your representative how important it is to you.
@@silverandexact i hear what you're saying. i'm taking a cue from john/nerdfighteria and framing it as a collaborative effort with the companies in question because i think that is likely a more effective strategy to get the companies to change their practices/policies
Now we need two heroes to take on the biggest Food Companies so we can eat real food again and Energy Companies so we can have fewer wildfires. Also so we can have TH-cam version of Captain Planet
As a master's student whose work is in this area: /*furiously taking notes*/ Also a small asterisk, a lot of data that notable LLMs have trained on is pirated text that was aggregated, and they reference them in their papers as a "public dataset". (Ex: Facebook and Book 3) Therefore in my opinion, if they're already willing to admit they're using illegally obtained data for generative AI usage, we should already assume they're doing things that are considered in the "grey area".
Thanks for making this video! I hate that I can't go a day without hearing about or interacting with awful AI content. I absolutely agree that everyone should have the choice to opt in or out of these learning models.
I'd like to opt out of taxes, and speed limits, too. Too bad, doing so would infringe others' freedoms, so you don't get to. Banning others LEARNING from your art is thought crime, and an absurd and draconian concept. One which you were never restricted by yourself when you were learning art. Did you have to respect Picasso "opting out" of you learning from and training on his art? No. So why do you think you deserve protections that you never afforded other artists yourself?
@@gavinjenkins899But AI doesn’t “learn.” It copies. It copies every stroke you make, every painting that you spent hours on. And it does that, not to express itself like a human would normally do with art, but to replace the artists it stole from. Let’s face it, generative AI is being used to replace artists. AI has great potential to be used for good but this isn’t it. It’s the same concept as tracing art. Tracing art and claiming it’s yours is theft because you copied the strokes. That’s what AI does. AI can’t create new strokes. It can’t create, it can only copy
@@jeSUS-wp2eg Nope, not only does it not copy, but it's physically IMPOSSIBLE for it to copy. Stable Diffusion has 10,000x less memory than it would take to store its training images, and even that's only if it somehow needed zero space for the connections and rules. "01001101" that's about how much data you have per training image. Explain to me how you can remember, in order to "copy" a several megabyte training image, with that many 1s and 0s? You simply don't understand how the technology works, and you shouldn't be arguing about rules and legislations until you learn the basic facts and how the thing works first. You have homework to do.
@gavinjenkins899 Humans use their life experience including art they've seen to make art. AI uses code thrown at it to display images and text on demand, and those productions are not art. You can use my human-made art to make more human-made art. It's physically impossible for an AI to produce something that can accurately be called art, whether or not it "learns" from human made art
@@Natalie-101 Yes, like you just said, you use art you've seen to make new images. Yet you do not ever ask permission or pay fees for doing so. So you're a hypocrite. it's pretty straightforward. You've offered zero rationale for why AI should be treated any different just "because it's AI and not a human", which is of course meaningless and circular. "AI should be treated differently because AI should be treated differently" 🙃 You have no argument.
The fact that EU (and probably also similar in other countries) legislation is coming simply puts a timer on how fast they need to scrape everything there is to scrape. I would not count on any "opt-out" feature until they absolutely are forced to add one by law. What many people fail to see is that human-created content just became a heavily sought commodity. Especially pre-GenerativeAI content, which is definitely not "tainted" in any way. We've known for a long time that trying to train ML models on ML generated data doesn't work - it just reinforces the bias too much. So what all those BigTech companies need is human-generated data for training. Paradoxically, the introduction of GenerativeAI actually made such content harder to get - lots of people now use ChatGPT to ask questions instead of posting on some online forums, and those online forums are also flooded with ChatGPT-generated answers as well. Is Google using youtube content? 1000% they are. That's a "competitive edge" they have over competition and they are definitely using that. Same as I'm absolutely sure Microsoft uses private github repos to train their Copilot and that Amazon is doing the same thing with anything Alexa records in your house.
OK, I'm feeling really old now. I feel like this all ties in with the problem some of us older folk had when the word "content" started to be used for creative work. Calling it "content" divorces it from the labor that produces it, we thought. "Content" is just stuff out there. I know younger folks haven't necessarily seen it that way; you all understand that "content" is creative work. So maybe that's a moot point. But I'm also really concerned with the tendency inherent in our social order / market economy for everything to get shoved into the private sector. I may not be stating this correctly, since "private" can simply mean that works are owned by their authors; but to me, putting it online puts it into the public square-in much the same way publishing in books, magazines, or newspapers does. Yes, there's some element of "private" involved, but once ideas are out there, they're available to the public. And that's a really good thing. But this looks like it could force the ceation of something...new? to replace the public square - something we've been seeing in terms of literal geography & the built environment, as public spaces give way to private spaces-including private parks technically open to the public, but certainly not in the same way an actual public park would be. We saw this here in Detroit in different context. Our art musem, the very much world-class Detroit Institute of Arts, was jeopardized during the city's bankruptcy because its collections were owned by the City of Detroit-and that meant by the People of the City of Detroit, when it comes down to it. It was a very public space. The city's creditors (who, to cite Oscar Wilde, clearly know the price of everything and the value of nothing) claimed they should be able to force the sale of the DIA's collections to satisfy the city's debts. Legally, I suppose that could make sense. Ethically and, well, culturally, absolutely f'ing not. And there was even an op-ed in I think the LA Times arguing that of course the DIA's collections should be sold, so that it could go to the coasts where people actually deserved to have it. (We're used to that kind of shade here; hence the clothing company, Detroit Versus Everybody.) Luckily, the relevant staff at the museum (one of whom is a friend of mine) worked really, really hard, long hours to save the art. The solution turned out to be the establishment of a private, though thankfully nonprofit, foundation, and the ownership of the art was transfered to the foundation. And the great experiment of publicly-owned art came to an end, despite nothing really changing on a practical level-and the iimportant thing, keeping the art here, where even Detroit Public School children, who the good people at the LA Times think don't deserve art, get to experience it either by just stopping in or through interactive, educational programs provided by the museum to schools. All that to give an example. I worry that by focusing on what's legal and profitable rather than what's ethical and healthy and suportive of human flourishing, we're moving to a place where everything will have to cost money to be accessed by the public, since that will be the only way to protect it. Thankfully, the DIA managed not to do that, and maybe something like a private foundation, which would have the kind of legal standing individual people don't seem to anymore, could emerge to help in many situations. I'm a theologian; I don't really know all the legal ins & outs, but I know unethical when I see it (as does everyone, really, if they're not deluding themselves), and I know detriment to human flourishing when I see it. And I value human flourishing far, far more than legal technicalities that allow people with money and power to exploit systems and people to increase their money and power. Here endeth the screed.
Well said! I had no idea that DIA was caught up in that. Stockton, CA went through a similar bankruptcy, but not one mention of local resources that might've been affected. Now, I'm wondering if I shouldn't revisit that part of my city's recent history, not that it's history and preservation of community assets has been stellar....
@@erinmac4750 I went to grad school in Berkeley (at the GTU, not Cal) & lived in Oakland & worked in SF. I don't think I ever made it out to Stockton. I'm sorry to hear you all went through bankruptcy too, though.
This is very well written; it's nice to read a very thoughtful take on this problem that it feels like isn't being talked about enough, but it's making me sad that this comment is already being added to a LLM's data set too.
Who remembers when you rented a movie from the video shop there was a little statement before the movie about piracy? "You wouldn't steal a TV would you?!" The tables have not just turned - they've been disassembled put in a video box, reassembled elsewhere and a big "shush, it's for science" sign put in it's spot.
Person: Hey, does Google train Gemini on TH-cam videos? Google/TH-cam: Rambles vaugely about TOS. Person: So you didn't say no, which basically means "Yes" you just don't want to say "Yes".
Rambles vaguely in court about granting a commercial license in perpetuity. Commercial? Commerce. Used to generate revenue. Google will us the AI to generate revenue. Commercial licenses will cover training. That word perpetuity carries a lot of weight.
Oh, god. It sounds like the preds who justify their behaviour by saying "but she didn't say no". As if anything that's not an emphatic "no" is automatically permission to go ahead 🤢
@@TheKrispyfort OMG Truth! People don't recognize the boiling frog analogy, but in your analogy the stark reality of the harm in these companies' actions is unequivocal. Louis Rossman used this same analogy discussing subscriptions for features and services to be unlocked on products. This predatory behavior of companies has to stop. Note: As the survivor of DV/SA, I don't think this is hyperbole. When these entities violate people's rights to create, work, own purchases, control how their own creations/work is used or not used, etc, that's affecting life and liberty, working to render a person/the people powerless. No. Not hyperbole.
Especially considering if there's a span of time where you're opted-in before unchecking, they'll just immediately grab the content and bake it into their next version of the AI. And once they've done that, they're not going to roll it back, even if the data is technically removed from the "training" set once you opt out. It's already been eaten.
After watching and spending hours reading/commenting in threads, I'm adding this one to thank you, Hank, vlogbrothers, and this insightful community for all this "real" information and insight. Godspeed and much love, everyone.
So basically they're training on everybody except record labels and cable networks. Consent is so important and big tech has such a huge problem with it on every level. Thanks for this, Hank.
The fact that AI companies think making it opt-out is a valid solution is insane to me. It's like if I go into a store and get my wallet pick pocketed by a store's on site pick pocket. Then I go to complain and they tell me that I never opted out of their pick pocketing policy so they were allowed to do it, at which point I opt out and they tell me it won't happen again but don't give me back my wallet.
No no no. It's like you leaving your wallet on the floor and someone ran through it and took some notes. And you complain you should have opted in for people to take notes of thing lying around in public. Pick pocketing is physically taking possession of something. That is not what's happening. TH-cam doesn't steal videos. If they deleted the video from your channel and forbid you to use it, that would be comparable. You don't lose any right you had.
@@OmateYayami If someone leaves a wallet on the floor, the morally right thing to do is to try and return it to it's owner. You do not go rifling through it.
Thanks for making this video, Hank. Filling out your survey now. Agree that we should all have a path to opting out.
@@ScuubiDuubi It's not just the information we're talking about. It's the presentation, the interpretation, and the synthesis. Putting the physics principle in layman's terms is work that has value. Teaching is a skill. If it wasn't, you wouldn't need Blender tutorials, you could just look at Blender and learn everything you need to know.
If an AI scrapes thousands of hours of that work and breaks it down into soulless slurry to be marketed to morons, the thing being stolen is the interpretation, presentation, and explanation of the physics, not the physics themselves. Are paintings the property of the painter? They probably didn't create the brush, the canvas, or the paint they used, so by the logic you're arguing, they're somehow not the property of the person who created them.
Also, for the record, a Blender tutorial produced by an AI would give you incorrect information and be useless for learning. The model is trained to produce something that looks like a Blender tutorial, it's not learning how to use Blender. The result will be videos that are very well-produced to LOOK like useful tutorials posted by someone knowledgeable, but all the content in them is nonsense. Just a whole new generation of scams.
@@deggy42 " Teaching is a skill. If it wasn't, you wouldn't need Blender tutorials, you could just look at Blender and learn everything you need to know. If an AI scrapes thousands of hours of that work and breaks it down into soulless slurry to be marketed to morons"
This is an obnoxious statement. What exactly would be wrong with that? So teaching should be gatekept because some people make money off of it? If I dont want to or more likely, dont have time to waste hundreds of hours sifting through all their tutorials I am a moron?
This misses the entire point of the second part of what I said. Citing video sources with time stamps would be a win for everyone. It would be very easy for an AI to provide a weighted list of sources for how it came up with its answer. If this were the case, AI gets its training data, the youtuber gets to be sourced and linked to in its answer, therefore driving more traffic to that youtuber, and the person looking for a solution doesnt have to comb through hundreds of hours of youtube to find the needle in a haystack. If you're a proponent of free and open education, then you should support making tools like this better. Even if someone does have the TIME required to find solutions themselves, very few people have the patience. otherwise EVERYONE would know everything they wanted. I believe everyone SHOULD be able to know everything they want, and if AI served as a search algorithm of sorts for videos, that would open up web searches to a massive new realm of information that they cant address right now.
I'll give you a specific example.
It took me over a year to figure out how to transfer data from a digital sculpt to a 2d normal map in blender.. in other programs It worked effortlessly, but in blender I always got the SAME error. I googled it in every way possible, watched tons of tutorials, none of them ever had the issue i had. they would run through their process, it would work. I would get that same error. It made me want to scream, and i simply gave up on texturing and normal maps for a long time until one day i had a tutorial going as im sculpting, as i usually do.. and all of the sudden on a short tanget the presenter explains why that was happening about 30 minutes into the video. So my experience with that issue is just "the way it should be done" because youtubers think AI is learning how to present better from them? No.. screw that. I want the ability to google youtube videos' content so that i can find the effective tutorials more efficiently.
and your next statement just proves my point because
"Also, for the record, a Blender tutorial produced by an AI would give you incorrect information and be useless for learning."
EXACTLY
I can't effectively google solutions to many problems i have while learning blender, but if I could google it, and google gave me a brief explanation with links to its sources, I would immediately go watch those sources to find the correct information. not to mention, you're all just assuming they plan to use this to make tutorial videos for youtube.. I think google has better and more profitable shit to do dude. they arent trying to steal their youtubers views.. and most youtubers personalities are what attract people to them. I dont watch smarter every day because I need to know how a pulley works. I like his explanations of stuff, and if AI rips that off it will be a "soulless" version as you said, that im not interested in.. so this whole argument just kinda doesnt make sense.
lastly
"Are paintings the property of the painter? They probably didn't create the brush, the canvas, or the paint they used, so by the logic you're arguing, they're somehow not the property of the person who created them."
This isnt the equivalent of my argument at all. The youtuber didnt create youtube, but they still own the video, right? the artist didnt create the tools, but they own the painting. MY ARGUMENT is more akin to "if a youtuber made a video about a ferarri, then said he owns said ferrari.. everyone would agree that was dumb." youtubers dont own the subjects of their videos, and if youtube starts pumping out error filled videos with bad information in styles imitating real youtubers.. WHO TF IS GOING TO WATCH THEM? im not? are you? have you seen AI art? it cant even count fingers yet man.. It has no UNDERSTANDING of its subject whatsoever, but it could be an INCREDIBLE referencing tool, and if you oppose that because you think only morons dont want to or shouldnt have to sift through mountains of information for an answer.. too bad?
@@deggy42 " Teaching is a skill. If it wasn't, you wouldn't need Blender tutorials, you could just look at Blender and learn everything you need to know. If an AI scrapes thousands of hours of that work and breaks it down into soulless slurry to be marketed to morons"
This is an obnoxious statement. What exactly would be wrong with that? So teaching should be gatekept because some people make money off of it? If I dont want to or more likely, dont have time to waste hundreds of hours sifting through all their tutorials I am a moron?
This misses the entire point of the second part of what I said. Citing video sources would be a win for everyone. It would be very easy for an AI to provide a weighted list of sources for how it came up with its answer. If this were the case, AI gets its training data, the youtuber gets to be sourced and linked to in its answer, therefore driving more traffic to that youtuber, and the person looking for a solution doesnt have to comb through hundreds of hours of youtube to find the needle in a haystack. If you're a proponent of free and open education, then you should support making tools like this better. Even if someone does have the TIME required to find solutions themselves, very few people have the patience. otherwise EVERYONE would know everything they wanted. I believe everyone SHOULD be able to know everything they want, and if AI served as a search algorithm of sorts for videos, that would open up web searches to a massive new realm of information that they cant address right now.
I'll give you a specific example.
It took me over a year to figure out how to transfer data from a digital sculpt to a 2d normal map in blender.. in other programs It worked effortlessly, but in blender I always got the SAME error. I googled it in every way possible, watched tons of tutorials, none of them ever had the issue i had. they would run through their process, it would work. I would get that same error. It made me want to scream, and i simply gave up on texturing and normal maps for a long time until one day i had a tutorial going as im sculpting, as i usually do.. and all of the sudden on a short tanget the presenter explains why that was happening about 30 minutes into the video. So my experience with that issue is just "the way it should be done" because youtubers think AI is learning how to present better from them? No.. just no. I want the ability to google youtube videos' content so that i can find the effective tutorials more efficiently.
and your next statement just proves my point because
"Also, for the record, a Blender tutorial produced by an AI would give you incorrect information and be useless for learning."
EXACTLY
I can't effectively google solutions to many problems i have while learning blender, but if I could google it, and google gave me a brief explanation with links to its sources, I would immediately go watch those sources to find the correct information. not to mention, you're all just assuming they plan to use this to make tutorial videos for youtube.. I think google has better and more profitable things to do dude. they arent trying to steal their youtubers views.. and most youtubers personalities are what attract people to them. I dont watch smarter every day because I need to know how a pulley works. I like his explanations of stuff, and if AI rips that off it will be a "soulless" version as you said, that im not interested in.. so this whole argument just kinda doesnt make sense.
lastly
"Are paintings the property of the painter? They probably didn't create the brush, the canvas, or the paint they used, so by the logic you're arguing, they're somehow not the property of the person who created them."
This isnt the equivalent of my argument at all. The youtuber didnt create youtube, but they still own the video, right? the artist didnt create the tools, but they own the painting. MY ARGUMENT is more akin to "if a youtuber made a video about a ferarri, then said he owns said ferrari.. everyone would agree that was dumb." youtubers dont own the subjects of their videos, and if youtube starts pumping out error filled videos with bad information in styles imitating real youtubers.. WHO THE HECK IS GOING TO WATCH THEM? im not? are you? have you seen AI art? it cant even count fingers yet man.. It has no UNDERSTANDING of its subject whatsoever, but it could be an INCREDIBLE referencing tool, and if you oppose that because you think only morons dont want to or shouldnt have to sift through mountains of information for an answer.. too bad?
@@deggy42 " Teaching is a skill. If it wasn't, you wouldn't need Blender tutorials, you could just look at Blender and learn everything you need to know. If an AI scrapes thousands of hours of that work and breaks it down into soulless slurry to be marketed to morons"
This is an obnoxious statement. What exactly would be wrong with that? So teaching should be gatekept because some people make money off of it? If I don't want to or more likely, don't have time to waste hundreds of hours sifting through all their tutorials I am a moron?
This misses the entire point of the second part of what I said. Citing video sources would be a win for everyone. It would be very easy for an AI to provide a weighted list of sources for how it came up with its answer. If this were the case, AI gets its training data, the youtuber gets to be sourced and linked to in its answer, therefore driving more traffic to that youtuber, and the person looking for a solution doesn't have to comb through hundreds of hours of TH-cam to find the needle in a haystack. If you're a proponent of free and open education, then you should support making tools like this better. Even if someone does have the TIME required to find solutions themselves, very few people have the patience. otherwise EVERYONE would know everything they wanted. I believe everyone SHOULD be able to know everything they want, and if AI served as a search algorithm of sorts for videos, that would open up web searches to a massive new realm of information that they cant address right now.
I'll give you a specific example.
It took me over a year to figure out how to transfer data from a digital sculpt to a 2d normal map in blender.. in other programs It worked effortlessly, but in blender I always got the SAME error. I googled it in every way possible, watched tons of tutorials, none of them ever had the issue i had. they would run through their process, it would work. I would get that same error. It made me want to scream, and I simply gave up on texturing and normal maps for a long time until one day I had a tutorial going as I'm sculpting, as I usually do.. and all of the sudden on a short tangent the presenter explains why that was happening about 30 minutes into the video. So my experience with that issue is just "the way it should be done" because youtubers think AI is learning how to present better from them? No.. just no. I want the ability to google TH-cam videos' content so that I can find the effective tutorials more efficiently.
and your next statement just proves my point because
"Also, for the record, a Blender tutorial produced by an AI would give you incorrect information and be useless for learning."
EXACTLY
I can't effectively google solutions to many problems I have while learning blender, but if I could google it, and google gave me a brief explanation with links to its sources, I would immediately go watch those sources to find the correct information. not to mention, you're all just assuming they plan to use this to make tutorial videos for TH-cam.. I think google has better and more profitable things to do dude. they aren't trying to steal their youtubers views.. and most youtubers personalities are what attract people to them. I don't watch smarter every day because I need to know how a pulley works. I like his explanations of stuff, and if AI rips that off it will be a "soulless" version as you said, that I'm not interested in.. so this whole argument just kind of doesn't make sense.
lastly
"Are paintings the property of the painter? They probably didn't create the brush, the canvas, or the paint they used, so by the logic you're arguing, they're somehow not the property of the person who created them."
This isn't the equivalent of my argument at all. The youtuber didn't create TH-cam, but they still own the video, right? the artist didn't create the tools, but they own the painting. MY ARGUMENT is more akin to "if a youtuber made a video about a Ferrari, then said he owns said Ferrari.. everyone would agree that was dumb." youtubers don't own the subjects of their videos, and if TH-cam starts pumping out error filled videos with bad information in styles imitating real youtubers.. WHO THE HECK IS GOING TO WATCH THEM? I'm not? are you? have you seen AI art? it cant even count fingers yet man.. It has no UNDERSTANDING of its subject whatsoever, but it could be an INCREDIBLE referencing tool, and if you oppose that because you think only morons don't want to or shouldn't have to sift through mountains of information for an answer.. too bad?
@@ScuubiDuubiI reprinted your the book and will not pay you for selling and redistribution. Is this fair?
"Work its way through the courts". I.e. we will never ask for permission for anything, just stop if we are ordered to.
I wonder if the court will fully side with copyright on this one. I get this feeling of this whole race agents China narrative in the background of all of this. I wonder how much of a role it will play in the decision making.
it's so gross. he's saying we will do whatever we want and if you don't like it we will sue you for all you're worth! honestly disturbing
I don’t think they can get away with it that easily, i’m pretty sure companies like TH-cam with all the power they have would at least try something to stop it like developing new technologies to stop them from training or something
‘Move fast and break things’ has long been the tech industry’s moto. Rarely does it seem they stop to consider exactly what - or who - they break
I've only just started watching the video so this may be touched upon, but this is exactly my understanding of how these companies work. If you ask for permission at the start then you will a) never get permission and therefore b) never improve your model.
It simply doesn't make sense for these companies to be legal about this at the start because it will pay far more dividends to just fire ahead and then deal with whatever legal implications show up later.
Is it shit? Yes!
Is it going to change? No!
There is no means to make this better. Severe regulation simply won't work because it can be ignored until its obvious and then its too late (i.e. exactly what happens now) This is a weird new quirk of copyright law that we don't know how to navigate effectively yet.
I just used the search function. Wow....45 Smarter Every Day videos were used. That's irritating.
Wait until you find out how many of them Google's used...
Everything you love will be owned. Welcome to tech capitalism babyyyyyy
break google up!! youtube should be a separate company!! come on DOJ
@@rowanmurphy4986 they're not going to stop there. everything you love will be owned, including everything you've made, and as long as you abide by the TOS, you can pay the company that owns the thing you love and made for the conditional, revoke-able right to view it... not to distribute it though, of course; that's their thing.
Which is more frustrating, that they used your videos, or that when educational videos are so popular, that they didn't use MORE of your videos? Both are kind of a big oof.
"If it's on the open web, it's free for use". Microsoft isn't allowed to complain about piracy in any context ever again.
Which is also such BS because so many of these companies say that they don't let non-copyright holders upload content. If putting it on the open web means a copyright holder needs to authorize it, then why would someone viewing it be allowed to use it however they want??
This I agree with.
Fr bruh what hypocrits 😂
Time to move to linux
exactly I wanna pirate word and exel so bad rn
My entire life I was told that "just because it's accessible on the internet doesn't mean that you can use it in any way you want because often those things are protected by copyright" and now those companies that strictly enforced copyright law (sometimes over zealously) are saying "actually we can do whatever we want with copyrighted materials that we don't own now" because they figured out a way to make money off of it. Before these companies were investing in AI their standard policy would be to issue cease and desist letters to anyone doing exactly what they are doing now; the only difference is that now they are the ones who stand to gain from a grey area. It doesn't feel remotely fair because it isn't.
Okay, but that's not relevant to the TH-cam situation, because in this case they have explicit permission in the ToS to make derivative works.
@@Random3716 copyright law very much seems to have an invisible "only if you're rich" asterisk added to all protections.
Or rather *unless you're rich@@yuvalne
Exactly. It's like when 5 year olds change the rules to a game because they start to lose. It's laughably immature and downright embarrassing
No, they're not saying they can do whatever they want with it. They're saying that the thing they want to do with it is not covered by copyright law, and also not within the spirit of the thing that copyright was intended to protect.
Copyright is about duplicating a work, either as a whole or as a part, and about "derivative works", where the US copyright office gives examples of things like *translating a book, adapting a book into a movie, releasing a revision of a book, and making your own arrangement of a song* . You've always been totally fine with taking a Disney movie and studying the character's proportions, the color palette, how they draw eyes, and it's been totally fine to sell books describing this or putting that data into a computer program and selling that. And as long as what you make doesn't start to encroach on being considered derivative, you've been free to make whatever you want that's inspired by the things you learned as part of that study, both in a fuzzy "inspiration" sense as well as a literal "I measured the body proportions of Disney characters" sense.
Machines are automating this and becoming tools that autonomously can use this data, but it's so much different than both the spirit and the letter of the law that copyright is protecting.
I'm going to advocate for one step further: None of these systems should be Opt-Out, they should 100% and unambiguously be Opt-In. The internet currently is a barrage of "you didn't turn off us stealing from you, so we're allowed to steal from you." and that is not and never should be how IP ownership works. If someone broke into my house and then told me "well you didn't tell me not too", that would be ridiculous and for these companies to insist that that is the system we should all be working under is even more ridiculous. And anyone claiming that burying a sentence in a 500 page ToS waives all obligation to basic respect of other people and their property is wrong and should be laughed out of the room.
yeah it's not how consent works, you don't just assume it's ok and do it unless i say no
so basically we can come to their headquarters and take their stuffs. real-life has no ‘switching off theft’ feature so?
hear hear
Yes, my exact thoughts. Plus, there are the creators who have died, who have lost access to their accounts, or who just one day disappeared from the online space. The first one in particular is very upsetting. Opt-in is the ONLY ethical way to do this.
Legally, not stealing. People are calling it stealing, but there is no precedent in law or society of ai training, and whether or not it's stealing is not decided in court or in the eyes of society as a whole.
26:24 "go somewhere else" is the situation NOBODY is in. The size of these mega social media sites makes it impossible to have a thriving business "somewhere else." Artists have been screaming this for MONTHS, if not longer. We shouldn't have to submit to our rights being violated online that they wouldn't be violated elsewhere. If it's illegal for a company to make physical product using intellectual property they don't own, the same should apply to the digital space. Anyone trying to argue otherwise just stands to benefit from you rolling over.
The problem is, its more complex than that. It ISNT illegal for anyone to make a physical product using intellectual property they don't own. Fair Use explicitly says copywritten works may be used for transformative works. It could just as well be argued that the output of a genAI is so different from the source (in most cases) that it is transformative. It could also go that the transformative nature is the AI model itself, not its output, which is definitely different enough to be transformative. Under that interpretation, Each output would need to be scrutinized as its own potential copywrite violation. In either of these interpretations, no rights are being violated. It really does come down to how the courts interpret it. The slow arm of the law loses again to the lightning speed of innovation.
Exactly. We artists have been sounding this alarm for a while now. We can get into the weeds about technicalities and ToS phrasing, but that’s ultimately obfuscation; legal casuistry meant to deliver to tech industry executives what they want from generative AI: to profit from the labor of creatives without having to compensate creators for their work. To generate “content” without having to hire creators; or at the very least, to remove what bargaining power creators have (their creations) and therefore make creators into unpaid interns at best. For most of us, that just means erasing any chance of making a living in a creative medium.
For lawyers to say “it’s okay to commit theft so long as it’s rich people stealing from massive groups of people.”
I appreciate that Hank has a good relationship with many people at TH-cam but if the era of enshittification has taught us anything, it is TRUST NO CORPORATION, TRUST NO BILLIONAIRE.
I've hoy more job opportunities in fedi 1 year (3-5 people asking me to work for them) than Twitter in 3 years, including the same year fediverse got me 3-5 (0 a literal 0 works, not even for free).
We're just afraid of algorythms the trick is that uploading your stuff somewhere else is cheap and not that much extra work (2 extra clicks once you have setted Up) and contrary to popular beliefs, it's rather less sressful than just betting on a hellsite to help. Gives you a sense of actual chance of living better and outside this shit, to a more personal situation, more like the old internet with the modern money movement.
We can't scape PayPal patreon and such tho.
I do recommend downloading CARA
Also, 'go somewhere else' is a legitimately impossible thing to do for video streaming that is above 360p. Mass-scale video is the one Internet business that is legitimately extremely expensive to run, it's insanely unlikely anyone can _ever_ compete with TH-cam short of having state backing (China) or incredibly extensive legal reforms (EU maybe?).
Love how these CEOs want to claim all this content is "fair use" because it's on the open web, but then will be first in line to sue an individual who takes the same stance when it comes to their own IP.
Why anbody is surprised, this is why we overthrew monarchies and empires and why most of the world is some form of democracy, mostly republics. Maybe we'll do better next time, but the wannabe Empererors are getting too powerful now. Again.
I mean, the video creators did click the box that said they agreed to the contract that says TH-cam can do all this stuff and more. They did ask already, years ago.
@@Name-ot3xw Except generative AI wasn't a thing "years ago" when they started and built their careers on it.
So no, they DIDN'T agree to it.
@@ReverendRaff That sort of logic only works in the world where putting "All content is fair use, no theft intended" in the description somehow shields you from DMCA takedowns.
@@Name-ot3xw it's wrong and hypocritical for them to do it, and most people understand that.
If you want to be a bootlicker for these corporations, just say that 🤷
It's kinda amazing how companies can go from "You wouldn't steal a car" to "that's what WE do"
@@broccoli9308that’s why i’m no longer pirate. whatever things people pirate now are only coming from the companies that was stated by op
Well yeah, obviously - corporations have a right to profit. Consumers pirating = infringing their rights; companies stealing for profit = exercising their rights.
Really, we should just give them our money for free, they deserve it
@@orterves While we're at it, we should also give them ~1/3 of our lifespan, neglect our relationships in favor of meeting the companies' demands, and educate our children to do the same! It's simply knowing our place, really. :V
@@broccoli9308 The philosophy behind this is that the movies being pirated were made by organizations with a lot of money, and have probably made back their money, so they suffer a negligible amount of harm (if at all, there's a debate about your potential as a customer there). Public internet content, in this case, the vast majority of it is people who have comparatively far less money, and did not create any of it for commercial reasons.
@@broccoli9308 I honestly wouldn't mind making copyright apply exclusively to generative AI
Yes, absolutely TH-camrs should be allowed to opt out. In fact I will go farther. Opt out should be the default option and users should be given the option to opt in. I know, naive, right? 🤣
It's incredible the amount of people that suddenly want to restrict access to information.
Do people tell themselves that they are not using any other sources for inspiration or reference?
@@broccoli9308Restricting information would be taking down the videos or putting them behind a paywall-you’re misdirecting the conversation.
The people who want their AI to train on TH-cam videos plan to make a lot of money from that. If you want to make money off of work someone else did, that absolutely warrants at minimum asking their permission. Some people might be down to let someone make money off of them even without getting paid a cut of the profits if they think the tech is cool, and that’s their decision. Not everyone is going to be okay with that, as most people want to be compensated for their labor.
Opt Out should be the default on a zillion other things too.
@@broccoli9308 I don't know how many times I've had to reply to this sentiment today. AI is not the same as people. There is a difference between people and AI. How on earth can you think that AI can be inspired? Inspiration is a uniquely human trait.
@@draconvarie AI in its current state can not be inspired. You are correct. Its the person that controls the AI that provides the inspiration. Its just a tool. So honestly we should be asking the question does each individual person who is using any give generative AI model have the right to use the data in said model to create derivative works.
Just searched and they’ve used 12 of my videos. 🤢 I’d have liked the chance to opt out. Thanks for making a deep dive vid on this!
(I mention this simply as a data point - because obviously my work is absolutely not educational! 😂 I did also fill out your survey.)
It is my absolute right to learn from any source I want, and I will never allow someone to opt out of being a source for my learning. My learning, or my machine's learning, it's the same thing. The only right you own is copy-right, and absolutely nothing is being copied and reproduced.
The class action lawsuit has been activated
or at least, you should be able to bill them for using your intellectual property
Even better, it should have been opt in. You shouldn't have your work stolen from you by default
*_Give TH-camrs an option to opt of training AI models. I'm not a youtuber. I just love watching TH-cam and I love the TH-camrs I watch and want their work to be respected by Google as just as much as I respect it. Google you have nothing without these creators._*
The "Hank pulls camera out of editing screen to add addendums" is becoming a classic move.
I think this is a classic Casey Neistat move, but I could be wrong.
Addenda.
It's an Adam Neely move for me
It's delightfully disorienting, every time!
@@vlogbrothers talking to a camera on youtube is a Casey Neistat move, that's just your imposter syndrome talking
21:30
"Useragent: Everyone. Disallow: Everything" honestly sounds like a really good protest chant
this
Then we are all disconected nodes...
The internet needs to internet.
"Useragent: *. Disallow: " is the way to go.
Open google's data for us to use aswell.
+++
It reminded me of Nothing Pizza with Left Beef.
+
I'm not a creator and I still want an opt-out. I don't want these awful "AI Search" functions that are becoming mandatory everywhere and are worse than regular search. I don't want these awful AI ads and images everywhere. I don't want to support companies that use AI to save money instead of paying artists. And yes, every creator and publisher should have the option to opt out. And publishers/websites should not be able to opt in their individual writers and artists without consent.
I'm starting to think I was way ahead of the game when I took part in a protest march at Google headquarters 10 years ago over copyright issues.
+
One way of reading this, is that all these AI companies are killing the open web. If you can be scraped you can be trained means that suddenly there's a new incentive to leave the open web. That the whole internet is slowly filled with AI sludge means it's harder and harder to find genuine good human interaction. They're killing something we knew had value and replacing it with something that is unproven but makes them money.
@@fallingphoenix2341 Yep. I'm spending more time on Mastodon, on a server hosted by someone I know, with strong privacy controls, where I can opt out of web scraping. It will never be popular because enough "influencers" won't migrate there - there's no algorithm to cater to their egos and funnel followers to them. But it's so much healthier for the end user. No ads, no Nazis, no AI, no algorithmic junk.
@@fallingphoenix2341 Sounds like what I've been thinking.
All of this, except it should be the option to opt IN. Check out the comment a few spots above yours by @Rithael. 👍
I do believe content creators should be able to opt out. The data is clearly valuable and it being used for the purpose of training.
Creators should get a royalty cut for helping to train as an incentive to opt in. Seems fair
Tha gall for youtube to put a gemini ad at the end of this video is why adblockers are appreciated
Sponsor Block is very helpful, and allows for self-promotion by default but can changed per category
They did it probably because the AI is dumb and thinks if you talk about AI for a whole video and even mention Gemini in it then this must be the perfect place to advertise that.
I got a Gemini ad in the middle right after the specific segment on Gemini.
@@rockallmusic A rebuttal? lol
I mean on the bright side, our brother is at least getting paid by that AI provider now 😅
The BS of "opt out" goes against basic informed consent. It's gross on so many levels. It should always be opt IN.
"Sorry lady, you forgot to "opt-out" for this assault scenario."
hey, it looks like you haven't opted out of being in the army, seems like you're going to have to go to war
and opt in should come with a paycheck.
Yeah why would they assume people are automatically opt in?
@@tradutorajuliana Personally, I don't think they assume that. I think they know that nobody really wants that, but *sure would love * getting paid for the stuff we all made instead of us getting paid for it.
Google being allowed to use TH-cam videos to train their AI because of the terms of service is a great argument for anti trust action to break up the company. It's anticompetitive if google can use TH-cam to train but apple can't, but more importantly it sucks for people who rely on TH-cam for their livelihood have to agree to letting google train GenAI or find a new job.
This is a good point.
Anyone can train on TH-cam videos. Do you see Google doing anything to stop anyone? If Apple owned TH-cam they would put an app-store-esque fee on training. Apple's ethos is HEAVILY closed-source. Google is and always has been the opposite. Why punish the best actor of the bunch? (Whether these companies SHOULD be able to train on TH-cam videos is a separate issue, but this hate toward google is irritating when they are the LEAST value-extracting company of the big techs.)
@@flipro23did you watch the video? Google says they could sue Apple for breaching TOS if they scraped transcripts because transcripts are not free to scrape, but Google's TOS also grants them a license to use transcripts of TH-cam users.
Being the least evil of the bunch isn't good enough in this situation.
@@Scarybug There's a difference between saying it breaks TOS and actually preventing scraping like Reddit recently did. Your comment said Apple can't use TH-cam. But they can. There are new generative video models every day and most of them have likely scraped TH-cam. Say what you will about the ethics of artist compensation but you can't say Google is shutting out competition (like Apple does with their app store--and they are getting hit with anti-trust enforcements for that right now).
If Google blocks web-scrapers on TH-cam, I'll change my opinion. But as of now I DO think they are the least evil (with regard to allowing competition), and that counts for a lot.
+
Name a more iconic duo than techbros and disregard for consent.
All this "opt out" shit just screams "it's easier to ask for forgiveness than permission." Except no one peddling these models is bothering to ask for forgiveness either.
If an AI was made from all of our data, we should all own the product. Not the private companies that stole the data.
essentially, open source
That's a great idea
Welcome to capitalism… Others do the job and only the top makes the profit and owns the product.
Absolutely. They should be significantly taxed and that money go to projects that help fundamentally reshape society (as voted on through direct democracy such as citizens councils-an AI federated co-op, if you will).
Companies whose business models aren't aligned with human values can't create AI aligned with human values.
Indeed. It is way too late to put the AI genie back in the bottle, but what we can do is to prevent a future where only a few companies and the governments have access to AI to do whatever they want with them, while 99.9% of humanity have nothing and are at their mercy. AI trained on public data should be seen as public domain.
"I cannot opt out of TH-cam. ...I live here." This reminds me so much of people talking about changes in their beloved hometown. I hope your digital hometown will be ok
It's an interesting analogy, but towns are owned and governed communally, by the people who live there. Meanwhile TH-cam is a private platform so neither users nor creators have any say on its governance. Which sucks.
@@jaimepujol5507 Ohhhh corporations can "own" towns too.
It won't be because the people "living" there won't actually do anything meaningful and lobby to change the laws to curb this sorta behavior. They expect lawmaker's to wake up and do something about it, lame.
@@databoy2396 Cities and towns generally are actually corporations, technically speaking. But it's certainly different from a world-spanning media corporation.
@@databoy2396 I mean, sure, but that's not the case by default. Are there still corporate cities in the US? Apart from Disneyland
There is a rather odd analogous situation. Back in the 1980's there were multiple lawsuits regarding strip mining. The problem was that in the 1800s coal companies had bought the mineral rights to large swaths of land that were then being used as farms. When those rights were sold the technology for strip mining did not exist. Therefore, those selling their rights very reasonably expected that the mining companies would, at most, sink a shaft to mine out any coal under their land. I believe access rights were included in the original sales of rights. Then, in the 1960's - often almost a century after the rights were sold - the mining companies could (and did) start showing up and destroying the land over which their mineral rights resided. In short, they destroyed the farms without permission or further compensation. The obvious issue is the landowners had never agreed to have their farms destroyed nor would that have been possible when they sold those rights. The mines made no attempt to restore the land after having extracted the coal. In short, the land had no residual value despite the surface rights having not been acquired by the coal companies. Effectively, they had acquired ALL the rights to the lands without paying for them. This is not about the EPA coming after the mining companies to rectify the environmental damages they had inflicted, which is also a thing. No, I do not know if there was ever a broad settlement on this but I do know that if you enter "strip mining lawsuit" into Google search you will find pages and pages of lawsuits. So, now you have AI developers "strip mining" the web. Some things never change.
That’s really sad.
It’s a great real life example of the implications of this kind of unregulated heartless dystopian system.
Thank you for sharing this information
this is an interesting and very fitting comparison. thanks for sharing
thanks for adding!
AND not to even mention outright theft of land tirelessly built and stewarded for thousands of years by people whose work was not seen as valuable until its fruits could be sold for US dollars.
The entire AI “revolution” has felt more like one giant robbery
Glorified recombinators that regurgitate the inputs stripped of the copyright.
@@bodaciouschad And neanderthals thinking these LLMs are actually smart or LeArNInG, not just regurgitating the most probable answers.
Crazy that dozens of high-profile CEOs are making the same risky bet. What could possibly go wrong?
it’s so depressing to me because there really is potential for AI use as a tool like Hank mentioned in subtitling or translating or the YT recommended algorithm.
but, companies are focusing on the generative AI side, which is exactly where that robbed feeling comes from I think. Feigned creativity from genuine creativity :/
I’m just hoping the inbreeding ouroboros of generative AI trained on generative AI will kill companies’ interest in it
That's Capitalism! *jazz hands*
32 of ours according to the proof search. 🙃 No me gusta. I mean I love tech as much as the next guy and my background is a masters in computer science with specialty in AI, ironically enough right before I got into doing this 15 years ago. BUT, so much of the way the data is used here for profit on the backs of other's work seems insanely problematic with a lot of broad implications. Even when discussing just basic search where google or microsoft giving the AI answer derived from other's work, but not sharing revenue with rhe sources the AI was trained on, and potentially taking clicks away from people going to such sources for themselves. Definitly should be an opt-out capability at the minimum, and it should actually be the other way around of an opt-in, and opt out be default. Yes, we learn from other humans for free. And yes, we create content towards this end as well for humans to learn from for free. BUT its a mutually beneficial relationship for all snd sustainable. An AI learning from creator's content is completely one sided and even hurts the source. And, of course, AI can learn in minutes from thousands, etc. Etc. Anyway, great video and awesome getting more public attention on all this. -Daven
well said.
Yeah, I immediately wondered how many of your (and Simon's) videos would be in that list, Daven.
But if creating derivative works is in your terms of service, then they have the right to create derivative works, which can be _anything._
For Content farm like you, i think its ok..
@@michaelmicekderivate works only work when they don't have copyrighted elements and taking away an entire video to do something like training an ML without permission consist of copyright. It's similar to taking a song in it's entirety and using it as background in a video without permission, sure it's derivative as I can add a random video to an entire song but i can be copyright striked too for doing that.
"UserAgent : EVERYONE
Disallow : EVERYTHING"
Goes kinda hard tbh.
I'd buy the shirt honestly.
This is going to be the motto of the new resistance
Great opening to an edm song
I'm getting this phrase tattooed on my forehead so when the robot uprising happens, I'm untouchable
@@smudge3446 Uhhh, is the new resistance opposing the open web? Because we were just told that the open web is good, and this is what hinders it.
It’s pretty astonishing how companies are able to escape legal responsibility for anything posted on their platforms on the grounds of “we didn’t make it” but then turn around and sell the content on the grounds of “it’s on our platform.”
WOOOOOOW! You hit the nail on the HEAD! That is an apt observation. Thank you for articulating it this way because those aspects hadn't occurred to me.
they do say that but the law is making them responsible. They are considered publishers (for the sake of the content on their "platform").
its disappointing that you can't understand that both of those statements can be true.
Your content is produced by you and so you bare personal responsibility for the content of that material but you are posting it to a third party site, who as part of that agreement gain rights to your work. It's not that hard.
This was my first thought. It's a dangerous game Google is playing. There are loads of politicians who would love to revoke their Section 230 status, and this feels like an inroad for them.
@@anotheryoutubeuser142 Exactly, if you preformed at a venue, you're responsible for the show you put on, but the venue's owner could sell a recording of your preformance, because its their venue. This is what TH-cam is doing
Four of my videos were used, I just learned. That's about 5% of a vlogbrother. I can live with that ratio, but am a bit miffed that my work is used without my being asked.
Hank is positively simmering with rage in the kindest, most matter-of-fact tone I've ever witnessed
He is skilled at that.
Fear the quiet man when he gets angry.
We should all be filled with rage about this. Ai will take more than it can give. The benefits do not outweigh the negatives.
When the mightier-than-sword-pen produces publicly useable power
Months ago TH-cam recruited creator for a “study” I was one of the selected candidates and you had to pass certain filters but only if you accept their terms. I spent few hours reading it and oooh boy. Literally said, giving away my voice and face for their AI training study!!!!!
omg you should cover this in a video
+
All TOSs and privacy policies are chilling when you actually read them. You should see what’s in the policies for doctor’s office portal software. If you mention it to your doctor they’ll squawk about HIPAA but then are surprised if you tell them what’s actually in it.
Thank you Hank. Please keep hitting this with this level of tenacity and honest detail. We need someone to.
+
This is 100% a “steal first, ask questions later” situation. They know for a FACT this is bullsht, they’d never let another company use their own data like this without proper payment and consent, but went ahead and did it to others because they know, by the time some people can fight back, they’ve made so much money on it, it’s not gonna matter.
1.Make big money on data sets
2.Get fined for 2% of the profit.
3. Take a vacation on your new boat.
"Stealing" no where is this true, but in your emotions.
@@TallicaMan1986Here comes the bootlicker, are you going to defend disney too? They recently forgave themselves for a death in disney world because one of the victims family members had disney plus, and in the contract it forbade them from pressing charges. These are the types of companies you are willing to die on a hill for debating semantics
Reminds me of the saying that goes like "ask for forgiveness not permission." I guess the lawsuits signify that we are approaching the "forgiveness" stage.
But creators did consent when they accepted the terms and conditions lol
This is not some grand heist. It's just people like usual not reading the terms and thinking they've been swindled
This is the latest example of how beautiful things grow and develop when people share, but everything falls apart when people decide to be greedy.
Not people, companies. Companies aren't people, companies don't _"want"_ anything. Yes, companies are run by people, but remove every greedy CEO today and new ones with the same motivations will take their place tomorrow. Because Companies survive on profit, on reporting growth. It's like how you can't fault mold for eating its way through your food pantry. It's just what mold does, it's just following its survival instincts as the living death taxonomical nightmare organism that it is. A capitalist company is the same. The people don't matter because there's enough selfish greedy people to go through in the world that it would never run out, and should a "benevolent" CEO one day be at the helm, the executive boards of directors, shareholders, and the capitalist system in general would make their influence non-existent and their career short-lived. A publicly traded mega-company is like a superintelligence with profits as its reward function. It's not about the people, it's about the ecosystem of our global economy that provides the perfect conditions for these companies to capitalize.
@@gwen9939 Can I have permission to copy and paste that? because it's the most efficient delivery of 'the problem' I've seen so far
@@gwen9939 Exactly. The economic system that we have encourages the cancerous growth of mega corporations. We need an overhaul.
+
@nicholaslogan6840 At least you're asking for permission unlike the AI in this video lmao
Is this why the algorithm has been listing videos titled "why everyone should have a TH-cam channel" "how starting a TH-cam channel saved my life" "I post even if no one is watching" oh they're watching alright. Thank you Hank you and your team make this world make a little more sense to me. ❤
mine too!
I think the important issue for most TH-camrs to understand is that no matter how good your relationship is with TH-cam, TH-cam's relationship with Google/Alphabet and its partners will always be the more important relationship to them. No matter how well individuals at TH-cam treat you, no matter how much they "get it" when it comes to creators, you will always come in last when it comes down to other priorities.
Same as workers with employers
I mean maybe not last but definitly not first
@@Leftistattheparty
This is why workers should unionize-and maybe creators should start thinking about what that would mean for them, too.
You vs Google loses every time. Us vs Google wins every time. We just need to coordinate and organize.
TH-cam doesn't have a "relationship" with Google/Alphabet. *It is Google.* It is literally just one department of Google. It's like talking about your relationship with your left hand.
It’s time we start moving over to platforms that enforce ZERO Ai
I love the term different incentives. The best way for people to work together is by having the same incentives. The reason why management gets along well with employees is when intensives are aligned. When they there is a disconnect it's because incentives are different.
Idk if other people have brought this up or if you noticed Hank and just didn’t mention because the video is already fairly long but Scott from Nerdsync did a video on this topic as well and went through the proof database and found that their videos that were scrapped all had user uploaded subtitles. They talked to some other smaller TH-camrs and discovered that the few videos of theirs that had been used to train AI also all had user uploaded subtitles.
So it seems like the ai is targeting videos with user uploaded subtitles which is very interesting and might be another reason why educational content is being used so heavily to train ai because companies like Kahn Academy and you and John's company Complexly's videos have extremely consistent user uploaded subtitles. Anyways, just thought you would like to know.
Wow. I'll have to check that out. That's so insidious, weaponizing an adaptive tool that so many need and appreciate, given the fact that auto generated text is often garbage.
I'd be truly disheartened right now if it weren't for Hank and people in this community being the ones looking to take Google to task in this. They are ones to take on Goliath and give him a proper trouncing.
@vlogbrothers @hankschannel dang how do we get this bumped for Hank to see >.
+
+ This makes a lot of sense. AI researchers are aware of the "ouroboros" problem Hank mentioned and this is their way of getting around it. Most AI-generated content has some way to mark it as AI-generated integrated into it (I believe this is usually done through a "fingerprint" from a model, although that's not required and not in the interest of these companies so not all of them will do it), and then they don't use AI-generated stuff in the training data, only the real human-written words, images, videos etc... which means pretty much always breaking the moral trust with the people who wrote those things, and usually also violating copyrights.
+
Yeah I'd like the option to opt out, absolutely
It should be opt in not out that is ridiculous for you to have to go out of your way for someone stealing your content.
This feels like the Hank equivalent of John calling out Danaher
Let’s make it that way
+
+
+
+
As a lowly TH-cam consumer, I want creators to be able to opt out.
I want the creators to have more ownership of their uploaded material.
Asking AI to explain how use of AI is controversial is an insane power move
Also proof it has no feelings.
Also use of its derivative works
Also funny
Not really sure how it is "insane" or a "power move", but go off I guess.
The infuriating part is to not even ask for creator's permission before using them for training. Creators being able to opt out is the absolute bare minimum that TH-cam can do.
It demonstrates that the company only values creators ability to make them money (primarily by capturing attention for ad revenue). The business has (nearly) no incentive to care about creators for their own sake.
It should only be opt in but thieves gotta thieve.
++
@@RubelliteFaethat’s a bingo
Why would they need your permission? you uploaded the video on a public video platform for everyone to see.
It's infuriating the lengths companies will go to make people think that the thing they don't want to happen is not happening all while they're actually directly making it happen.
The only infuriating part here is people not understanding "copy right" protects copying, and not learning. The Right To Learn must be protected.
@@cuthbertallgood7781 Or that something might sound like a good idea but overall be even worse. At least currently other then compute everyone is on an even field. Would you rather only big companies and rich entities be in control of such tech?
@cuthbertallgood7781 Hiring people to make videos, art, music, photos, and Computer Code for the AI to train on is ethical; this isn't. AI is a Tool; not a person.
@@cuthbertallgood7781 But can a machine really learn, or is it operating on data sequences?
@@cuthbertallgood7781This is the development of a product, not education.
This is honestly a great ad for Nebula. I was sold as soon as they offered a one-time payment for a lifetime subscription.
Maybe it's impossible but I think supporting nebula and Odysee could break the monopoly
Hank, this rules. Thank you for standing up for creators who have been affected by this!! I would like an option to opt out.
In addition, I think Google is using university Catalogues for AI training. I work at a LARGE (will not name) uni. Google is collecting books to scan for open access. I am pro open access obvi, but a ton of these books are already available across the internet, like Internet Archive.
Does Google just want to have their version? I mean probably but it is Already open access.
We are picking hundreds of books per day to send over in wooden carts every month. Some of these books are not copyright free yet and Google told us that they want to have it "ready to go when copyright has cleared"
I don't see the problem
They've been doing that for decades. It has been a running battle with publishers over copyright violations.
That's because you don't own anything Google is ripping off, yet. @@Procrastinacion_
@@nexypl No, OP is referring to Google Books (distinct from the ebook store Google Play Books) project to scan and make searchable physically published books. Your thesis will be under whatever appropriate copyright applies between you and your university and should only be scanned if it's made public domain or your university has a specific agreement with Google or some other AI company
If google wanted their own library, they could just copy it from the internet. I think a CEO might be wasting money
As a photographer, this is a situation we're dealing with too and has caused myself buckets of existential anxiety. That our agency wasn't even considered when these companies just vacuumed our work just to get hands a little less creepy looking.
This is the zenith of the silicon valley concept of move fast and break things and I sincerely hope that it thoroughly comes back to bite them hard.
If someone came to you wanting to buy a license to train their (A.I.) models on your photographs, I presume you would be unable to agree until you had worked out agreements with every (human) model and landowner you had used in your work?
@@ps.2 Straw man dude. That shit is a no brainer, especially with models since you license photographs of them to the photographer. The models own part of the photographs they've been photographed in. And the land thing, you can take pictures in public and you own the picture...
@@johngddr5288 Yeah you have a license agreement with the model. But does this license agreement include permission to train A.I. models? Or do you agree with Google that this is simply implied by language that was written before anyone really thought about this possibility?
@@ps.2 No f google and every company abusing a loop hole by a new technique that requires a firmer clarification of copyright law. Just because Google thinks they're safe because they wrote "derivative" in their licenses, doesn't imply people would agree to be exploited by a technology that didn't exist before. A case should be made that simply saying you give them the "derivative rights" does not include training for AI models
@@johngddr5288 No need to get defensive. I'm just trying to figure out how a photographer feels about the right to use their photographs in some new way, such as training an A.I. Would you feel obligated to get permission from the subjects in your photographs? Or do you feel that because they already signed an earlier agreement, that they would have no moral right to object to their likeness being used in this new way?
The fact that TH-cam has put this as the next up video for basically every single video that I watched yesterday is really... Interesting
I love how the title is the question and thumbnail is the answer. No browsing ten minutes to find if the creator even answers the question during the video.
Art youtubers, in particular, REALLY need that option to opt out of AI training on their videos.
I was thinking about that!
every TH-cam video has some level of art
On a very simple, basic level, if companies can block your video because it has a song in it or Disney can sue you for having Iron Man or a character similar to Iron Man on the banner of your company, then AI companies should not be allowed to take the entirety of your body of work to shove into their machines
You can't know exactly what an AI is basing a generated video on, companies bank on it, but you can 100% ask for consent from creators before scraping their stuff. If tech relies solely on exploitation of others in order to function then it probably shouldn't exist
Doesn't all large-scale enterprise rely on exploitation of people at some level?
@@isaacbagley8211I mean sure, but that doesn't mean we shouldn't strive to stop it. Just because our agricultural system relies on human trafficking and slavery doesn't mean that we should strive for a world free from slavery.
@@isaacbagley8211 Yes, but it doesn't mean it should be allowed
When the exploitation is this blatant and in your face then there's no reason why people shouldn't try and stop it from happening
You're confusing copying ("has a song in it") with learning ("I learned from this song and created an original work"). You can learn from Disney stuff all day long, and all the AI companies have learned from Disney material. "Copy rights" protecting copying. It does NOT protect learning, and we must fight to prevent the insane greed of people wanting to be paid just for seeing their stuff and learning from it. My learning, or my machine learning, it's the same thing. Learning must be absolutely protected.
But those two things aren't the same. You using a disney song is infringing because the _outcome_ has copyrighted material in it. If an AI model _reproduces_ your work (or something similar enough for copyright law) then that'd be a good comparison. But an AI model that is trained on copyrighted material and produces content that is different enough (by the same legal standards that regulate derivative work) isn't breaking copyright in the same way.
I'm not saying that training generative AI on copyrighted works is good, but the argument you made (and that I see lots of people make) doesn't make sense.
3:53 idea: they should pay each person they “trained” on a consultant fee. And the fee should match whatever a consultant would cost
“the social contract…[everything on the open web] is freeware” hooooooly moly that is A SENTENCE
In some ways I agree with him, but not when it manifests in this way.
The hallmark of a good government in the internet era is agility. Let's clamp down on this before we see what happens. Because if we wait we may _really_ not like the result.
@@sntslilhlpr6601 Truth!
Yeah, that's definitely not true. At all, except for things like public domain or Creative Commons (CC) licensed content.
Funny thing is: he rants in this video, but they choose to put CC on these videos, so he's the exception to many of the other creators, who left it on the default Standard TH-cam License and they have far more rights than the CC license he put on it !
thanks very much i will pirate my heart out and when a guy who isn't a billionaire ends up in court see how well it goes.
@@autohmae Not all Creative Commons licenses permit commercial use.
Hank I just wanna say how much I appreciate whatever the opposite of clickbait is, because your thumbnail absolutely killed it. In the hall of fame alongside Veritasium's salt lamp video you go
And if you want, you can just take his video and do whatever with it, it's in the Creative Commons license he choose for the video and at first glance all the videos on this channel.
The funny thing is, he's the exception to the rule, those other TH-camrs all choose the default option: TH-cam Standard License.
Thanks for making this video, Hank. Yes, we would like to opt out, PLEASE! :)
I love this take, I feel like it's realistic and measured. Google/TH-cam does have an opportunity to just add an "opt-out" button for creators, which 99% of people would leave on, but would sanctify all the youtube content they're training on. It sounds like a hassle to redo costly model training, but it's still early, and getting data permissions in order at this point would give them a huge advantage once the courts come down on AI companies.
As an independent creator, I don't want ANY of my content used to train ANY generative models without my express consent, and I should NEVER have to "opt out" after the fact - my content should only EVER be accessed and used for this purpose if I opt IN FIRST. ESPECIALLY if there's no way to guarantee that my content can be expunged from the model after I opt out. Otherwise, we're agreeing to a system where TH-cam and anyone else is free to steal all the content we've ever made, and only have to ask if they get to *keep* stealing all the work we do from the present onward.
My content should be removed from all of these models, and these companies shouldn't have the right to access it as training data at all without my explicit and informed consent. Period.
Edit: all of the replies drawing an equivalence between human beings learning and taking inspiration from creative works, and generative models ingesting creative works as training data, are foolhardy at best or bad-faith propaganda at worst. Google training a generative model on your work and spitting out lookalike content is categorically NOT the same as a human being taking inspiration from your work. It's corporate IP theft at scale. You anthropomorphize corporate AI models at your own peril. And, by the way - human artists credit their inspirations.
you have one sub, i am sure your fine. love your passion, but you are fine
You can’t steal content, homie
So is this an alt?
You will be forgotten, as requested.
@@indeliblyronnieI guess you never heard of pirating.
I’ve got to give you credit-this was a solid video. The fact that you actually mentioned the EULA and broke down critical sections like “sublicense,” “transferable,” and “royalty-free” is something most people skip over. It’s wild that nobody reads these things-99.9% of us just scroll down and hit “Agree.” But you’re right; this stuff is universally tucked into all EULAs. And as long as there’s a door with a key somewhere down the line, your data and personal work are never really protected. If that key gets handed off, even just once, whoever holds it can access and share everything.
And let’s be real-none of us agreed to this new wave of data scraping for AI. Back then, we didn’t even know what was coming. It’s kind of like when we didn’t sign up for the CIA and other agencies recording our calls and texts for “national security” after 9/11, but they still did it. The same loophole-y, vague lawyer-speak that lets them justify that is now letting companies scrape our data for AI. Whether you’re just a regular viewer or someone running a small business, we’re all stuck agreeing to this nonsense because, in the end, we all just want to get our content online and maybe make a little Praise and money if we’re lucky.
The truth is, as long as companies can find that “key,” (raddit and alike) - backdoors will always exist. Sure, opting out-or better yet, opting in-would be great, but the laws and EULAs are always going to be written in favor of those with power. And let’s face it, government and big corporations aren’t going to change the rules that benefit them.
So, what’s left for us? Either we deal with it or we start hosting everything on our own servers, behind paywalls, with strict “do not scrape” policies-good luck getting the average person to do that! It’s just easier to scroll past those long agreements without reading a word. And yeah, this is coming from a dyslexic who actually *did* read the EULA with some AI help, and I still don’t agree with it. I especially didn’t sign up for that one site trying to steal my character and let others use it-no thanks! It’s frustrating, but as individual users, our hands are tied. Until the big players decide to make changes, we’re stuck with this reality.
The end-at least until the next convoluted EULA update drops.
Yes Google is training on your videos. The New York Times reported on this months ago. They found out Open AI was training on TH-cam videos. They also found out Google was reluctant to do anything about it because if they called out Open AI, it could expose the fact that Google was also training on TH-cam, which is against Google's own terms of service. All Google said to the New York Times was if what they uncovered was true, Google would pursue AI companies to the fullest extent of the law, since it was against its own terms of service. Months later, Google has not pursued any legal actions against AI companies.
If you didn’t know, Google is using the scapegoat excuse of “We didn’t let them; a third party did.” They claim they handed the key to one third party, who then passed it to another, and so on, leading to the situation where none of our data is truly private. If we really want privacy, we’d have to start using encrypted scrambled text. Unfortunately, even that wouldn’t hold up for long. Sad to say.
why is this not getting more views? this is urgent and hella important for the future of content creation and many MANY other types of creative media
HEY SO! ARTISTS! Have been talking about this for the last couple years. This has been decimating 2d concept artists and visdevs and companies want to create internal models based on work from people signed on to give up their created works under the company to then internally replace them. They did this without consent or compensation already, now they're trying to backpedal to just going "well now we have the base line foundation but will you help us finish it off so we can remove you all from existence?"
I miss when google was all like "oooh don't be evil!" :( thems were the days! (9 of my videos scraped)
Hello. It seems you have added an inappropriate and unnecessary word to your comment. The word in question is "don't" . We have removed that for you. There, isn't that better ? 📈💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵💵
I love the fact that the algorithm wanted me to see this video so much
Wow 18 years is incredible I've been watching your content for at least 10 of those years, I even remember creating an account just so i could comment on scishow videos. I choose to do a chemistry degree after I watched crash course chemistry. The work you've done is invaluable and the fanbase is ready to defend you
The phrase "derivative works" is doing a LOT of heavy lifting in the ToS. I think it's super unfair for a company to one-sidedly profit from your hard work without even giving you an opportunity to disagree. This is an excellent analysis of the situation! Love the videos
Class action idea for ya: If your account existed before November 2022, Google has performed a bait-and-switch on the terms. Generative AI was not available to the general public before then, and it is reasonable to believe that it would **not** be covered in that license. ANY content before November 2022 should be off-limits, and users should have the right to be compensated REGARDLESS of whether they know Google used it, unless Google opens up its usage to a third-party auditor to calculate who is part of the class, at Google's expense.
Unfortunately class action money goes more toward lawyers and fees than the people they represent.
@@RubelliteFae yeah but it would cost Google a LOT of money.
Generative AI has been a thing since the 1950's.
@@everettvitols5690 not on scales like this.
@@everettvitols5690 available to the general public and with corporations created around it? Being technically correct isn’t correct if all context is ignored.
I really really appreciate you adding the disclaimer about "AI ≠ generative AI", it's been so frustrating working as an academic in machine learning and trying to have public conversations about what generative AI is/what it can do/how it works/the associated ethics/etc with people parroting blatantly incorrectly nonsense. A nuanced worldview doesn't exclude feeling righteous indignation at it's problems, if anything it informs and justifies it.
I am not a content creator, but the work of so many creators on this platform has enriched my life. Firmly believe creators should have a choice about this--and it should be opt IN, not opt out.
I'm sick of companies using TOS obfuscation to make us "consent" to things they know full well we wouldn't agree to. It should be opt in, not opt out and certainly not no choice at all.
This exactly.
Yes, exactly.
Well, they're not much we can do about since the same companies own our law makers and have gutted anti-trust laws.
" Why do people feel like they are being ripped off?"
Because they are.
So why did you "rip off" all the artists you learned art from as a kid, all without permission?
@@gavinjenkins899 WE didn't. Cause we aren't machines that ingest and regurgitate things with a % more one style to another by taking in every possible bit of information and repurposing it with the express purpose of feasting and leeching off of the source material until it is unable to support itself. Hope that helps.
@@zach7 We literally. Literally. Are machines that ingest visual information and regurgitate things with a % more one style [etc etc]. So no, that doesn't help, since you just equally described human brains. Except for the "unable to support itself" part which doesn't describe either AI or humans, and thus was just off topic. The AI works the same way your brain does, it's called a "neural network" for a reason, it works pretty much just like neuron systems do.
@@gavinjenkins899 I share your perspective about us doing the same thing. If one can do something better than another, like a person with a photographic memory remembering the pages in a book. Do we disallow them from looking at books because they will remember it to well and may harness that better recall to write a better book in the future? I think the answer in that situation is, of course not.
On the other hand what if a program is taking a photograph of every page in all books to create a better "on demand" book than any human could possibly hope to write in the future. This program would only financially benefit the group of people who created and implemented this program. The program is not a sentient being but a tool that was used to take opportunity away from all others. Is that different? I think that it is.
Should all of this result in an AGI that is not a tool of some for profit company then I think however it wants to learn is fine. I do not think that is what is happening right now. These companies are making tools that strip economic opportunity from the rest of society and then funnel it to themselves. There is no single superior entity here doing something better than another. What we are talking about is a tool made by the many that will result in opportunity theft and that will benefit a select few. You may argue that this is fine but I do not think this is a very wise move if a proponent of the well being of humanity.
There should be some conversations on is now the time to seriously consider universal basic income and does everyone need to become tradesmen and farmers from now? Will everything else will be handled by for profit AI companies for the foreseeable future? I know my advice for young people has changed in the last several years.
@@gavinjenkins899 Nope. You have a fundamental misunderstanding of how human brains work. Humans can’t make a 1-1 replica of someone’s art. AI can and do, to the point where artists’ watermarks very often pop up on AI-generated images. However, this goes beyond just art; these AI companies are stealing valuable data that they do not have permission to take. Saying AI works exactly like humans is a very poor, weak excuse that does not hold up in reality.
AI are not human; they are a product. Billion-dollar products, in fact. If you take someone’s intellectual property and/or data to create a product without their permission and without compensating them, you are stealing. Stealing. Full stop.
You do not get to take things for free. That’s being a thief. I should think that’s a very simple concept to understand.
I think this really boils down to the fact that generative AI is a product. It's not a person, it has no rights or legal autonomy. It is a tool built by a large tech company to sell to other companies for a profit. When people make movies, everyone that participates in the making of that movie gets paid and credited. When you build a house every piece and laborer gets paid to build that house. The ONLY industry that gets away with theft of labor, code and intellectual property is Big Tech. Microsoft did it to put together Dos. Facebook did it to create user profiles for Students that weren't using their service. And the list goes on and on. AI is just the latest caper by big tech to unlawfully steal people's property and information to repackage it and sell it as a product to someone else.
The problem is that governments don't know how to handle or classify Data. In part because they themselves LOVE the ability to buy it from these massive monopolistic corporations. But it's a total violation of privacy, a massive security risk for citizens, and just a blatant violation of copyright law. Data is one thing.... but TH-cam videos aren't data, they're copyrighted works. Nobody has the right to use, copy or distribute a copyrighted work without express permission to do so from the author.
If AI was being trained on Mission Impossible 3 you bet your bottom the MPAA would be suing these companies for millions of dollars in damages. Ya'll need a class action suit. Because TH-cam / Google is taking advantage of their position as a platform to steal from creators. And if they don't get hit with a hard legal bat now, they're just going to do worse in the future.
YES!! This! There are so many comments that are trying to equate human learning with AI learning. They are not the same and should be looked at differently. AI is so very powerful that we cannot compare that to how humans learn and build on one another's knowledge. Trying to argue that AI should be able to scrape all the available knowledge on the internet because that's how human's learn is a ridiculous comparison.
I'm a bit of a writer and this last year I've struggled to put any words down. My whole corpus has already been eaten to feed regurgitive AI (on Google and Microsoft clouds) and I just hate that anything I share going forward is gonna be too.
You are not as unique as you think you are
@@Phobos11 And you're an absolute brickhead.
Thankfully, none of what these AIs could come up with would be on par with the worst writer. They don't understand what they're saying so continuity and psychology are extremely inconsistent and you just end up with something involuntarily goofy. Not saying there's never gonna be an AI able to do that, but LLMs aren't the ones which are going to eventually do that. Their very structure prevents them from understanding what they are outputting. If they're able to "understand" something it's more like an instinct for grammar, not of the meaning.
It's good but not great at writing short posts to pretend to be a regular user on a website but it can't be good at creating a script for a movie.
@@Phobos11 But you aren't either and he's unique enough to matter.
@@Maverick_Mad_Moiselle while I agree with you it's less that I'm worried AI will do better and more that I just hate knowing my work is being stolen to enrich the rich and devalue the creative. I put a lot of myself into my writing and it almost feels like that self is being assimilated into this amorphous mass of corporate greed.
As a visual artist and first wave victim of gen ai training and market disruption, I was very disappointed by several big science channels (not yours /nevermind, edited below) as they were using ai thumbnails and illustrations in their videos. Some of them were very eager to defend their actions as they (probably) felt safe from scraping and training. This situation shows nobody is safe, any domain is on the chopping block.
I wish we could all get over this new type of exploitation and become more united as creators.
Good luck!
Edited to add: these last 2 days, 2 video thumbnails looked like they were ai generated on the sci show channel. Both were changed after a few hours. Please stop, if you are doing it.
Problem is the viewers don't care because the creators didn't care when THEIR jobs were on the chopping block.
When it's programmers, cashiers, drivers or factory workers getting replaced it was "progress" and told everybody to become a creative "Robots can never take our jobs"
There's a level of hypocrisy in the creator mindset, that's content creators have no problem using AI based on other arts they don't participate in. Because they don't actually care about anybody but themselves.
@@SherrifOfNottingham selfishness is indeed encouraged and weaponized in our current system.
Agreed. This is something that needs to be fought, and Hank/people in this community have shown that they can do it.
The commenter saying that the creators didn't care when others jobs were cut is being disingenuous. On this platform, from creators here is where I've usually learned about these incidents, often including what action can be taken or how to help.
There is a large section of content creators who have been supportive of workers, unions, small businesses/sole creators.
Greed and exploitation amongst these monopolies and conglomerates has reached staggering, pathological, proportions within the last 20 years. It will remain to see if we stop the combined Handmaid's Tale and Rise of the Machine(s)....
@@erinmac4750disingenuous?
No, it's reality.
People are so out of touch that they don't realize that these creators are only making noise because they're afraid of losing their job.
Fact is these guys aren't talking about AI is being trained on people's output in work from home which is allowing companies to start replacing workers. Why aren't they fighting against that?
Because they don't care, there's OTHER uses of AI happening right now that should be talked about, but creators are only focused on "it's stealing my art" when it doesn't really even work that way instead of discussing everything happening with AI taking jobs.
@@SherrifOfNottingham we are all getting shafted. let's not fall into a classic "divide and conquer"
all human output is shoved in training.
we all need to recognize, get educated on the tech, communicate our needs, and propose policies.
together.
personally, I am more read up on ai image generators and I find musicians and youtubers to be adorable saying the same things I did 2 years ago, while they catch up on training, datasets, neural nets and diffusion models. But we need each other
ps: your complaints are totally legitimate
I'm shocked! Shocked! Well, not that shocked
I actually was a little bit surprised. I think the fact that they won't say it out loud indicates that it matters whether or not they are actually doing it.
@vlogbrothers they know that there would be backlash. All they seem to care about is the pipe dream of infinite growth.
@@vlogbrothersAt this point I expect tech companies and major corporations in general to be doing shady sht as often as possible.
@@vlogbrotherswhat I find crazy is that google can change the TOS whenever they want, but the people that are bound by that TOS can't change it at all. If we say that training an AI is not part of the TOS and the TOS needs to be updated...radio silence
BENDER was the evil Bender??
I think the big take away from this all to me is that corporations LOVE piracy and that it is totally ok to do.
Of course.
I’m just training my LLM. A NAS was just the easiest way to share with my models.
And no I don’t wish to elaborate on those terms.
It's not piracy when the content is publicly available for free.
@@IceMetalPunk well the Disney library is publically available and free! On pirate websites! Therefore, it's not piracy!
Cue the tone deaf reply that doesn't understand humor and just loves the taste of boot
@@thedarter 🙄 You and I both know there's a difference between "someone stole it and made it available for free" and "the person who made this made it available for free".
Humor's fine, but if you call that humor, it's about the same as those conservative, "anti-woke" comedy specials: it misses the mark and is being used to make bad points.
Not a TH-camr, but as a viewer I fully support your ability to opt-out.
What upsets me the most about generative AI is how suddenly it has seeped into everything. Real life and internet life. No matter how much I want to, I can't avoid it.
You can't scroll through instagram or facebook without coming across AI generated content and so many people fall for it because it is never, ever tagged as AI generated. I worked in a primary school and so many of the teachers so quickly started to rely on AI for word mats and picture inspiration and even lesson planning. People now use it to write essays for them at university - you are not learning if you are getting a machine to learn for you! The whole point of university is to broaden your mind and LEARN! Companies suddenly don't need to hire as many writers because an AI programme will write it and then one human can fact check and edit it all. Voice actors are getting replaced with robots. Etc. Etc.
Generative AI will make a generation that is completely incompetant and unable to think for themselves. I feel so sad for the kids having to grow up with this.
Generative AI is like the dinosaurs in Jurassic Park. They "were so preoccupied with whether or not they could, they didn't stop to think if they should."
It's in my text messages with no way to opt out!
@@hannahc8947 that's so annoying!
this is what scares and upsets me. it feels very distopian and it seems that people are just like ... okay??? with kids not learning basic things. reminds me of an article I read a few years ago about computer programming classes needing to do intro into file directories because kids grew up so reliant on search they never learned how to follow a directory, which is how computer programming works.
to take that google commercial during the olympics as an example... so instead of sitting with your kid and helping them write a letter to their hero, you're going to sit with them and tell google to help your kid write a letter. which means your kid never has to learn how to write a letter........ like??? not only is it soulless and replacing human parenting with literal google but also it's doing the kid a disservice.
how are people so okay with this lazy attitude. furthermore, you have to go back behind genai to make sure that what is ther makes sense, because a lot of it doesn't. so then you're editing it when you could have just done the work??? idk seems so crazy to me. and like I feel like we're all just sitting here pretending there's no issue when like.... if you think through "google teaches my kindergartener!" that sounds fucking terrifying!!! and immoral!!!! and BAD!!!!!!!!! like????????
@@zobothehobo You make so many good points. I can't understand the complacency either.
Like with your example of gen AI replacing parent/child bondong time. That's a valuable moment cementing that relationship. That's the kind of love we remember and carry with us through our lives. The kind of love that makes us human. How are people okay to let what, essentially, makes us human be stripped from us so easily.
And yeah, it might be easier to let an AI write a letter for you, but writing a letter shouldn't be a hardship. It should bring people joy like any artistic pursuit.
@@hcstubbs3290 "How are people okay to let what essentially makes us human be stripped from us so easily"
THIS especially when you think about the first piece of culture or society that we did after becoming biologically modern humans... was create art. Art is human. Humans are art. It is a part of who and what we are as animals. so removing art from humanity is literally BAD like it's never been that way!! we know how important creativity is for mental, emotional, and physical health!!! and yet?????? like?????
My main takeaway of this video is that Google is a monopoly and should be broken up. Same company can't be owning the biggest search engine, youtube, and now creating their own ai-tools using the youtube-data. It's clear conflict of interest that youtube is giving the data to google while not selling the data to competitors, which then hurts the creators, who should get their share of the profits.
very much so.
Hear hear
The only greed here is from "creators" who want to be paid if I, or anyone, learn from their videos. Creators should get NOTHING from learning. Absolutely zero. Don't like it? Don't release anything into the public. Copy right only gives you the right control copying, not learning. People need to think about a world where people can control what you learn and sue you if they think you learned from their material.
It's not a conflict of interest, because none of their divisions would have an interest in doing otherwise. Users are not the customers, our attention is the product. Videos are how they acquire the product. So, our interests aren't ever a part of the decision-making process.
However, that is irrelevant to their monopoly status.
@@cuthbertallgood7781 You are deeply misinformed if you think we live in a world that doesn't control what you learn.
Machine learning algorithms aren't people. They're commercial products that create commercial products. If that commercial product uses other commercial products in the process of creation, then that's a very clear use of intellectual property for commercial use, which is very much subject to copyright law.
This isn't about controlling what you learn. It's about preventing the literal theft of intellectual property by multi-billion dollar corporations.
Useragent: *
Disallow: /
I need stickers, shirts, hoodies, pins, lanyards, pillowcases, shoelaces everything.
It's the protest chant
Except they then sold the users stuff?
Could this be a QR code that links to a universal legal document ? I had this thought that I would want to print it on masks and/or patches when I heard they were training with security camera data without telling people.
#good.store
@@kidkurmudgeon-0_o there is no such thing as a "universal legal document," unless you mean a law.
www.robotstxt.org/faq/legal.html
I'm an academic researcher and spend most of my time critiquing "AI" development and deployment ("AI" is a confusing term, large language models and generative image models and so on and so forth are not intelligent, they are just quite complex statistical models of limited and necessarily biased data sets). This is a really valuable insight into the artists' perspectives and I'm really heartened by the fact that TH-camrs are catching on to this use of their "content".
Thanks for drawing more attention to this - I've been watching Vlogbrothers since I was maybe 13 years old and could have never anticipated that I'd ever get to cite a video of yours in an academic article.
Also, I will add this video to any syllabus about AI, because it's so clear even though the context is so messy.
It is also important to note that WE are the ones who provide this valuable data, including our personal data for ads. Without us giving them this information for free, these companies would be worth 0 dollars.
To make Reddit's nonsense worse - I was one of those people who left Reddit in objection to their API debacle.
I purposefully (before the TOS changed) used an app to change all of my posts/comments to gibberish. Then deleted my account, with a note saying why I was deleting it.
Three days later, I searched for something I knew would be one of my posts. Google found it. Okay, fine, the crawler just hadn't updated to see my gibberish-replacement. Let's follow the link aaaaand-nope.
Reddit restored all of my posts/comments to be their last version before I swapped them for gibberish. Now all attributed to "[deleted user]" instead of my user name. But they still exist.
And since my account is deleted, I have no way of editing them again. So I have no way of removing my content from their just-added-to-their-TOS AI training provision. (Although I enjoy that their User Agreement says "if you don't agree with the new terms, just stop using Reddit." - that doesn't stop your *CONTENT* from being there!)
I have sent multiple emails to Reddit support, demanding that my copyrighted content be removed from Reddit, to no avail. (I haven't even gotten a single reply, to the multiple different attempts at multiple different emails; I've filled out different web forms, I've even created a new "created just to complain" account, and have gotten zero replies to anything I've sent.) I'm a single individual, and can't exactly afford to hire a lawyer to sue Reddit, but I guarantee there are other people in my position.
The fact that reddit can get away with shit like that when all the little guys on earth have to quiver in fear that their protected work will get a DMCA takedown they can't fight is just despicable. Our current copyright system is here to pleasure Disney and maybe two other people, and leave the rest of us to suffer
Start a class action suit. Contact other users that are objecting and pool resources to get a lawyer
You'd think the ACLU would be taking action on this.
TOS is a pain in the ass
bro ur fault for deleting ur acc hahag. u should've made it gibberish and not deleted it yet
Funny that when people say not to use their content, it's a "gray area". Last time I checked, no means no.
That bothered me too.
These frontier AI labs are pretty rapey about the whole situation.
I hope there will be upcoming lawsuits about this, but I think U.S. legislation is slow to catch up to technology.
Oh don’t worry, they’re probably the same people who want to get rid of the idea of consent too
I don't know. I think it's interesting that they agreed to granting TH-cam a transferable commercial license in perpetuity and people didn't understand what that meant. It's even more interesting how they're acting like it's something new and will be treated differently by the courts in this instance. Transferable commercial license in perpetuity is very well defined by the court. It's going to be interesting seeing people try to prove that they are not using the license appropriately when they are in fact making it for commercial means.
It's going to be sad when IP the lawyer TH-camrs hop on this and give a little insight because this is not going to go the way a lot of these creators. Hope it will. It sucks but it is what it is. It's definitely not like how certain movies never had rights negotiated for future formats. It's completely different. And it's going to be interesting seeing a lot of people swallow this bitter pill.
This topic needs to be seen discussed and addressed on a global scale. We need new rules. We need new laws. Its revolution time baby
I know nothing about how TH-cam will use the scraped data from videos, but today I did a google search and the text in the “generative ai overview” wasn’t just a reinterpretation, but an exact copy of information in the the top sponsor hit. Not a training set, but the exact words were lifted.
I think that might be called overfitting
John's taking on the biggest pharmaceutical companies to help them do what's right, Hank's taking on the biggest tech companies to help them do what's right. love to see it, proud to be here
*To _make_ them do what's right. Under capitalism, ethics will not (arguably cannot) be weighted more heavily than profit. 99% of the time companies will only do what's right if they think it will positively affect their stock price or bottom line.
@@silverandexact Yup! Regulation is necessary. Best thing you can do is find a useful piece of potential legislation and tell your representative how important it is to you.
@@silverandexact i hear what you're saying. i'm taking a cue from john/nerdfighteria and framing it as a collaborative effort with the companies in question because i think that is likely a more effective strategy to get the companies to change their practices/policies
Now we need two heroes to take on the biggest Food Companies so we can eat real food again and Energy Companies so we can have fewer wildfires.
Also so we can have TH-cam version of Captain Planet
As a master's student whose work is in this area: /*furiously taking notes*/
Also a small asterisk, a lot of data that notable LLMs have trained on is pirated text that was aggregated, and they reference them in their papers as a "public dataset". (Ex: Facebook and Book 3) Therefore in my opinion, if they're already willing to admit they're using illegally obtained data for generative AI usage, we should already assume they're doing things that are considered in the "grey area".
Thanks for making this video! I hate that I can't go a day without hearing about or interacting with awful AI content.
I absolutely agree that everyone should have the choice to opt in or out of these learning models.
I want an opt out - not just for my videos but also my photography, songs, poems, artworks, burps, farts, sneezes and whatever else makes me me.
I'd like to opt out of taxes, and speed limits, too. Too bad, doing so would infringe others' freedoms, so you don't get to. Banning others LEARNING from your art is thought crime, and an absurd and draconian concept. One which you were never restricted by yourself when you were learning art. Did you have to respect Picasso "opting out" of you learning from and training on his art? No. So why do you think you deserve protections that you never afforded other artists yourself?
@@gavinjenkins899But AI doesn’t “learn.” It copies. It copies every stroke you make, every painting that you spent hours on. And it does that, not to express itself like a human would normally do with art, but to replace the artists it stole from. Let’s face it, generative AI is being used to replace artists. AI has great potential to be used for good but this isn’t it. It’s the same concept as tracing art. Tracing art and claiming it’s yours is theft because you copied the strokes. That’s what AI does. AI can’t create new strokes. It can’t create, it can only copy
@@jeSUS-wp2eg Nope, not only does it not copy, but it's physically IMPOSSIBLE for it to copy. Stable Diffusion has 10,000x less memory than it would take to store its training images, and even that's only if it somehow needed zero space for the connections and rules. "01001101" that's about how much data you have per training image. Explain to me how you can remember, in order to "copy" a several megabyte training image, with that many 1s and 0s? You simply don't understand how the technology works, and you shouldn't be arguing about rules and legislations until you learn the basic facts and how the thing works first. You have homework to do.
@gavinjenkins899 Humans use their life experience including art they've seen to make art. AI uses code thrown at it to display images and text on demand, and those productions are not art. You can use my human-made art to make more human-made art. It's physically impossible for an AI to produce something that can accurately be called art, whether or not it "learns" from human made art
@@Natalie-101 Yes, like you just said, you use art you've seen to make new images. Yet you do not ever ask permission or pay fees for doing so. So you're a hypocrite. it's pretty straightforward. You've offered zero rationale for why AI should be treated any different just "because it's AI and not a human", which is of course meaningless and circular. "AI should be treated differently because AI should be treated differently" 🙃 You have no argument.
The fact that EU (and probably also similar in other countries) legislation is coming simply puts a timer on how fast they need to scrape everything there is to scrape. I would not count on any "opt-out" feature until they absolutely are forced to add one by law.
What many people fail to see is that human-created content just became a heavily sought commodity. Especially pre-GenerativeAI content, which is definitely not "tainted" in any way. We've known for a long time that trying to train ML models on ML generated data doesn't work - it just reinforces the bias too much. So what all those BigTech companies need is human-generated data for training. Paradoxically, the introduction of GenerativeAI actually made such content harder to get - lots of people now use ChatGPT to ask questions instead of posting on some online forums, and those online forums are also flooded with ChatGPT-generated answers as well.
Is Google using youtube content? 1000% they are. That's a "competitive edge" they have over competition and they are definitely using that. Same as I'm absolutely sure Microsoft uses private github repos to train their Copilot and that Amazon is doing the same thing with anything Alexa records in your house.
OK, I'm feeling really old now. I feel like this all ties in with the problem some of us older folk had when the word "content" started to be used for creative work. Calling it "content" divorces it from the labor that produces it, we thought. "Content" is just stuff out there. I know younger folks haven't necessarily seen it that way; you all understand that "content" is creative work. So maybe that's a moot point.
But I'm also really concerned with the tendency inherent in our social order / market economy for everything to get shoved into the private sector. I may not be stating this correctly, since "private" can simply mean that works are owned by their authors; but to me, putting it online puts it into the public square-in much the same way publishing in books, magazines, or newspapers does. Yes, there's some element of "private" involved, but once ideas are out there, they're available to the public. And that's a really good thing.
But this looks like it could force the ceation of something...new? to replace the public square - something we've been seeing in terms of literal geography & the built environment, as public spaces give way to private spaces-including private parks technically open to the public, but certainly not in the same way an actual public park would be.
We saw this here in Detroit in different context. Our art musem, the very much world-class Detroit Institute of Arts, was jeopardized during the city's bankruptcy because its collections were owned by the City of Detroit-and that meant by the People of the City of Detroit, when it comes down to it. It was a very public space. The city's creditors (who, to cite Oscar Wilde, clearly know the price of everything and the value of nothing) claimed they should be able to force the sale of the DIA's collections to satisfy the city's debts. Legally, I suppose that could make sense. Ethically and, well, culturally, absolutely f'ing not. And there was even an op-ed in I think the LA Times arguing that of course the DIA's collections should be sold, so that it could go to the coasts where people actually deserved to have it. (We're used to that kind of shade here; hence the clothing company, Detroit Versus Everybody.)
Luckily, the relevant staff at the museum (one of whom is a friend of mine) worked really, really hard, long hours to save the art. The solution turned out to be the establishment of a private, though thankfully nonprofit, foundation, and the ownership of the art was transfered to the foundation. And the great experiment of publicly-owned art came to an end, despite nothing really changing on a practical level-and the iimportant thing, keeping the art here, where even Detroit Public School children, who the good people at the LA Times think don't deserve art, get to experience it either by just stopping in or through interactive, educational programs provided by the museum to schools.
All that to give an example. I worry that by focusing on what's legal and profitable rather than what's ethical and healthy and suportive of human flourishing, we're moving to a place where everything will have to cost money to be accessed by the public, since that will be the only way to protect it. Thankfully, the DIA managed not to do that, and maybe something like a private foundation, which would have the kind of legal standing individual people don't seem to anymore, could emerge to help in many situations. I'm a theologian; I don't really know all the legal ins & outs, but I know unethical when I see it (as does everyone, really, if they're not deluding themselves), and I know detriment to human flourishing when I see it. And I value human flourishing far, far more than legal technicalities that allow people with money and power to exploit systems and people to increase their money and power.
Here endeth the screed.
Google needs to be broken up like "Ma Bell" was in the 80s.
Well said! I had no idea that DIA was caught up in that. Stockton, CA went through a similar bankruptcy, but not one mention of local resources that might've been affected. Now, I'm wondering if I shouldn't revisit that part of my city's recent history, not that it's history and preservation of community assets has been stellar....
Well said.
@@erinmac4750 I went to grad school in Berkeley (at the GTU, not Cal) & lived in Oakland & worked in SF. I don't think I ever made it out to Stockton. I'm sorry to hear you all went through bankruptcy too, though.
This is very well written; it's nice to read a very thoughtful take on this problem that it feels like isn't being talked about enough, but it's making me sad that this comment is already being added to a LLM's data set too.
Who remembers when you rented a movie from the video shop there was a little statement before the movie about piracy? "You wouldn't steal a TV would you?!" The tables have not just turned - they've been disassembled put in a video box, reassembled elsewhere and a big "shush, it's for science" sign put in it's spot.
Person: Hey, does Google train Gemini on TH-cam videos? Google/TH-cam: Rambles vaugely about TOS. Person: So you didn't say no, which basically means "Yes" you just don't want to say "Yes".
As soon as dude didn't immediately say 'no' that's very very very obviously 'yes'
He was thinking too long for comfort
Rambles vaguely in court about granting a commercial license in perpetuity. Commercial? Commerce. Used to generate revenue. Google will us the AI to generate revenue. Commercial licenses will cover training.
That word perpetuity carries a lot of weight.
Oh, god. It sounds like the preds who justify their behaviour by saying "but she didn't say no".
As if anything that's not an emphatic "no" is automatically permission to go ahead 🤢
@@TheKrispyfort OMG Truth! People don't recognize the boiling frog analogy, but in your analogy the stark reality of the harm in these companies' actions is unequivocal. Louis Rossman used this same analogy discussing subscriptions for features and services to be unlocked on products. This predatory behavior of companies has to stop.
Note: As the survivor of DV/SA, I don't think this is hyperbole. When these entities violate people's rights to create, work, own purchases, control how their own creations/work is used or not used, etc, that's affecting life and liberty, working to render a person/the people powerless.
No. Not hyperbole.
No one should have to opt out- they should have to opt in
this,10000x this,we should not have to ask,they have to ask
Especially considering if there's a span of time where you're opted-in before unchecking, they'll just immediately grab the content and bake it into their next version of the AI. And once they've done that, they're not going to roll it back, even if the data is technically removed from the "training" set once you opt out. It's already been eaten.
After watching and spending hours reading/commenting in threads, I'm adding this one to thank you, Hank, vlogbrothers, and this insightful community for all this "real" information and insight.
Godspeed and much love, everyone.
So basically they're training on everybody except record labels and cable networks. Consent is so important and big tech has such a huge problem with it on every level. Thanks for this, Hank.
Consent is not important if your use is transformative.
I like Hank's suggestion. Just allow an opt-out, and avoid a mountain of bad press and legal disputes later.
The fact that AI companies think making it opt-out is a valid solution is insane to me. It's like if I go into a store and get my wallet pick pocketed by a store's on site pick pocket. Then I go to complain and they tell me that I never opted out of their pick pocketing policy so they were allowed to do it, at which point I opt out and they tell me it won't happen again but don't give me back my wallet.
No no no. It's like you leaving your wallet on the floor and someone ran through it and took some notes. And you complain you should have opted in for people to take notes of thing lying around in public.
Pick pocketing is physically taking possession of something. That is not what's happening. TH-cam doesn't steal videos. If they deleted the video from your channel and forbid you to use it, that would be comparable. You don't lose any right you had.
@@OmateYayami If someone leaves a wallet on the floor, the morally right thing to do is to try and return it to it's owner. You do not go rifling through it.
@@draconvarie And do you establish who the owner is?
@@OmateYayami congrats that is the stupidest analogy ive seen an ai bro make yet
yeah. But you do not keep their personal info from their drivers license in case you ever need to send them adds.
And now I just want cornbread
I especially want cornbread that turns into honey and then drizzles itself onto more cornbread.
@vlogbrothers seeing that horror stopped me dead, I had to rewind it and make sure I wasn't losing my mind.
@@Kidzelda0 Same! I sat there for at least 5 minutes trying to understand what was going on 😅
That's what happens when you watch too much Cornhub.☝️
@@Kidzelda0 good to know i'm not the only one spinning out on that weird nightmare-fuel
I sure hope YT pays attention to this. I mean, as far as I can tell Hank is with no exaggeration, one of the people that made TH-cam important.