A reason. One reason of many. But it'd happen anyway, this is contingent on a few factors, and it is a particular kind of person being targeted by another kind. So, not as neutral as you perhaps are mistaking it to be.
Why is everyone on this site so dramatic? It is not soul thievery, it is a form of identity theft. If you exaggerate everything people tend to take you less seriously after a while ... which is exactly what is happening to people who are considered progressives.
@@tasibsharar7357 that is true, but in a lot of places, even security footage is now simply moved to a cloud and not kept in physical drives (near the cameras). Limiting the internet probably won't work, but laws about the digital signatures/prints of video evidence will now have to be rethought.
Yeah well that is only going to help people for now. This technology will advance and in the future it won't require a big social media presence, maybe even 1 photo will do. And even then. A big social media presence doesn't justify this at all.
Celebrities are the ones most at risk but not the ones who will suffer the most. They have the means to clarify the situation and reach people through the media. Other people don't. They won't have experts analysing the video and popular web sites and media outlets clarifying the situation. Most people will most likely believe it's the real deal, or at least consider de possibility of it being real. But it's not that you couldn't do it already to pictures with photoshop and similar software. It's just that most people didn't know you could now do it to videos as well, but it's a matter of time until they do.
Yup, the more you discuss it and try to ban it, the more it grows. We should use this time to develop tools to identify deepfakes as its explosion is right around the corner.
hetmonster2 ya, but it’s not in heavy use like detecting counterfeits. It should be built into video players, uploaders, browsers, etc. we’re nowhere near that
By what I've seen, that's already happening. It should have been made illegal yesterday, but it's not. Allowing deepfakes of people like this will only feed into potential rapists and abusers, making them more likely to commit a serious crime later on with the real deal.
I agree. If it makes you feel better though, @iamdeepfaker on Instagram uses the technology pretty responsibly. So this is a pretty double-edged sword, although I believe all of us can live without this technology.
If you're concerned, you don't have to get off social media completely, but deleting any pictures or videos that aren't important off your social media account could minimize the chances of getting deepfaked.
I think this gets into the question of “just because you can, should you?” Novel technology has allowed for scenarios that we could scarcely imagine just a few years ago. We have seen this technology used by Hollywood to bring back stars that have passed away. The ethics of that are still being debated, but at least the parties involved (to my knowledge) have given consent. So, there are arguably benevolent uses for this particular technology. What mechanisms, be they legal, technological or otherwise, can we use to protect people’s privacy and self-ownership while not abridging the pursuit of knowledge upon which technology is built? How do we get people to take a step back and think, “should I do this?”
@@Sovereignless_Soul Interesting point there, but unfortunately some people do their job through social media presence. So, I think to blame the victim is really not the final answer we want to pursue.
I'm old. I Didn't grow up with computers. Never, in a million years, would a 25-year old me believe that many years in the future, something like this would come about. Just beyond belief.
Actually the same Bill Hader clip they showed in this came up in my TH-cam recommendations because I was watching Bill Hader clips. I'm sure I had heard of DeepFakes before, but that was the first time I was really exposed to them.
@@sebastian-benedictflore Ashton Kutcher has a foundation call thorn that specializes on tracing kidnapped children on the dark web to save them from child prostitution and trafficking, so it is entirely possible that he knew about Kristen bells deepfake because of this
as always we are the ones being abused by whatever new thing men find to objetify us and use our bodys against our will for their pleasure with no consent as usual... not shocked, sadly 😔
well it couldn't be a surprise. it's not like with deepfakes being invented there's more evil in the world. it's all same old evil (people) playing with its new toys not thinking about the others.
I feel super bad for the victims. Imagine it being viral and toxic people will bully them. It will deeply impact them in a negative way and may bring depression into their lives. And the people who are making these deep fakes are very inhumane. Humanity is messed up smh
It's actually a historical thing almost. Back in Medieval times you had Pretenders (people who look and sound like heirs to the throne of a country) imitating for personal gain of power and wealth.
This feels just like the concern and outrage we had when Photoshop became a big thing. Turned out there wasn't much that could be done about it and we all got used to it. Sadly this is most likely what will happen with this technology as well. No putting the genie back in the bottle.
Yeah, right now there are ups and downs to this technology that are emerging, still we have the opportunity to learn from past and control this. I wouldn't be surprised if something big happens again by this point. Hopefully nothing bad will happen. :/
Yeah I agree that its not nice, and I'm sorry that this is happening, but there is really no way to stop this completely, except in ways that would be too extreme. You can discourage it, but not stop it entirely. Its just not worth it to try and make a big deal about this, because they ONLY way to stop this is to have an extremely controlling internet that isn't free anymore. It's sad, but you need to be resilient and just deal with it.
You could put regulations over it. No need to get all "we can't do anything about it", because we can say that about literally everything. "We can't do anything about stealing because people are gonna do it anyway" or whatever are just weak excuses.
Another reason to protect your children’s faces from social media. Parents plaster their pages with pics of their kids. If they do this with adults why not children? Very scary.
@@2NVS basically make sure there are no videos or multiple candid photographs of you online-- which is almost impossible when social media is so prevalent in our society. It's frustrating that the video's conclusion was essentially "you could be next lol good luck"
Seeing the comments, I’m actually surprised by the amount of people who haven’t known about this. It’s been talked about and documented for years now. I guess some people have to wait for things to hit their ‘social bubble’.
Bugler55 I’m not surprised by your lack of surprise. The nonchalance to women’s images being sexually exploited and degraded online without their consent is expected.
It's not "the internet" that needs to be kinder or more considerate. It's people. Specific people making specific decisions to do something that's scumbaggy. Many people are scumbags and the internet allows them to be their worst selves anonymously-but it doesn't MAKE them be scumbags, it only gives them a vehicle to have an audience.
This raises some serious and concerning questions about the nature of the digital self. Who owns images of us? Do we own the idea of our likeness? If so, do people have the right to take our photo in public? But if NOT, would we also not have the right to stop deepfakes made from those photos?
This is one of the reasons I don’t ever post pics of myself online. I was even fired from my job of 15 years because I told my new boss I don’t allow my picture online because he wanted to post pics of his staff. Others declined, but I got fired.
it may be seen as very incriminating now, but I think that as the technology gets more common, people start to distance identities from images, videos and etc. and in some time, nobody will regard any image or video as relevant, regardless of its realism. this will likely be very detrimental to many systems that rely on identifying identities based on images, videos and etc. like the legal system.
@@sentinel9651 Sounds very plausible actually. Once fakes become common, people will start doubting visual material; they have to. It would be ideal if people could stop caring (different value system around sexuality), but that might run too much against human nature.
You cannot stop this. Just stop postings thousands of photos of yours online. For a flawless deepfake you need a huge amount of data to train your model with. Celebrity photos can be found easily online that's why its easier to produce deepfakes of them.
Deepfaking technology is getting better and better. From my phone, I could deepfake someone with just one photo of them. It's not that good NOW, but 2 years ago it was completely unthinkable.
@@kwirro That's why I added the word Flawless or you can say indistinguishable. Not posting photos is just a prevention. There is no cure. The only possible solution I see is how we tackle viruses with antiviruses to design algorithms to identify deepfakes.
And people wonder why I have a cat as avatar... I still think politics is more of an URGENT threat as that could potentially threaten every person in the world, but sure, on a personal level this is incredibly damaging.
What's happening to adult women is horrifying but now I'm thinking of all the naive, ignorant parents uploading reams and reams of photos and videos of their young kids... :(
This is nowhere near the same thing but it reminds me of when some people would put on realistic masks of black people and rob banks. Black people were actually getting arrested for crimes they weren't doing
I just like the fact that people are no longer asking me why I chose not to have any children. It’s just odd that it had to get this bad for some to kind of start getting it.
As a kpop fan this was brought to many attention recently through some channels i watch, its really something that everyone should know about because it can really affect every celebrity and person wth their face online honestly. Its just disgusting that people think they can do it, and i really hope people don’t think this is actually okay. Gotta give it to Vox for also giving informed information that many people should know about, the internets just a dark, scary place.
I feel like if this was happening to men the issue would be taken way more seriously. But as 99% of the victims are women, it’s just swept under the rug like all of our other issues.
More than 4 years people have been photoshopping celebrities on naked bodies ever since image editing was a thing, i was gonna say "ever since photoshop was a thing" but i think there were image editing software before adobe photoshop but deep fakes is still pretty newish
No one should be entitled to sexualize others or use their faces without consent, it is wrong. But the thing is, it's the internet and unfortunately everything gets sexualized from objects to people... We have 2 options: we learn to live with this or we try to censor the internet which is almost impossible
@@ahblooloo8639 Someone would've stumbled on it eventually. Machine learning is a booming field and someone would've applied it to faces sooner or later.
Anyone else think that you just need to be responsible for that what you post on the internet? IE... you dont need to post selfies on instagram. But it is so scary.
I usually don’t care too much when celebrities complain/ speak out about something that’s damaging their image or reputation. But this one I really sympathize with them. How this isn’t illegal is beyond me, I understand why someone that finds the individual actress super attractive but this isn’t how you go about it. Just use your imagination, instead of defaming them on the internet where anybody can find it. This is just wrong, I mean it has to be defined as defamation in some form.
It's sad how naive Kristen Bell sounds at the end of the video. Sad in the sense that something like that is considered naive when she's absolutely right. I along with countless others have seen just how immature, cruel and insensitive people can be online and there's no fixing that without violating privacy and privacy is needed to protect those most vulnerable. Like with the riots currently going on, you can't quite fix it, you just gotta ride it out and that's what we gotta try to online. Ride it out and hope for the best in the end... even though humans don't really work that way.
I'm glad this is getting attention. It will teach people about privacy and critical thinking. You can't take anything you see at the internet at a face value.
When you put personal information online, you forever lose control of what people do with that information. Photos count as information. Nobody has any reasonable expectation of privacy online.
Photosites NEED to start making it very clear what rights users are sublicensing uploading images to their sites in such a way that it's not written in legalese, and end users NEED to start reading the TOS. Most people would have never have forseen that uploading an image of themselves by default sublicenses the company to do whatever they want with said image.
Let’s start the countdown timer. Congress should be able to address this in about 100 years or so.
This is so sad and true that it's actually funny
A hundred years? Quite the optimist, aren't you? My money's on a brief chat in 250 and legislation 60 years after that.
*Acknowledge, it will take another 200 years or so to actually do anything
unless the people there became popular faces all of a sudden
😂😂
Did anyone else think that Kristen Bell at the start was a deepfake
Glad you told me
Haha yea
That would have made it sooo much better. Like at the end it's revealed it's a deepfake.
Yah
Yeah D:::
This is basically the reason why you should care about privacy of your data. You can't imagine what will be done with it.
A reason. One reason of many. But it'd happen anyway, this is contingent on a few factors, and it is a particular kind of person being targeted by another kind. So, not as neutral as you perhaps are mistaking it to be.
@@liang2492 There is still a risk of it happening.
What's the name of your first pet?
Where were you born?
What's your mother's maiden name?
@@liang2492 That's what you think.
Then it happens.
In b4 Facebook starts selling all your photos to deepfake makers. Hey a buck is a buck to Zuck.
It is said that some aboriginal peoples, when first introduced to photographs, thought their soul was being stolen.
In a sense, now that is possible.
The Amish have the same feeling towards pictures of their faces.
Why is everyone on this site so dramatic? It is not soul thievery, it is a form of identity theft. If you exaggerate everything people tend to take you less seriously after a while ... which is exactly what is happening to people who are considered progressives.
It suddenly clicked for me, like stealing someone's face is bad, but stealing someone's soul, manipulating it as you see it, is horrific.
@@LightningbrotherG :(
i mean i get what youre going for but this just sounds so pretentious
It won’t be long until video, photographs, or audio recordings are no longer considered evidence in a court of law.
Agrim Gupta you can just limit the internet
@@tasibsharar7357 that is true, but in a lot of places, even security footage is now simply moved to a cloud and not kept in physical drives (near the cameras). Limiting the internet probably won't work, but laws about the digital signatures/prints of video evidence will now have to be rethought.
The response would be a stronger focus on Metadata and device identifiers
Imagine how easy it will be for authoritarian governments to frame people now with deepfakes as evidence
Where I live they are already not considered evidence because they can easily be tampered with
Imagine if all the people they had in the video were also deepfakes
That'd be soooo meta
Oh you took the Red Pill 👍
Duncan Kim Can someone please elaborate the “red pill, blue pill” thing? Thanks
Tbh if we live in a simulation we may all be deepfakes in a computer. Oh wait I didn't want to go this deep.
MimixLight it's from the story line of the movie "The Matrix"
This is why I've never been a fan of showing off my life online.
The suspect could've used her face from her movies..
Nobody care about ur life
Who tf is going to deepfake you?
@@SR009s
That's probably Noelle thought too, but she got deepfaked anyway.
Yeah well that is only going to help people for now. This technology will advance and in the future it won't require a big social media presence, maybe even 1 photo will do.
And even then. A big social media presence doesn't justify this at all.
Celebrities are the ones most at risk but not the ones who will suffer the most. They have the means to clarify the situation and reach people through the media. Other people don't. They won't have experts analysing the video and popular web sites and media outlets clarifying the situation. Most people will most likely believe it's the real deal, or at least consider de possibility of it being real. But it's not that you couldn't do it already to pictures with photoshop and similar software. It's just that most people didn't know you could now do it to videos as well, but it's a matter of time until they do.
Sadly the more awareness this topic gets there will be even more traction to those websites. It's a lose lose situation to fight against the internet.
Yup, the more you discuss it and try to ban it, the more it grows. We should use this time to develop tools to identify deepfakes as its explosion is right around the corner.
Exactly mate, It's time to think about how to live with that problem, not how to prevent it
I agree
@@tc2241 Already exists.
hetmonster2 ya, but it’s not in heavy use like detecting counterfeits. It should be built into video players, uploaders, browsers, etc. we’re nowhere near that
Abusive exes could use this on their victims
That's the terrifying part.
Could be a worthy punishment
@@fleshtaffy I hope you just misread that. 😬
By what I've seen, that's already happening. It should have been made illegal yesterday, but it's not. Allowing deepfakes of people like this will only feed into potential rapists and abusers, making them more likely to commit a serious crime later on with the real deal.
@@fcgHenden Nah. It's quite interesting seeing how white people justify things.
I think my Dad is a deepfake, I havent seen him in 9 years.
Spooderman lol
I really needed that milk.
huh
Can I come to the top with you?
Stay strong men
Tiktok: booming
Deepfakes: it's free real estate
🤣
@@Lennard222 Still not ok buddy
Do people even make original jokes anymore?
*social media: Booming*
From social media to solipsism
imagine victim blaming someone for having a face ☹️ taking a selfie is 'asking for it' now?
and you can be guaranteed men are currently jumping through the mental hoops required to both justify it and blame women for it.
T K yes
Its obviously not the victims fault but yes everybody should be ready for stuff like this when making your data public.
The feminists are rampant in 2020. You guys are like little flies.
@@fleshtaffy you're a misogynist, and a poor excuse of a man
Good thing I have no face.
Stonks
Good Man
Nufes
Lol. Me too
Good Man
This is sick on so many levels.
It's disrespectful but how is it an urgent threat though?
welcome to the internet
Someone clearly isn't familiar with Rule#34
@@a2pabmb2 Cartoons aren't the same thing as a deepfake.
@@canti7951 these people did not consent for this. that's the problem. it could ruin their lives
Man this has the making for an incredible Black Mirror episode. This is some messed up sh*t!
Theyve already done it- "be right back"!
This is disgusting. Even though I’m not a woman this makes me want to delete or at least minimise the use of my social media.
I agree. If it makes you feel better though, @iamdeepfaker on Instagram uses the technology pretty responsibly. So this is a pretty double-edged sword, although I believe all of us can live without this technology.
or just not post pictures of you. i just post cats lol. do a deepfake of them
If you're concerned, you don't have to get off social media completely, but deleting any pictures or videos that aren't important off your social media account could minimize the chances of getting deepfaked.
It won't be just women in danger because of these deepfakes.
Just delete it dude.
I think this gets into the question of “just because you can, should you?” Novel technology has allowed for scenarios that we could scarcely imagine just a few years ago. We have seen this technology used by Hollywood to bring back stars that have passed away. The ethics of that are still being debated, but at least the parties involved (to my knowledge) have given consent. So, there are arguably benevolent uses for this particular technology. What mechanisms, be they legal, technological or otherwise, can we use to protect people’s privacy and self-ownership while not abridging the pursuit of knowledge upon which technology is built? How do we get people to take a step back and think, “should I do this?”
much too late for that
We're providing way too much data on the internet.
But that's not the problem here, the problem is that people abuse this data
@@luziealyssa5677 you can't stop people, but you can stop uploading data.
@@Sovereignless_Soul Which just defeats the point of the internet.
@@Sovereignless_Soul Interesting point there, but unfortunately some people do their job through social media presence. So, I think to blame the victim is really not the final answer we want to pursue.
@@Sovereignless_Soul The internet won't be all that interesting now would it
kristen bell is so calm I like how she was interviewed
I have huge respect for Kristen and Noelle for talking so calmly and authoritatively about something that must have been so upsetting.
“I’d be thrilled that someone found me attractive”
I so badly want to put this claim to the test.
The one who said that is probably people who are incels.
@@d4vian398 100%
@@d4vian398 or someone with a different opinion?
@@temtem9255 who will be degraded more?
it's because they're imagining an attractive woman behind that "someone", not an obese, sweaty, 45 years old man. the latter is our reality.
I'm old. I Didn't grow up with computers. Never, in a million years, would a 25-year old me believe that many years in the future, something like this would come about. Just beyond belief.
have you never heard of photoshop? XD
About time that Somebody talks about it. Happened for years
This has been in headlines for, like, three years.
@@harrylane4 never saw it. Only individual cases in which stars talked about it
I feel so bad for this woman. People are so cruel
The people who discovered deepfakes, were obviously looking for deepfakes
Actually the same Bill Hader clip they showed in this came up in my TH-cam recommendations because I was watching Bill Hader clips. I'm sure I had heard of DeepFakes before, but that was the first time I was really exposed to them.
Referring to Ashton Kutcher "stumbling upon" Kristen Bell's deepfake?
Sebastian-Benedict Flore he has an organisation named Thorn, look it up
Yeah lol her husband...
@@sebastian-benedictflore Ashton Kutcher has a foundation call thorn that specializes on tracing kidnapped children on the dark web to save them from child prostitution and trafficking, so it is entirely possible that he knew about Kristen bells deepfake because of this
A sad smile at Kristen Bell's face at the end got me :(
The people who made those deepfakes are definitely going to the bad place
Dude, EVERYONE is going to the "bad place".
Yeah man, seeing how the 2020 is going, we have all went to the bad place.
This sounds like satire. I hope it is.
This is sarcasm. The vast majority of us will be there.
@@bidbux9500 He didn't know. 😨
We were so excited when we got the rainbow barfs and dog ears... and now here we are.
Imagine if this is used to frame someone for a crime
china is making good use of it
more like imagine evidence no longer being credible when you have just some media recordings
like... imagine people logically evolving
Videos and audio would need to have a certified and trusted source in the future, rather than having the video or audio prove itself on its own.
Of course it’s mostly happening to women. I’m annoyed that I’m not surprised.
as always we are the ones being abused by whatever new thing men find to objetify us and use our bodys against our will for their pleasure with no consent as usual... not shocked, sadly 😔
@Bugler55 yeah... so an isolated case defines it all doesn't it... think about the majority of the cases and not only about what you choose to see
@Bugler55 would you judge a guy the same way if he was doing the same thing? just asking, really
well it couldn't be a surprise. it's not like with deepfakes being invented there's more evil in the world. it's all same old evil (people) playing with its new toys not thinking about the others.
Scary.
Indeed.
Definitely.
And people made fun of me when I was super concerned about everyone sharing everything online
We should have all listened to you
I thought staying inside is safe enough for me, turns out im wrong
01:20
“This is my face, this belongs to me”
Deepfakers:
“All your face are belong to us”
Soviet union vibes
HAHAHAHA
*Our Face*
I feel super bad for the victims. Imagine it being viral and toxic people will bully them. It will deeply impact them in a negative way and may bring depression into their lives. And the people who are making these deep fakes are very inhumane. Humanity is messed up smh
this people might feel the impact of this for their entire lifes... it's simply sad
it won't be as impactful once everyone knows that media (previously text, photo, now video) is mostly not credible
Wow, what has humanity come to? This is dark and unacceptable.
this isnt close to the worst people have done lol
@@Terjecs lol no, this is pretty fkn malicious.
@@Terjecs still, this is horrible. Nobody should be doing this to anyone.
No, this is mankind. Again.
It's actually a historical thing almost. Back in Medieval times you had Pretenders (people who look and sound like heirs to the throne of a country) imitating for personal gain of power and wealth.
This feels just like the concern and outrage we had when Photoshop became a big thing. Turned out there wasn't much that could be done about it and we all got used to it. Sadly this is most likely what will happen with this technology as well. No putting the genie back in the bottle.
Yeah, right now there are ups and downs to this technology that are emerging, still we have the opportunity to learn from past and control this. I wouldn't be surprised if something big happens again by this point. Hopefully nothing bad will happen. :/
Yeah I agree that its not nice, and I'm sorry that this is happening, but there is really no way to stop this completely, except in ways that would be too extreme. You can discourage it, but not stop it entirely. Its just not worth it to try and make a big deal about this, because they ONLY way to stop this is to have an extremely controlling internet that isn't free anymore. It's sad, but you need to be resilient and just deal with it.
You could put regulations over it. No need to get all "we can't do anything about it", because we can say that about literally everything. "We can't do anything about stealing because people are gonna do it anyway" or whatever are just weak excuses.
Another reason to protect your children’s faces from social media.
Parents plaster their pages with pics of their kids. If they do this with adults why not children? Very scary.
I'm just impressed you got Tom Cruise's consent to having his face digitally edited into this video.
Did they really get the consent tho?
Exactly, only for women VIPs there is a problem, for everyone else shut up and let the internet flow
But Tom cruise is man, so he and how he feels does not matter.
good point actually.
Why am I afraid this vid actually gave ideas to some people who haven't thought about this..
You're afraid because you know you're right.
@@2NVS basically make sure there are no videos or multiple candid photographs of you online-- which is almost impossible when social media is so prevalent in our society. It's frustrating that the video's conclusion was essentially "you could be next lol good luck"
@@2NVS not much is needed.
Nah deepfakes are well known for everyone even remotely interested in machine learning.
Seeing the comments, I’m actually surprised by the amount of people who haven’t known about this. It’s been talked about and documented for years now. I guess some people have to wait for things to hit their ‘social bubble’.
Most people are only aware of what's being talked about in their lunch room or what's in the headlines. I envy them though. Ignorance truly is bliss.
I’m not surprised by the amount of people who either aren’t taking this seriously or don’t really see it as an issue
yeah me too
Bugler55 I’m not surprised by your lack of surprise. The nonchalance to women’s images being sexually exploited and degraded online without their consent is expected.
William Gibson wrote about this 20 years ago, people thought it was exaggerated...
"I found your deepfake"
"Errr actually that was real"
:O
People don't look at it from this side. It will give everyone a way to deny embarrassing videos.
".....yes.... right... the deepfake..."
It's not "the internet" that needs to be kinder or more considerate. It's people. Specific people making specific decisions to do something that's scumbaggy. Many people are scumbags and the internet allows them to be their worst selves anonymously-but it doesn't MAKE them be scumbags, it only gives them a vehicle to have an audience.
This raises some serious and concerning questions about the nature of the digital self. Who owns images of us? Do we own the idea of our likeness?
If so, do people have the right to take our photo in public?
But if NOT, would we also not have the right to stop deepfakes made from those photos?
Some celebrities took steps to patent their likeness etc. A early bid to be proactive in this new era.
But can a bright up and comer beat a celebrity into patenting their likeness. $$$
You absolutely do not own your likeness
@@RampageG4mer Religiously Maybe but in scientific terms yes
Good thing I only post memes.
This is one of the reasons I don’t ever post pics of myself online. I was even fired from my job of 15 years because I told my new boss I don’t allow my picture online because he wanted to post pics of his staff. Others declined, but I got fired.
You should never be forced to put yourself online.
I think HR would've been on your side for that
it may be seen as very incriminating now, but I think that as the technology gets more common, people start to distance identities from images, videos and etc. and in some time, nobody will regard any image or video as relevant, regardless of its realism. this will likely be very detrimental to many systems that rely on identifying identities based on images, videos and etc. like the legal system.
This sounds philosophical.
@@sentinel9651 Sounds very plausible actually. Once fakes become common, people will start doubting visual material; they have to.
It would be ideal if people could stop caring (different value system around sexuality), but that might run too much against human nature.
me when I read the title:
vox: The most urgent threat of deepfakes isn't politics
me: oh I know vox, I know
ty
The Amish were right
god yeah
You cannot stop this. Just stop postings thousands of photos of yours online. For a flawless deepfake you need a huge amount of data to train your model with. Celebrity photos can be found easily online that's why its easier to produce deepfakes of them.
Even if it becomes illegal its still going to happen it is just sad
Deepfaking technology is getting better and better. From my phone, I could deepfake someone with just one photo of them. It's not that good NOW, but 2 years ago it was completely unthinkable.
@@kwirro That's why I added the word Flawless or you can say indistinguishable. Not posting photos is just a prevention. There is no cure. The only possible solution I see is how we tackle viruses with antiviruses to design algorithms to identify deepfakes.
@@aryant1884 Ok, true.
And people wonder why I have a cat as avatar...
I still think politics is more of an URGENT threat as that could potentially threaten every person in the world, but sure, on a personal level this is incredibly damaging.
The world feels like a giant dumpster fire right now, it is scary.
My dream is to be off the grid within 10 years.
you are forgetting that you are living in the best times ever, judging by materialistic standards. before it was much much harder.
When you give people tools this powerful, it sheds a light on the darkest crevices of humanity.
Crevice 🤣 great choice of wording
I like Cr1tikal's takes on this, "now if I ever get caught doing anything, I can just say it's a deepfake".
I guess there is an upside to most things. But I'm still way more concerned about this being used to frame people for heinous crimes.
What's happening to adult women is horrifying but now I'm thinking of all the naive, ignorant parents uploading reams and reams of photos and videos of their young kids... :(
This is an unfortunate situation to be in. Having your identity abused like this.
Unfortunate but unsuprising it started with photoshop and then it was only a matter of time before it moved to video
Your editor deserves a hike!
His TH-cam is Johnny Harris
@@c4ssiop3ia Whoa he is the same person who did border series!
One of the best series.
tiktok is literally a breeding place for deepfake videos, all these teens. sigh.
Christ, you're right
It’s also a national security threat
This is nowhere near the same thing but it reminds me of when some people would put on realistic masks of black people and rob banks. Black people were actually getting arrested for crimes they weren't doing
Privacy will become a thing of the past, in our high tech future.
I thought there was an expert in the thumbnail
she is an expert in acting though
Why would you expect that from vox
@@xxDOTH3DEWxx There is a researcher from Deeptrace in 1:30
@@perisaizidanehanapi7931 yes but Kristen bell is not
Actors have a huge ego and they think they’re experts on everything... specially politics and climate change.
I just like the fact that people are no longer asking me why I chose not to have any children. It’s just odd that it had to get this bad for some to kind of start getting it.
This is one of the many reasons I quit social media. My privacy is more important than validation.
I rest my case.
I admire Kristen’s resilience in how she handled this and remained strong
I guess women gonna mostly be impacted by this
6:15 I wish internet was more responsible and kinder.. 😔
i wonder how they find these links
research probably. Dont ask what the research was.
There's so much twisted stuff in the world that I just don't know how to live my life anymore.
As a kpop fan this was brought to many attention recently through some channels i watch, its really something that everyone should know about because it can really affect every celebrity and person wth their face online honestly. Its just disgusting that people think they can do it, and i really hope people don’t think this is actually okay. Gotta give it to Vox for also giving informed information that many people should know about, the internets just a dark, scary place.
bruh moment am i right guys
edit: this has nothing to do with the video why did people like this ._.
bruh
bruh
bruh
bruh
bruh..
that moment when i saw the title
Don't sort by newest first, DON'T SORT BY NEWEST FIRST
Y
The amount of people who think they're entitled to sexualize a woman just because they put their face online is honestly sickening
Omg ur tempting them
You made me curious though
i did and i regret it. disturbing comment 😔
I feel like if this was happening to men the issue would be taken way more seriously. But as 99% of the victims are women, it’s just swept under the rug like all of our other issues.
Of course, now go to the kitchen
Why would that be the case?
It does happen to men just in different ways and honestly you can't really stop deepfakes unfortunately
Why is this the first thing some men think of? And don't come for me because I said "men". We all know that's where it started.
delor b and who says it doesn’t happen to men as well
@@theflamethrower867 JEEBUS. Please go back and RE-READ what I posted.
delor b after you read what I said
Blunt accusations/statements don’t mean anything
Rule 34 is always in effect
@jami0070 oh you sweet summers child....
@jami0070 I can't destroy something so precious.
Stay golden Ponyboy, stay golden
Only drawn content exists in rule34...
Yeah yeah.. we all know your "friend" didn't tell you about those videos of your wife
this has been a problem for like 4 years already ..
More than 4 years people have been photoshopping celebrities on naked bodies ever since image editing was a thing, i was gonna say "ever since photoshop was a thing" but i think there were image editing software before adobe photoshop but deep fakes is still pretty newish
That scary because it could mess up your whole life.
I mean, as the problem expands the videos wont be taken seriously anymore and the problem wil kinda fade away
Kristen Bell is an angel and the fact that someone would do that astounds me
Someone liked angels a little too much
So sad & like disrespectful to people & wrong. Very dangerous.
I'm in the 4%, would rather just make goofy face swaps!
No one should be entitled to sexualize others or use their faces without consent, it is wrong. But the thing is, it's the internet and unfortunately everything gets sexualized from objects to people... We have 2 options: we learn to live with this or we try to censor the internet which is almost impossible
I don't know whats real anymore
None of us do. A UFO could land in the middle of Time Square and we’d never be able to tell if it really happened.
We need a Black Mirror episode about Deepfakes immediately
That fact that it originated as a name from reddit is scary
LOL her husband’s friend was watching deepfakes of her 😂😂😂
Ashton Kutcher was likely trying to see if there was deepfakes of his wife Mila Kunis and came across ones for Kristen Bell.
Her husband's freind should go back in time and stop deepfake reasearchers.
@@ahblooloo8639 Someone would've stumbled on it eventually. Machine learning is a booming field and someone would've applied it to faces sooner or later.
@@chaosfire321 This guy Joe Rogans
Anyone else think that you just need to be responsible for that what you post on the internet? IE... you dont need to post selfies on instagram.
But it is so scary.
It's horrible that this is happening, but are you really saying that this is a bigger impact than politics?? Come on now....
That's disgusting. Such a wonderful technology and people use it to abuse women.
Missed the impact on the actors who are having their work stolen and their identity erased. There's more than one hurt along the way here.
I usually don’t care too much when celebrities complain/ speak out about something that’s damaging their image or reputation. But this one I really sympathize with them. How this isn’t illegal is beyond me, I understand why someone that finds the individual actress super attractive but this isn’t how you go about it. Just use your imagination, instead of defaming them on the internet where anybody can find it. This is just wrong, I mean it has to be defined as defamation in some form.
It's sad how naive Kristen Bell sounds at the end of the video. Sad in the sense that something like that is considered naive when she's absolutely right. I along with countless others have seen just how immature, cruel and insensitive people can be online and there's no fixing that without violating privacy and privacy is needed to protect those most vulnerable. Like with the riots currently going on, you can't quite fix it, you just gotta ride it out and that's what we gotta try to online. Ride it out and hope for the best in the end... even though humans don't really work that way.
I'm glad this is getting attention. It will teach people about privacy and critical thinking. You can't take anything you see at the internet at a face value.
Face value lol
When you put personal information online, you forever lose control of what people do with that information. Photos count as information. Nobody has any reasonable expectation of privacy online.
Photosites NEED to start making it very clear what rights users are sublicensing uploading images to their sites in such a way that it's not written in legalese, and end users NEED to start reading the TOS. Most people would have never have forseen that uploading an image of themselves by default sublicenses the company to do whatever they want with said image.
If we have freedom of speech, privacy is the freedom of not speaking.
Thank you for talking about issues that actually matter, Vox.
...As opposed to what, exactly?
I CAN SEE KIM COMING UP WITH AN EXCUSE
Why does Ashton Kutcher know about Kristen Bell's deepfakes?
From his spank bank
@@backwoodsjunkie08 lol
Me, a Two Minute Paper watcher: *I am 4 parallel universes ahead of you*