The most urgent threat of deepfakes isn't politics

แชร์
ฝัง
  • เผยแพร่เมื่อ 7 พ.ย. 2024

ความคิดเห็น • 1.4K

  • @ianlack4417
    @ianlack4417 4 ปีที่แล้ว +4208

    Let’s start the countdown timer. Congress should be able to address this in about 100 years or so.

    • @myeyelashflewoff
      @myeyelashflewoff 4 ปีที่แล้ว +117

      This is so sad and true that it's actually funny

    • @o_o3142
      @o_o3142 4 ปีที่แล้ว +117

      A hundred years? Quite the optimist, aren't you? My money's on a brief chat in 250 and legislation 60 years after that.

    • @tomwaddy9035
      @tomwaddy9035 4 ปีที่แล้ว +20

      *Acknowledge, it will take another 200 years or so to actually do anything

    • @BSKX17
      @BSKX17 4 ปีที่แล้ว +18

      unless the people there became popular faces all of a sudden

    • @K42023
      @K42023 4 ปีที่แล้ว +2

      😂😂

  • @milfordjenkins6113
    @milfordjenkins6113 4 ปีที่แล้ว +2330

    Did anyone else think that Kristen Bell at the start was a deepfake

    • @hongvicodes
      @hongvicodes 4 ปีที่แล้ว +25

      Glad you told me

    • @Emily-Dong
      @Emily-Dong 4 ปีที่แล้ว +6

      Haha yea

    • @Blinkisageek
      @Blinkisageek 4 ปีที่แล้ว +28

      That would have made it sooo much better. Like at the end it's revealed it's a deepfake.

    • @evilmokri6217
      @evilmokri6217 4 ปีที่แล้ว +1

      Yah

    • @timez9663
      @timez9663 4 ปีที่แล้ว

      Yeah D:::

  • @LuxuryPi
    @LuxuryPi 4 ปีที่แล้ว +3225

    This is basically the reason why you should care about privacy of your data. You can't imagine what will be done with it.

    • @thatsterroristsbro7855
      @thatsterroristsbro7855 4 ปีที่แล้ว +54

      A reason. One reason of many. But it'd happen anyway, this is contingent on a few factors, and it is a particular kind of person being targeted by another kind. So, not as neutral as you perhaps are mistaking it to be.

    • @bubrub5564
      @bubrub5564 4 ปีที่แล้ว +34

      @@liang2492 There is still a risk of it happening.

    • @lucidexistance1
      @lucidexistance1 4 ปีที่แล้ว +4

      What's the name of your first pet?
      Where were you born?
      What's your mother's maiden name?

    • @gozolino
      @gozolino 4 ปีที่แล้ว +24

      @@liang2492 That's what you think.
      Then it happens.

    • @MattPratt
      @MattPratt 4 ปีที่แล้ว +6

      In b4 Facebook starts selling all your photos to deepfake makers. Hey a buck is a buck to Zuck.

  • @paulcooper8818
    @paulcooper8818 4 ปีที่แล้ว +928

    It is said that some aboriginal peoples, when first introduced to photographs, thought their soul was being stolen.
    In a sense, now that is possible.

    • @ryancappo
      @ryancappo 4 ปีที่แล้ว +29

      The Amish have the same feeling towards pictures of their faces.

    • @illDefine1
      @illDefine1 4 ปีที่แล้ว +28

      Why is everyone on this site so dramatic? It is not soul thievery, it is a form of identity theft. If you exaggerate everything people tend to take you less seriously after a while ... which is exactly what is happening to people who are considered progressives.

    • @LightningbrotherG
      @LightningbrotherG 4 ปีที่แล้ว +30

      It suddenly clicked for me, like stealing someone's face is bad, but stealing someone's soul, manipulating it as you see it, is horrific.

    • @deiandorinha1207
      @deiandorinha1207 4 ปีที่แล้ว +3

      @@LightningbrotherG :(

    • @unicornsprinkles3277
      @unicornsprinkles3277 4 ปีที่แล้ว +5

      i mean i get what youre going for but this just sounds so pretentious

  • @agrimg
    @agrimg 4 ปีที่แล้ว +1145

    It won’t be long until video, photographs, or audio recordings are no longer considered evidence in a court of law.

    • @tasibsharar7357
      @tasibsharar7357 4 ปีที่แล้ว +5

      Agrim Gupta you can just limit the internet

    • @agrimg
      @agrimg 4 ปีที่แล้ว +77

      @@tasibsharar7357 that is true, but in a lot of places, even security footage is now simply moved to a cloud and not kept in physical drives (near the cameras). Limiting the internet probably won't work, but laws about the digital signatures/prints of video evidence will now have to be rethought.

    • @XavierZara
      @XavierZara 4 ปีที่แล้ว +54

      The response would be a stronger focus on Metadata and device identifiers

    • @beepboopbeepp
      @beepboopbeepp 4 ปีที่แล้ว +77

      Imagine how easy it will be for authoritarian governments to frame people now with deepfakes as evidence

    • @luziealyssa5677
      @luziealyssa5677 4 ปีที่แล้ว +2

      Where I live they are already not considered evidence because they can easily be tampered with

  • @loof470
    @loof470 4 ปีที่แล้ว +3060

    Imagine if all the people they had in the video were also deepfakes

    • @samazwe
      @samazwe 4 ปีที่แล้ว +92

      That'd be soooo meta

    • @FIGHTwPRIDE
      @FIGHTwPRIDE 4 ปีที่แล้ว +25

      Oh you took the Red Pill 👍

    • @MimixLight
      @MimixLight 4 ปีที่แล้ว +9

      Duncan Kim Can someone please elaborate the “red pill, blue pill” thing? Thanks

    • @MortyMortyMorty
      @MortyMortyMorty 4 ปีที่แล้ว +9

      Tbh if we live in a simulation we may all be deepfakes in a computer. Oh wait I didn't want to go this deep.

    • @anneemull
      @anneemull 4 ปีที่แล้ว +3

      MimixLight it's from the story line of the movie "The Matrix"

  • @flacadiabla3193
    @flacadiabla3193 4 ปีที่แล้ว +2521

    This is why I've never been a fan of showing off my life online.

    • @Guest-sl2qb
      @Guest-sl2qb 4 ปีที่แล้ว +64

      The suspect could've used her face from her movies..

    • @Theblueseaweedsoawesome
      @Theblueseaweedsoawesome 4 ปีที่แล้ว +80

      Nobody care about ur life

    • @SR009s
      @SR009s 4 ปีที่แล้ว +59

      Who tf is going to deepfake you?

    • @FreakieFan
      @FreakieFan 4 ปีที่แล้ว +88

      @@SR009s
      That's probably Noelle thought too, but she got deepfaked anyway.

    • @RenbroNL
      @RenbroNL 4 ปีที่แล้ว +22

      Yeah well that is only going to help people for now. This technology will advance and in the future it won't require a big social media presence, maybe even 1 photo will do.
      And even then. A big social media presence doesn't justify this at all.

  • @bruno3
    @bruno3 4 ปีที่แล้ว +664

    Celebrities are the ones most at risk but not the ones who will suffer the most. They have the means to clarify the situation and reach people through the media. Other people don't. They won't have experts analysing the video and popular web sites and media outlets clarifying the situation. Most people will most likely believe it's the real deal, or at least consider de possibility of it being real. But it's not that you couldn't do it already to pictures with photoshop and similar software. It's just that most people didn't know you could now do it to videos as well, but it's a matter of time until they do.

  • @princeshezy007
    @princeshezy007 4 ปีที่แล้ว +697

    Sadly the more awareness this topic gets there will be even more traction to those websites. It's a lose lose situation to fight against the internet.

    • @tc2241
      @tc2241 4 ปีที่แล้ว +75

      Yup, the more you discuss it and try to ban it, the more it grows. We should use this time to develop tools to identify deepfakes as its explosion is right around the corner.

    • @daniilgrankin7658
      @daniilgrankin7658 4 ปีที่แล้ว +30

      Exactly mate, It's time to think about how to live with that problem, not how to prevent it

    • @Plasticityy
      @Plasticityy 4 ปีที่แล้ว +3

      I agree

    • @hetmonster2
      @hetmonster2 4 ปีที่แล้ว +4

      @@tc2241 Already exists.

    • @tc2241
      @tc2241 4 ปีที่แล้ว +3

      hetmonster2 ya, but it’s not in heavy use like detecting counterfeits. It should be built into video players, uploaders, browsers, etc. we’re nowhere near that

  • @starcherry6814
    @starcherry6814 4 ปีที่แล้ว +577

    Abusive exes could use this on their victims

    • @penname8441
      @penname8441 4 ปีที่แล้ว +69

      That's the terrifying part.

    • @fleshtaffy
      @fleshtaffy 4 ปีที่แล้ว +6

      Could be a worthy punishment

    • @fcgHenden
      @fcgHenden 4 ปีที่แล้ว +80

      @@fleshtaffy I hope you just misread that. 😬

    • @FoxyNinetails
      @FoxyNinetails 4 ปีที่แล้ว +36

      By what I've seen, that's already happening. It should have been made illegal yesterday, but it's not. Allowing deepfakes of people like this will only feed into potential rapists and abusers, making them more likely to commit a serious crime later on with the real deal.

    • @fleshtaffy
      @fleshtaffy 4 ปีที่แล้ว +1

      @@fcgHenden Nah. It's quite interesting seeing how white people justify things.

  • @spooderman2616
    @spooderman2616 4 ปีที่แล้ว +1775

    I think my Dad is a deepfake, I havent seen him in 9 years.

  • @vincenttjia
    @vincenttjia 4 ปีที่แล้ว +1433

    Tiktok: booming
    Deepfakes: it's free real estate

    • @megaraph5551
      @megaraph5551 4 ปีที่แล้ว +3

      🤣

    • @marcya4428
      @marcya4428 4 ปีที่แล้ว +1

      @@Lennard222 Still not ok buddy

    • @jonny__b
      @jonny__b 4 ปีที่แล้ว +12

      Do people even make original jokes anymore?

    • @cshaps1212
      @cshaps1212 4 ปีที่แล้ว

      *social media: Booming*

    • @ccricers
      @ccricers 4 ปีที่แล้ว +1

      From social media to solipsism

  • @JoLum_doesnt_eat_cheese_a_lot
    @JoLum_doesnt_eat_cheese_a_lot 4 ปีที่แล้ว +528

    imagine victim blaming someone for having a face ☹️ taking a selfie is 'asking for it' now?

    • @hmbs1630
      @hmbs1630 4 ปีที่แล้ว +163

      and you can be guaranteed men are currently jumping through the mental hoops required to both justify it and blame women for it.

    • @BaM7Up
      @BaM7Up 4 ปีที่แล้ว +4

      T K yes

    • @tostupidforname
      @tostupidforname 4 ปีที่แล้ว +17

      Its obviously not the victims fault but yes everybody should be ready for stuff like this when making your data public.

    • @fleshtaffy
      @fleshtaffy 4 ปีที่แล้ว +10

      The feminists are rampant in 2020. You guys are like little flies.

    • @crogers3602
      @crogers3602 4 ปีที่แล้ว +91

      @@fleshtaffy you're a misogynist, and a poor excuse of a man

  • @goodman25
    @goodman25 4 ปีที่แล้ว +596

    Good thing I have no face.

  • @standdbyme
    @standdbyme 4 ปีที่แล้ว +833

    This is sick on so many levels.

    • @canti7951
      @canti7951 4 ปีที่แล้ว +36

      It's disrespectful but how is it an urgent threat though?

    • @neszt9325
      @neszt9325 4 ปีที่แล้ว +27

      welcome to the internet

    • @a2pabmb2
      @a2pabmb2 4 ปีที่แล้ว +36

      Someone clearly isn't familiar with Rule#34

    • @gozolino
      @gozolino 4 ปีที่แล้ว +35

      @@a2pabmb2 Cartoons aren't the same thing as a deepfake.

    • @desiree7633
      @desiree7633 4 ปีที่แล้ว +70

      @@canti7951 these people did not consent for this. that's the problem. it could ruin their lives

  • @lifevest1
    @lifevest1 4 ปีที่แล้ว +108

    Man this has the making for an incredible Black Mirror episode. This is some messed up sh*t!

    • @patrickb1439
      @patrickb1439 3 ปีที่แล้ว +4

      Theyve already done it- "be right back"!

  • @louispride7695
    @louispride7695 4 ปีที่แล้ว +578

    This is disgusting. Even though I’m not a woman this makes me want to delete or at least minimise the use of my social media.

    • @IsThisRain
      @IsThisRain 4 ปีที่แล้ว +25

      I agree. If it makes you feel better though, @iamdeepfaker on Instagram uses the technology pretty responsibly. So this is a pretty double-edged sword, although I believe all of us can live without this technology.

    • @itdc2219
      @itdc2219 4 ปีที่แล้ว +16

      or just not post pictures of you. i just post cats lol. do a deepfake of them

    • @hanhong2267
      @hanhong2267 4 ปีที่แล้ว +20

      If you're concerned, you don't have to get off social media completely, but deleting any pictures or videos that aren't important off your social media account could minimize the chances of getting deepfaked.

    • @j.j.juggernaut9709
      @j.j.juggernaut9709 4 ปีที่แล้ว +6

      It won't be just women in danger because of these deepfakes.

    • @jonny__b
      @jonny__b 4 ปีที่แล้ว +3

      Just delete it dude.

  • @Sela2125
    @Sela2125 4 ปีที่แล้ว +408

    I think this gets into the question of “just because you can, should you?” Novel technology has allowed for scenarios that we could scarcely imagine just a few years ago. We have seen this technology used by Hollywood to bring back stars that have passed away. The ethics of that are still being debated, but at least the parties involved (to my knowledge) have given consent. So, there are arguably benevolent uses for this particular technology. What mechanisms, be they legal, technological or otherwise, can we use to protect people’s privacy and self-ownership while not abridging the pursuit of knowledge upon which technology is built? How do we get people to take a step back and think, “should I do this?”

    • @thinkabout602
      @thinkabout602 4 ปีที่แล้ว +6

      much too late for that

  • @AS-ke6co
    @AS-ke6co 4 ปีที่แล้ว +275

    We're providing way too much data on the internet.

    • @luziealyssa5677
      @luziealyssa5677 4 ปีที่แล้ว +29

      But that's not the problem here, the problem is that people abuse this data

    • @Sovereignless_Soul
      @Sovereignless_Soul 4 ปีที่แล้ว +25

      @@luziealyssa5677 you can't stop people, but you can stop uploading data.

    • @bubrub5564
      @bubrub5564 4 ปีที่แล้ว +5

      @@Sovereignless_Soul Which just defeats the point of the internet.

    • @perisaizidanehanapi7931
      @perisaizidanehanapi7931 4 ปีที่แล้ว +15

      @@Sovereignless_Soul Interesting point there, but unfortunately some people do their job through social media presence. So, I think to blame the victim is really not the final answer we want to pursue.

    • @leelee-rf6px
      @leelee-rf6px 4 ปีที่แล้ว

      @@Sovereignless_Soul The internet won't be all that interesting now would it

  • @paulfernandez8820
    @paulfernandez8820 4 ปีที่แล้ว +68

    kristen bell is so calm I like how she was interviewed

  • @MrBristolian
    @MrBristolian 4 ปีที่แล้ว +70

    I have huge respect for Kristen and Noelle for talking so calmly and authoritatively about something that must have been so upsetting.

  • @Laurence2000
    @Laurence2000 4 ปีที่แล้ว +102

    “I’d be thrilled that someone found me attractive”
    I so badly want to put this claim to the test.

    • @d4vian398
      @d4vian398 3 ปีที่แล้ว +34

      The one who said that is probably people who are incels.

    • @stitches768
      @stitches768 3 ปีที่แล้ว +7

      @@d4vian398 100%

    • @temtem9255
      @temtem9255 3 ปีที่แล้ว +6

      @@d4vian398 or someone with a different opinion?

    • @MA-yu2ss
      @MA-yu2ss 3 ปีที่แล้ว +3

      @@temtem9255 who will be degraded more?

    • @bimbo6861
      @bimbo6861 3 ปีที่แล้ว

      it's because they're imagining an attractive woman behind that "someone", not an obese, sweaty, 45 years old man. the latter is our reality.

  • @williamhild1793
    @williamhild1793 4 ปีที่แล้ว +30

    I'm old. I Didn't grow up with computers. Never, in a million years, would a 25-year old me believe that many years in the future, something like this would come about. Just beyond belief.

    • @Chan-md2hb
      @Chan-md2hb ปีที่แล้ว

      have you never heard of photoshop? XD

  • @timmm8686
    @timmm8686 4 ปีที่แล้ว +326

    About time that Somebody talks about it. Happened for years

    • @harrylane4
      @harrylane4 4 ปีที่แล้ว +11

      This has been in headlines for, like, three years.

    • @timmm8686
      @timmm8686 4 ปีที่แล้ว

      @@harrylane4 never saw it. Only individual cases in which stars talked about it

  • @adityasuri4009
    @adityasuri4009 4 ปีที่แล้ว +114

    I feel so bad for this woman. People are so cruel

  • @Yaruh
    @Yaruh 4 ปีที่แล้ว +674

    The people who discovered deepfakes, were obviously looking for deepfakes

    • @friggasring
      @friggasring 4 ปีที่แล้ว +47

      Actually the same Bill Hader clip they showed in this came up in my TH-cam recommendations because I was watching Bill Hader clips. I'm sure I had heard of DeepFakes before, but that was the first time I was really exposed to them.

    • @sebastian-benedictflore
      @sebastian-benedictflore 4 ปีที่แล้ว +69

      Referring to Ashton Kutcher "stumbling upon" Kristen Bell's deepfake?

    • @ropytube
      @ropytube 4 ปีที่แล้ว +19

      Sebastian-Benedict Flore he has an organisation named Thorn, look it up

    • @crimsoncloud6352
      @crimsoncloud6352 4 ปีที่แล้ว

      Yeah lol her husband...

    •  4 ปีที่แล้ว +32

      @@sebastian-benedictflore Ashton Kutcher has a foundation call thorn that specializes on tracing kidnapped children on the dark web to save them from child prostitution and trafficking, so it is entirely possible that he knew about Kristen bells deepfake because of this

  • @duyguozkann
    @duyguozkann 4 ปีที่แล้ว +78

    A sad smile at Kristen Bell's face at the end got me :(

  • @zakki5630
    @zakki5630 4 ปีที่แล้ว +206

    The people who made those deepfakes are definitely going to the bad place

    • @bidbux9500
      @bidbux9500 4 ปีที่แล้ว +36

      Dude, EVERYONE is going to the "bad place".

    • @MaDKrEvEdKo
      @MaDKrEvEdKo 4 ปีที่แล้ว +13

      Yeah man, seeing how the 2020 is going, we have all went to the bad place.

    • @canti7951
      @canti7951 4 ปีที่แล้ว +2

      This sounds like satire. I hope it is.

    • @fleshtaffy
      @fleshtaffy 4 ปีที่แล้ว

      This is sarcasm. The vast majority of us will be there.

    • @fcgHenden
      @fcgHenden 4 ปีที่แล้ว

      @@bidbux9500 He didn't know. 😨

  • @Therealmykag
    @Therealmykag 4 ปีที่แล้ว +115

    We were so excited when we got the rainbow barfs and dog ears... and now here we are.

  • @clp480
    @clp480 4 ปีที่แล้ว +61

    Imagine if this is used to frame someone for a crime

    • @PHlophe
      @PHlophe 4 ปีที่แล้ว +9

      china is making good use of it

    • @nameq
      @nameq 4 ปีที่แล้ว +2

      more like imagine evidence no longer being credible when you have just some media recordings
      like... imagine people logically evolving

    • @akzebraminer
      @akzebraminer 3 ปีที่แล้ว

      Videos and audio would need to have a certified and trusted source in the future, rather than having the video or audio prove itself on its own.

  • @imo.o
    @imo.o 4 ปีที่แล้ว +53

    Of course it’s mostly happening to women. I’m annoyed that I’m not surprised.

    • @deiandorinha1207
      @deiandorinha1207 4 ปีที่แล้ว +29

      as always we are the ones being abused by whatever new thing men find to objetify us and use our bodys against our will for their pleasure with no consent as usual... not shocked, sadly 😔

    • @deiandorinha1207
      @deiandorinha1207 4 ปีที่แล้ว +4

      @Bugler55 yeah... so an isolated case defines it all doesn't it... think about the majority of the cases and not only about what you choose to see

    • @deiandorinha1207
      @deiandorinha1207 4 ปีที่แล้ว +2

      @Bugler55 would you judge a guy the same way if he was doing the same thing? just asking, really

    • @nameq
      @nameq 4 ปีที่แล้ว +4

      well it couldn't be a surprise. it's not like with deepfakes being invented there's more evil in the world. it's all same old evil (people) playing with its new toys not thinking about the others.

  • @charlieliang6883
    @charlieliang6883 4 ปีที่แล้ว +170

    Scary.

  • @redoktopus3047
    @redoktopus3047 4 ปีที่แล้ว +34

    And people made fun of me when I was super concerned about everyone sharing everything online

    • @sfullernj
      @sfullernj ปีที่แล้ว

      We should have all listened to you

  • @daviddima6067
    @daviddima6067 4 ปีที่แล้ว +108

    I thought staying inside is safe enough for me, turns out im wrong

  • @uss_04
    @uss_04 4 ปีที่แล้ว +50

    01:20
    “This is my face, this belongs to me”
    Deepfakers:
    “All your face are belong to us”

    • @Yeern101
      @Yeern101 4 ปีที่แล้ว +3

      Soviet union vibes

    • @Cratees
      @Cratees 4 ปีที่แล้ว +1

      HAHAHAHA

    • @JK-yj7cg
      @JK-yj7cg 3 ปีที่แล้ว +1

      *Our Face*

  • @crunchypastries713
    @crunchypastries713 4 ปีที่แล้ว +109

    I feel super bad for the victims. Imagine it being viral and toxic people will bully them. It will deeply impact them in a negative way and may bring depression into their lives. And the people who are making these deep fakes are very inhumane. Humanity is messed up smh

    • @deiandorinha1207
      @deiandorinha1207 4 ปีที่แล้ว +8

      this people might feel the impact of this for their entire lifes... it's simply sad

    • @nameq
      @nameq 4 ปีที่แล้ว

      it won't be as impactful once everyone knows that media (previously text, photo, now video) is mostly not credible

  • @m.a.3322
    @m.a.3322 4 ปีที่แล้ว +65

    Wow, what has humanity come to? This is dark and unacceptable.

    • @Terjecs
      @Terjecs 4 ปีที่แล้ว +22

      this isnt close to the worst people have done lol

    • @m.a.3322
      @m.a.3322 4 ปีที่แล้ว +6

      @@Terjecs lol no, this is pretty fkn malicious.

    • @yellovvdoe
      @yellovvdoe 4 ปีที่แล้ว +16

      @@Terjecs still, this is horrible. Nobody should be doing this to anyone.

    • @thatsterroristsbro7855
      @thatsterroristsbro7855 4 ปีที่แล้ว +4

      No, this is mankind. Again.

    • @Munchausenification
      @Munchausenification 4 ปีที่แล้ว +4

      It's actually a historical thing almost. Back in Medieval times you had Pretenders (people who look and sound like heirs to the throne of a country) imitating for personal gain of power and wealth.

  • @NasaRacer
    @NasaRacer 4 ปีที่แล้ว +100

    This feels just like the concern and outrage we had when Photoshop became a big thing. Turned out there wasn't much that could be done about it and we all got used to it. Sadly this is most likely what will happen with this technology as well. No putting the genie back in the bottle.

    • @chaseleswift252
      @chaseleswift252 4 ปีที่แล้ว +3

      Yeah, right now there are ups and downs to this technology that are emerging, still we have the opportunity to learn from past and control this. I wouldn't be surprised if something big happens again by this point. Hopefully nothing bad will happen. :/

    • @arielhernandez1638
      @arielhernandez1638 2 ปีที่แล้ว +1

      Yeah I agree that its not nice, and I'm sorry that this is happening, but there is really no way to stop this completely, except in ways that would be too extreme. You can discourage it, but not stop it entirely. Its just not worth it to try and make a big deal about this, because they ONLY way to stop this is to have an extremely controlling internet that isn't free anymore. It's sad, but you need to be resilient and just deal with it.

    • @glutenfree7057
      @glutenfree7057 ปีที่แล้ว +2

      You could put regulations over it. No need to get all "we can't do anything about it", because we can say that about literally everything. "We can't do anything about stealing because people are gonna do it anyway" or whatever are just weak excuses.

  • @skywriting33
    @skywriting33 4 ปีที่แล้ว +31

    Another reason to protect your children’s faces from social media.
    Parents plaster their pages with pics of their kids. If they do this with adults why not children? Very scary.

  • @IchabodNyx
    @IchabodNyx 4 ปีที่แล้ว +76

    I'm just impressed you got Tom Cruise's consent to having his face digitally edited into this video.

    • @chinguunerdenebadrakh7022
      @chinguunerdenebadrakh7022 4 ปีที่แล้ว +1

      Did they really get the consent tho?

    • @blakedake19
      @blakedake19 4 ปีที่แล้ว +11

      Exactly, only for women VIPs there is a problem, for everyone else shut up and let the internet flow

    • @EddieMachetti
      @EddieMachetti 4 ปีที่แล้ว +7

      But Tom cruise is man, so he and how he feels does not matter.

    • @tostupidforname
      @tostupidforname 4 ปีที่แล้ว +2

      good point actually.

  • @rinopw4262
    @rinopw4262 4 ปีที่แล้ว +81

    Why am I afraid this vid actually gave ideas to some people who haven't thought about this..

    • @canti7951
      @canti7951 4 ปีที่แล้ว +3

      You're afraid because you know you're right.

    • @abb4538
      @abb4538 4 ปีที่แล้ว +18

      @@2NVS basically make sure there are no videos or multiple candid photographs of you online-- which is almost impossible when social media is so prevalent in our society. It's frustrating that the video's conclusion was essentially "you could be next lol good luck"

    • @tostupidforname
      @tostupidforname 4 ปีที่แล้ว +1

      @@2NVS not much is needed.

    • @tostupidforname
      @tostupidforname 4 ปีที่แล้ว +3

      Nah deepfakes are well known for everyone even remotely interested in machine learning.

  • @tc2241
    @tc2241 4 ปีที่แล้ว +50

    Seeing the comments, I’m actually surprised by the amount of people who haven’t known about this. It’s been talked about and documented for years now. I guess some people have to wait for things to hit their ‘social bubble’.

    • @fleshtaffy
      @fleshtaffy 4 ปีที่แล้ว +6

      Most people are only aware of what's being talked about in their lunch room or what's in the headlines. I envy them though. Ignorance truly is bliss.

    • @leilanidru7506
      @leilanidru7506 4 ปีที่แล้ว +2

      I’m not surprised by the amount of people who either aren’t taking this seriously or don’t really see it as an issue

    • @tostupidforname
      @tostupidforname 4 ปีที่แล้ว +1

      yeah me too

    • @leilanidru7506
      @leilanidru7506 4 ปีที่แล้ว

      Bugler55 I’m not surprised by your lack of surprise. The nonchalance to women’s images being sexually exploited and degraded online without their consent is expected.

  • @mmmfuhlendorf
    @mmmfuhlendorf 4 ปีที่แล้ว +19

    William Gibson wrote about this 20 years ago, people thought it was exaggerated...

  • @raydhanitra
    @raydhanitra 4 ปีที่แล้ว +69

    "I found your deepfake"
    "Errr actually that was real"
    :O

    • @360stab
      @360stab 4 ปีที่แล้ว +5

      People don't look at it from this side. It will give everyone a way to deny embarrassing videos.

    • @Toarcade
      @Toarcade 4 ปีที่แล้ว +4

      ".....yes.... right... the deepfake..."

  • @allencraig02
    @allencraig02 4 ปีที่แล้ว +7

    It's not "the internet" that needs to be kinder or more considerate. It's people. Specific people making specific decisions to do something that's scumbaggy. Many people are scumbags and the internet allows them to be their worst selves anonymously-but it doesn't MAKE them be scumbags, it only gives them a vehicle to have an audience.

  • @pavarottiaardvark3431
    @pavarottiaardvark3431 4 ปีที่แล้ว +35

    This raises some serious and concerning questions about the nature of the digital self. Who owns images of us? Do we own the idea of our likeness?
    If so, do people have the right to take our photo in public?
    But if NOT, would we also not have the right to stop deepfakes made from those photos?

    • @deceiver123m
      @deceiver123m 4 ปีที่แล้ว +3

      Some celebrities took steps to patent their likeness etc. A early bid to be proactive in this new era.

    • @deceiver123m
      @deceiver123m 4 ปีที่แล้ว

      But can a bright up and comer beat a celebrity into patenting their likeness. $$$

    • @RampageG4mer
      @RampageG4mer 4 ปีที่แล้ว +5

      You absolutely do not own your likeness

    • @TengkuAmier
      @TengkuAmier 4 ปีที่แล้ว

      @@RampageG4mer Religiously Maybe but in scientific terms yes

  • @terrainvictus1210
    @terrainvictus1210 4 ปีที่แล้ว +73

    Good thing I only post memes.

  • @wearelegion1163
    @wearelegion1163 4 ปีที่แล้ว +28

    This is one of the reasons I don’t ever post pics of myself online. I was even fired from my job of 15 years because I told my new boss I don’t allow my picture online because he wanted to post pics of his staff. Others declined, but I got fired.

    • @thehufflepuffhermione
      @thehufflepuffhermione 4 ปีที่แล้ว +28

      You should never be forced to put yourself online.

    • @mooominpapa
      @mooominpapa 4 ปีที่แล้ว +10

      I think HR would've been on your side for that

  • @Alexander-xo5ho
    @Alexander-xo5ho 4 ปีที่แล้ว +26

    it may be seen as very incriminating now, but I think that as the technology gets more common, people start to distance identities from images, videos and etc. and in some time, nobody will regard any image or video as relevant, regardless of its realism. this will likely be very detrimental to many systems that rely on identifying identities based on images, videos and etc. like the legal system.

    • @sentinel9651
      @sentinel9651 4 ปีที่แล้ว +2

      This sounds philosophical.

    • @HiAdrian
      @HiAdrian 4 ปีที่แล้ว +1

      @@sentinel9651 Sounds very plausible actually. Once fakes become common, people will start doubting visual material; they have to.
      It would be ideal if people could stop caring (different value system around sexuality), but that might run too much against human nature.

  • @monday8585
    @monday8585 4 ปีที่แล้ว +30

    me when I read the title:
    vox: The most urgent threat of deepfakes isn't politics
    me: oh I know vox, I know

  • @ErraticMagics
    @ErraticMagics 4 ปีที่แล้ว +15

    The Amish were right

  • @aryant1884
    @aryant1884 4 ปีที่แล้ว +7

    You cannot stop this. Just stop postings thousands of photos of yours online. For a flawless deepfake you need a huge amount of data to train your model with. Celebrity photos can be found easily online that's why its easier to produce deepfakes of them.

    • @mended8774
      @mended8774 4 ปีที่แล้ว

      Even if it becomes illegal its still going to happen it is just sad

    • @kwirro
      @kwirro 4 ปีที่แล้ว

      Deepfaking technology is getting better and better. From my phone, I could deepfake someone with just one photo of them. It's not that good NOW, but 2 years ago it was completely unthinkable.

    • @aryant1884
      @aryant1884 4 ปีที่แล้ว

      @@kwirro That's why I added the word Flawless or you can say indistinguishable. Not posting photos is just a prevention. There is no cure. The only possible solution I see is how we tackle viruses with antiviruses to design algorithms to identify deepfakes.

    • @kwirro
      @kwirro 4 ปีที่แล้ว

      @@aryant1884 Ok, true.

  • @patrik5123
    @patrik5123 4 ปีที่แล้ว +38

    And people wonder why I have a cat as avatar...
    I still think politics is more of an URGENT threat as that could potentially threaten every person in the world, but sure, on a personal level this is incredibly damaging.

  • @Sorrys0rry
    @Sorrys0rry 4 ปีที่แล้ว +56

    The world feels like a giant dumpster fire right now, it is scary.

    • @MaryLeighLear
      @MaryLeighLear 4 ปีที่แล้ว +2

      My dream is to be off the grid within 10 years.

    • @nameq
      @nameq 4 ปีที่แล้ว +1

      you are forgetting that you are living in the best times ever, judging by materialistic standards. before it was much much harder.

  • @Razear
    @Razear 4 ปีที่แล้ว +11

    When you give people tools this powerful, it sheds a light on the darkest crevices of humanity.

    • @sfullernj
      @sfullernj ปีที่แล้ว

      Crevice 🤣 great choice of wording

  • @likira111
    @likira111 4 ปีที่แล้ว +20

    I like Cr1tikal's takes on this, "now if I ever get caught doing anything, I can just say it's a deepfake".

    • @ekaterinavalinakova2643
      @ekaterinavalinakova2643 3 ปีที่แล้ว

      I guess there is an upside to most things. But I'm still way more concerned about this being used to frame people for heinous crimes.

  • @davidmelgar1197
    @davidmelgar1197 4 ปีที่แล้ว +3

    What's happening to adult women is horrifying but now I'm thinking of all the naive, ignorant parents uploading reams and reams of photos and videos of their young kids... :(

  • @desmonddesjarlais2697
    @desmonddesjarlais2697 4 ปีที่แล้ว +9

    This is an unfortunate situation to be in. Having your identity abused like this.

    • @christhelostsoul9927
      @christhelostsoul9927 4 ปีที่แล้ว +1

      Unfortunate but unsuprising it started with photoshop and then it was only a matter of time before it moved to video

  • @maytia7
    @maytia7 4 ปีที่แล้ว +51

    Your editor deserves a hike!

    • @c4ssiop3ia
      @c4ssiop3ia 4 ปีที่แล้ว +2

      His TH-cam is Johnny Harris

    • @maytia7
      @maytia7 4 ปีที่แล้ว +7

      @@c4ssiop3ia Whoa he is the same person who did border series!
      One of the best series.

  • @Hdidbi_3049
    @Hdidbi_3049 4 ปีที่แล้ว +10

    tiktok is literally a breeding place for deepfake videos, all these teens. sigh.

    • @stitches768
      @stitches768 3 ปีที่แล้ว

      Christ, you're right

    • @akzebraminer
      @akzebraminer 3 ปีที่แล้ว

      It’s also a national security threat

  • @pyramid_iremide
    @pyramid_iremide 4 ปีที่แล้ว +19

    This is nowhere near the same thing but it reminds me of when some people would put on realistic masks of black people and rob banks. Black people were actually getting arrested for crimes they weren't doing

  • @Aero7SVR
    @Aero7SVR 4 ปีที่แล้ว +10

    Privacy will become a thing of the past, in our high tech future.

  • @sarveshsawant9564
    @sarveshsawant9564 4 ปีที่แล้ว +41

    I thought there was an expert in the thumbnail

    • @commentmachine1457
      @commentmachine1457 4 ปีที่แล้ว +9

      she is an expert in acting though

    • @xxDOTH3DEWxx
      @xxDOTH3DEWxx 4 ปีที่แล้ว +4

      Why would you expect that from vox

    • @perisaizidanehanapi7931
      @perisaizidanehanapi7931 4 ปีที่แล้ว

      @@xxDOTH3DEWxx There is a researcher from Deeptrace in 1:30

    • @xxDOTH3DEWxx
      @xxDOTH3DEWxx 4 ปีที่แล้ว +1

      @@perisaizidanehanapi7931 yes but Kristen bell is not

    • @luisuribe5457
      @luisuribe5457 4 ปีที่แล้ว

      Actors have a huge ego and they think they’re experts on everything... specially politics and climate change.

  • @selmahare
    @selmahare 2 หลายเดือนก่อน +1

    I just like the fact that people are no longer asking me why I chose not to have any children. It’s just odd that it had to get this bad for some to kind of start getting it.

  • @opalindigo2984
    @opalindigo2984 4 ปีที่แล้ว +7

    This is one of the many reasons I quit social media. My privacy is more important than validation.
    I rest my case.

  • @Generouslife153
    @Generouslife153 9 หลายเดือนก่อน +1

    I admire Kristen’s resilience in how she handled this and remained strong

  • @yangsinful
    @yangsinful 4 ปีที่แล้ว +19

    I guess women gonna mostly be impacted by this

  • @abhijitmeti1611
    @abhijitmeti1611 4 ปีที่แล้ว +9

    6:15 I wish internet was more responsible and kinder.. 😔

  • @persiadance7605
    @persiadance7605 4 ปีที่แล้ว +26

    i wonder how they find these links

    • @derpatel9760
      @derpatel9760 4 ปีที่แล้ว +5

      research probably. Dont ask what the research was.

  • @crazziii_
    @crazziii_ 4 ปีที่แล้ว +3

    There's so much twisted stuff in the world that I just don't know how to live my life anymore.

  • @oli9757
    @oli9757 4 ปีที่แล้ว +4

    As a kpop fan this was brought to many attention recently through some channels i watch, its really something that everyone should know about because it can really affect every celebrity and person wth their face online honestly. Its just disgusting that people think they can do it, and i really hope people don’t think this is actually okay. Gotta give it to Vox for also giving informed information that many people should know about, the internets just a dark, scary place.

  • @Luna-fo8np
    @Luna-fo8np 4 ปีที่แล้ว +111

    bruh moment am i right guys
    edit: this has nothing to do with the video why did people like this ._.

  • @pmm1767
    @pmm1767 4 ปีที่แล้ว +60

    Don't sort by newest first, DON'T SORT BY NEWEST FIRST

    • @strange498
      @strange498 4 ปีที่แล้ว +5

      Y

    • @thekosmickollector7748
      @thekosmickollector7748 4 ปีที่แล้ว +52

      The amount of people who think they're entitled to sexualize a woman just because they put their face online is honestly sickening

    • @frederickvictor2038
      @frederickvictor2038 4 ปีที่แล้ว +1

      Omg ur tempting them

    • @perisaizidanehanapi7931
      @perisaizidanehanapi7931 4 ปีที่แล้ว +1

      You made me curious though

    • @farhanari9547
      @farhanari9547 4 ปีที่แล้ว

      i did and i regret it. disturbing comment 😔

  • @himynameis3102
    @himynameis3102 4 ปีที่แล้ว +10

    I feel like if this was happening to men the issue would be taken way more seriously. But as 99% of the victims are women, it’s just swept under the rug like all of our other issues.

    • @jaydeepbose4501
      @jaydeepbose4501 4 ปีที่แล้ว +2

      Of course, now go to the kitchen

    • @tostupidforname
      @tostupidforname 4 ปีที่แล้ว

      Why would that be the case?

    • @christhelostsoul9927
      @christhelostsoul9927 4 ปีที่แล้ว

      It does happen to men just in different ways and honestly you can't really stop deepfakes unfortunately

  • @delorbb2298
    @delorbb2298 4 ปีที่แล้ว +7

    Why is this the first thing some men think of? And don't come for me because I said "men". We all know that's where it started.

    • @theflamethrower867
      @theflamethrower867 4 ปีที่แล้ว

      delor b and who says it doesn’t happen to men as well

    • @delorbb2298
      @delorbb2298 4 ปีที่แล้ว

      @@theflamethrower867 JEEBUS. Please go back and RE-READ what I posted.

    • @theflamethrower867
      @theflamethrower867 4 ปีที่แล้ว +1

      delor b after you read what I said
      Blunt accusations/statements don’t mean anything

  • @Crick1952
    @Crick1952 4 ปีที่แล้ว +22

    Rule 34 is always in effect

    • @DawingmanT900
      @DawingmanT900 4 ปีที่แล้ว +13

      @jami0070 oh you sweet summers child....

    • @Crick1952
      @Crick1952 4 ปีที่แล้ว +9

      @jami0070 I can't destroy something so precious.
      Stay golden Ponyboy, stay golden

    • @christhelostsoul9927
      @christhelostsoul9927 4 ปีที่แล้ว +1

      Only drawn content exists in rule34...

  • @kamranakrami3
    @kamranakrami3 4 ปีที่แล้ว +5

    Yeah yeah.. we all know your "friend" didn't tell you about those videos of your wife

  • @CH-vr2dl
    @CH-vr2dl 4 ปีที่แล้ว +8

    this has been a problem for like 4 years already ..

    • @seanbelkom9094
      @seanbelkom9094 4 ปีที่แล้ว +3

      More than 4 years people have been photoshopping celebrities on naked bodies ever since image editing was a thing, i was gonna say "ever since photoshop was a thing" but i think there were image editing software before adobe photoshop but deep fakes is still pretty newish

  • @phillywilly4155
    @phillywilly4155 4 ปีที่แล้ว +6

    That scary because it could mess up your whole life.

    • @temtem9255
      @temtem9255 3 ปีที่แล้ว +2

      I mean, as the problem expands the videos wont be taken seriously anymore and the problem wil kinda fade away

  • @JamesBond-rb1ln
    @JamesBond-rb1ln 4 ปีที่แล้ว +12

    Kristen Bell is an angel and the fact that someone would do that astounds me

    • @cyrusthe0ther795
      @cyrusthe0ther795 ปีที่แล้ว

      Someone liked angels a little too much

  • @lilyvalley5389
    @lilyvalley5389 3 ปีที่แล้ว +4

    So sad & like disrespectful to people & wrong. Very dangerous.

  • @DarthDravvid
    @DarthDravvid 4 ปีที่แล้ว +17

    I'm in the 4%, would rather just make goofy face swaps!

  • @TugaThings
    @TugaThings 4 ปีที่แล้ว +6

    No one should be entitled to sexualize others or use their faces without consent, it is wrong. But the thing is, it's the internet and unfortunately everything gets sexualized from objects to people... We have 2 options: we learn to live with this or we try to censor the internet which is almost impossible

  • @nutbreaker7482
    @nutbreaker7482 4 ปีที่แล้ว +7

    I don't know whats real anymore

    • @RezValla
      @RezValla 4 ปีที่แล้ว

      None of us do. A UFO could land in the middle of Time Square and we’d never be able to tell if it really happened.

  • @guenstigvegan
    @guenstigvegan 4 ปีที่แล้ว +7

    We need a Black Mirror episode about Deepfakes immediately

  • @firehot9578
    @firehot9578 4 ปีที่แล้ว +3

    That fact that it originated as a name from reddit is scary

  • @harutomishima3114
    @harutomishima3114 4 ปีที่แล้ว +17

    LOL her husband’s friend was watching deepfakes of her 😂😂😂

    • @alex30425
      @alex30425 4 ปีที่แล้ว

      Ashton Kutcher was likely trying to see if there was deepfakes of his wife Mila Kunis and came across ones for Kristen Bell.

    • @ahblooloo8639
      @ahblooloo8639 4 ปีที่แล้ว

      Her husband's freind should go back in time and stop deepfake reasearchers.

    • @chaosfire321
      @chaosfire321 4 ปีที่แล้ว

      @@ahblooloo8639 Someone would've stumbled on it eventually. Machine learning is a booming field and someone would've applied it to faces sooner or later.

    • @fleshtaffy
      @fleshtaffy 4 ปีที่แล้ว

      @@chaosfire321 This guy Joe Rogans

  • @stainlessstove4629
    @stainlessstove4629 ปีที่แล้ว +2

    Anyone else think that you just need to be responsible for that what you post on the internet? IE... you dont need to post selfies on instagram.
    But it is so scary.

  • @brooklynyte
    @brooklynyte 4 ปีที่แล้ว +5

    It's horrible that this is happening, but are you really saying that this is a bigger impact than politics?? Come on now....

  • @ninnikins4768
    @ninnikins4768 4 ปีที่แล้ว +2

    That's disgusting. Such a wonderful technology and people use it to abuse women.

  • @gasdive
    @gasdive 4 ปีที่แล้ว +3

    Missed the impact on the actors who are having their work stolen and their identity erased. There's more than one hurt along the way here.

  • @brandonmccoy2894
    @brandonmccoy2894 ปีที่แล้ว +1

    I usually don’t care too much when celebrities complain/ speak out about something that’s damaging their image or reputation. But this one I really sympathize with them. How this isn’t illegal is beyond me, I understand why someone that finds the individual actress super attractive but this isn’t how you go about it. Just use your imagination, instead of defaming them on the internet where anybody can find it. This is just wrong, I mean it has to be defined as defamation in some form.

  • @andyc2518
    @andyc2518 4 ปีที่แล้ว +3

    It's sad how naive Kristen Bell sounds at the end of the video. Sad in the sense that something like that is considered naive when she's absolutely right. I along with countless others have seen just how immature, cruel and insensitive people can be online and there's no fixing that without violating privacy and privacy is needed to protect those most vulnerable. Like with the riots currently going on, you can't quite fix it, you just gotta ride it out and that's what we gotta try to online. Ride it out and hope for the best in the end... even though humans don't really work that way.

  • @360stab
    @360stab 4 ปีที่แล้ว +3

    I'm glad this is getting attention. It will teach people about privacy and critical thinking. You can't take anything you see at the internet at a face value.

    • @sfullernj
      @sfullernj ปีที่แล้ว

      Face value lol

  • @AJX-2
    @AJX-2 4 ปีที่แล้ว +6

    When you put personal information online, you forever lose control of what people do with that information. Photos count as information. Nobody has any reasonable expectation of privacy online.

    • @ekaterinavalinakova2643
      @ekaterinavalinakova2643 3 ปีที่แล้ว

      Photosites NEED to start making it very clear what rights users are sublicensing uploading images to their sites in such a way that it's not written in legalese, and end users NEED to start reading the TOS. Most people would have never have forseen that uploading an image of themselves by default sublicenses the company to do whatever they want with said image.

  • @warbler4954
    @warbler4954 4 ปีที่แล้ว +1

    If we have freedom of speech, privacy is the freedom of not speaking.

  • @sebastian-benedictflore
    @sebastian-benedictflore 4 ปีที่แล้ว +4

    Thank you for talking about issues that actually matter, Vox.

    • @Chameleonred5
      @Chameleonred5 4 ปีที่แล้ว

      ...As opposed to what, exactly?

  • @papiii711
    @papiii711 4 ปีที่แล้ว +28

    I CAN SEE KIM COMING UP WITH AN EXCUSE

  • @sebastian-benedictflore
    @sebastian-benedictflore 4 ปีที่แล้ว +5

    Why does Ashton Kutcher know about Kristen Bell's deepfakes?

  • @NafedalbiFilms
    @NafedalbiFilms 4 ปีที่แล้ว +3

    Me, a Two Minute Paper watcher: *I am 4 parallel universes ahead of you*