Deepfake Videos Are Getting Terrifyingly Real

แชร์
ฝัง
  • เผยแพร่เมื่อ 5 พ.ย. 2024

ความคิดเห็น • 6K

  • @onyx8231
    @onyx8231 5 ปีที่แล้ว +886

    Steve Buscemi totally rocked that red dress. 🤣

    • @cheezuschrist1102
      @cheezuschrist1102 5 ปีที่แล้ว +6

      Phill Ken 7 Ms. Buscemi does it better

    • @davidsirmons
      @davidsirmons 5 ปีที่แล้ว

      I had flashbacks of one scene with him from the movie "On The Road"... D:

    • @Charlie-zd2dr
      @Charlie-zd2dr 5 ปีที่แล้ว

      Wtf

    • @ogweasel4273
      @ogweasel4273 5 ปีที่แล้ว

      That’s what I was thinking!!!

    • @kjamison5951
      @kjamison5951 5 ปีที่แล้ว +2

      “Hey, why am I Mr Pink?”

  • @abhishekbhatia6092
    @abhishekbhatia6092 5 ปีที่แล้ว +5778

    YES! NOW WE CAN HAVE NICHOLAS CAGE IN EVERY MOVIE.

    • @pabloescoberg
      @pabloescoberg 5 ปีที่แล้ว +119

      He is, you just watch the deepfakes

    • @NickRoman
      @NickRoman 5 ปีที่แล้ว +42

      "Being Nicholas Cage"?

    • @pyroparagon8945
      @pyroparagon8945 5 ปีที่แล้ว +115

      Saving Private Ryan but every American is Nicholas Cage

    • @anttitheinternetguy3213
      @anttitheinternetguy3213 5 ปีที่แล้ว +62

      From now on, EVERY movie should have EVERY character replaced with Nicholas cage

    • @PickledAmericano
      @PickledAmericano 5 ปีที่แล้ว +19

      At least he'll have some decent credits now..

  • @Melusi_Nyathi
    @Melusi_Nyathi 4 ปีที่แล้ว +721

    We are being conditioned to doubt reality when it drops.

    • @southerngirlsrock2799
      @southerngirlsrock2799 3 ปีที่แล้ว +23

      I Absolutely agree with you!

    • @ji7953
      @ji7953 3 ปีที่แล้ว +9

      Yes

    • @Earthdogbonzo3
      @Earthdogbonzo3 3 ปีที่แล้ว +28

      Probably the most important point made here

    • @jenj1221
      @jenj1221 3 ปีที่แล้ว +3

      Well it’s definitely working I don’t trust anything perhaps actual trials. I hope it doesn’t seep into a court room. I trust what they present.

    • @valariebroadway
      @valariebroadway 3 ปีที่แล้ว +2

      My thoughts EXACTLY!🙄

  • @drivethrupoet
    @drivethrupoet 3 ปีที่แล้ว +615

    hmmm... funny how YT blew the dust off this one and tossed it into our recommends at a time when there's talk of some very incriminating videos coming out

    • @lawing75
      @lawing75 3 ปีที่แล้ว +7

      Exactly!!!

    • @jjones9551
      @jjones9551 3 ปีที่แล้ว +4

      Wow

    • @D-Almighty
      @D-Almighty 3 ปีที่แล้ว

      That part

    • @melissaradell8263
      @melissaradell8263 3 ปีที่แล้ว +3

      I was thinking that too!!

    • @melissaradell8263
      @melissaradell8263 3 ปีที่แล้ว +9

      @¿0.O? Putting out doubt, before releas of what should be proof of (the things we aren't supposed to know, hmm interesting timing)

  • @steveos111
    @steveos111 5 ปีที่แล้ว +282

    This also works in reverse. People can now do or say something wrong and just claim that any footage of them doing it is a deep fake.

    • @superhazydays
      @superhazydays 5 ปีที่แล้ว +3

      Just another excuse. Honestly they will use it with ways, which ever way benefits them.

    • @superhazydays
      @superhazydays 5 ปีที่แล้ว +4

      I meant both ways

    • @steveos111
      @steveos111 5 ปีที่แล้ว +12

      @BTIsaac I disagree with the way you have just instantly used my point to attack one particular group of people. Its exactly that type of thinking that makes this technology dangerous. If you think the only threat is from the so called "alt right" they you are nuts. And If you think someone on the left is so morally superior that they wouldn't resort to the same tactic then you are just blatantly stupid.

    • @steveos111
      @steveos111 5 ปีที่แล้ว +10

      @BTIsaac clearly anyone from either side of the political spectrum could abuse the technology in the way described. My problem with your post is that you have applied your own political bias and used it to attack a particular political movement. Again if you think that only the right would stop so low and that the left are above such behaviour then you must live on a different planet.

    • @brandonhvacants2217
      @brandonhvacants2217 5 ปีที่แล้ว

      Not really....

  • @NimW
    @NimW 5 ปีที่แล้ว +1863

    Thank god there's piano music so I know what I'm supposed to feel.

    • @MajinBuuButtercup
      @MajinBuuButtercup 5 ปีที่แล้ว +5

      Perhaps a little "Thus Spake Zarathustra"?

    • @CuriousPerson2206
      @CuriousPerson2206 5 ปีที่แล้ว +42

      Genius comment

    • @New_Talent
      @New_Talent 5 ปีที่แล้ว +2

      Piano music?? Oh arrgh mine was muted... 😑

    • @DandyDeishu
      @DandyDeishu 5 ปีที่แล้ว +4

      The Italian mafia would like to know your location

    • @nicklane9032
      @nicklane9032 5 ปีที่แล้ว +9

      Verge has started making videos without music, the difference is so fucking refreshing

  • @Dagovernment
    @Dagovernment 5 ปีที่แล้ว +2321

    Deepfakes aren't the problem, the public's lack of skepticism is. Start teaching children critical thought and the weakness' of the human mind from an early age, and this problem goes away.

    • @username4441
      @username4441 5 ปีที่แล้ว +77

      but k-12 is designed to remove critical thought and instill mental weakness.. the complete overhaul you suggest would be extremely cost prohibitive... and we're already "borrowed to the hilt".

    • @BatmanLampin
      @BatmanLampin 5 ปีที่แล้ว +40

      Its crazy. We as a people have been fed lies for so long we have essentially and collectively learned (or been conditioned) to reject everything as a starting point. Which is harmless on is on. But when weighing truth vs fallacy, one needs the capacity to tell the difference between a theory and a fact. That distinction is going extinct.

    • @estebannemo1957
      @estebannemo1957 5 ปีที่แล้ว +18

      The technology isn't a problem now because our intuition (not our reason) tells us something isn't quite right. But what will the technology look like in 10 years?

    • @williampan29
      @williampan29 5 ปีที่แล้ว +66

      "Start teaching children critical thought and the weakness' of the human mind from an early age"
      Very naive thinking.
      You think it's all a matter of conscious thinking? You know how many things human do that is unconscious?
      We just need more training to deal with technology invented by people in PhD of neuroscience who understand more about the human mind than you ever have?
      You think you can resist technology?
      Let me use a blunt example: you are using youtube, which is owned by Google, and it is now collecting your personal data, which could be used without your permission.
      Have you managed to stop watching youtube videos? Why don't you stop with your critical thinking skill?

    • @BatmanLampin
      @BatmanLampin 5 ปีที่แล้ว +10

      @@williampan29
      Its not a matter of more training. Its a matter of teaching. This problem is no different than the Third Reich or the Quran or even as far back as the serpent deceiving Eve in the garden of Eden. The overall conversation is about propaganda. In order to train for these things we must teach our youth to ALWAYS account for the main variable which is the INTENT of your SOURCE.

  • @ViciousAlienKlown
    @ViciousAlienKlown 3 ปีที่แล้ว +552

    This is how these swamp creatures will claim innocence.

    • @jimzorn3853
      @jimzorn3853 3 ปีที่แล้ว +7

      Only problem for them is that it's possible to technologically identify what is and what is not a deep fake, or so I've heard...

    • @kavodadonai9290
      @kavodadonai9290 3 ปีที่แล้ว +1

      I was thinking something like that too

    • @momogrand6789
      @momogrand6789 3 ปีที่แล้ว

      Exactly

    • @cliffordgill9052
      @cliffordgill9052 3 ปีที่แล้ว +1

      And frame their enemies

    • @monicawright9045
      @monicawright9045 3 ปีที่แล้ว

      Sure is.

  • @BolinFoto
    @BolinFoto 5 ปีที่แล้ว +819

    This also gives someone the "plausible deniability" card to play.
    "It wasn't me, it was a deepfake..."

    • @sergiotl7378
      @sergiotl7378 4 ปีที่แล้ว +24

      Smart comment that went unnoticed.

    • @wever5077
      @wever5077 4 ปีที่แล้ว +11

      I'm afraid it'll only work for white people.

    • @xdoxxy
      @xdoxxy 4 ปีที่แล้ว +26

      W Ever Not true in every case c’mon we gotta stop doing this... 🤦🏾‍♂️

    • @chloeES42
      @chloeES42 4 ปีที่แล้ว +5

      Right. What are they priming us for.....?

    • @jarnamhar
      @jarnamhar 4 ปีที่แล้ว +5

      In many countries, that excuse is being used by bureaucrats even without deepfake technologies.

  • @jaywalt8418
    @jaywalt8418 5 ปีที่แล้ว +551

    The problem is people believe everything they see in videos and media without questioning in most cases

    • @microbios8586
      @microbios8586 5 ปีที่แล้ว +59

      Ok but the contrary is just as bad. In this modern age, something completely real can be dismissed as fake. We are begining to enter an age where facts don't matter. The truth is subjective. It's terrifying.

    • @CrazyCowboyBuilds
      @CrazyCowboyBuilds 5 ปีที่แล้ว +17

      So true. Majority of people believe what they hear from “they said” without a source. Imagine if they could see a fake message you couldn’t tell was fake. Mind control at it’s best

    • @Holret
      @Holret 5 ปีที่แล้ว +6

      Or they do question it in a way that conforms their confirmation bias.

    • @CrazyCowboyBuilds
      @CrazyCowboyBuilds 5 ปีที่แล้ว +3

      Holret wait ‘they’ said there was collusion. Heard it on the internet it must be true

    • @scribliez
      @scribliez 5 ปีที่แล้ว +1

      Jay Walt there are just as many ppl who are skeptical of things in media as there are believers lol. You have a giant bunch of “people are sheep i don’t believe everything i see ppl are dumb” types of people. And they’re not a minority.

  • @carbonunit
    @carbonunit 5 ปีที่แล้ว +148

    Yes. This is weaponized technology.

  • @dugnice
    @dugnice 3 ปีที่แล้ว +512

    Just one more reason NOT to trust the media.

    • @sofakingobvious754
      @sofakingobvious754 3 ปีที่แล้ว

      th-cam.com/video/YblaCYo_ZdY/w-d-xo.html
      xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx

    • @dustyblue2ify
      @dustyblue2ify 3 ปีที่แล้ว +5

      Indeed but Corporate Owned Media......Puts this out (PBS Too) for saying "Not Them" but "Hackers" ect....see HOW IT CAN BE TWISTED :)

    • @simster45
      @simster45 3 ปีที่แล้ว +9

      Always live by the mantra: Because it is said so,, Does not make it so,,, also: Believe nothing,, question everything ;-) Rik Mayal R.I.P

    • @dwaneyocum1718
      @dwaneyocum1718 3 ปีที่แล้ว +8

      It's not the media you need to worry about. It's the people using it to further their goals..

    • @dugnice
      @dugnice 3 ปีที่แล้ว +3

      @@dwaneyocum1718
      Actually it's really the cops and soldiers that we have to worry about because THEY are the ones who actually enforces the laws and policies of the politicians and corporate executives, most of whom are cowardly old men.

  • @adrc5606
    @adrc5606 5 ปีที่แล้ว +4062

    Things this'll be used for:
    • -Actual presidential messages-
    • Memes
    • Porn

    • @thecentrist9507
      @thecentrist9507 5 ปีที่แล้ว +28

      Propaganda in 3 2... (Mushroom cloud)

    • @pasmomoonde6077
      @pasmomoonde6077 5 ปีที่แล้ว +128

      Walks in store: ahhh yes can I get Ariana Grande face on Mia Khalifa body please,...........its For a friend of course.

    • @davidsirmons
      @davidsirmons 5 ปีที่แล้ว +46

      And a new, glorious age of porn (deepPorn?.....nm) begins

    • @aggnal2892
      @aggnal2892 5 ปีที่แล้ว +13

      @@davidsirmons time to stock up on lotion

    • @360mw5
      @360mw5 5 ปีที่แล้ว +28

      Faking videos of osama bin laden to justify a false flag

  • @lordwaffle4614
    @lordwaffle4614 5 ปีที่แล้ว +896

    Creates technology that could start wars and destroy countries.
    Uses it to make memes.

    • @joncoda365
      @joncoda365 5 ปีที่แล้ว +41

      I see nothing wrong here.

    • @Nina-vv3ev
      @Nina-vv3ev 5 ปีที่แล้ว +1

      Lord Waffle 🤣

    • @TwinShards
      @TwinShards 5 ปีที่แล้ว +16

      @@joncoda365 *A movie with the only face available is Rick Astley*

    • @Nina-vv3ev
      @Nina-vv3ev 5 ปีที่แล้ว +2

      Dante Caballero do you do drugs & watch conspiracy videos late at night? Lol

    • @joncoda365
      @joncoda365 5 ปีที่แล้ว +5

      ​@@TwinShards Damn. I just mentally rick rolled myself.

  • @Blind_Side94
    @Blind_Side94 5 ปีที่แล้ว +616

    Buschemi's face on Lawrence is some high quality art to me

    • @stavrosgeorgios5577
      @stavrosgeorgios5577 5 ปีที่แล้ว +13

      Modern art masterpiece.

    • @Cookie-Dough-Dynamo
      @Cookie-Dough-Dynamo 5 ปีที่แล้ว +13

      I'd... Yeah, I'd probably still hit that.

    • @fredflintstone3979
      @fredflintstone3979 5 ปีที่แล้ว +1

      Madmaxmiami Aaron I guess that's what paper bags are for. Besides, if you can look at Jennifer Lawrences body and not want to £¥€£ it, there's probably something wrong (go see the doc).

    • @Cookie-Dough-Dynamo
      @Cookie-Dough-Dynamo 5 ปีที่แล้ว +5

      Buscemi's face has the added benefit of having never been covered in another man's "Gentleman's Relish". We all saw those leaked photos.

    • @Blind_Side94
      @Blind_Side94 5 ปีที่แล้ว

      @@Cookie-Dough-Dynamo 🤣

  • @themessenger6442
    @themessenger6442 3 ปีที่แล้ว +177

    And people still believe the news LOL. Wake up people

    • @sofakingobvious754
      @sofakingobvious754 3 ปีที่แล้ว

      th-cam.com/video/YblaCYo_ZdY/w-d-xo.html
      xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx

    • @bitterlemons690
      @bitterlemons690 3 ปีที่แล้ว

      th-cam.com/video/dQw4w9WgXcQ/w-d-xo.html this is a good video with sock psychology

    • @thejww96
      @thejww96 3 ปีที่แล้ว +3

      This is the news dude. LOL.

  • @RocksmithPdl
    @RocksmithPdl 5 ปีที่แล้ว +422

    “Seeing is believing” has run its course lol

    • @thenooblvl100
      @thenooblvl100 4 ปีที่แล้ว +16

      Same thing for "Cameras don't lie"

    • @ElasticReality
      @ElasticReality 4 ปีที่แล้ว +6

      It was never in the race. The Universe is not how we see it, and never has been.

    • @Skynet_the_AI
      @Skynet_the_AI 4 ปีที่แล้ว +1

      Clever comment

    • @joerivandeweyer3056
      @joerivandeweyer3056 4 ปีที่แล้ว

      Believe nothing you hear and only half of what you see

    • @garycohen555
      @garycohen555 4 ปีที่แล้ว

      Jen farmer excellent point...I have wondered on many occasions whether my cat lives in a world which at least visually even similar to what I see. For that matter, I wonder whether any two humans see the same world. Could it be
      that our brain, at birth, is like a slime mold which use ever changing and calculating solutions to problems but just as we are all aware of the notion that
      Often there is more than one solution method to a given problem, it seems that
      me there should exist more than one slime mold that solves the same problems. ....see NOVA’s program on the intelligence of slime molds.

  • @leelandglover2731
    @leelandglover2731 5 ปีที่แล้ว +437

    Now the innocent will become the guilty an the guilty will become innocent.. Anybody can be framed now.

    • @darken3150
      @darken3150 5 ปีที่แล้ว +9

      If you have a limited digital exposure it would be way more difficult.

    • @iSaigeable
      @iSaigeable 5 ปีที่แล้ว +3

      Darken ya 0 social media

    • @randomguy263
      @randomguy263 5 ปีที่แล้ว

      Well, I better become guilty then!

    • @Pimp-Master
      @Pimp-Master 5 ปีที่แล้ว +5

      Computer face morphing was a huge deal 25 years ago, now it’s quaint. People got hip.

    • @Gee-xb7rt
      @Gee-xb7rt 5 ปีที่แล้ว +1

      wasn't this always happening anyway? oh it used to only be minorities getting framed for things they didn't do. i get it.

  • @NorthSea_1981
    @NorthSea_1981 5 ปีที่แล้ว +123

    This is deeply disturbing stuff and a reason for MASSIVE concern. Sure, the technology itself is impressive, but this opens Pandora's Box for a multitude of possible abuse.

    • @gannonfitzgerald6485
      @gannonfitzgerald6485 4 ปีที่แล้ว +8

      NorthSea 1981 Just by making someone say something could lead to an all out world war for fucks sake

    • @raymondfrye5017
      @raymondfrye5017 4 ปีที่แล้ว +3

      You have no idea. Wait until you see Arnold Schwarznegger's face in Clint Eastwood's Dirty Harry movies.

    • @johnqpublic314
      @johnqpublic314 3 ปีที่แล้ว

      It's taken over the media.

  • @cw4608
    @cw4608 3 ปีที่แล้ว +84

    The world has screwed itself and may not even be aware of it.

    • @larsgsanger3105
      @larsgsanger3105 3 ปีที่แล้ว +2

      🍏🧡❤️

    • @UnseenEternalStudios
      @UnseenEternalStudios 3 ปีที่แล้ว +5

      We didn’t Ask for this. We already know these people are destroying us. How long are we going to let it happen?

    • @preciousmettlex
      @preciousmettlex 3 ปีที่แล้ว +5

      The world is fucked.

    • @cjmacq-vg8um
      @cjmacq-vg8um 3 ปีที่แล้ว

      anyone who downplays the EXTREME dangers this fascist, corporate technology and digitization presents to society is a MAJOR part of this horrendous problem. anyone who believes the corporate lie that this ISN'T a problem needs to open their damned eyes. you need look no further than trump, the GOP and q-anon to WITNESS the absolute danger of fraud and lies being spread around the globe at the speed of light.
      technology has opened the floodgates to dishonesty, misinformation, disinformation, fraud, theft and LIES! reality is being STOLEN from us and we morons set back and EMBRACE the corporate elite and their technocrats who's stealing it from us!
      no one - NONE OF US - know what's real anymore. we have no concept of honesty, integrity and ethics anymore. the ONE THING we do know, without a doubt, is that the global elite have conditioned and brainwashed humanity into accepting their own dependence on, addiction and slavery to these overpriced corporate PRODUCTS.
      the elite has turned us into a SLAVE species completely incapable of independent thought, objective reasoning and seeing the oppressive reality being implemented all around them.

  • @impartialted4905
    @impartialted4905 5 ปีที่แล้ว +410

    The Steve buchemi one just looks like Maculay Culkin

    • @guessdog4871
      @guessdog4871 5 ปีที่แล้ว +5

      You mean Maculay Maculay Culkin Culkin? He changed his middle name to Maculay Culkin you know. Sorry. I thought it was funny, me...

    • @klystron2010
      @klystron2010 5 ปีที่แล้ว +1

      He looks like Macaulay Culkin USED to look.

    • @gursharansingh8042
      @gursharansingh8042 5 ปีที่แล้ว

      i lost it when she said "it surprised me" hahahahahahahaha the eyes were perfect

  • @davidh3823
    @davidh3823 5 ปีที่แล้ว +347

    Gone goes the phrase "pictures/video or it didn't happen"

    • @agentorange8530
      @agentorange8530 5 ปีที่แล้ว +9

      now the only one to prove it's real is sharpie in pooper

    • @MandatoryHandle
      @MandatoryHandle 5 ปีที่แล้ว +4

      @@agentorange8530 Soon enough there'll be programs to realistically place deepfake sharpies in deepfake poopers. What a time to be alive

    • @agentorange8530
      @agentorange8530 5 ปีที่แล้ว +3

      @@MandatoryHandle Scary just thinking about it

    • @KaliTakumi
      @KaliTakumi 5 ปีที่แล้ว

      That phase became outdated many, many years ago

    • @notyet3dna
      @notyet3dna 5 ปีที่แล้ว

      LOOOL

  • @fjm1061
    @fjm1061 5 ปีที่แล้ว +492

    Thank you. Now I don’t believe anything I see.

    • @thinkman2467
      @thinkman2467 5 ปีที่แล้ว +22

      You were only suppose to believe half of what you see anyway.

    • @SirContent
      @SirContent 5 ปีที่แล้ว

      believe or not, if it is belief then it is half from the truth.

    • @francis_castiglion3
      @francis_castiglion3 5 ปีที่แล้ว +1

      Is it bad that I don't even care about watching the news any more...

    • @markr8604
      @markr8604 5 ปีที่แล้ว +1

      Me too. Actually, I don't even believe this is an actual comment.

    • @derekseven1647
      @derekseven1647 5 ปีที่แล้ว +1

      You shouldn't.

  • @scocassovegetus
    @scocassovegetus 3 ปีที่แล้ว +69

    Imagine what the CIA etc. has. They can do a video call to you pretending to be your friend and you wouldn't be able to tell.

  • @stivcdl
    @stivcdl 5 ปีที่แล้ว +478

    I love how David Schwimmer deep faked as Nicolas Cage barely changes at all.

    • @justmatthere8521
      @justmatthere8521 5 ปีที่แล้ว +7

      Neither did Phoebe.... Lol

    • @HightopDavid
      @HightopDavid 5 ปีที่แล้ว +13

      Lol I didn’t realize they had changed anything at first lol

    • @wjtshane
      @wjtshane 5 ปีที่แล้ว +1

      i was thinking the same !! thank you for this comment

    • @thebasketballhistorian3291
      @thebasketballhistorian3291 5 ปีที่แล้ว +1

      Yeah, would've never thought they looked anything a like but Ross-Cage looks barely changed.

    • @therestorationofdrwho1865
      @therestorationofdrwho1865 5 ปีที่แล้ว +6

      Wait what I didn’t realise they changed his face

  • @ConnectTheDotsProper
    @ConnectTheDotsProper 5 ปีที่แล้ว +123

    At this point, if you're use to seeing deepfakes, or if you just edit all the time, you can still spot it. Very soon though, I have no doubt that it will be undetectable..

    • @touchestoomuch
      @touchestoomuch 5 ปีที่แล้ว

      if it's possible to create an algorithm to do this, then it's also possible to create an algorithm to detect it, and this will always be true

    • @BANG1681
      @BANG1681 5 ปีที่แล้ว +2

      @@touchestoomuch It may require some form of ID for the videos format in the future. Like an encrypted EXIF data for video verification. While it would help for authenticity, it wouldn't prevent spread of false information.

    • @gaiawillis
      @gaiawillis 5 ปีที่แล้ว

      @@touchestoomuch theres already ai that can spot deepfakes

    • @DEV3N87
      @DEV3N87 5 ปีที่แล้ว

      Ofcourse they will use it to seay elections

  • @chipskylark5086
    @chipskylark5086 5 ปีที่แล้ว +549

    Imagine AI robots using this technology to destabilize and take over the world.

    • @angelofdeath275
      @angelofdeath275 5 ปีที่แล้ว +57

      shinukism shnuck fake intellects be like

    • @kutthroatgaming1925
      @kutthroatgaming1925 5 ปีที่แล้ว +1

      Thanks for the nightmares prick.

    • @Fortunateis4luck
      @Fortunateis4luck 5 ปีที่แล้ว +1

      angelofdeath275 😂😂

    • @swdetroiter313
      @swdetroiter313 5 ปีที่แล้ว +22

      @shinukism shnuck You clearly don't know anything about the brain or science. The brain does not use "switches" and it is not binary.

    • @redrum327
      @redrum327 5 ปีที่แล้ว

      Hier ist John Connor. Wenn ihr das hört, gehört ihr zum Widerstand. [...] die Zukunft ist das, was wir daraus machen.

  • @peanutthepoodle
    @peanutthepoodle 3 ปีที่แล้ว +127

    They’ve been using this with Biden the entire time.

    • @preciousmettlex
      @preciousmettlex 3 ปีที่แล้ว +2

      Trump too lol

    • @kellyedey5952
      @kellyedey5952 3 ปีที่แล้ว +15

      @@preciousmettlex can't get trump out of your head hey.

    • @MD-gz5yw
      @MD-gz5yw 3 ปีที่แล้ว +1

      Agreed

    • @leejames1792
      @leejames1792 3 ปีที่แล้ว +7

      @@kellyedey5952 he can't, he lives in there rent free too.

    • @jeremiahfitzwinkle323
      @jeremiahfitzwinkle323 3 ปีที่แล้ว +1

      certifiabley nuts

  • @asinglefrenchfry
    @asinglefrenchfry 5 ปีที่แล้ว +515

    This technology is honestly really fascinating to me. It’s sad that it’s being used for malicious purposes

    • @tomfisher9089
      @tomfisher9089 5 ปีที่แล้ว +14

      And you're surprised. Pathetic. Go back to your IPA and great shows like Nailed It and wait in comfort on your leather couch until THEY knock on your door and take you away.

    • @mauroylospichiruchis544
      @mauroylospichiruchis544 5 ปีที่แล้ว +15

      @@tomfisher9089 you cracked the code, buddy!

    • @SilisAlin
      @SilisAlin 5 ปีที่แล้ว +15

      Everything in this world is being used for malicious purposes, by someone, somewhere.

    • @blazejames47
      @blazejames47 5 ปีที่แล้ว +7

      The fact that you think this technology is interesting is the whole reason this directly malicious bullshit exists

    • @diggleda2952
      @diggleda2952 5 ปีที่แล้ว

      a single french fry there are more bad uses than good in this industry of changing reality. This also makes me think of all of the bad uses in the VR industry

  • @sorsorscience0787
    @sorsorscience0787 5 ปีที่แล้ว +326

    How do I know all these interviews aren't all deepfakes lmfao I dont like this.

    • @maxpower1337
      @maxpower1337 5 ปีที่แล้ว +4

      Me too.

    • @nuthineatholl6434
      @nuthineatholl6434 5 ปีที่แล้ว +11

      Furthermore ---
      How am I to be sure that you are not an AI bot posting a provocative human-like message?
      How are you to be sure that I am not an AI bot posting a provocative human-like message?
      Exactly -- deepfakery: if it's possible anyplace in the infosphere, it's possible everyplace.
      If it's malignly-used in any particular instance, it could be used thus in every instance.
      After all, what's to prevent such misuse?
      Who can infallibly detect it in any application?
      Who's keeping track of the actual authentic history/reality from which systematic deceptive divergence is being taken?
      The point of the operation is induced human despair and demoralization, for political-control purposes, from the general ontologically-gullible public's feeling of having lost any secure grasp whatsoever of any traditionally-dependable consensus-reality, for -- voila! -- anything can demonstrably be anything!
      The conditioned human mind can not tell the difference, and will accept whatever image supports the narrative favored by its existing systematic preconceptions.
      The subliminal exploitability of this belief-mechanism is what psywar is founded upon.
      The knowledge which provides a counter to this sort of thing lies in domains previously dismissed as "spiritual" or even "mystical", but which nonetheless provide a way to more-usefully ground oneself in a comprehensive actuality, avoiding the prevailing fear-based caetextia.
      www.hgi.org.uk/human-givens/new-insights/caetextia
      www.ishk.net/wp-content/uploads/2016/03/NewWorldNewMind-web.pdf
      www.humanjourney.us/

    • @stevemisfit1
      @stevemisfit1 5 ปีที่แล้ว +2

      That's exactly the whole point don't believe anything

    • @DrSaav-my5ym
      @DrSaav-my5ym 4 ปีที่แล้ว +4

      @Schnbl in 10, 20 years it won't matter how smart you are, you won't be able to discern the difference.

    • @menhera-chan2878
      @menhera-chan2878 4 ปีที่แล้ว

      Nuthine Atholl BRUHH~

  • @iNSANEcOOkieMONstEr
    @iNSANEcOOkieMONstEr 3 ปีที่แล้ว +28

    We are in serious trouble. No... correct that ... we have been in serious trouble for a long time. We just didn't know it.

  • @jameswill9323
    @jameswill9323 3 ปีที่แล้ว +20

    "Believe half of what you see, and none of what you hear"... Best advice my father ever gave.

  • @richystar2001
    @richystar2001 5 ปีที่แล้ว +158

    Your own face will used betray your own character... Crazy.

    • @John-qt8km
      @John-qt8km 5 ปีที่แล้ว

      piss pants goo goo ga ga

    • @KRAFTWERK2K6
      @KRAFTWERK2K6 5 ปีที่แล้ว

      And Fantomas had to make actual masks of his victims… or the IMF team members in "Mission Impossible" with their face-immitator devices.

    • @billzco6137
      @billzco6137 5 ปีที่แล้ว

      moral of the story...talk like u have torrets and a skitzo

    • @gradeahonky
      @gradeahonky 5 ปีที่แล้ว

      Until we realize that all videos are unreliable. Even to the point where you could very reasonably deny an actual video of yourself. I think in the end it will be used by people who want to watch porno's involving whoever they are crushing on.

  • @alleonard4423
    @alleonard4423 4 ปีที่แล้ว +59

    Oh I get it, they trying to cover for the frazzeldrip video!!!

    • @Ephesians5-14
      @Ephesians5-14 4 ปีที่แล้ว +12

      Ha! I just commented something about that.. it was my first thought!! Now literally everything that comes out as evidence against these elite pedos will be called a deep fake, how convenient

    • @miagabrielle7628
      @miagabrielle7628 4 ปีที่แล้ว +6

      EXACTLY! Exactly what I was just now thinking. They’re doing preventative damage control.

    • @earnyourimmortality
      @earnyourimmortality 3 ปีที่แล้ว +4

      That sh*t needs to be exposed like yesterday smfh
      People hardly mention it.
      Wtf

    • @breezy3725
      @breezy3725 3 ปีที่แล้ว +2

      I've heard there is an even worse one than that. Disgusting, evil, vile people need to be exposed.

    • @jamesscott9081
      @jamesscott9081 3 ปีที่แล้ว

      I'm sorry, what? Frazzeldrip??

  • @slendertale1186
    @slendertale1186 5 ปีที่แล้ว +611

    This technology should be used to put Stan Lee in every marvel
    Movie

    • @swdetroiter313
      @swdetroiter313 5 ปีที่แล้ว +23

      Oh wow, that is not stupid at all!

    • @SweetDreamsHalo3
      @SweetDreamsHalo3 5 ปีที่แล้ว +18

      Well, this kind of technology had been doing similar things already for a long time. Take Cersei from GOT in the *shame* scene. A body double with her face plastered on top.

    • @broderickbrown5336
      @broderickbrown5336 5 ปีที่แล้ว +2

      @@SweetDreamsHalo3 Similar, yet you'd also have to have Cersei singing The Big Rock Candy Mountain* to really get the gist of what's going on here.
      (*Song choice optional)

    • @bh5485
      @bh5485 5 ปีที่แล้ว +6

      Let him rest in peace!!!

    • @lemonade2473
      @lemonade2473 5 ปีที่แล้ว

      Omg I thought I was the only one who was thinking about this lol

  • @loujcrucreations8358
    @loujcrucreations8358 3 ปีที่แล้ว +69

    They're trying to cover their trails

  • @icecreaminc8013
    @icecreaminc8013 5 ปีที่แล้ว +86

    imagine what someone can do with all those pictures you keep uploading to social media... with your face from every angle... and facebook's love of sharing your data with whom ever they feel like

    • @AH-lw2bj
      @AH-lw2bj 5 ปีที่แล้ว +10

      100% why I got rid of facebook

    • @rod1840
      @rod1840 5 ปีที่แล้ว +5

      @@AH-lw2bj dont you have a google acc? lol

    • @corypeltier9576
      @corypeltier9576 5 ปีที่แล้ว +6

      Apps like that don't need your pictures, government or whomever could easily just hack into your front camera while you're on your phone and get all the footage of your face they would need.

    • @sillysiji5257
      @sillysiji5257 5 ปีที่แล้ว +1

      Except those who only take a single angle pic for every pic 😂😂

    • @AmazingJayB51
      @AmazingJayB51 5 ปีที่แล้ว

      I they really need FB, your picture is taking everywhere you go.

  • @amberland7491
    @amberland7491 5 ปีที่แล้ว +94

    Deepfakes: exist
    Everyone: POrNn AnD MEmeS

    • @sifter14
      @sifter14 5 ปีที่แล้ว

      This is what comes after SFM porn

  • @aimillios9039
    @aimillios9039 4 ปีที่แล้ว +226

    "Deepfake is dangerous"
    People using deepfake: *Dame Da Ne*

    • @jakob9585
      @jakob9585 4 ปีที่แล้ว +9

      96 percent of deepfake contents is porn

    • @melvinjansen2338
      @melvinjansen2338 4 ปีที่แล้ว

      For those interested..
      I've done 2 Majima covers:
      TH-cam.com/watch?v=SqsRRBm27IE

    • @D_YellowMadness
      @D_YellowMadness 4 ปีที่แล้ว +1

      @epstein was killed Let's all take advice on sanity from someone who thinks no authority figure, celebrity, or media worker has ever lied about anything.

    • @lorrainecoggins7391
      @lorrainecoggins7391 3 ปีที่แล้ว +1

      Silent weapons for quiet wars.

  • @noclu4u384
    @noclu4u384 4 ปีที่แล้ว +20

    Pushing this tells me your getting ready to deny some very damaging videos coming out soon.

  • @dodieodie498
    @dodieodie498 3 ปีที่แล้ว +41

    As we move into a world of facial recognition security, will we start seeing ourselves doing things that we never did?

    • @waynepalmer319
      @waynepalmer319 3 ปีที่แล้ว +1

      @Joshua What you said made absolutely no sense.

  • @shadixyt
    @shadixyt 5 ปีที่แล้ว +190

    You should be worried about this technology being used against ANYONE, not just the elite!

    • @gaIexy
      @gaIexy 5 ปีที่แล้ว +30

      If anything, the average person is more at risk because there won't be tons of people paying attention to it and ensuring it gets tracked down and debunked. I'm actually terrified.

    • @Drybones898
      @Drybones898 5 ปีที่แล้ว +11

      That’s what I was thinking as well. Someone could easily ruin your chances at a job or college because of this stuff.

    • @andrewwhite1985
      @andrewwhite1985 5 ปีที่แล้ว +9

      @@Drybones898 a job and college are the least of you're worries, how about murder? Someone else could film the act and then put you in the video. I'm sure there's ways to figure out that it's been doctored, but give it time and it will be perfected. Plus if that happened to the average Joe then there probably wouldn't be that much of an investigation. Itd be oop you are guilty cuff him boys.

    • @Andreseme23
      @Andreseme23 5 ปีที่แล้ว +6

      Shadix No. WE should be worried about this being used BY the elite.

    • @onanpeuplu
      @onanpeuplu 5 ปีที่แล้ว

      You don't understand : It IS the elite who used these FAKES AGAINST Us ! for instance, the fake "syrian war" and the fake "refugees"....

  • @emanonymous
    @emanonymous 5 ปีที่แล้ว +63

    if you think this is bad just imagine how advanced the top secret CIA and military faceswapping tech is

    • @abcdefg3214
      @abcdefg3214 5 ปีที่แล้ว +2

      They couldn't even make a video showing bin laden writing with his right hand haha

    • @alexwormo486
      @alexwormo486 5 ปีที่แล้ว

      There isn't any lmfao

  • @BruceNicholas
    @BruceNicholas 3 ปีที่แล้ว +24

    Dear PBS, we know what you're doing.

    • @lulumoon6942
      @lulumoon6942 3 ปีที่แล้ว

      On our dime, no less.

    • @ghz24
      @ghz24 3 ปีที่แล้ว

      And what is it they are doing?

    • @daveidduha
      @daveidduha 3 ปีที่แล้ว

      @@ghz24 nunya

  • @Bierman2121
    @Bierman2121 5 ปีที่แล้ว +66

    Makes me think of George Orwell - 1984. The main character was tasked with censoring and rewriting history, now we can censor and rewrite video. A little scary.

    • @Uncommon_Cents71
      @Uncommon_Cents71 5 ปีที่แล้ว +8

      That's not the only Orwellian issue we face.
      For example. Take the 1984 phrase "first they change the words, then they change the meaning"
      It's scary.

    • @Gee-xb7rt
      @Gee-xb7rt 5 ปีที่แล้ว +1

      um, read Edward Bernays, you are subject to propaganda every minute of the day thanks to 'smart phones' making people dumb.

    • @randallcox2238
      @randallcox2238 5 ปีที่แล้ว

      This is a lot scary. Like most things this will get into the wrong hands and do damage.

    • @vs7548
      @vs7548 5 ปีที่แล้ว

      No people, This is DOWNRIGHT TERRIFYING!! Just think of how badly this WILL be abused.

    • @CheezMonsterCrazy
      @CheezMonsterCrazy 5 ปีที่แล้ว

      @@Gee-xb7rt Smart phones don't make anyone dumb. It still baffles me to think that people believe we weren't subject to manipulation through media since the literal dawn of civilization.

  • @gazbot9000
    @gazbot9000 5 ปีที่แล้ว +19

    0:53 I never before realised that Ross already looks like Nicolas Cage.

  • @EntryLevelLuxury
    @EntryLevelLuxury 5 ปีที่แล้ว +57

    How dare they blaspheme the sacredness of Steve Buscemi's face.

    • @sergiotl7378
      @sergiotl7378 4 ปีที่แล้ว

      How dare you blaspheme the sacredness of Steve Buscemi's name?

    • @cynthiacastaneda6687
      @cynthiacastaneda6687 3 ปีที่แล้ว

      Lol

    • @termikesmike
      @termikesmike 3 ปีที่แล้ว

      Face ! what about his breasts !!

  • @inkdreams4
    @inkdreams4 3 ปีที่แล้ว +54

    ALL LIES WILL BE REVEALED & GOD WHO KNOWS EVERYTHING, IS IN CONTROL! Rest easy tonight ~~

    • @TheRayjane
      @TheRayjane 3 ปีที่แล้ว +4

      Amen 🙏🏼

    • @ausbare140
      @ausbare140 3 ปีที่แล้ว +1

      If god is real look at the history and see god gave up on people when adam and eve at from the tree of knowledge.

    • @donnasheppard7371
      @donnasheppard7371 3 ปีที่แล้ว

      Amen!

    • @wendywiedmeyer1575
      @wendywiedmeyer1575 3 ปีที่แล้ว

      @@2manymorons Beautifully said!!

    • @smartwater6936
      @smartwater6936 3 ปีที่แล้ว +2

      praise satan :)

  • @mrveritas700
    @mrveritas700 5 ปีที่แล้ว +51

    Imagine whats really going on. if "joe blow" in his bedroom can make legitimate fakes.....imagine whet the "big boys club" is doing.

    • @AntilleanConfederation
      @AntilleanConfederation 5 ปีที่แล้ว +10

      Planes flying into buildings on tv and then bringing them down with explosives? Hmmmmmm. Seems to crazy

    • @joncoda365
      @joncoda365 5 ปีที่แล้ว +2

      Legitimate fakes? We're already so thoroughly under control that video tricks (although quite possibly another tool) aren't at all necessary.
      Who said what when shouldn't matter anywhere near as much as who is doing what to you now.

    • @marklewis4793
      @marklewis4793 5 ปีที่แล้ว

      @Trumble Research ,..not if you ARE the military,

    • @colleenstevens7651
      @colleenstevens7651 5 ปีที่แล้ว

      You have to belive in one thing you.

    • @AntilleanConfederation
      @AntilleanConfederation 5 ปีที่แล้ว +1

      715490 926 people can see holograms. My only complain is why did witness report military planes without windows rather than commercial air liners. And yes people did die, that’s how you use human emotion in your favor and lie to the masses

  • @irfandy8
    @irfandy8 5 ปีที่แล้ว +617

    I know one actor that will stay safe even when he's being attacked by this.
    _Nicholas Cage_

    • @cenkaytekin
      @cenkaytekin 5 ปีที่แล้ว +10

      I knew the answer before revealing it! So true!

    • @vernontauro
      @vernontauro 5 ปีที่แล้ว +8

      Please educate me on this 😂. Why Nicholas Cage?

    • @cenkaytekin
      @cenkaytekin 5 ปีที่แล้ว +42

      @@vernontauro He is a meme phenomenon and his face has been Photoshopped on so many other celebrities. I think a Cage deepfake would be accepted naturally by everyone (at least the meme community). That is how I interpreted OP's comment at least :)

    • @vernontauro
      @vernontauro 5 ปีที่แล้ว +4

      @@cenkaytekin thanks :)

    • @cenkaytekin
      @cenkaytekin 5 ปีที่แล้ว +3

      @@vernontauro I recommend googling it, some are hilarious :)

  • @FlamingRobzilla
    @FlamingRobzilla 5 ปีที่แล้ว +50

    This is scary, and as an animator I want to learn how to do it.

    • @billydoyle7415
      @billydoyle7415 2 ปีที่แล้ว

      CIA, et al are always hiring Hee Hee Hee
      Cheers from Texas,
      Billy 🏴‍☠️🌈💚🇺🇦🖖🏻✌️

  • @mericasshepherd2550
    @mericasshepherd2550 3 ปีที่แล้ว +11

    I was always taught, believe nothing you hear and only half of what you see.

    • @daveidduha
      @daveidduha 3 ปีที่แล้ว

      i hear that.. i mean, i see... i mean.. nvm.

  • @thepumpkinking6370
    @thepumpkinking6370 5 ปีที่แล้ว +23

    They are trying to get ahead of what's about to be released. They cant stop the pain that is coming their way.

    • @anhero2377
      @anhero2377 5 ปีที่แล้ว

      Pretty much. WWG1WGA

  • @flechette3782
    @flechette3782 4 ปีที่แล้ว +37

    Think of this the next time someone says that they have "photographic proof" of a crime.

    • @livewire2k4
      @livewire2k4 3 ปีที่แล้ว +1

      deep fakes do not stand up to forensic investigation.

    • @flechette3782
      @flechette3782 3 ปีที่แล้ว +1

      @@livewire2k4 for now...give it time. Remember, all pictures nowadays are ultimately just ones and zeros.

    • @livewire2k4
      @livewire2k4 3 ปีที่แล้ว

      @@flechette3782 it's already just ones and zeros. It is how they are arranged that matters. (layers).

    • @livewire2k4
      @livewire2k4 3 ปีที่แล้ว

      meta lives matter

    • @DM-wu5hn
      @DM-wu5hn 3 ปีที่แล้ว

      Photos have been manipulated since the beginning of photography.

  • @ComaToast1
    @ComaToast1 5 ปีที่แล้ว +16

    When ever they say when "starting to work on" means they've already have done and probably better

  • @radicalhonesty3628
    @radicalhonesty3628 3 ปีที่แล้ว +13

    impossible to imagine that the people still willfully in a comatose state after a year of this insanity... will ever wake

  • @canibaloxide
    @canibaloxide 5 ปีที่แล้ว +178

    Just the fact that this technology exist will give many further excuse to dismiss what they see presented to them in the media and recede deeper into their own reality bubbles and conspiracy theories.

    • @davidolmos5411
      @davidolmos5411 5 ปีที่แล้ว

      Dead as Dreams that's why I'm perfecting the id this is our project we getting licence to kill

    • @canibaloxide
      @canibaloxide 5 ปีที่แล้ว +9

      @Ross Yes everything should be questioned but you shouldn't just swap one falsehood for another that suits your own narrative.

    • @taco-vato
      @taco-vato 5 ปีที่แล้ว +14

      "Presented to them in the media"
      what? Is that the only tit we are allowed to suck on? Reality bubbles. You sound like quite the theorist yourself. You sure have some wild generalizations you're blinded by. Keep sucking your teet, everything will be just fine. The media presents....you so funny

    • @sakakisenpai3917
      @sakakisenpai3917 5 ปีที่แล้ว +12

      fun fact: The Term “Conspiracy Theory” Was Invented by the CIA In Order To Prevent Disbelief of Official Government Stories

    • @Farsay
      @Farsay 5 ปีที่แล้ว

      @@sakakisenpai3917 The funnest fact of all.

  • @derpstorm23
    @derpstorm23 5 ปีที่แล้ว +151

    PBS knows some powerful people are about to be outed for some unspeakable crimes and with video evidence nonetheless.

    • @jamesclarke4221
      @jamesclarke4221 5 ปีที่แล้ว +15

      Bingo

    • @omegamale7880
      @omegamale7880 5 ปีที่แล้ว +24

      Exactly. Deepfake has been used already for propaganda purposes but I have a feeling there's some damning video footage coming that the establishment will try to dismiss as "deepfake."

    • @TuckFwitter
      @TuckFwitter 5 ปีที่แล้ว +2

      eye opening

    • @guessdog4871
      @guessdog4871 5 ปีที่แล้ว +2

      What's coming? I'm naive.

    • @guessdog4871
      @guessdog4871 5 ปีที่แล้ว +2

      @@omegamale7880 Bring in the news camera crews right away. You have a feeling.

  • @chopvansuey
    @chopvansuey 5 ปีที่แล้ว +26

    0:42 That is a cursed image that I’ll never be able to fully cleanse from mind

  • @SC-gp7kt
    @SC-gp7kt 3 ปีที่แล้ว +3

    Powers of discernment are more needed than ever.

    • @SC-gp7kt
      @SC-gp7kt 3 ปีที่แล้ว +1

      @@anne902 Amen!

    • @SC-gp7kt
      @SC-gp7kt 3 ปีที่แล้ว

      @@anne902 No I dont think it has. The only weird thing that's happened is sometimes I'll type an original comment or a reply to someone else's comment, and after I hit the green arrow, it records it twice. Then i just go back and delete one. Yes I would dbl check if I were you. Who knows, YT could be creating comments on "behalf" of us! I'll watch mine too. Thanks for letting me know 🙏😊

  • @bigcheesetaste
    @bigcheesetaste 5 ปีที่แล้ว +190

    Make an AI that utilizes deep learning to detect deepfakes, job done!

    • @DuxGalt
      @DuxGalt 5 ปีที่แล้ว +28

      So this is already used with many machine learning algorithms its called adversarial machine learning. One AI is really good at knowing a picture is a cat, and the other tries too fool it only stopping when it does. All making tech to find deep fakes will do, is actually make deep fakes more real and convincing.

    • @spid3r951
      @spid3r951 5 ปีที่แล้ว

      Yes, AI generally uses deep learning :P

    • @bigcheesetaste
      @bigcheesetaste 5 ปีที่แล้ว

      @@spid3r951 I'll have you know that most traditional AI is just programmed with a bunch of `if` statements (joke), but seriously I do a fair bit of Unity development in my spare time and still just write AI the old fashioned way without any machine learning

    • @spid3r951
      @spid3r951 5 ปีที่แล้ว +1

      @@bigcheesetaste Are you saying the cutting edge AI of today doesn't use deep learning? How about an AI that doesn't use deep learning to detect deep fakes then? ://

    • @spid3r951
      @spid3r951 5 ปีที่แล้ว

      Btw. Machine learning and deep learning are not the same things.

  • @X1Daring2
    @X1Daring2 5 ปีที่แล้ว +33

    Imagine how much fake blackmail there will be from this.....the earth is doomed
    X_x

    • @JosBergervoet
      @JosBergervoet 4 ปีที่แล้ว +2

      OK, was doomed from the start anyway..

  • @PlacidDragon
    @PlacidDragon 5 ปีที่แล้ว +47

    One little thing you forgot to mention in this video... when saying "The Department of Defense is working with experts to identify and prevent malicious deepfakes"... you left out "and they are working damn hard to perfect the system (in the unlikely event that they haven't already) so they can use it themselves on their enemies".

    • @1966-k3z
      @1966-k3z 5 ปีที่แล้ว +2

      By the time we hear of any type of technology they have already weaponized it, and perfected it, if not created it themselves for this sole purpose

    • @mcseedat
      @mcseedat 5 ปีที่แล้ว +7

      Enemy? More like use it on everyone

    • @cjxgraphics
      @cjxgraphics 5 ปีที่แล้ว +1

      Ever seen Running Man?

    • @oldmannabors3198
      @oldmannabors3198 5 ปีที่แล้ว

      And we are their number one enemy....

    • @PlacidDragon
      @PlacidDragon 5 ปีที่แล้ว

      @@mcseedat
      When i say "enemy", i mean everyone who isn't "them" (IE, practically all of us).

  • @JonahBCarpenter
    @JonahBCarpenter 3 ปีที่แล้ว +36

    Will the real Joe please stand up...or could it be that..."He's Shot!"

    • @breezy3725
      @breezy3725 3 ปีที่แล้ว +5

      They'll probably create a "fake" a$$a$ination and blame it on the right. Oldest play in the book. Then Commy Harris can step in.

    • @razback8661
      @razback8661 3 ปีที่แล้ว +2

      @@breezy3725 Commy Harris WILL step in. That was the plan from the beginning. The Demoncrats always wanted a "woman of color" to occupy the White House. She's already making phone calls to foreign leaders. Boy, has the American electorate had the wool pulled over its eyes. I didn't. I didn't nor ever will vote for Sleepy Joe or Kooky Kamala.

    • @Noor-jw2tn
      @Noor-jw2tn 3 ปีที่แล้ว

      th-cam.com/video/EWuAYS8lh7A/w-d-xo.html
      Here are the 2 Joes

  • @THELEXIS1983
    @THELEXIS1983 5 ปีที่แล้ว +52

    by feeding your pictures into facebook/insta/kik/etc.. A.I is getting cleverer and more skilled i knew this would happen

    • @Constantinesis
      @Constantinesis 5 ปีที่แล้ว +6

      Especially when we add tags, its even easier for it to learn :)

    • @xybrpnk8904
      @xybrpnk8904 5 ปีที่แล้ว

      pause for facial recognition based AI kill drone hives..

    • @MrOiram46
      @MrOiram46 5 ปีที่แล้ว +3

      Only the Amish are safe lol

    • @DJSbros
      @DJSbros 5 ปีที่แล้ว

      This is how the AI overlord will control us.

  • @moscowjade
    @moscowjade 4 ปีที่แล้ว +89

    Why is this so 'SCARY' as opposed to saying 'Hey, look at how cool this new technology is'?
    Answer: because someone is scared of video footage that is about to drop.

    • @rileybaker8914
      @rileybaker8914 4 ปีที่แล้ว +5

      Ped()'s bout to get yeeted!!

    • @rileybaker8914
      @rileybaker8914 4 ปีที่แล้ว +2

      1 day ago?? We must be in the same rabbit hole....

    • @rj8836
      @rj8836 4 ปีที่แล้ว +1

      I’m in the rabbit hole with y’all!!

    • @PrinceCbass
      @PrinceCbass 4 ปีที่แล้ว +5

      been in the hole for years now, just waiting for the hole to cave in on itself. I am ready for all to be revealed and a little light to disinfect the darkness

    • @googlygoose4830
      @googlygoose4830 4 ปีที่แล้ว +2

      Because it could be used for evil doing genius that’s why.

  • @Verydemureverycutesie
    @Verydemureverycutesie 5 ปีที่แล้ว +106

    Snapchat: hey you can put this persons face on your face and make a funny video
    Deepfake: Hold my beer...

    • @jnuggs2323
      @jnuggs2323 4 ปีที่แล้ว +3

      Theyve been doing it aince the 80s

    • @ElasticReality
      @ElasticReality 4 ปีที่แล้ว +3

      I'm actually really happy you caught that. Cheers! We've been inching towards this in AR for many years. I guess the lesson here is, if you want to change the world by filtering reality, just package it in something they won't be able to resist trying on themselves.

  • @judyg2889
    @judyg2889 3 ปีที่แล้ว +8

    We wonder how this is allowed to happen. We do not need to look to far."The whole world is lying in the power of the wicked one ." 1John 5:19

  • @A3Kr0n
    @A3Kr0n 5 ปีที่แล้ว +7

    People who do that for nefarious purposes should be removed from society. They can never be trusted again.

    • @montag4516
      @montag4516 5 ปีที่แล้ว +1

      Unfortunately people who do that and able to do get away with such, unquestioningly, are at the top of the Cabal. The abuse of power starts at the top and works it's way down (trickle- down theory). Micro managing isn't even required.

  • @littlejoesconcrete
    @littlejoesconcrete 5 ปีที่แล้ว +16

    This wouldn't be a problem is people had some patience and did research and looked into everything they see or hear before jumping to conclusions.

    • @WaffleAbuser
      @WaffleAbuser 5 ปีที่แล้ว +3

      Not gonna happen, sadly

    • @akpokemon
      @akpokemon 5 ปีที่แล้ว +2

      lol k
      There will always be 13 year-olds and hillbillies (a.k.a. shallow-thinking reactionaries) in the world

  • @nonh1
    @nonh1 5 ปีที่แล้ว +23

    The Dept. of Defense is working with experts to identify and prevent malicious deepfakes… or to create and use them.

  • @silentecho8329
    @silentecho8329 2 ปีที่แล้ว +2

    Perfect! All politicians, actors, musicians, state leaders, doctors, scientist, occultist, whistleblowers, etc.... Any and all who have been blackmailed or simply want to get out of the demonizing club they joined but have long since regreted... God has just given you a way out. A perfect opportunity and excuse to do it, right here, right now!!!! "Go ahead and show that video, the world knows it's faked." Humanity will stand behind you. We got your back baby!!

  • @Rustycrawler
    @Rustycrawler 5 ปีที่แล้ว +18

    I hope this isn’t being done in anticipation for the release of the Epstein footage? Pbs we have everything!

  • @mathavraj9378
    @mathavraj9378 5 ปีที่แล้ว +7

    Mark my words, this same video will again be trending and be recommended in all our TH-cam feed once the US presidential election happens

    • @miagabrielle7628
      @miagabrielle7628 4 ปีที่แล้ว

      Yep. You were 100% correct on that one. It’s starting..

  • @ceratedsign4825
    @ceratedsign4825 4 ปีที่แล้ว +26

    "Your call maybe recorded for *training and quality purposes.* "

    • @miagabrielle7628
      @miagabrielle7628 4 ปีที่แล้ว +3

      Training and quality... that is chilling as f*ck. Training and improving quality of AI. With OUR voices.

  • @Toolgdskli
    @Toolgdskli 4 ปีที่แล้ว +7

    *sigh of relief* , when the videos come out Bill and Andrew can just say “those are deep fake”.

  • @EM-hq7hv
    @EM-hq7hv 5 ปีที่แล้ว +27

    Does anyone remember the movie, the Running Man?

  • @mrchrysler9736
    @mrchrysler9736 5 ปีที่แล้ว +7

    I stopped " _just believing what I saw_ in the late 90's when I watched Celine Dion sing live from Las Vegas with Elvis Presley.
    It was far from perfect, but it was easy to see where it was going.
    Now it is undetectable to the naked eye, and the experts claim to be able to detect all, but how would it know?
    Why would we assume the _experts_ we hear and see, are the apex of the tech?
    If you don't see how the very few who want to control the all, could feed us anything with this technology and you can't detect it, and rely on "someone" telling you they got all the bad guys.
    Please never say to me, "I saw a picture, I saw a video, on TV, a movie" and attempt to claim you know its a fact.
    " *Believe nothing you hear, and only one half that you see* ." _Edgar Allen Poe_

  • @flowerpower111
    @flowerpower111 5 ปีที่แล้ว +8

    I can't wait for: "Lord of the Rings the Fellowship of the Ring - But Everyone is Danny DeVito"!

  • @harmonymomentofbeing5753
    @harmonymomentofbeing5753 3 ปีที่แล้ว +9

    I am so glad that this is being shown .. I’ve been labeled a schizo and all cause I’ve been trying to tell ppl this for so long

    • @rainbowbgood
      @rainbowbgood 3 ปีที่แล้ว +1

      where did you notice it the most?

  • @klyesam4006
    @klyesam4006 5 ปีที่แล้ว +18

    Why couldn't anyone just cryptographically sign any official video so you can always authenticate the source of the video

    • @muffinspuffinsEE
      @muffinspuffinsEE 5 ปีที่แล้ว

      Ofc

    • @toebiwankonobijuciysmooyay3400
      @toebiwankonobijuciysmooyay3400 5 ปีที่แล้ว

      Lol you just solved the problem you sir are smart

    • @actontreadway1168
      @actontreadway1168 5 ปีที่แล้ว +4

      just creates another layer to the confusion. they can avoid signing anything controversial in the first place and claim it's fake.

    • @klyesam4006
      @klyesam4006 5 ปีที่แล้ว

      @@actontreadway1168 why would they release anything controversial in the first place. Usually the people releasing controversial footage aren't the people in the footage. So whoever is releasing it can sign it

    • @klyesam4006
      @klyesam4006 5 ปีที่แล้ว

      Or even camera manufacturers could release cameras that would automatically sign any video

  • @kenbibi7631
    @kenbibi7631 5 ปีที่แล้ว +19

    James Cameron's Avatar 5mins sequence = 20million
    DF = 20 bucks

    • @shivatecs
      @shivatecs 5 ปีที่แล้ว

      riiight? they're trying to sell us a lie, so that when their heinous crimes are exposed on film, they can say "tHe rUsSiaNs MaDe A deEp fAKe"

  • @selenadansfield1305
    @selenadansfield1305 5 ปีที่แล้ว +6

    Watch the mouth. It gives it away EVERY time (so far). I'm reasonably good at reading lips - the lips/mouth on these deep fakes are very, VERY hard to read. That gives it a way for me.

    • @winterrain1947
      @winterrain1947 2 ปีที่แล้ว +1

      yes I saw that as well. But, that'll get cleaned up and even deaf people trained to read lips won't notice.

  • @reyhudson563
    @reyhudson563 3 ปีที่แล้ว +6

    I GOT it! THAT'S how they got him saying complete sentences.

  • @blakewilliams9642
    @blakewilliams9642 5 ปีที่แล้ว +8

    In 10 to 15 years this technology will probably be flawless.

    • @angelohernandez6406
      @angelohernandez6406 5 ปีที่แล้ว

      More like years ago...old news

    • @blakewilliams9642
      @blakewilliams9642 5 ปีที่แล้ว

      Nahhh... Even in these short example videos you can still tell somethings off. I'm sure there are some good ones that are hard to tell but i feel in 10 to 15 you wont be able too.

  • @wmtrader
    @wmtrader 5 ปีที่แล้ว +6

    Soon to be used to anger people so that they take action against a person or group.

    • @ireneuszpyc6684
      @ireneuszpyc6684 5 ปีที่แล้ว

      Soon to be used (by certain governments, around the world)

  • @21EC
    @21EC 5 ปีที่แล้ว +4

    2:55 They need to create a worldwide kind of signature for files that would be marked on original videos files so they would know if it got edited in some form or not, so freshly filmed videos that were just got recorded are gonna automatically get this deep rooted signature mark in the original file, that's the simplest idea I could come up with to solve this issue.

    • @NorthSea_1981
      @NorthSea_1981 5 ปีที่แล้ว +1

      Sounds like a reasonable idea to me.

    • @mom.left.me.at.michaels9951
      @mom.left.me.at.michaels9951 ปีที่แล้ว

      Isn't there a way to tell if something has been altered in the metadata. Like original file creation and save logs? I have that in some of my art programs. Sure it's not perfect, someone could still take an image of the image and that would count as an unaltered original copy if you were unaware of the true original. But it's hardly a new concept, just needs a bit of updating.

  • @toriwright8306
    @toriwright8306 3 ปีที่แล้ว +1

    Eventually video evidence won't be allowed as evidence.

  • @ProjectILT
    @ProjectILT 5 ปีที่แล้ว +45

    Photoshop right now can create WAY more convincing still images, but you don't see people sweatin' over how that could have terrifying ramifications.

    • @davidt01
      @davidt01 5 ปีที่แล้ว +27

      Yeah, but videos where you can make a person say anything are scary.

    • @CriticalRoleHighlights
      @CriticalRoleHighlights 5 ปีที่แล้ว +4

      No they're not. It's not gonna have any negative effect of any significant scale.

    • @ProjectILT
      @ProjectILT 5 ปีที่แล้ว +10

      @@davidt01 The whole reason why deepfake is a trending topic is only because unlike Photoshop, deepfake is still a new thing. People with no computer science background immediately connect this technology to its possibility of slander and political smearing. But you can essentially try and accomplish the same thing by photoshopping someone getting a bj from a kid or something, yet you don't really see this happen on a daily basis right now. It doesn't happen because there are ways to verify authenticity of pictures, you might not know how to but there are professionals who do and the same will go for deepfaked videos. The original source of the footage can be found for comparison, suspicious video coding, etc. Word of it being faked would then hopefully spread before it gains real traction. People in general also need to learn to be smarter and be more skeptical, but there will always be gullible people who are willing to believe in pictures/videos at the drop of a hat, as long as the fake stuff confirms their political views/ideals

    • @mikemhz
      @mikemhz 5 ปีที่แล้ว +1

      @@ProjectILT but you said yourself, hopefully not too many gullible people will believe it before somebody exposes it. That's not a very good assurance, because a viral video can reach millions of people and sometimes the debunking will never reach them, or only will after a significant delay.

    • @mikemhz
      @mikemhz 5 ปีที่แล้ว +1

      "People need to learn to be smarter and more skeptical" is also a deeply flawed expectation. Propaganda is extremely effective because generally people are naive and that's not going to change.

  • @-The-Darkside
    @-The-Darkside 5 ปีที่แล้ว +36

    It's good but it's still not quite right. I can see this will be a massive issue soon though.

    • @nalissolus9213
      @nalissolus9213 5 ปีที่แล้ว +4

      Yeah, you can still tell something is off. Also when they turn their head the face isn't following perfectly and the perspective of the face and the head is sometimes off.

    • @spinny2010
      @spinny2010 5 ปีที่แล้ว +4

      That's the 'off the peg' version. Much more realistic versions exist. Just not for Joe public.

    • @username4441
      @username4441 5 ปีที่แล้ว +2

      thats because the examples in the video, and that pleb speaking about his "professional deepfakery" is a simpleton, who cannot curate a dataset outside the zip files of cage mugshots he finds from reddit links, nor ever tinkered with anything larger than 64x64.

  • @johnphilip7466
    @johnphilip7466 5 ปีที่แล้ว +14

    SO many stories about this recently. Almost like they are trying to get ahead of something....

    • @volcryndarkstar
      @volcryndarkstar 5 ปีที่แล้ว +2

      So many stories about this recently. Almost as if you've clicked on a video about this before so more get recommended to you as they come out.

    • @johnphilip7466
      @johnphilip7466 5 ปีที่แล้ว +1

      @@volcryndarkstar Now you're catching on!👍

    • @MsSomeonenew
      @MsSomeonenew 5 ปีที่แล้ว

      It was in the damn video, people will abuse it for politics, that is why they made this video to tell you about it.

    • @johnphilip7466
      @johnphilip7466 5 ปีที่แล้ว +5

      @@MsSomeonenew , OR they will use the prospect of this technology to try to explain away real video evidence.

    • @jamesclarke4221
      @jamesclarke4221 5 ปีที่แล้ว +1

      @@johnphilip7466 bingo

  • @ramonago3690
    @ramonago3690 3 ปีที่แล้ว +2

    All the more reason to not trust and understand social media. Meeting random people can be devastating. It’s scary.

  • @Saad-ih3ys
    @Saad-ih3ys 5 ปีที่แล้ว +7

    I like how the deep fake thing is still a new concept and isn't much popular but many channels are already trying to expose it

    • @boshk4054
      @boshk4054 5 ปีที่แล้ว

      well, it would have taken off, except pornhub blocked and deletes all the videos using it.

  • @KP11520
    @KP11520 5 ปีที่แล้ว +8

    It's a toss up on whether Technology will save the world or end it. The way ethics and morals are on the decline, the latter is the likely trajectory.

    • @evilseedsgrownaturally1588
      @evilseedsgrownaturally1588 5 ปีที่แล้ว +3

      KP11520 Morals have always been volatile and thus fluctuated heavily. Yet the world still stands, and has improved greatly on almost all measurable parameters. Chill, bro. Chill.

    • @KP11520
      @KP11520 5 ปีที่แล้ว +1

      @@evilseedsgrownaturally1588 Morals may have, but when technology makes those morals exponentially more volatile, it takes less mistakes to go beyond the point of no return in the blink of an eye. Can't compare the past to now. It's not even close to the same situation. Eyes wide open bro, OPEN!

    • @astraldirectrix
      @astraldirectrix 5 ปีที่แล้ว +1

      KP11520 hmm, I’m more inclined to believe in the half-empty glass more. From what I’ve seen of social media, it only wastes people’s time, boosts their egos with a false sense of importance, and misinforms them into ignorance. People are absorbed into their smartphones instead of appreciating the real world around them. And Google has amassed so much power and information about everyone that they could singlehandedly deliver us into a utopia if it comes to that.
      Technology seems more like it will condemn us before it could help us. I think we need to remember what is real and true in this world, and come back down from our artificial highs.

  • @ace2fst603
    @ace2fst603 5 ปีที่แล้ว +7

    The truth is coming!This reveal of this technology will not save thos who must face consequences for what they've done!

  • @danr2652
    @danr2652 3 ปีที่แล้ว +4

    Like Shaggy, everyone will soon say "It wasn't me" blaming it on deep fakes.

  • @orenmashko1177
    @orenmashko1177 5 ปีที่แล้ว +14

    Why do we need 'save guards'? Let's just stop believe what we see in videos. It will actually bring back the need to actually meet people that you want to hear what they have to say.

  • @Sam-be4yy
    @Sam-be4yy 5 ปีที่แล้ว +16

    why would someone release this kind of technology

    • @davidtaylor857
      @davidtaylor857 5 ปีที่แล้ว +2

      Samuel Lee For profit.

    • @miameow4833
      @miameow4833 4 ปีที่แล้ว +2

      it can be used for harm and for good...imagine they have a grown woman who can fake the voice of a child and they deep fake her to look like a kid, then they can catch pedos and not have the child exposed to horrible X-rated conversations.And for bad, can be used for facial recognition to get access somewhere people should not be.Neutral use would be just for entertainment purposes.

  • @epichaxboi6258
    @epichaxboi6258 4 ปีที่แล้ว +16

    “Deepfakes are going to cause chaos!”
    People who use deepfakes: *dame nane*

  • @TerraDactyl-hc9ff
    @TerraDactyl-hc9ff 3 ปีที่แล้ว +1

    Reading the comments here and you guys really restored my faith in our American citizens.
    Thank you for not being brainwashed