What are deepfakes and are they dangerous? | Start Here

แชร์
ฝัง
  • เผยแพร่เมื่อ 4 ต.ค. 2024

ความคิดเห็น • 463

  • @excelsior31107
    @excelsior31107 3 ปีที่แล้ว +241

    This robust technology is a great way for destroying people’s reputations.

    • @amerlad
      @amerlad 3 ปีที่แล้ว +19

      among other things, such as fabricating evidence, disapproving evidence and so much more extremely dangerous uses.
      you dont like a new government policy? oh look! we have a video of you having intercourse.

    • @RiversBliss
      @RiversBliss 3 ปีที่แล้ว +3

      It's not new.

    • @dansierrasam79
      @dansierrasam79 3 ปีที่แล้ว +5

      @@amerlad Well said! Not only government policy but particularly disagreements pertaining to matters of religion and race. Or if people want to discredit you in some way to suit their agenda. It's very easy. Still, people who are close to you usually know you better and which is why these deepfake videos fail to reflect reality over time.

    • @aminbinsalim1995
      @aminbinsalim1995 3 ปีที่แล้ว

      /s

    • @aminbinsalim1995
      @aminbinsalim1995 3 ปีที่แล้ว

      @@amerlad :(

  • @salmanramzan2032
    @salmanramzan2032 3 ปีที่แล้ว +88

    Welcome to the Age of Deceptions.

    • @samdacosta4676
      @samdacosta4676 3 ปีที่แล้ว +4

      I read that first as DECEPTICONS

    • @rogeramezquita5685
      @rogeramezquita5685 3 ปีที่แล้ว

      Facts

    • @filhanislamictv8712
      @filhanislamictv8712 3 ปีที่แล้ว +1

      @@samdacosta4676 You need to unload movies from your mind.

    • @samdacosta4676
      @samdacosta4676 3 ปีที่แล้ว +1

      @@filhanislamictv8712 ha ....ikr

    • @furrycheetah
      @furrycheetah 3 ปีที่แล้ว

      @@filhanislamictv8712 it is related

  • @fahdjamy
    @fahdjamy 3 ปีที่แล้ว +304

    Appreciate the fact that "Sandra looks and sounds better than J-Lo"

    • @danielwilson9342
      @danielwilson9342 3 ปีที่แล้ว +15

      Unimportant and unnecessary take

    • @SubliminalMessagesTV
      @SubliminalMessagesTV 3 ปีที่แล้ว

      Ayyyyeeee

    • @SubliminalMessagesTV
      @SubliminalMessagesTV 3 ปีที่แล้ว +9

      @@danielwilson9342 u right but shut up

    • @klarag7059
      @klarag7059 3 ปีที่แล้ว +2

      Totally agree.

    • @klarag7059
      @klarag7059 3 ปีที่แล้ว +4

      @jaep struiksma not from my perspective. The presenter looks more beautiful as she looks more natural than the overly made up “fictitious”image of beauty. The reporter is more real and relatable because of her more natural look.

  • @florence8532
    @florence8532 3 ปีที่แล้ว +82

    Informative, yet chilling enough to make people think twice before posting pictures of themselves.

    • @PainfulGrowth
      @PainfulGrowth 2 ปีที่แล้ว +4

      celebs are gonna be in danger lol, so easy to make a scandall

    • @Scarshadow666
      @Scarshadow666 ปีที่แล้ว +2

      @@PainfulGrowth
      Considering how often a lot of people put images of themselves online (or, even if they don't, there's lots of people that will intentionally look for pictures of them to display online), I don't think it's just celebrities that are going to be in danger... 0_0

    • @Scarshadow666
      @Scarshadow666 ปีที่แล้ว +1

      Considering how well social media like TikTok, TH-cam, and Instagram take off due to people posting images of themselves, I doubt it'll hinder people unless they educate themselves of the dangers of deep-fakes. 0_0

  • @hoboryan3455
    @hoboryan3455 3 ปีที่แล้ว +112

    LMAO! the beginning actually had me, I was like "WTF" And then I remembered what the topic was xD

    • @TheSenzerx
      @TheSenzerx 3 ปีที่แล้ว +4

      Same here 🤪

    • @amenjamal8454
      @amenjamal8454 3 ปีที่แล้ว +2

      @@TheSenzerx i thought it was youtube add of some beauty product of jennifer lopez

    • @TheSenzerx
      @TheSenzerx 3 ปีที่แล้ว +1

      @@amenjamal8454 lol

    • @frfarahrahman
      @frfarahrahman 3 ปีที่แล้ว +1

      Same... 😂😂😂

  • @xja85mac
    @xja85mac 3 ปีที่แล้ว +42

    Wonder why she put her earrings off? It turns out that earrings or eyeglasses make it harder for the algorithm to isolate your face.

  • @tauriqabdullah6130
    @tauriqabdullah6130 3 ปีที่แล้ว +19

    I thought the J Lo intro was an ad.

  • @itistrueitisafact5432
    @itistrueitisafact5432 3 ปีที่แล้ว +115

    Technology has advantages and disadvantages and this is one of them. May Allah save us from all evil people amen.

    • @y.r5155
      @y.r5155 3 ปีที่แล้ว +3

      Amin. But deep fake can be recognized there are apps people use that verify if the video was created

    • @KatariaGujjar
      @KatariaGujjar 3 ปีที่แล้ว +3

      @@y.r5155
      So what, are you gonna scan and check every video for possible deep fake?

    • @y.r5155
      @y.r5155 3 ปีที่แล้ว +3

      @@KatariaGujjar no I'm talking about like a celebrity or government officials or someone known. I'm a software engineer I know how to create it and how to know it's a deep fake.

    • @KatariaGujjar
      @KatariaGujjar 3 ปีที่แล้ว

      @@y.r5155
      Celebrities and officials make thousands of videos daily. Who is going to check every single clip?

    • @ADeeSHUPA
      @ADeeSHUPA 3 ปีที่แล้ว

      @@KatariaGujjar ARe You A Jew or Zoroastrian

  • @Mazzie2022
    @Mazzie2022 3 ปีที่แล้ว +34

    I actually don’t think she looked like Jennifer Lopez. In fact when I watched it at first the sound was muted and I just thought she had the same name as Jennifer Lopez but I will agree this is very dangerous and quite sick actually.

    • @andikoazri
      @andikoazri 3 ปีที่แล้ว

      She only did it for an examples...There's someone out there can make DeepFake looking 100% like the real celebrities!

    • @nusaibahibraheem8183
      @nusaibahibraheem8183 3 ปีที่แล้ว +1

      They probably did it very quickly just as an example

    • @andikoazri
      @andikoazri 3 ปีที่แล้ว +1

      @@nusaibahibraheem8183
      Exactly...

  • @zx7siovia213
    @zx7siovia213 3 ปีที่แล้ว +60

    Fun fact, Sandra looks better than Jlo 😂

    • @blaze4158
      @blaze4158 3 ปีที่แล้ว +5

      So what? JLO can sing, dance, act, choreograph, and she's got a better body. A woman's physical appearance shouldn't matter that much to you or anyone else. Stop comparing us like objects. It creates a competitive sense between females and we should no longer allow males to do this to us.

    • @loveshell007
      @loveshell007 3 ปีที่แล้ว

      Not a fact, just an opinion

    • @blaze4158
      @blaze4158 3 ปีที่แล้ว +1

      @@loveshell007 Who is your comment directed to? You should know enough to be specific about whom you are addressing.

  • @rajinrashid2455
    @rajinrashid2455 3 ปีที่แล้ว +14

    this series is actually pretty good

  • @ousman997
    @ousman997 3 ปีที่แล้ว +9

    this lady is amazing, your scripts are just on spot.

  • @MuhammadShahAlamSaqibi
    @MuhammadShahAlamSaqibi 3 ปีที่แล้ว +7

    That was the best session according to me, post pandemic. Especially that face changing Sandra.

  • @badripaudel77
    @badripaudel77 3 ปีที่แล้ว +9

    It's people's moral characters. There are people always to misuse something 🙄

  • @gokulpayyanur1839
    @gokulpayyanur1839 3 ปีที่แล้ว +9

    When also look into the fact that we have a small camera to record things, the world is getting creeper by the minute

    • @EmpressTouch
      @EmpressTouch 3 ปีที่แล้ว

      Yes. And this video only highlights visuals. Sound and acoustics are developing very fast too.

  • @sindhujasai1345
    @sindhujasai1345 3 ปีที่แล้ว +19

    Before anything horrible happens, I hope a global policy is created to protect those who were used for deep fake stuff. Which could involve cyber police maybe.
    Edit: Honestly it's already seeming to get out of hand but the sooner the better.

    • @KatyYoder-cq1kc
      @KatyYoder-cq1kc 6 หลายเดือนก่อน

      Too late, it has and is

  • @nickdupreez1843
    @nickdupreez1843 3 ปีที่แล้ว +22

    Thank you! This is a very good overview on deep fake technology, the only of the thing you didn't mention was the fact that realistic deep fakes are trained on huge data sets of images (10,000's+) like the Tom Cruise deep fakes, where they have hours of footage with a huge range of facial expression and you need to map the face onto someone with a similar facial structure to achieve realistic results.

    • @DarkPesco
      @DarkPesco 2 ปีที่แล้ว +2

      @B A I agree. The commenter seemed to be trying to make it seem like it's not that big of a threat to anyone other than celebrities with grand amounts of footage, while ignoring the fact that modern phones and social media have driven large segments of the population to create a comparable amount of footage of themselves and post it all online.
      The commenter doesn't use personal social media like FB? Doesn't have friends on SM so he doesn't know?

  • @MuhammadAbdullah-dy5dn
    @MuhammadAbdullah-dy5dn 3 ปีที่แล้ว +14

    That is really interesting to know. Technology has changed everything. I hope law will be made on it to spot the culprits behind it.

  • @azzyfreeman
    @azzyfreeman 3 ปีที่แล้ว +9

    The same algorithms used to detect deep fakes, can be used to train better deep fake networks

    • @DarkPesco
      @DarkPesco 2 ปีที่แล้ว +1

      A sick cycle...

  • @mahmudulhaidersiyam3186
    @mahmudulhaidersiyam3186 3 ปีที่แล้ว +21

    Starting was just🤣🤣🤣🤣

  • @izzatfauzimustafa6535
    @izzatfauzimustafa6535 3 ปีที่แล้ว +9

    Marvel is creating deepfakes of Tom Hiddleston using Loki.

  • @williamsjones4139
    @williamsjones4139 3 ปีที่แล้ว +39

    Investing in crypto is a more lucrative way of making money

    • @greywoods3412
      @greywoods3412 3 ปีที่แล้ว +1

      Absolutely right , I got 70% of my total portfolio in crypto and I have been making good profits

    • @brownjackie9105
      @brownjackie9105 3 ปีที่แล้ว

      I wanted to trade crypto but got confused by the fluctuations in price

    • @amandawilson7329
      @amandawilson7329 3 ปีที่แล้ว

      I heard that his strategies are really good

    • @andrewmasscot2955
      @andrewmasscot2955 3 ปีที่แล้ว

      Yeah
      My first Investment with Mr Tony Gallippi made me profits of over $24,320 US dollars and ever since then he has Been delivering

    • @andrewpeterson7726
      @andrewpeterson7726 3 ปีที่แล้ว

      Wow you know Tony Gallippi

  • @saajidalikhan
    @saajidalikhan 3 ปีที่แล้ว +1

    I was trying to find the skip ad button in the beginning, thinking it was an omaze ad or something

  • @michelefortner1190
    @michelefortner1190 3 ปีที่แล้ว +9

    Very interesting but terrifying at the same time I wouldn't want that to ever to happen to me or my friends and family this isn't good there will be so many problems with this

  • @dorcasnjeri2858
    @dorcasnjeri2858 3 ปีที่แล้ว +3

    It's very dangerous because it's destroying people's reputations dignity , career's, creating depression

  • @Head_of_the_Table2.0
    @Head_of_the_Table2.0 3 ปีที่แล้ว +1

    Al Jazeera is definitely a great example of it

  • @mirygalas6508
    @mirygalas6508 3 ปีที่แล้ว +1

    Education, education, education. People who can think critically and exercise a healthy level of skepticism are difficult to deceive. New technology, old solutions.

  • @cinto1394
    @cinto1394 3 ปีที่แล้ว +1

    Wow thanks for raising concerns!

  • @thisismyloooveeeyy8014
    @thisismyloooveeeyy8014 2 ปีที่แล้ว +3

    All we need is knowledge, and to stay informed of what technology can do.

  • @birdsarecool6448
    @birdsarecool6448 3 ปีที่แล้ว +1

    This is extremely alarming.

  • @sultanrayder
    @sultanrayder 3 ปีที่แล้ว +1

    Sandra! Thank you so much 🧡

  • @paklah245
    @paklah245 3 ปีที่แล้ว +76

    Fitnah coming soon

  • @gandhi1945
    @gandhi1945 3 ปีที่แล้ว +6

    This tech will help protect the elites

  • @IbrahimAli-vv3df
    @IbrahimAli-vv3df 3 ปีที่แล้ว +7

    I was so annoyed that Sandra was substituted. Lol.

  • @rimshakhan9751
    @rimshakhan9751 3 ปีที่แล้ว +3

    The intro really got me! 😂

  • @SunnySJamil
    @SunnySJamil 3 ปีที่แล้ว +1

    At 2:16, the genius of the software user trumps that of its developer.

  • @ShahbazAli-ji3jq
    @ShahbazAli-ji3jq 3 ปีที่แล้ว

    There is a woman in Canada who has TH-cam channel her name is jasmine, she is doppelganger of Sandra.
    The name of the channel is jasmine and dawoud .

  • @shabeebkaringappara2917
    @shabeebkaringappara2917 2 ปีที่แล้ว +1

    The starting.... Nailed it 🤣

  • @joycejeong-x4b
    @joycejeong-x4b 10 หลายเดือนก่อน

    Engaging in open discussions about deepfakes is essential for raising awareness and building resilience. By fostering a culture of transparency and accountability, we can collectively navigate the challenges posed by deepfake technology, mitigating its negative impact on individuals and society.

  • @yoeymeme
    @yoeymeme 4 หลายเดือนก่อน +1

    the first one is obv not real like the field of view in the face is completely different

  • @NINJANOOB777
    @NINJANOOB777 2 ปีที่แล้ว

    yo her last part words about rocks sounded like the hood i love it lol

  • @LivesInReality
    @LivesInReality 3 ปีที่แล้ว +1

    *Very*

  • @jamdindali
    @jamdindali 3 ปีที่แล้ว +6

    pro-deepfakes and deepfakes apologist are a problem. mark my words

  • @arvailankara
    @arvailankara 3 ปีที่แล้ว

    Sandra has got extraordinary grace and gravitas

  • @moatazgamal34
    @moatazgamal34 3 ปีที่แล้ว +3

    Let's just appreciate all the hard work done to make such a video, It seems simple because it took a lot of effort into it for sure

  • @andym6603
    @andym6603 3 ปีที่แล้ว

    Too notch investigative journalism

  • @slipknotj2581
    @slipknotj2581 3 ปีที่แล้ว

    The only video of Al Jazeera I’ve respected

  • @LivesInReality
    @LivesInReality 3 ปีที่แล้ว +3

    *Read the Qur'an Translation*

  • @axethrowing1801
    @axethrowing1801 2 ปีที่แล้ว +2

    Nothing good can come from this.

  • @havetrustissue8975
    @havetrustissue8975 3 ปีที่แล้ว +4

    This is the technology that CIA uses, I guess.

  • @Recuper8
    @Recuper8 3 ปีที่แล้ว

    The audio on this is brutal.

  • @Aslaan1
    @Aslaan1 3 ปีที่แล้ว

    Very hard to find fake ones so be vigilant & prudent.

  • @letarvisjohnson5337
    @letarvisjohnson5337 2 ปีที่แล้ว +1

    They are absolutely dangerous. Bcuz people always believe most of what they see.

  • @MagicMattHawkins
    @MagicMattHawkins ปีที่แล้ว +1

    What if we found a way to like “watermark” a video 💦 the future will eventually depend upon markings to prove ❤legitimacy❤of media

  • @nawazsharif4634
    @nawazsharif4634 3 ปีที่แล้ว

    Sandra is my favorite journalist

  • @kristakaufman3593
    @kristakaufman3593 2 ปีที่แล้ว

    Sowhat APPSdidyou use?

  • @Gustoking37
    @Gustoking37 2 ปีที่แล้ว

    Correction

  • @dumanimjo609
    @dumanimjo609 3 ปีที่แล้ว +1

    Funny enough the only people I am afraid of regarding this technology is the government. Who knows what kind of devious plans they're going to be able to pull off because of this tech.

  • @VAUIENLET
    @VAUIENLET 2 ปีที่แล้ว

    I suggest these things should be used for video game entertainment purpose and not for harm or conflict.
    And we can make deep fakes in such a way that they look real and even know that is deep fake.

  • @Amaaaaan1
    @Amaaaaan1 3 ปีที่แล้ว

    Make laws to watermark deep fake videos or face prosecution!

  • @haideralisuterwala9403
    @haideralisuterwala9403 3 ปีที่แล้ว +1

    Great informative vedio, thanks. Keep going.

  • @tzogreekwarrior6
    @tzogreekwarrior6 3 ปีที่แล้ว +1

    Soooo
    any good free apps for deepfaking?
    For educational purposes of course :)

  • @danny-li6io
    @danny-li6io 3 ปีที่แล้ว

    I’m just curious why the stupid name deep fake stuck? Counter facing was the better descriptive.

  • @SubliminalMessagesTV
    @SubliminalMessagesTV 3 ปีที่แล้ว +2

    Yet the funny thing about this segment is that this information has been widely known for the last several years and has only improved and there's a slight possibility that It has been used with actual results in modern media settings

  • @OctavioBecerril1
    @OctavioBecerril1 7 หลายเดือนก่อน +1

    Conductora ❤

  • @Kingofboys15
    @Kingofboys15 3 ปีที่แล้ว

    Could you please do a episode of the current situation of Somalia!!!

    • @LondonHijamaa
      @LondonHijamaa 3 ปีที่แล้ว

      What's happening in Somalia???

  • @rodaxel7165
    @rodaxel7165 3 ปีที่แล้ว +1

    Is this a trick question?

  • @tjmarx
    @tjmarx 3 ปีที่แล้ว +4

    lol @ "if it makes you feel a strong emotion, either really, really good or very mad take an extra second to check if it's real".
    Yeah, because people experiencing strong emotions are definitely using logic in that moment and will think to check frame by frame for artefacts and ghosting before they join a bandwagon. lol. The rule in the 90's was, don't believe anything you see on the internet you don't know what is and isn't fake. It continued to be the rule in the early 2000s too. Then suddenly somewhere around 2010 people lost their minds and forgot that the internet is full of misinformation and fakery. If we just went back to the original rule, deep fakes online wouldn't pose a problem to anyone.
    Hopefully deep fakes might also encourage people to care about their data, about who has their voice print and who has access to their photos. Maybe they'll think twice about using that Russian novelty face swap app, or letting a major company/the government just have their voice print for "security purposes".

  • @ishaqueahmed6362
    @ishaqueahmed6362 2 ปีที่แล้ว

    Thanks a lot for informative videos and please upload your video on time.

  • @stoneyestevan1513
    @stoneyestevan1513 3 ปีที่แล้ว +6

    There's nothing inherently bad about it? Whaaaat?!
    What is the usefulness of a technology like this? When is it ever useful to put someone else's face on your face?
    We live in a sick world. WE NEED GOD. LIFE IS SLOWLY SPIRALLING OUT OF CONTROL. GOD IS THE SOLUTION

    • @TheMaghi85
      @TheMaghi85 3 ปีที่แล้ว

      how blind can one be to not see the inherent danger in this technology

  • @Farah_Gojali07
    @Farah_Gojali07 3 ปีที่แล้ว +2

    I literally believed she is Jennifer Lopez!!

  • @abdullahbinraghib5983
    @abdullahbinraghib5983 3 ปีที่แล้ว

    with advancement ... these blur spots around ear or this resolution thing will disappear and we won't be able to recognize.

  • @knowledgeispowerchannel7734
    @knowledgeispowerchannel7734 3 ปีที่แล้ว +4

    Matthew 24: Many will come in my name and deceive many - sounds like deepfake

    • @VicpaCS2
      @VicpaCS2 3 ปีที่แล้ว

      Check out this thing bro, Antichrist or as we call him Dajjal in Islam, is known for being the liar and deceiver and Qur'an and Hadith is saying that in the times when he will come, people will not be able to recognise truth from false and vice versa... Looks like world is getting ready for his comming...

  • @FarzTurk
    @FarzTurk 3 ปีที่แล้ว +2

    Absolutely frightening

  • @CSSWITHRIDAEZAINAB
    @CSSWITHRIDAEZAINAB 3 ปีที่แล้ว

    Kindly make the video on Digital currency.

  • @osiasnocum7239
    @osiasnocum7239 3 ปีที่แล้ว +1

    Yes, very dangerous stuff..!!

  • @sunny-pe6yt
    @sunny-pe6yt ปีที่แล้ว

    Very informative videos, good work 👍

  • @lovely1641
    @lovely1641 3 ปีที่แล้ว +6

    Of course they are. No one has self control anymore

  • @emadabuhagag222
    @emadabuhagag222 2 ปีที่แล้ว

    Thanks

  • @Farah_Gojali07
    @Farah_Gojali07 3 ปีที่แล้ว +5

    Till 3:40 it was fun but the whole scenario changed after that.... It is actually terrifying!!

    • @krateproductions4872
      @krateproductions4872 3 ปีที่แล้ว +1

      Rule 34 of the internet: If something exists on the internet, its NSFW version already exists.

    • @fahmidafaiza8207
      @fahmidafaiza8207 3 ปีที่แล้ว

      Jin😀

    • @Farah_Gojali07
      @Farah_Gojali07 3 ปีที่แล้ว

      @@fahmidafaiza8207 yess!😍😍💜💜

  • @wonderfacts7782
    @wonderfacts7782 3 ปีที่แล้ว

    You are rocking Sandra ❤️

  • @SLKFJAD
    @SLKFJAD 3 ปีที่แล้ว

    Is water wet?

  • @fivetimesyo
    @fivetimesyo 3 ปีที่แล้ว

    Sandra looks good and she knows it

  • @JAREDPLY1
    @JAREDPLY1 3 ปีที่แล้ว +3

    Everything on Instagram is a deep fake, but we still click like anyways. BTW Sandra over JLO any day.

  • @halalpolice23
    @halalpolice23 3 ปีที่แล้ว

    Audubillah at first I was confused 🤷‍♀️😂

  • @SAWS
    @SAWS 3 ปีที่แล้ว +1

    Oh no - you should have picked Jenniffer Anniston. She has the same jaw and face structure which would have made the deepfake perfect!

  • @hotmandead1
    @hotmandead1 3 ปีที่แล้ว

    Anyone pulling the free site down?

  • @empmachine
    @empmachine ปีที่แล้ว +1

    She's totally got lopez beat on beauty (especially class)

  • @emotionalimpact4323
    @emotionalimpact4323 3 ปีที่แล้ว +1

    Not for now but in 10 years when people will know to perfect it and the ai will get smarter then it will be a big threat

  • @juankitchen1008
    @juankitchen1008 ปีที่แล้ว

    I subscribe your TH-cam channel by watching this video. Very informative and timely!

  • @hermie0600
    @hermie0600 2 ปีที่แล้ว

    The only good application i can see for deepfaking is replacing a stunt double's face with the actor/actress in movies and tv shows instead of like doing all sorts of scand and rigging for that scan or hiding the stunt double's face

  • @wh3resmycar
    @wh3resmycar 3 ปีที่แล้ว

    Caleb Carr’s book Killing Time predicted this.

  • @watcher5729
    @watcher5729 5 หลายเดือนก่อน

    Imagine how content can be hijacked like news that we believe or advices we follow

  • @ahiyanali7231
    @ahiyanali7231 3 ปีที่แล้ว

    At 4:14 that music made me think my stomach was rumbling

  • @NegashAbdu
    @NegashAbdu 3 ปีที่แล้ว

    Few days!?

  • @lancesay
    @lancesay 2 ปีที่แล้ว

    pretty interesting and they should redo enter the dragon, bruce lee movie.

  • @01arthi
    @01arthi 3 ปีที่แล้ว

    The last bit was a killer 😀

  • @Avisheknandi12
    @Avisheknandi12 3 ปีที่แล้ว

    Do you know reFace app?

  • @rogeramezquita5685
    @rogeramezquita5685 3 ปีที่แล้ว

    That freak me out

  • @BillyHau
    @BillyHau 3 ปีที่แล้ว

    I want to say... there is a reason why the DeepFaceLab don't keep updating the image every second, that slow down the training process!