Apple’s Software Chief Explains ‘Misunderstood’ iPhone Child-Protection Features (Exclusive) | WSJ

แชร์
ฝัง
  • เผยแพร่เมื่อ 15 พ.ค. 2024
  • Neural Hashes, Safety Vouchers and More Fun Terms Explained
    Apple’s tools for flagging child pornography and identifying explicit photos in kids’ messages caused backlash and confusion. In an exclusive interview, Apple software chief Craig Federighi sat down with WSJ’s Joanna Stern to defend the technology and explain how it will work. Illustration: Laura Kammermann/The Wall Street Journal
    Personal Technology With Joanna Stern
    Technology is overwhelming and making decisions about what gadget to buy is harder than ever. WSJ personal tech columnist Joanna Stern makes it all a bit easier in her lively and informative videos.
    More from the Wall Street Journal:
    Visit WSJ.com: www.wsj.com
    Visit the WSJ Video Center: wsj.com/video
    On Facebook: / videos
    On Twitter: / wsj
    On Snapchat: on.wsj.com/2ratjSM
    #WSJ #Apple #Privacy
  • วิทยาศาสตร์และเทคโนโลยี

ความคิดเห็น • 2.6K

  • @MidNiteR32
    @MidNiteR32 2 ปีที่แล้ว +1388

    Real title of video: Tim Cook throws Craig to the wolves.

    • @chr_v
      @chr_v 2 ปีที่แล้ว +9

      🤣🤣🤣🤣

    • @cardboardpackage
      @cardboardpackage 2 ปีที่แล้ว +6

      bro what 😂

    • @MidNiteR32
      @MidNiteR32 2 ปีที่แล้ว +59

      @@cardboardpackage Tim Cook has been hiding while he throws Craig under the bus. I think the CEO of the company should be the one explaining this to its customers and media, not his VP of Software Engineering. Cook just threw him under the bus and started driving it.

    • @vaneakatok
      @vaneakatok 2 ปีที่แล้ว +33

      @@MidNiteR32 nah. I think it was the right move. First of all Craig is more agreeable and second the risk is lower. And to be honest Craig managed it formidably imo

    • @vaneakatok
      @vaneakatok 2 ปีที่แล้ว +4

      @@MidNiteR32 nonetheless your comment is quite on point ;)

  • @rijs4303
    @rijs4303 2 ปีที่แล้ว +3782

    Let’s make sure all the members of the Vatican have an iPhone

    • @cobracommander.1958
      @cobracommander.1958 2 ปีที่แล้ว +242

      The pope just bought a Huawei phone

    • @fofoqueiro5524
      @fofoqueiro5524 2 ปีที่แล้ว +28

      Or they picked the blackberry devices.

    • @MG-ll5nw
      @MG-ll5nw 2 ปีที่แล้ว +18

      @Highground Trump and biden be sweating

    • @sebastiangruenfeld141
      @sebastiangruenfeld141 2 ปีที่แล้ว +12

      Use iCloud*

    • @RajJhaveri
      @RajJhaveri 2 ปีที่แล้ว +10

      @Highground We don't have to be selective. But if you want to be, the Republican party is where we should start

  • @ferreiraleo
    @ferreiraleo 2 ปีที่แล้ว +1227

    Loved how he mentions Telegram as the message app.
    No free mentions for you, Zuck.

    • @trowawayacc
      @trowawayacc 2 ปีที่แล้ว +22

      Dam facebook. Still dont get how their adquisition of instagram and whatsap went true, o wait...

    • @geminijemini7976
      @geminijemini7976 2 ปีที่แล้ว +12

      How is Signal? I saw Elon Musk recommending it.

    • @albinjt1
      @albinjt1 2 ปีที่แล้ว +12

      Yet Telegram founder hates Apple

    • @technotrack5959
      @technotrack5959 2 ปีที่แล้ว +8

      @@albinjt1 so whats the benefits of loving apple?

    • @bigrich9654
      @bigrich9654 2 ปีที่แล้ว +1

      @@albinjt1 probably because Apple restricts certain channels on Telegram. It’s insane that I have to go the browser version of Telegram to view those restricted channels.

  • @beku420
    @beku420 2 ปีที่แล้ว +802

    Apple: We're not scanning your images, we're just scanning your images

    • @gmancolo
      @gmancolo 2 ปีที่แล้ว +74

      We're not scanning your images, we're just scanning OUR images.

    • @spaceforce0
      @spaceforce0 2 ปีที่แล้ว +20

      If they are stored on their servers in this day and age I feel as if it’s your fault for trusting big tech. Either way, we’ll all forget about this in a couple of weeks. We basically already have

    • @henrinaths1
      @henrinaths1 2 ปีที่แล้ว +6

      If they can install a program that tell me my battery is at 10% after 10minutes of use, when a quick hard restart bring it back on at 100% there is no telling what they can install on your phone.
      If Ur f-ingdeau gives Apple a couple hundred million of our tax dollars because we proved that the vax was ineffective and self immunity has a 80% success at beating the virus there’s no telling what those greedy blasters will do.

    • @oniisuki9025
      @oniisuki9025 2 ปีที่แล้ว +3

      Correct me if I'm wrong, but in order to upload an image to the cloud, you need to scan the images first right?

    • @xehP
      @xehP 2 ปีที่แล้ว +1

      There is plenty information on how they “scan” the photos, it’s even explained in laymen terms in this video.

  • @hyypersonic
    @hyypersonic 2 ปีที่แล้ว +1156

    3:21 “pornography of any other sort” I’m glad Craig essentially said that Apple knows and understands that people simply just have nudes on their phones

    • @nicolelea615
      @nicolelea615 2 ปีที่แล้ว +84

      Yeah, some people are simple degenerate pigs, but not actually pedophiles.

    • @billjamal4764
      @billjamal4764 2 ปีที่แล้ว +173

      @@nicolelea615 yes, and some people are photographers and part of that is nude photography, not porn.

    • @abubakrakram6208
      @abubakrakram6208 2 ปีที่แล้ว +100

      @@nicolelea615 They might be photos of a spouse or partner. Or photos people took to track weight lose/gain progress.

    • @utubekullanicisi
      @utubekullanicisi 2 ปีที่แล้ว +4

      @@billjamal4764 Like you, your dad, your uncle, etc.

    • @yousifwessam180
      @yousifwessam180 2 ปีที่แล้ว +11

      @@nicolelea615 My friend there are people out there that are just as bad as mentioned but they are people who have private photos of their partners/spouses don’t put everyone and everything under one group its not fair hope you understand (:

  • @tobybartlett
    @tobybartlett 2 ปีที่แล้ว +785

    “It takes 20 years to build a reputation and five minutes to ruin it. If you think about that, you'll do things differently.”
    -Warren Buffett
    Apple is feeling this hard, hence the panicked response to media.

    • @bhagathyennemajalu
      @bhagathyennemajalu 2 ปีที่แล้ว +6

      Okay then purchase a Chinese phone.

    • @sciencescripture
      @sciencescripture 2 ปีที่แล้ว +15

      It’s going to be misused like any other tool big tech and the government gets its hands on, period.

    • @alexjasonchandler
      @alexjasonchandler 2 ปีที่แล้ว +3

      @Carrot Cruncher I'm an Android guy through and through Brett has a network engineer I can tell you programs are much more flawed than people realize they're doing this for margin of error

    • @The-Heart-Will-Testify
      @The-Heart-Will-Testify 2 ปีที่แล้ว +12

      Apple fanboys will just bend over and accept everything

    • @gliderman9302
      @gliderman9302 2 ปีที่แล้ว +6

      @@The-Heart-Will-Testify they are the ones who are mad at apple. Think before u comment.

  • @doompod
    @doompod 2 ปีที่แล้ว +221

    Tim: “Hey Craig….”
    Craig: “NO NO NO NO NO NO!”
    Craig: “Hey everyone…😅”

    • @Jushwa
      @Jushwa 2 ปีที่แล้ว +1

      Don’t get it

    • @EZYash5
      @EZYash5 2 ปีที่แล้ว +3

      @@Jushwa it means tim cook said craig to go for interview

    • @Sphenixco
      @Sphenixco 2 ปีที่แล้ว

      Tim Cook is the ceo Craig is the software person he knows what everything does he made it Tim Cook does not do software and anyone that stores things to the cloud they dont own the servers all they own is the main device storage they dont scan on the device they scan on the cloud only

  • @bazil4146
    @bazil4146 2 ปีที่แล้ว +498

    Now that this whole news has gotten out, actual petafiles aren’t going to be storing their Photos on iPhone anymore. So basically this feature is useless now.

    • @hundvd_7
      @hundvd_7 2 ปีที่แล้ว +130

      The people stupid enough to store highly illegal material in cloud storage won't be stopped by these news.
      It was always one of the easiest ways to get caught

    • @diedforurwins
      @diedforurwins 2 ปีที่แล้ว +11

      @@hundvd_7 he says, sounding a little too informed

    • @hundvd_7
      @hundvd_7 2 ปีที่แล้ว +138

      @@diedforurwins Sure, go ahead and call other people pedophiles. That will make you look smart.

    • @Dr.HouseMD
      @Dr.HouseMD 2 ปีที่แล้ว +13

      “Petafiles” bro?

    • @AugustaChile
      @AugustaChile 2 ปีที่แล้ว +12

      @@Dr.HouseMD down with those Petafiles!

  • @Sulfen
    @Sulfen 2 ปีที่แล้ว +614

    I don't want my images to be scanned even if I don't engage in any illegal activities. It doesn't matter if it's AI or a human looking through my photos it just makes me feel uncomfortable.

    • @johansm97
      @johansm97 2 ปีที่แล้ว +113

      They already are. Don’t you see how your photos app can recognize faces etc? I think people pressed about this have things to hide

    • @7billza
      @7billza 2 ปีที่แล้ว +131

      @@johansm97 lol I don't understand how people think everything in their iPhones aren't already being touched by AI, especially photos. How do you think your photos look so good? Computational photography using AI. How do you think they group faces and show you memories? AI. This is just Apple using AI, in a much more careful way than other companies, to do something. That's all it is, and people are losing their minds

    • @bouzianenadhir8503
      @bouzianenadhir8503 2 ปีที่แล้ว +35

      @@7billza they actually aren't, facial recognition on iphone is done on device, apple doesn't scan anything, it's the only company that believes in privacy

    • @coolbuddyshivam
      @coolbuddyshivam 2 ปีที่แล้ว +22

      @@bouzianenadhir8503 they wouldn't have destroyed end to end encryption to Apple servers aka iCloud. If they can snoop around while a photo is uploading to cloud, it's not end to end encrypted. It's not private. As simple as that.

    • @soniqstateofficial4490
      @soniqstateofficial4490 2 ปีที่แล้ว +26

      Don’t use iCloud then.

  • @quentinlemaitre2998
    @quentinlemaitre2998 2 ปีที่แล้ว +1066

    First of all, thank you for covering the issue. I wish you pressed him on what type of audit he mentions, because to me anyone can force Apple to add a database via the FISA court. I want to know what is done to prevent that from happening instead of taking Apple as its word.

    • @cw4861
      @cw4861 2 ปีที่แล้ว +31

      THIS!! I was pleased to hear about "auditability" -- but what exactly does he mean? Anyone got source / more info on that?

    • @joshua01
      @joshua01 2 ปีที่แล้ว +59

      Before WSJ is allowed the privilege to interview craig they have to agree to terms and conditions

    • @Jamesytdjv
      @Jamesytdjv 2 ปีที่แล้ว +20

      He seems to be very shady in his explanation as to what the company IS going to do.

    • @zonka6598
      @zonka6598 2 ปีที่แล้ว +11

      So basically avoid apple's icloud services

    • @joshua01
      @joshua01 2 ปีที่แล้ว +5

      @@zonka6598 if that it what’s works for you and gives you a sense of privacy then by all means but just note that if you use google they’re already doing it and worst so yeah…

  • @popgyorgybotond4741
    @popgyorgybotond4741 2 ปีที่แล้ว +405

    “I think the customer owns the phone”
    It’s a yes or no answer

    • @Lousy_Bastard
      @Lousy_Bastard 2 ปีที่แล้ว +36

      That's a big fat no.

    • @mantasvilcinskas
      @mantasvilcinskas 2 ปีที่แล้ว +9

      That sounds like a yes to me?

    • @joshgribbon8510
      @joshgribbon8510 2 ปีที่แล้ว +9

      It's definitely a little more complicated, I can "own" a car but there's a lot of restrictions on what I can to with it or to it, especially if you want to use it on a road. Ownership doesn't really imply full control most of the time, even with land you have tons of laws limiting what you can do with it

    • @Lousy_Bastard
      @Lousy_Bastard 2 ปีที่แล้ว +3

      @@joshgribbon8510 Exactly we as consumers don't really own anything anymore and that's the world over, we don't have any rights just privileges until someone decides to take them away.

    • @bernardomejia9882
      @bernardomejia9882 2 ปีที่แล้ว +1

      @@mantasvilcinskas definitely more complicated then a yes.

  • @animeguy6877
    @animeguy6877 2 ปีที่แล้ว +348

    "It's not a backdoor. But it can be manually verified by humans in case our algorithm finds a match."
    Hmmmmmmm 🤔 That sounds suspiciously like a backdoor to me.

    • @ralphy4813
      @ralphy4813 2 ปีที่แล้ว +18

      Spying withe extra steps

    • @Roshan_420
      @Roshan_420 2 ปีที่แล้ว +9

      The files are on their servers

    • @powerplayer75
      @powerplayer75 2 ปีที่แล้ว +22

      A backdoor to what? iCloud? Which Apple already controls?

    • @bluebird1954
      @bluebird1954 2 ปีที่แล้ว +2

      Yeah... You don't need to upload your photos or use that service.... Or simple don't have cp

    • @joeswansonthesimphunter2612
      @joeswansonthesimphunter2612 2 ปีที่แล้ว +3

      @@bluebird1954 no it's the fact that it might be a faulty system. How can it differentiate an image of a child posing in a sexual manner with lingerie, to a baby taking a bath. Will it flag both, none, or one of those images? Simple things like that can really impact a person's future

  • @JackZeroZ
    @JackZeroZ 2 ปีที่แล้ว +653

    Same tech can be used to identify political dissidents, protesters, and just about anybody. Imagine matching memes commonly shared by people of the groups to identify people for political persecution.

    • @outofahat9363
      @outofahat9363 2 ปีที่แล้ว +33

      Yes. Even If we take them at their word and accept that they can't see other photos because they can only see the ones that neural network has very tightly matched for. They still haven't said anything about the possibility of them searching for other stuff.

    • @stater3
      @stater3 2 ปีที่แล้ว +25

      All they need to do is change the hash and AI to look for other photos.

    • @billjamal4764
      @billjamal4764 2 ปีที่แล้ว +24

      I'm sure your isp, phone provider, Google, facebook (including Instagram), and any other social media or messaging platform do that. If you truly care about privacy, you have to get an opensource operating system, and only use opensource apps. There's no way around it

    • @grifinx
      @grifinx 2 ปีที่แล้ว +1

      THANK YOU was looking for this. This is smoke in Mirrors.

    • @fvs666
      @fvs666 2 ปีที่แล้ว +7

      It’s already on gmail , facebook, instagram, twitter and TH-cam.

  • @richardparker9268
    @richardparker9268 2 ปีที่แล้ว +160

    He doesn't seem to understand the fundamental reason people are upset. The hash database is on your phone. The scanning is on your phone. This means that we have no guarantee that our phones will be private in the future.

    • @bhavinbijlani
      @bhavinbijlani 2 ปีที่แล้ว +10

      I believe the POSSIBILITY of future changes existed even before this announcement. We only had their word before and we only have their word now. Why are you in an uproar now? When they first said your phone was Private, why didn't you roll your eyes and say "ya, but what about the future?"

    • @agentlatte
      @agentlatte 2 ปีที่แล้ว +7

      @@bhavinbijlani They already built the tech to do it. That was their argument against creating a backdoor in 2015. Now it exists and Apple has no excuse that they “can’t comply.” They’ve already stated that they developed the technology to comply.

    • @carlosgomez-ct6ki
      @carlosgomez-ct6ki 2 ปีที่แล้ว +16

      It feels like in China.

    • @chrismeller6248
      @chrismeller6248 2 ปีที่แล้ว +8

      People are upset because they don’t understand the underlying technology, the same way that a lack of education about natural forces and science leads people, still, to call someone a witch and persecute them.

    • @Benjamin-lv8zg
      @Benjamin-lv8zg 2 ปีที่แล้ว +1

      @@carlosgomez-ct6ki World is gone same. We want to have privacy. But Every company/gov want to get it.

  • @lenajk2004
    @lenajk2004 2 ปีที่แล้ว +33

    “I think the customer owns the phone”
    Right to repair: no

    • @bazil4146
      @bazil4146 2 ปีที่แล้ว

      now you can

  • @SC-RGX7
    @SC-RGX7 2 ปีที่แล้ว +33

    Tim literally threw the guy at wolves. Hilarious

    • @_sparrowhawk
      @_sparrowhawk 2 ปีที่แล้ว +3

      It's almost as if it's Craig's job to talk about software, 5 days a week. He even makes a few mil a year for doing it.

    • @SC-RGX7
      @SC-RGX7 2 ปีที่แล้ว +1

      @@_sparrowhawk talk about software is his job, but that matter was super important and a word from Tim would have been welcomed

    • @00z53
      @00z53 2 ปีที่แล้ว +3

      Nice copy and paste

  • @nish6106
    @nish6106 2 ปีที่แล้ว +366

    Tim checks the laptops of his engineers........
    apple engineer: I swear its just for the image classification algorithm.

    • @chefnyc
      @chefnyc 2 ปีที่แล้ว +33

      Trying to confuse the rocket detection algorithm with similar images 😏

    • @EzraMerr
      @EzraMerr 2 ปีที่แล้ว

      Bruh 🤣

  • @TomNook.
    @TomNook. 2 ปีที่แล้ว +291

    "customers own their phones for sure"
    They cant even repair them without going to Apple!

    • @JackieWelles
      @JackieWelles 2 ปีที่แล้ว +16

      You own it, until you want to repair it ;)

    • @milantoth6246
      @milantoth6246 2 ปีที่แล้ว +3

      I just did it today tho

    • @bilalkhann16
      @bilalkhann16 2 ปีที่แล้ว +16

      if you repair, you’ll get a warning message in settings 🥲

    • @zak6093
      @zak6093 2 ปีที่แล้ว +4

      You don't own an IPhone, you just use it.

    • @costacoffee4life665
      @costacoffee4life665 2 ปีที่แล้ว +6

      Ive been repairing Apple products for 2 years, and aside from battery replacements I wouldn’t recommend non-techie/ qualified/ confident people to do other things like replacing screen screens, lightning ports, FaceID sensors etc

  • @JessieS
    @JessieS 2 ปีที่แล้ว +127

    "A thoroughly
    documented, carefully
    thought-out, and
    narrowly-scoped
    backdoor is still a
    backdoor"

    • @Dlawderek
      @Dlawderek 2 ปีที่แล้ว +3

      But isn’t it better than having the door wide open as it is on many cloud services? I think this is the best balance they could find between not hosting CSAM on their servers and also protecting customer privacy.

    • @Manish_Kumar_Singh
      @Manish_Kumar_Singh 2 ปีที่แล้ว +5

      @@Dlawderek no

    • @gabrielnunez3371
      @gabrielnunez3371 2 ปีที่แล้ว +9

      @@Dlawderek What about not building the door at all. Law enforcement is NOT the duty of private companies, and there are very good reasons for that.

    • @epampoefmkfkefpeao4291
      @epampoefmkfkefpeao4291 2 ปีที่แล้ว +1

      @@Dlawderek absolutely not, however good their intentions are, i will never agree to having my private data monitored.
      Something many forget is that one’s privacy is protected by law. Even if police were to illegally obtain even legitimate evidence against one (be it through illegal wiretapping or else) that evidence will be rejected as unlawfully obtained. What apple is doing here is basically rephrasing “we will hack into your storage and check if you have anything illegal” into “we will scan all your photos and if you don’t agree then we will stop providing service to you even if you paid for it”. Barbarism.

    • @chinogambino9375
      @chinogambino9375 2 ปีที่แล้ว +1

      @@Dlawderek NO. If you upload to a cloud service its not your hardware or a private space. Apple is now saying my hardware is actually theirs too to do as they please...

  • @romakrelian
    @romakrelian 2 ปีที่แล้ว +68

    Apple cannot call itself the privacy company anymore.

    • @starbutterflygaming8881
      @starbutterflygaming8881 2 ปีที่แล้ว +1

      How about Google drive and Dropbox, they also scan for CP

    • @romakrelian
      @romakrelian 2 ปีที่แล้ว +13

      @@starbutterflygaming8881 true, but they never really were known for their privacy stance, unlike Apple.

    • @drinkwoter
      @drinkwoter 2 ปีที่แล้ว +2

      @@romakrelian thats apple stan right there

    • @4JUVIE
      @4JUVIE 2 ปีที่แล้ว

      ...do you people not know how to interpret English language? Why is there still confusion

    • @reydanny6-792
      @reydanny6-792 2 ปีที่แล้ว +1

      @@drinkwoter or because nobody is ever safe when buying a phone

  • @kushpatel9911
    @kushpatel9911 2 ปีที่แล้ว +136

    This was a weak interview. Craig threw some big fancy words when asked to simply describe the system. No hard questions asked and this seemed more like a PR move than an interview

    • @ssud11
      @ssud11 2 ปีที่แล้ว +19

      Basically a paid interview for pr purposes.

    • @KarstenJohansson
      @KarstenJohansson 2 ปีที่แล้ว +6

      "Craig, tell us why it's okay to treat your customers as if they are guilty until proven innocent, and why you want to foist the system resources onto the users instead of your data centers..." That's what should have been asked.

    • @IndexError
      @IndexError 2 ปีที่แล้ว

      @@KarstenJohansson because if it was checked at iCloud servers people would go oh no they are spying on us

    • @KarstenJohansson
      @KarstenJohansson 2 ปีที่แล้ว +1

      @@IndexError They wouldn't say that when the check is done on their personal device?

    • @bgill7475
      @bgill7475 2 ปีที่แล้ว

      @@ssud11 Yep, they also pay through future access to things, not just through money.
      So if they cover this in a way that Apple likes they get future access to news first because they're seen as trusted.

  • @jaredspencer3304
    @jaredspencer3304 2 ปีที่แล้ว +207

    The reason to worry about this photo scanning is that there's no way it doesn't evolve. Currently, it only checks 1) photos being uploaded to iCloud 2) that match a database of known CSAM. Importantly, this doesn't do anything about new CSAM created in the abuse of children. Catching new material is the obvious next step. And there's no way to achieve that with the current hashing architecture. It has to be done by constantly monitoring all media on the phone, probably with some "AI moderator". And there's no way that some government doesn't demand that this monitoring be used to detect something other than CSAM, like political dissent (remember: China is Apple's biggest market). That's the worry. This new tech is only *kinda ok* as long as it doesn't evolve a single step beyond what it is now. And there's virtually no chance of that happening.

    • @RHStevens1986
      @RHStevens1986 2 ปีที่แล้ว +13

      en.wikipedia.org/wiki/Slippery_slope

    • @hrtlsbstrd
      @hrtlsbstrd 2 ปีที่แล้ว +17

      @@RHStevens1986 Sure, but also worth considering: en.wikipedia.org/wiki/Foot-in-the-door_technique

    • @dylanxu
      @dylanxu 2 ปีที่แล้ว +14

      The typical American ignorance that radiates from this single comment is amazing.

    • @walterwhite1
      @walterwhite1 2 ปีที่แล้ว +6

      Ios 16 they will start scanning your on device photo library. Mark my worlds guys 😎

    • @CalienteFrijoles
      @CalienteFrijoles 2 ปีที่แล้ว +2

      Yeah, this tech should evolve because this step alone doesn’t solve the problem. Regardless, It was either going to get created to do the right thing or the wrong thing. That’s just how it works. For now it’s use-case is positive.

  • @MatuteG
    @MatuteG 2 ปีที่แล้ว +318

    I’m glad that she pushed the “who owns your phone” and the conclusion. I applaud WSJ on pushing the exec on something that felt not scripted apple BS interview.
    Now, how do we know those pictures being provided by those associations won’t be manipulated into searching for other stuff. At the end of the day apple has no idea what those hashes are. Who knows what the hash provided was.

    • @bobbyright2010
      @bobbyright2010 2 ปีที่แล้ว +14

      @@Karantkr Multiple photo apps do this..

    • @MrSidneycarton
      @MrSidneycarton 2 ปีที่แล้ว +7

      This felt not scripted? The forced laughs, fake "searching for the right words", multiple camera angles and after all that this felt unscripted?

    • @nixednamode3607
      @nixednamode3607 2 ปีที่แล้ว +5

      @@MrSidneycarton you expect a trillion dollar company to shoot an interview with a single camera? 😒🙄 multiple camera angles are an industry standard

    • @harish8231
      @harish8231 2 ปีที่แล้ว +3

      Paid interview

    • @MrSidneycarton
      @MrSidneycarton 2 ปีที่แล้ว +1

      @@nixednamode3607 Not sure whether that was intended as sarcasm or not buddy.

  • @stoneroseshero
    @stoneroseshero 2 ปีที่แล้ว +30

    This is painful even for him to sell this…my god. This is a problem. iCloud photos are now turned off for me.

    • @bruhlollmao560
      @bruhlollmao560 2 ปีที่แล้ว +1

      dont jinx this to me dude, i just switched to icloud

  • @Mickeysternum245
    @Mickeysternum245 2 ปีที่แล้ว +657

    It seems like Apple still doesn't understand just how strange this has made their most loyal and fervent customers feel. This has the potential to really spiral out of control on PR terms, much like the 'apple purposely slows down phones' headlines came out of the throttling due to battery age thing. This loyal base kinda sets the tone for what the sentiment around Apple is, and right now they are spewing and the issue isn't going away. I think the way Craig handled this won't do anything to dampen the concerns either, condescendingly dismissing the backdoor concerns and also giving no details on how it will be expanded or just how we can guarantee Apple is limiting it to child porn. I understand Apple has a new head of PR, it's making people question just what Apple has been up to before that their slick PR glossed over. Some kinda line has been crossed here that I've never felt/seen in my 25 years of using and following Apple.

    • @ThinkyParts
      @ThinkyParts 2 ปีที่แล้ว +53

      I feel exactly the same way. I’ve been apple only since I was 8… huge fan of the company… they basically bought my house… some line is being crossed here. Like maybe I’m not in love anymore…

    • @masternobody1896
      @masternobody1896 2 ปีที่แล้ว +8

      windows is in background nice

    • @Tential1
      @Tential1 2 ปีที่แล้ว +21

      The apple loyalists will always fall in line. As a person who used to think "this is surely the last straw for Apple fans" I don't doubt anymore. I buy the stock and get "rich" with the winning team.
      Public backlash needs to be HUGE to stop this. Apple hard-core fans aren't revolting against Apple. I've put my money on that.

    • @thevideoclub8562
      @thevideoclub8562 2 ปีที่แล้ว +2

      We're not because we don't run our lives with pitchforks and torches.

    • @stevenperea_5405
      @stevenperea_5405 2 ปีที่แล้ว +4

      @@ThinkyParts how old are you now?

  • @vinzniv75
    @vinzniv75 2 ปีที่แล้ว +486

    This explanation from Apple is even more worrying. They describe a technical solution where no one will be able to independently evaluate what content triggers the alert. Hashes results will be ciphered so that no one will know what content matches what "hash of interest" on the device nor on the backend. Any political sensitive content could be part of the database without any one never knowing. Never trust anyone's word to keep you safe from technology abuse.

    • @user-hm7zn6bz4y
      @user-hm7zn6bz4y 2 ปีที่แล้ว +41

      The explanation is literally a lie. We don't process the images on your phone, here's the misunderstanding: oeighoihzgoiehrg hieogheorighe oriheoirhgoierhg "scanning on your phone, yes, but," eoitgoiehgeihgo
      That could be the TL;DW of the video tbh

    • @honzasedlon3309
      @honzasedlon3309 2 ปีที่แล้ว +41

      @@user-hm7zn6bz4y It's literally not that hard to understand it

    • @tomboss9940
      @tomboss9940 2 ปีที่แล้ว +24

      The alternative is, as Google and MS do, to scan the whole cloud content of all users. Apple wants to be in a position to not being able to see our data. And that's the way to protect our privacy, while trying to follow the laws of the US and EU that want to have more and more supervision.

    • @grimhammer00
      @grimhammer00 2 ปีที่แล้ว +12

      Oh it gets worse. At some point audits (real humans) get involved. At this level who knows what can happen…and if anything did go afoul at Apple, how would you know? What happens when hackers find a way to inject foul hashes or FISA requests force apple to apply this tech for political reasons (under the guise of domestic terrorists)…..in fact, the timing is extraordinarily on point with recent up dates to terrorism.

    • @vinzniv75
      @vinzniv75 2 ปีที่แล้ว +13

      @@tomboss9940 The alternative is better. You, as a user, can decide whether or not your content undergo the screening. While your data rests on your computer or phone, they remain yours and on your sole control. What Apple is doing is potentially removing that control from your hands: any data on your phone may be monitored without you even granting that right. the only things preventing them from doing that is their good will. Technology history taught us that you should never trust anyone's word from preventing technology abuse (be it knowingly or not).

  • @yooperlite
    @yooperlite 2 ปีที่แล้ว +10

    It doesn’t matter what the steps are between if A is uploading a photo and Z is them reviewing/alerting authorities. They “Review your private photos” despite the letters in between. Don’t get lost in the steps.

  • @MrTee-de7to
    @MrTee-de7to 2 ปีที่แล้ว +98

    As a longtime Apple customer (1986)I was thrilled with Tim Cook's statement about privacy and your history of resisting law enforcement and government when it comes to privacy. Now you have appointed yourself the law. And now you are going to scan my phone without my permission. At least the government has to get a warrant. Just a month ago I got rid of my Fitbit watch because Google bought the company and bought an Apple watch because of Apple's supposed commitment to privacy. You are not the government so I have no recourse if you abuse my privacy. So you can do whatever you think is right and I have no recourse. There are only two operating systems in the world and we just have to accept that Big Brother Apple is like Big Brother Google who knows what’s best for the unwashed. We have just about as much recourse as people in China.

    • @user-ry3no8mc6z
      @user-ry3no8mc6z 2 ปีที่แล้ว +5

      Or Apple wanted to avoid government parties such as FBI and CIA so long that by doing so (according to past features such as adding a feature to destroy all users data should the phone's password be typed 10times wrong) it could jeopardize the company. Donald Trump single handedly managed to give an executive order to Google to stop providing the official version of Android and it's services to Huawei and Huawei was almost ready to exit the market.
      Now imagine Apple being forced to show all users iCloud data to governments due to child pornography claims even though you do not have any. That would suck for them and the user's privacy. Apple (for now) found an in-between solution that still protects legit users data on iCloud and protects Apple from governments by giving an actual "backdoor" to them after many years (as seems by Kreg's tone).
      The only time this feature will get out of hand is only if it expands for political parties or political correctness such as someone posting an LGBTQ funny image that seems insulting in apple's eyes. Then things will not look good for Apple.

    • @milantoth6246
      @milantoth6246 2 ปีที่แล้ว +11

      An american saying they have to endure tyranny anything like the one in china is just ignorant.

    • @MrTee-de7to
      @MrTee-de7to 2 ปีที่แล้ว +3

      @@milantoth6246 It was extreme. My concern is the fact that the internet and media companies are becoming a necessity. Most businesses or utility companies assume you have internet access. The problem is the tools you need to access the internet are companies that can make arbitrary decisions that change your access to the internet, and you have no recourse. There are only two operating systems in reality Apple and Android, private companies.

    • @jackwilson5542
      @jackwilson5542 2 ปีที่แล้ว +2

      You can de-google Android phones though, since it is open source. Check out Rob Braxman's channel on how to do it, if privacy is so important to you.

    • @theodiscusgaming3909
      @theodiscusgaming3909 2 ปีที่แล้ว +1

      @@justinberman7386 along with Graphene and Calyx which pretty much only work on Pixels, there is also /e/OS which supports a wider range of phones.

  • @telomnisi9054
    @telomnisi9054 2 ปีที่แล้ว +16

    If this is allowed, what's stopping them from reporting your drug pics to the police? Wake up people

    • @gobi817
      @gobi817 2 ปีที่แล้ว +6

      If drug is illegal where you live, then why not? People doing illegal activities should be reported.

    • @bradador1
      @bradador1 2 ปีที่แล้ว +1

      DRUGS ARE BAD MQWAYYYY

    • @koloqial
      @koloqial 2 ปีที่แล้ว +2

      @@gobi817 You missed the point entirely. Also, simply having a picture of drugs is not illegal.

    • @DebraJohnson
      @DebraJohnson 2 ปีที่แล้ว +1

      @@gobi817 Because you have a reasonable expectation of privacy on your personal cell phone and companies don't have the right to search and report your content to the police. They shouldn't be looking at your data beyond what is necessary to provide cell phone service. iCloud was marketed as a way to store your data, not a service to scan for and prevent illegal activity.

    • @KP3droflxp
      @KP3droflxp 2 ปีที่แล้ว

      Just don’t upload your photos to Apple then? Also, I don’t think people send well known pictures of drugs to other people. Funnily enough, if Apple has a hash for your drug photo, this proves you didn’t take it yourself.

  • @Maxyy40
    @Maxyy40 2 ปีที่แล้ว +292

    They also announced it on a Friday afternoon because they knew there would be blowback and they just wanted people to forget about it during the weekend. Well that’s not happening.

    • @pixelking_871
      @pixelking_871 2 ปีที่แล้ว +1

      And they was about to loose sales , I thought the whole phone was the cloud I'm not understanding

    • @fynkozari9271
      @fynkozari9271 2 ปีที่แล้ว +7

      Remember icloud hackin 2014? All celebrities pictures leaked. Yeah Apple has some nice security there. Thank God I dont have an Apple account.

    • @MathieuLLF
      @MathieuLLF 2 ปีที่แล้ว +3

      Apple always releases negative news on a Friday afternoon

    • @thehomiedan6378
      @thehomiedan6378 2 ปีที่แล้ว +4

      @@fynkozari9271 Dude that was 2014 lol Apple has only gotten better with security since then.

    • @HeyBoss-ve9hg
      @HeyBoss-ve9hg 2 ปีที่แล้ว +1

      @Apple Genius that’s actually sad

  • @Feadds
    @Feadds 2 ปีที่แล้ว +32

    Craig practicing his “Good Morning” for Tim Cook’s Replacement 👀😂
    Reference | 1:20

    • @triple7marc
      @triple7marc 2 ปีที่แล้ว +4

      I would be happy if Craig took over for Tim.

    • @Feadds
      @Feadds 2 ปีที่แล้ว +2

      @@triple7marc Same, he’s so perfect for the Role . Full of Life and so Enthusiastic .

  • @lost-prototype
    @lost-prototype 2 ปีที่แล้ว +57

    "Who owns this phone?"
    "Well, customers do, but good luck running any software other than ours on it."
    Answers to moot questions keep average consumers misinformed.

    • @Black7308
      @Black7308 2 ปีที่แล้ว +1

      You’ve obviously never jailbroken an iPhone

    • @lost-prototype
      @lost-prototype 2 ปีที่แล้ว

      Yeah, and that's totally intentional.

  • @GJ835
    @GJ835 2 ปีที่แล้ว +50

    Still doesn’t hit on the real concerning issue

    • @mukamuka0
      @mukamuka0 2 ปีที่แล้ว +4

      Wait until China ask them to quietly scan other photo...

  • @JJs_playground
    @JJs_playground 2 ปีที่แล้ว +280

    While I applaud the CSAM implementation, the issue becomes how far reaching will this become? It's a slippery slope.

    • @tiagomaqz
      @tiagomaqz 2 ปีที่แล้ว +22

      This question can and should not be asked to Apple directly but to the government and entities responsible for controlling data security. All companies have a similar or identical technology and unlike Apple they’ve been using it for decades now.

    • @_fisheater1027
      @_fisheater1027 2 ปีที่แล้ว +9

      Same thought. I think this is what happens when legislation cannot keep up with how fast tech develops.

    • @bhavinbijlani
      @bhavinbijlani 2 ปีที่แล้ว +2

      I believe the POSSIBILITY of future changes existed even before this announcement. We only had their word before and we only have their word now.

    • @halahmilksheikh
      @halahmilksheikh 2 ปีที่แล้ว +6

      This was built for china to spy on dissidents

    • @Tential1
      @Tential1 2 ปีที่แล้ว +2

      @@tiagomaqz other companies scan things on their cloud.
      Apple is scanning on your device AND the cloud

  • @jackoryan292
    @jackoryan292 2 ปีที่แล้ว +44

    I think that Apple has “misunderstood” that I value my privacy more than the convenience their products and services can offer me.

    • @cmtheone
      @cmtheone 2 ปีที่แล้ว +2

      They aren’t looking at your photos. The only people that should be worried about this are child predators… which may be telling of why you care so much.

    • @jackoryan292
      @jackoryan292 2 ปีที่แล้ว +13

      @@cmtheone I’ve worked with law enforcement to put predators in jail before. It’s funny that you’re too dim to see how having your privacy tampered with in the name of the greater good isn’t concerning. Then again, you’re the ideal complacent sheeple that big companies and governments want us all to be. Enjoy your ignorance friend.

    • @jopa7696
      @jopa7696 2 ปีที่แล้ว +2

      @@jackoryan292 Switch to Samsung brother 👍 I'd recommend the S21 great phone 👍 I love my iPad but come on man switch to Samsung brother 👍

    • @KP3droflxp
      @KP3droflxp 2 ปีที่แล้ว

      @J0p4 google has been doing this for ages. As well as Microsoft. So if you’re going to use them for image cloud storage it’s even worse.

    • @henrinaths1
      @henrinaths1 2 ปีที่แล้ว

      @lol
      what makes you think child predators will store their photos on their phones?
      Same idiocy as using gun registration to stop violence criminals using guns to rob a bank.

  • @arinchk.9265
    @arinchk.9265 2 ปีที่แล้ว +3

    I appreciated that she did this fiercely straightforward interviewing sessions (mostly kind of interrogation) for the good of every Apple device users. Thank You ✌🏼

  • @ShawnAuth
    @ShawnAuth 2 ปีที่แล้ว +176

    Many of us understood exactly what this was from day one, this "talking down to" by Apple is gross. You don't control what's in the database and a government can change it from just CSAM to anything they want. Creating the backdoor is the problem.

    • @justshad937
      @justshad937 2 ปีที่แล้ว +18

      Exactly. There was no never any misunderstanding

    • @bhavinbijlani
      @bhavinbijlani 2 ปีที่แล้ว +6

      I believe the POSSIBILITY of future changes existed even before this announcement. We only had their word before and we only have their word now.

    • @luisgutierrez8047
      @luisgutierrez8047 2 ปีที่แล้ว +3

      And you can just....you know not upload anything to the cloud....

    • @Theninjagecko
      @Theninjagecko 2 ปีที่แล้ว

      Exactly, those who provide the hashes can change it to look for anything.

    • @albertodlh
      @albertodlh 2 ปีที่แล้ว

      I disagree completely with calling this a "backdoor". Apple is not *entering* your phone to do anything, Apple is scanning what *you decide to send to them*. This is more of a bouncer than a backdoor.

  • @shrteng6856
    @shrteng6856 2 ปีที่แล้ว +30

    It sounds like “you are holding it wrong”

    • @JV-tk3nn
      @JV-tk3nn 2 ปีที่แล้ว +1

      I was looking for this comment.

    • @giacomonki
      @giacomonki 2 ปีที่แล้ว +1

      Me too

    • @DMINATOR
      @DMINATOR 2 ปีที่แล้ว

      Someone is old enough to remember ;)

  • @DARK_AMBIGUOUS
    @DARK_AMBIGUOUS 2 ปีที่แล้ว +5

    I don’t want my phone to use AI to scan my photos

  • @crspy1075
    @crspy1075 2 ปีที่แล้ว +26

    "how do you know this is a nude image or a rocketship?" LOL top-tier questions!

    • @baoquoc3710
      @baoquoc3710 2 ปีที่แล้ว +5

      Having a picture of Blue Origin rocket
      Iphone user: *nervous sweating*

  • @kingkang6877
    @kingkang6877 2 ปีที่แล้ว +131

    "I THINK our customers own their phones"
    What a great vote of confidence......

    • @khriskeane6800
      @khriskeane6800 2 ปีที่แล้ว +4

      for sure.

    • @tobybartlett
      @tobybartlett 2 ปีที่แล้ว +9

      I was shocked he used that language. I’m guessing Craig, Tim and anyone else giving media interviews are demanding the questions upfront.
      Then Apple legal, corp comm and marketing can train the two of them with exactly what to say that will answer SOME questions, but not enough to commit to anything that could lead up to being used in a courtroom or in Congress against them.

    • @joshuanolan4581
      @joshuanolan4581 2 ปีที่แล้ว

      Lol exactly

    • @KarstenJohansson
      @KarstenJohansson 2 ปีที่แล้ว +13

      If the customers owned their phones, they'd be able to install software from wherever they wanted to obtain it. They'd also be able to replace the battery themselves, even if it meant buying a special tool for the job.

    • @tophan5146
      @tophan5146 2 ปีที่แล้ว +3

      THINK DIFFERENT

  • @KuroiPK
    @KuroiPK 2 ปีที่แล้ว +224

    They should done this interview from the start, and the worry about future change still stands.

    • @Eugenepanels
      @Eugenepanels 2 ปีที่แล้ว +11

      Right? This makes the suspicion grow even more.

    • @KuroiPK
      @KuroiPK 2 ปีที่แล้ว +5

      @@Eugenepanels yeah it’s really strange how they try to underplay this change in a way. They should have done a comprehensive press release from the start, considering how important this change is.

    • @TomorowGames
      @TomorowGames 2 ปีที่แล้ว

      Dude the whole thing was leaked before they could properly present this. That’s why it’s causing problems, because it wasn’t officially presented by Apple.

    • @KuroiPK
      @KuroiPK 2 ปีที่แล้ว

      @@TomorowGames as far I know it wasn’t leak but released be apple them self via their newsroom, but I will check if I’m wrong….

    • @infinitepower6780
      @infinitepower6780 2 ปีที่แล้ว

      @@TomorowGames Yeah exxactly. It was leaked way before the proper launch and as a result there were tons of false information and fearmongering.

  • @NielsDutch1906
    @NielsDutch1906 2 ปีที่แล้ว +15

    I really wonder what the testing phase for the algorithm looked like.

  • @jellyd4889
    @jellyd4889 2 ปีที่แล้ว +54

    Sounds convincing. But this is still a backdoor to expand for the govt.

    • @frufrufrufru1999
      @frufrufrufru1999 2 ปีที่แล้ว +1

      Literally no, goggle has been doing this for the past 10 years

    • @akhileshjayaranjan5628
      @akhileshjayaranjan5628 2 ปีที่แล้ว +1

      hashes can be made from photos but a hash cannot be converted back into the photo. Apple does not see your photos to generate the hash.

    • @frufrufrufru1999
      @frufrufrufru1999 2 ปีที่แล้ว

      @@akhileshjayaranjan5628 exactly

    • @noth1ngnss921
      @noth1ngnss921 2 ปีที่แล้ว +1

      @@akhileshjayaranjan5628 If Apple cannot see your photos then what's even the point of this system? Algorithms are faulty and Apple admits that if this system flags something, there will have to be a human to double-check. And that's the big problem right there: they *can* check your photos. Who is to say that they or a local/US government agency wouldn't just check every photo instead of only the ones that have been flagged?

  • @aibochan1764
    @aibochan1764 2 ปีที่แล้ว +72

    “We’re not scanning your photos, you see, we’re scanning your photos.”

    • @jeycalc6877
      @jeycalc6877 2 ปีที่แล้ว +4

      actually it's we aren't scanning your photos on your phone, we are scanning you entire icloud photo library. It's even worse

    • @ohmyghost88
      @ohmyghost88 2 ปีที่แล้ว +10

      Actually they aren’t scanning any files. They are creating an encrypted hash that is checked against their database of CP hashes. How hashes work is they cannot be decoded and the only wait to identify them is to have a hash. Thus, the only data that is “revealed” in this process is CP data which should be banned. However, this is not to say that I agree with what they are doing or that I don’t recognize the potential of what this may be come as it relates to privacy, but the fundamental feature actually doesn’t breach privacy unless the user uploads CP.

    • @jeycalc6877
      @jeycalc6877 2 ปีที่แล้ว +16

      @@ohmyghost88 that is literally scanning

    • @aibochan1764
      @aibochan1764 2 ปีที่แล้ว +7

      @@ohmyghost88 nice try Craig we know that’s you

    • @ohmyghost88
      @ohmyghost88 2 ปีที่แล้ว +4

      Hashing isn’t scanning. The whole point of hashing is to efficiently store and retrieve data without scanning. The hash does not know the contents of the file, it just calculates a number (hash) that is used during transport to check if errors occurred (the checksum is calculated at the source and destination) and it needs to be sent again. Did you guys take a computer networking class or not?

  • @Tential1
    @Tential1 2 ปีที่แล้ว +72

    "journalism" = regurgitating what big companies tell you

    • @krunalcparmar1
      @krunalcparmar1 2 ปีที่แล้ว +2

      Where are Global Human Rights activists ?
      Taliban is more violent than ISIS

    • @bestintentions6089
      @bestintentions6089 2 ปีที่แล้ว +2

      Nothing to see here , move along pleb

    • @jeycalc6877
      @jeycalc6877 2 ปีที่แล้ว

      she is just like some apple activist, protecting apple at all costs

  • @Comisionado
    @Comisionado 2 ปีที่แล้ว +5

    Really good interview, he explained everything well. The point many people are missing is the fact that cloud services already scan all photographs you upload. Apple is just deploying a method that wouldn't scan all your photographs, instead generate hashes that are checked against the CSAM database.

  • @dukiwave
    @dukiwave 2 ปีที่แล้ว +32

    "I don't understand the backdoor characterization" what a weasel

  • @artjom01
    @artjom01 2 ปีที่แล้ว +72

    5:36 "Human moderators". So basically private icloud content can be viewed by apple tech support moderators

    • @harsimranbansal5355
      @harsimranbansal5355 2 ปีที่แล้ว +5

      It will probably be a highly specialized team who can do that, not anyone at apple, let alone tech support.

    • @blackhatson13
      @blackhatson13 2 ปีที่แล้ว +20

      @@harsimranbansal5355 still a violation of privacy

    • @sivamanipatnala5517
      @sivamanipatnala5517 2 ปีที่แล้ว +4

      @@blackhatson13 they are big tech companies…and they follow many rules and regulations, it isnt very simple for every employee over there to come and view our private icloud photos…

    • @noblesse4728
      @noblesse4728 2 ปีที่แล้ว +5

      Isn't Facebook also did this with it's team of "human moderator". I'm not saying if that's not gonna invade my privacy, but without those human moderator, we could be seeing terrorism, porn and those nasty things in our message / chat

    • @tiagopaim3060
      @tiagopaim3060 2 ปีที่แล้ว

      This has always been the case

  • @kadirkaynar
    @kadirkaynar 2 ปีที่แล้ว +139

    we will still continue the ‘Misunderstood’ after this video explanation

    • @samsonsoturian6013
      @samsonsoturian6013 2 ปีที่แล้ว +2

      This is the exact moment "misunderstand" becomes "defame."

    • @TechieBaby
      @TechieBaby 2 ปีที่แล้ว +1

      @@samsonsoturian6013 NO. DON"T TOUCH MY PHONE. DON"T USE MY IPHONE"S COMPUTATIONAL POWER TO DO THE FIRST HALF OF THE WORK. NONE OF MY BUSINESS. I DON"T WANT TO BE INVOLVED.

    • @KaizenAction296
      @KaizenAction296 2 ปีที่แล้ว

      He gave a vague answer. In future apple is planning to scan our entire phone . People like you who still don't understand and still thinks that apple is god . whatever they do is perfect.. I feel bad for you brother .

    • @samsonsoturian6013
      @samsonsoturian6013 2 ปีที่แล้ว

      @@KaizenAction296 ok, conspiritard

  • @AJsWorld
    @AJsWorld 2 ปีที่แล้ว +15

    "We've been unwilling to deploy a solution that would involve scaning all customer data"
    That's exactly what this this, Craig

    • @iAmCracky
      @iAmCracky 2 ปีที่แล้ว

      While I am against this. This is simply not true. They don’t even scan images that were taken yourself (apparently). Only images that came from another source and is then uploaded to the cloud.

  • @0xmg
    @0xmg 2 ปีที่แล้ว +4

    How do they perform the hashing? That's the interesting question here. It can definitely be abused to find other features inside the image. They store those hashes in their servers anyway.

    • @Seeeaaan
      @Seeeaaan ปีที่แล้ว

      A hash is a one way algorithm where data, in this case being the image, is passed through. This creates a unique hash value for the image which is basically impossible to replicate even a small change in the contents will affect the hash value drastically. This hash value is basically compared to a list of the hash values in the csam database. Meaning that to anyone who looks at your stuff its just going to be nothing more than a string of random numbers

  • @denisruskin348
    @denisruskin348 2 ปีที่แล้ว +207

    The second part reminds me of that Black Mirror episode. We are getting there.

    • @monsters8730
      @monsters8730 2 ปีที่แล้ว +2

      Which one?

    • @jayseb
      @jayseb 2 ปีที่แล้ว +2

      As a parent and security expert, I get that feature. The first one is the one I'm more curious about...

    • @LuthandoMaqondo
      @LuthandoMaqondo 2 ปีที่แล้ว

      Which episode?

    • @ShubhamKumar-xu2od
      @ShubhamKumar-xu2od 2 ปีที่แล้ว +5

      @@LuthandoMaqondo Arkengel

    • @user-ls2jg7vl2h
      @user-ls2jg7vl2h 2 ปีที่แล้ว +1

      @@ShubhamKumar-xu2od mm nice one It hadn’t occurred to me but agreed. This is the worry with technology little by little but we’re getting to that point

  • @kcocok
    @kcocok 2 ปีที่แล้ว +24

    Apple is like: a man comes to a lady during her shower, saying he will keep eyes closed and just scan for security. People just don’t believe it and don’t buy it. The point is not “ a safe way to scan phones”. The point is “ DON’T scan my phone”. Dont’t means don’t

    • @jovimathews
      @jovimathews 2 ปีที่แล้ว +3

      But Apple don’t scan your phone. I think you meant photos on iCloud server.

    • @brendenstahl7007
      @brendenstahl7007 2 ปีที่แล้ว +1

      Connection to your phone

    • @JV-tk3nn
      @JV-tk3nn 2 ปีที่แล้ว

      I like your woman in the shower analogy, but you should have elaborated more on that story. Left me wonder what happens next. When can the man open his eyes?

    • @xehP
      @xehP 2 ปีที่แล้ว

      Yeah a MAN, ofc it has to be a MAN

  • @OasisMusicOfficial
    @OasisMusicOfficial 2 ปีที่แล้ว +9

    In a practical sense, Apple at least has good intentions by doing this. Its unarguably a good thing that they are planning on tracking down phones that happen to have children on there.
    I do see why people are mad tho. Apple has always had a long history of keeping information secure for its customers and this seems like a slap in the face to those who use iPhone because of its security.

    • @mildmixchintu1717
      @mildmixchintu1717 ปีที่แล้ว

      Because apple has always been so vocal about Privacy and Not letting other apps track your data. And also cuz apple has starting to show ads on their platforms and hiring people to create a targeted ad network that they have been opposing for so long, pushing out the whole competition. Google scans your data all the time, and flags illegal stuff on gDrive but it's not a bid deal cuz they never said they won't do it or tracking personal data is a bad thing like Apple has been doing.

  • @danielpetersen5948
    @danielpetersen5948 2 ปีที่แล้ว +3

    I think that lady is just angry that she can’t keep watching her cp.

  • @konrain7299
    @konrain7299 2 ปีที่แล้ว +65

    Nice of Apple to brush us off as "confused" or "misinformed". We're not "confused" we simply don't want this on iPhone.
    Joanna I wish you would've asked him how easy it would be to simply replace CSAM with other illegal/copyrighted material.
    Even though Apple said they would "deny" if governments asked to do that, why did Apple feel the need to encrypt iMessage, after all they can just deny right?

    • @konrain7299
      @konrain7299 2 ปีที่แล้ว +2

      A major reason for Apple to encrypt iMessage was Privacy, not just security (aka hackers)

    • @krunalcparmar1
      @krunalcparmar1 2 ปีที่แล้ว

      Where are Global Human Rights activists ?
      Taliban is more violent than ISIS

    • @e_8074
      @e_8074 2 ปีที่แล้ว

      This is a carefully crafted PR piece wherein Apple explains that they are, indeed, scanning your photos. Nearly all Apple customers use iCloud.

    • @tomboss9940
      @tomboss9940 2 ปีที่แล้ว

      You're right. But then, Apple is not above the law. So the idea here that Apple hardens iCloud that they themselves cannot read it nor share it is fine. So is the idea to flag incriminated content and to review it inhouse before informing police.
      The alternative with Google, Facebook, MS is that the NSA has full access.

  • @avieshek
    @avieshek 2 ปีที่แล้ว +151

    What a timing to drop this exactly after the *Pegasus* deal 'still unaddressed'

    • @Gilgamesh12382
      @Gilgamesh12382 2 ปีที่แล้ว +4

      Yeah they are very similar

    • @tophan5146
      @tophan5146 2 ปีที่แล้ว +2

      What is that? Can you explain?

    • @kevinhernandezarango5005
      @kevinhernandezarango5005 2 ปีที่แล้ว

      What? Wasnt already patched?

    • @user-qi3hf8ko3q
      @user-qi3hf8ko3q 2 ปีที่แล้ว +1

      @@tophan5146 watch rene ritchie’s video about it

    • @CoffeeHead047
      @CoffeeHead047 2 ปีที่แล้ว +1

      @@kevinhernandezarango5005 never gonna be patched

  • @bj0urne
    @bj0urne 2 ปีที่แล้ว +4

    10:50 "I think our customers own their phone for sure"
    *Fights againt Right to Repair* 😂

  • @goodtimes333888
    @goodtimes333888 2 ปีที่แล้ว +14

    Apple is the only company telling people this is happening. Thank you for the transparency

    • @leonardog.2491
      @leonardog.2491 2 ปีที่แล้ว +2

      You’ve got a great point man

    • @chawlee
      @chawlee 2 ปีที่แล้ว +1

      Wasn’t it leaked? Then they had to come out and explain...

  • @10defo
    @10defo 2 ปีที่แล้ว +84

    7:05 should have digged deeper here. The reference hashes belong to child pornography today, tomorrow some state might want to force Apple to add additional references hashes, e.g., of Winnie Poh pictures. If too many Winnie Poh pics get uploaded to the cloud, we have our manual verification prompt, thus our backdoor.
    One could try to weaken the hashes, too, so they cover more pictures, prompting the manual verification on all kinds of pictures.
    In the end, you still need to trust Apple to only check for the hashes they tell you about. Not quite the advertised "you don't have to trust a single entity".

    • @UnkleRiceYo
      @UnkleRiceYo 2 ปีที่แล้ว +4

      I’m so lost with why people are upset 🤷🏻‍♂️ So what if a manual verification prompt occurs if we have too many Winnie the Pooh picture? Are you saying that then Disney could then advertise to us more or something? Like apple aren’t gonna report you to the police for having Winnie the Pooh on your phone

    • @Chaser-mw1fb
      @Chaser-mw1fb 2 ปีที่แล้ว +19

      @@UnkleRiceYo they will if your in China. That’s the point slick. In China it’s a hidden law not to have the photo referencing their leader as Winnie the Pooh so they arrest people who do. Apples software could easily be rolled out to match the picture and report people in China.

    • @Dlawderek
      @Dlawderek 2 ปีที่แล้ว

      @@Chaser-mw1fb So Apple is to blame because of China’s unfair censorship laws? Also, there’s no indication whatsoever that they will be doing anything of the sort.

    • @tobiasmaier4252
      @tobiasmaier4252 2 ปีที่แล้ว +12

      @@Dlawderek I believe you don‘t understand the issue. A country like china could say „hey apple additional to csam also scan for the following images when uploading to iCloud (f.e. HongKong freedom acitivism photos)“ If Apple then goes: „na wr promised our customers not to do that“ China could go: „do it or you‘re no longer allowed to sell your products in China“ (a huge market that brings a lot of revenue). It really isn‘t hard to understand. The problem is not what apple is doing but the possibility of the miss use.

    • @IneffablePanther
      @IneffablePanther 2 ปีที่แล้ว +1

      @@UnkleRiceYo did you bother trying to understand why this is an actual problem

  • @marks4java
    @marks4java 2 ปีที่แล้ว +284

    I appreciate the tone and balance of this interview. Nice job. My biggest problem with these features is that Apple is assuming a moral position. Let me say I am 100% aligned on these behaviors being immoral/heinous. What concerns me is simply that they are taking a moral position. What happens when next month, it’s not child porn but “hate words” in iMessage? Hate defined however Silicon Valley defines it. Applying tech to moral subjects is a very slippery slope. To suggest they can’t or won’t misuse this kind of tech in the future is just ignorant/naive.

    • @davehugstrees
      @davehugstrees 2 ปีที่แล้ว +9

      Legislation or court systems in other countries could easily add requirements to Apple’s scanning database. It’s hard to believe Apple executives could be this short-sighted about a technology. In order to save face Apple can simply say there are problems with the technology and shelve this for the time being.

    • @Dlawderek
      @Dlawderek 2 ปีที่แล้ว +13

      I think CSAM and “hate words” are not even nearly in the same league. CSAM is illegal and demonstrably dangerous. The 1st amendment protects your “hate words” so I find it hard to believe that Apple would scan or flag this content. This is a “slippery slope” logical fallacy.

    • @davehugstrees
      @davehugstrees 2 ปีที่แล้ว +6

      @@Dlawderek “Hateful content” like Nazi imagery is illegal in some European countries. What’s to stop governments from requiring to Apple to include that in the database of images they scan for?

    • @Dlawderek
      @Dlawderek 2 ปีที่แล้ว +5

      @@davehugstrees Maybe they will. If they start censoring political speech by looking through people's images and reporting them, I would be mad. This is not that. If that day comes, we can all turn off our iCloud storage and/or get rid of our Apple products. I don't think outrage is justified in a case where they are taking very cautious steps to curb the storage of CSAM on their servers. It takes 30 instances of hashcodes matching known CSAM before there is an audit. Even if some photos are flagged mistakenly (which I understand to be very rare) it would never reach 30 by mere chance. Even if it did, I would not mind someone at Apple verifying that I have no illegal images in my iCloud. There shouldn't be anything here to worry about.

    • @brkbtjunkie
      @brkbtjunkie 2 ปีที่แล้ว +9

      Anyone who says it will never be misused or increase in scope is lying to themselves.

  • @cyb3r1
    @cyb3r1 2 ปีที่แล้ว +1

    Craig Federighi looked highly uncomfortable during the interview and his explanations were a concerning mess. Thumbs up for the questions made by the reporter, they usually go much softer on them.

  • @freindimania11
    @freindimania11 ปีที่แล้ว

    Bro, what did they use to train the models?

  • @modimihir
    @modimihir 2 ปีที่แล้ว +52

    "People have misunderstood" People are not stupid, we understand what you are doing and we have a problem with it

    • @brendenstahl7007
      @brendenstahl7007 2 ปีที่แล้ว +4

      They are making it up to look at our private images 🤬

  • @DeedoDoop
    @DeedoDoop 2 ปีที่แล้ว +70

    And I thought apple cared about my privacy

    • @Michael-te6tb
      @Michael-te6tb 2 ปีที่แล้ว +4

      Lol

    • @TomNook.
      @TomNook. 2 ปีที่แล้ว +6

      And Google did no evil. Times change

    • @letrat7021
      @letrat7021 2 ปีที่แล้ว +11

      If you’re a child then your parents can and should know when you’re about to do something unsafe

    • @JConnel
      @JConnel 2 ปีที่แล้ว +8

      Just stop download illegal child images and you wont have anything to worry about.

    • @sivamanipatnala5517
      @sivamanipatnala5517 2 ปีที่แล้ว +5

      They still do, it isnt like tim apple is eating pop corn and having fun watching your private icloud photos…

  • @thinde88
    @thinde88 2 ปีที่แล้ว +4

    I like what you did there with the rocket ship reference

  • @pjdexter168
    @pjdexter168 2 ปีที่แล้ว +3

    it sounds like a blind raid without probable cause or a warrant, they can't see exactly what you have in your house as they rummage around, but they'll check anyway. it's either private or its not.

  • @ThinkyParts
    @ThinkyParts 2 ปีที่แล้ว +34

    From apple: “Could governments force Apple to add non-CSAM images to the hash list? -> Apple will refuse any such demands” So once again… you’re missing the point. Yes, a government could force this… but trust us. Why should we trust them? Who’s the next leadership team? Apple needs to stop this now. I’m honestly considering breaking up with them for the first time in 30 years.

    • @shahrilamirul4007
      @shahrilamirul4007 2 ปีที่แล้ว +1

      Do it, I know I am

    • @hsing-kaichen5062
      @hsing-kaichen5062 2 ปีที่แล้ว +1

      Didn’t Apple refuse to unlock an iPhone to US federal government once for a crime case?

    • @Tential1
      @Tential1 2 ปีที่แล้ว +4

      @@hsing-kaichen5062 they also gave into to China and put a data center in China for China icloud. So China just has to walk over to their icloud data center in china, pull the physical data and they have China iPhone data. They already caved to China once. China will ask to add their own csam database, will you disagree then? And what's in that database? We won't know.

    • @thakillerb
      @thakillerb 2 ปีที่แล้ว

      And go where? Analog? Pick your evil…

    • @sirfabyan
      @sirfabyan 2 ปีที่แล้ว

      if you don’t trust them don’t use icloud photos then and switch to a different cloud photo service 🤷🏼‍♂️

  • @keefyboy
    @keefyboy 2 ปีที่แล้ว +30

    "a degree of analysis done on your device" So, YES, iPhone will be scanned.

    • @infinitepower6780
      @infinitepower6780 2 ปีที่แล้ว

      No, the HASHES (alphanumerical strings) of your photos will be scanned.

    • @keefyboy
      @keefyboy 2 ปีที่แล้ว

      @@infinitepower6780 If they can come into my phone to hash pic, why couldn't the Gov't compel them to hash other files

    • @infinitepower6780
      @infinitepower6780 2 ปีที่แล้ว +1

      @@keefyboy touché
      I guess it's just trust at this point

    • @KP3droflxp
      @KP3droflxp 2 ปีที่แล้ว

      Only when preparing the files for upload to iCloud.

  • @soundsof...
    @soundsof... 2 ปีที่แล้ว +28

    "I think our customers own their phones, huh, for sure."
    Too bad his thinking is not reflecting what is really happening...
    #righttorepair
    p.s. this is not an interview, merely a communication from Apple...

    • @alespic
      @alespic 2 ปีที่แล้ว

      Pretty sure she asked questions, that makes it an interview.
      And he answered those questions. No fuss about it

  • @MrCoffis
    @MrCoffis 2 ปีที่แล้ว +28

    "This is not what is happening" but is actually exactly what is happening.

  • @zainratnani
    @zainratnani 2 ปีที่แล้ว +73

    7:45 what are the multiple levels of auditabililty? Will you seriously say “no” to China?

    • @halahmilksheikh
      @halahmilksheikh 2 ปีที่แล้ว +20

      They've already said "yes" to China when they gave up their security keys to decrypt Chinese iCloud data. They're just going to fold again.

    • @fofoqueiro5524
      @fofoqueiro5524 2 ปีที่แล้ว +2

      And will likely more yes to other governments

    • @mukamuka0
      @mukamuka0 2 ปีที่แล้ว +8

      Even more than that, this whole thing is probably start from China because Huawei got banned. So, CCP lost it's surveillance tools and turn to Apple for answer. What else can forced Apple to sudden launch such opposite program

    • @sorryi6685
      @sorryi6685 2 ปีที่แล้ว

      They don't provide encryption for phones sold in China and Saudi Arabia

    • @NotTubeIm
      @NotTubeIm 2 ปีที่แล้ว +3

      @@mukamuka0 lol remember Apple is an American company. If anyone is asking them to do anything it’s the CIA

  • @abhigyanchakraborty5563
    @abhigyanchakraborty5563 2 ปีที่แล้ว +33

    This is the first time I've seen apple being so flustered in an interview.
    Smells fishy...

  • @cjc363636
    @cjc363636 2 ปีที่แล้ว +2

    This explained this better than most videos/articles I've seen on this. And I'm still not sure where I stand. Crime against kids needs to be fought, but..... I'm still nervous about future uses of this hash matching thing. Anyway, great interview, WSJ.

  • @mediamass1404
    @mediamass1404 2 ปีที่แล้ว

    9:46 how much of the shelf life of a phone is lost this way, how hot will their phones get,

  • @jjaramos
    @jjaramos 2 ปีที่แล้ว +87

    I wish Joanna would have asked about the future "enhancement and expansion" of this thing, as Apple announced. Dystopian world we are about to live in.

    • @exiles_dot_tv
      @exiles_dot_tv 2 ปีที่แล้ว +8

      Meanwhile Google has already been inhabiting that world for *years* now.

    • @jjaramos
      @jjaramos 2 ปีที่แล้ว +2

      @@exiles_dot_tv indeed, the rest of big tech are dragging us all to that dark place.

    • @bhavinbijlani
      @bhavinbijlani 2 ปีที่แล้ว

      I believe the POSSIBILITY of future changes existed even before this announcement. We only had their word before and we only have their word now.

    • @PedroLopezBeanEater
      @PedroLopezBeanEater 2 ปีที่แล้ว +2

      Have you been asleep the last few decades or are you just a Microsoft/Google fan boy?

    • @jjaramos
      @jjaramos 2 ปีที่แล้ว +4

      @@PedroLopezBeanEater I haven't and I'm not anyone's fan boy.

  • @64FanNintendo
    @64FanNintendo 2 ปีที่แล้ว +88

    "This is not a backdoor" it literally is a backdoor

    • @justshad937
      @justshad937 2 ปีที่แล้ว +1

      I burst out laughing when he said that

    • @LordCoeCoe
      @LordCoeCoe 2 ปีที่แล้ว +1

      Literally, anything is a backdoor. If Google wanted, they could know everything and see everything about you. They can, they might already do that.

    • @BKAndre
      @BKAndre 2 ปีที่แล้ว +3

      It really isn't. And that's ok, criticism can still be made without calling it something that it's not.

    • @jaklgs7190
      @jaklgs7190 2 ปีที่แล้ว +5

      How tf is comparing a photo’s checksum a backdoor. Apple already does this to verify the integrity of files during the upload process. Them comparing this hash to a database to comply with federal law literally means nothing.

    • @twjackson94
      @twjackson94 2 ปีที่แล้ว +3

      @@jaklgs7190 They have to have access to the unencrypted data to run the checksum, which means they are accessing and processing your unencrypted data.

  • @parkerrichardson9058
    @parkerrichardson9058 2 ปีที่แล้ว

    So just to wondering if I’m getting this right apple isn’t scanning our whole iCloud but the photos we are starting to upload now that this update is out

  • @RickLaBanca
    @RickLaBanca 2 ปีที่แล้ว +2

    This is so Apple. The condescending attitude and they can’t even have an ad hoc interview, it’s a two camera shoot with powerpoint pseudo-interview.

  • @jimbo-dev
    @jimbo-dev 2 ปีที่แล้ว +61

    They actually cut the Apple campus out from any word Craig said which could be used to make memes! That means there has to be an agreement for this interview. I wonder if that includes other limitations as well since the interviewer didn’t pressure Apple that much. This feels more like Apple marketing than journalism

    • @koloqial
      @koloqial 2 ปีที่แล้ว +5

      I thought similar. I noticed how this was cut too....like why did they have alternate camera angles for a meeting that took place on FaceTime?

    • @JackieWelles
      @JackieWelles 2 ปีที่แล้ว +3

      I mean what did you expected from Apple. They are one of the most strictest companies who absolutely love controlling the narrative.

    • @gabrielgarcia7554
      @gabrielgarcia7554 2 ปีที่แล้ว +2

      Given that this is an exclusive, this is most likely a way for Apple to take control of the situation. Most companies will only agree to these types of interviews if only certain questions are asked to control the narrative.

    • @roiqk
      @roiqk 2 ปีที่แล้ว +3

      Literally everything you see in news these days is just propaganda. Journalism is dead

    • @gabrielnunez3371
      @gabrielnunez3371 2 ปีที่แล้ว +1

      Time to make some memes with a huge Apple watermark out of pure spite

  • @posgaming5143
    @posgaming5143 2 ปีที่แล้ว +54

    I do like Apple's attempt at blocking all the internet creeps, but at the cost of everyone's privacy. That's a big ask.

    • @manowar2816
      @manowar2816 2 ปีที่แล้ว +10

      Hey Siri tell me a joke. Apple is privacy

    • @frappes_
      @frappes_ 2 ปีที่แล้ว +11

      Did you watch the video? Or did you just not understand how the system works? Yes, Apple scans iCloud images for child porn, but the scans on device stay on device and Apple doesn't know what photos you have, unless the neural hashes match the csam database when you upload it to iCloud, and after levels of scanning.

    • @cisbaovuwel3394
      @cisbaovuwel3394 2 ปีที่แล้ว

      @@frappes_ What if the neural network matches photos of social activists and uploads them to CIA “accidentally”?

    • @cisbaovuwel3394
      @cisbaovuwel3394 2 ปีที่แล้ว +1

      @@frappes_ And do you remember the PRISM was in the name of anti-terrorism.

    • @venkat2005
      @venkat2005 2 ปีที่แล้ว +4

      They can provide a options for these type of features.

  • @phonepup06
    @phonepup06 2 ปีที่แล้ว +2

    Thank you for covering this in a much more educated way than other media does.

  • @DoctorPrepperMD
    @DoctorPrepperMD 2 ปีที่แล้ว +1

    So he is saying data you upload to Apple is theirs to scan if they want.

  • @Tential1
    @Tential1 2 ปีที่แล้ว +40

    "we wish you would accept us scanning your devices and breaking your privacy"
    Better do some free advertising on major networks by doing an "interview"

    • @frappes_
      @frappes_ 2 ปีที่แล้ว +6

      Did you watch the video?

    • @agiverreviga4592
      @agiverreviga4592 2 ปีที่แล้ว +5

      @@frappes_ They didn't, most of the comments come from people who haven't understood how this works and how this is different that just scanning your images which all other cloud companies already to.

    • @frappes_
      @frappes_ 2 ปีที่แล้ว +1

      @@agiverreviga4592 people will really dunk on Apple just cause they're Apple huh. Don't get me wrong, Apple does some pretty nasty stuff but Jesus this csam detection technology is not hard to grasp. And worse, theyre fine with the alternatives who literally scan your images as is, without the nueral hashes!

    • @onlyswedishmeatballs1677
      @onlyswedishmeatballs1677 2 ปีที่แล้ว +1

      yep, someone who clearly knows absolutely nothing about technology.
      Im sure google is completely private and the cia isn't spying on it citizens, ok boomer you keep using google.

  • @revtane9
    @revtane9 2 ปีที่แล้ว +38

    I am a developer, I agree with Craig on this tech or specific software implementation, but I am worried about what privacy really should be.

    • @datahearth1738
      @datahearth1738 2 ปีที่แล้ว +7

      I'm also a developer, but I kinda don't agree with him. Actually, providing any information that can identify your data (which can be again identify you by another process), is quite scary. That is not a backdoor YET. But it'll surely become on when someone will find an access to it (and we aren't even talking about using it). This is very innovative, but also very dangerous (as a developer or a client) due to the potential of this kind of technology. But, for the good side, that is an awesome tech that can help improvind other research fields :)

    • @revtane9
      @revtane9 2 ปีที่แล้ว +1

      @@datahearth1738 Apple should really implement the best ZeroTrust. The feature is ok, but it's a matter of who they trust. As a developer, I would never tap on this field of sensitivity on users' data.

    • @datahearth1738
      @datahearth1738 2 ปีที่แล้ว +1

      @@revtane9 Yup, right.

  • @YarGnawh
    @YarGnawh 2 ปีที่แล้ว +5

    I wonder which “rocket ship” she was referring to…

  • @matematicasendosminutos6360
    @matematicasendosminutos6360 2 ปีที่แล้ว +4

    If the code really respects privacy, why not make it open sourced for reviews?

  • @38Unkown
    @38Unkown 2 ปีที่แล้ว +13

    Another part of this issue is the idea of who owns the content. Regardless of where it is stored. If the police need a warrant to search a safety deposit box at a bank, shouldn't Apple need a warrant before searching photos? The idea of fiduciary duty and trust. If someone is purposely posting items to a public location by all means search away. But when photos are privately being stored in the cloud it feels very invasive.

    • @tobybartlett
      @tobybartlett 2 ปีที่แล้ว +2

      I’d love to see what the new TOS are for iCloud once Apple implements this.

    • @crusherman2001
      @crusherman2001 2 ปีที่แล้ว +2

      Youre not privately storing them though. Youre storing them on Apples iCloud servers where they become responsible for any content you have on there.

    • @38Unkown
      @38Unkown 2 ปีที่แล้ว +1

      @@crusherman2001 And that is the issue. When I store paper files in a safety deposit box (1) the bank can't nosy through my stuff and (2) the bank has the responsibility of keeping my files secure. I still own the documents. For all Apple's talk of privacy this could be manipulated to be very big brother...

    • @DebraJohnson
      @DebraJohnson 2 ปีที่แล้ว

      @@crusherman2001 If you have a reasonable expectation of privacy, they can't just go through your images to report them to the police. For example, if you pay to store your physical items at a storage place, they can't go through and search your stuff and report it to the police. Now, if they have a reason to think you are doing something illegal (smell of weed coming out, for example), they can report it to the police who will then need reasonable suspicion or a warrant to search your stuff. This proactively searching and reporting people to the authorities is not only a terrible invasion of privacy, but it's one that could create legal issues for innocent users.

    • @tomboss9940
      @tomboss9940 2 ปีที่แล้ว

      With Dropbox, Google, MS, this is happening now. Apple wants to safeguard iCloud. That's why they came up with this (complex) solution to not having to watch all your photos. The plan is to encrypt all parts of iCloud in a way that Apple cannot read it.
      This solution is a counter-offer to the US and EU's intrusive laws in the works for "child protection" (as a scapegoat for sniffing through all our cloud data and communications).

  • @MESHvlogs
    @MESHvlogs 2 ปีที่แล้ว +9

    It takes 20 years to build a reputation and yet only a few days to crumble the very foundation. Thanks Apple. SMH.

    • @baraboolive2611
      @baraboolive2611 2 ปีที่แล้ว

      You didn't realize that long ago?

  • @maxmccann5323
    @maxmccann5323 2 ปีที่แล้ว +1

    I haven't used facetime but those shots of the stream on the computer look amazing, doesn't look like there's any lag at all.

    • @gxlorp
      @gxlorp 2 ปีที่แล้ว

      I like your comment about cyber security.

    • @maxmccann5323
      @maxmccann5323 2 ปีที่แล้ว

      @@gxlorp thanks, don't usually put two comments on a video but here we are I guess.

    • @maxmccann5323
      @maxmccann5323 2 ปีที่แล้ว

      @@gxlorp I know you're being sarcastic but I actually have two comments so here's my other comment which is about cyber security:
      "Vatican confirms they have just bought a new installment of untraceable, Linux phones for all members"

  • @vaibhavsingh1888
    @vaibhavsingh1888 2 ปีที่แล้ว

    How do I turn on the feature

  • @prateekyadav9956
    @prateekyadav9956 2 ปีที่แล้ว +14

    Privacy was one of the only reasons to use apple but not anymore

  • @knowledgeispower36
    @knowledgeispower36 2 ปีที่แล้ว +30

    That’s why I like Craig , very clear , very respectful. I’m still using external storage though

    • @Tential1
      @Tential1 2 ปีที่แล้ว +4

      And that's why companies hire likeable people to do their pr campaigns. And it worked. Hence why I own the stock. Apple could kill your kids, and you'd still buy the phones.

  • @Eyedbythetiger
    @Eyedbythetiger 2 ปีที่แล้ว +1

    He says it’s not scanning anyone’s phone but it’s doing exactly that and he’s using technical speak to obfuscate the message.

  • @markoconnell804
    @markoconnell804 2 ปีที่แล้ว

    The on device messaging protection is amazing!

  • @akshaypatel6720
    @akshaypatel6720 2 ปีที่แล้ว +17

    So, 7:05 senior vp of software at Apple really don't understand what backdoor is?

  • @Mickeysternum245
    @Mickeysternum245 2 ปีที่แล้ว +52

    Craig is being trotted out like Colin Powell was about the Iraq war

  • @RickLaBanca
    @RickLaBanca 2 ปีที่แล้ว +33

    “no no no you dummies don’t understand how this works.”
    We do, which is why we don’t want it.

  • @pontifexinstitute
    @pontifexinstitute 2 ปีที่แล้ว +1

    This will not work if you upload to iCloud through Cryptomator first... And of course, you encrypt your DNS and add a VPN...