• kraxyk@lemmy.world
    link
    fedilink
    arrow-up
    3
    arrow-down
    27
    ·
    1 year ago

    All big tech analyzes our data. I rather they not analyze anything but since we’ll never see that day, they can at least use their privacy invasion power for good.

    • Kelsenellenelvial@lemmy.ca
      link
      fedilink
      arrow-up
      16
      ·
      edit-2
      1 year ago

      The thing is that while many companies have access to your data in various services, Apple has designed their systems such that they can’t access most user data. Can’t be both ways, your data is either private or not, and many would prefer it stay private.

      As I understand the actual situation with iCloud and CSAM scanning is Apple does scan iCloud photos (the ones that users choose to upload to iCloud) if they can. A few years ago they tried to design a privacy focused version of that scanning that would allow them to access that kind of content for the purposes of reporting it, while preserving the users privacy. It was supposed to happen on device(while most companies only scan the photos on their servers) before the photos were uploaded, and use hashes to compare user photos to known CSAM material. This seemed an odd thing at the time, but a while after that Apple released end to end encryption for iCloud Photos, which means they can’t scan the uploaded photos anymore because they don’t have that access. Some have a theory that the big tech companies have regular contact with various government/law enforcement/etc. agencies and the on device scanning was a negotiated by them as a response to Apple’s plans to add E2E encryption to iCloud Photos, among other previously less secure services.

      • scurry@lemmy.world
        link
        fedilink
        arrow-up
        7
        ·
        1 year ago

        Some nits: Apple could access many classes of data stored on iCloud by default (including any photos), even now, but you can make almost every class end to end encrypted now if you explicitly chose to. Previously, and by default now, it’s Apple policy and internal controls over the keys your data is encrypted with that protect that data, not the encryption itself (though you can opt in to the encryption itself protecting you from Apple). From what I understand, Apple is only known to actually scan iCloud mailboxes regularly, with the on-device scanning having never been implemented. Outside of nits, considering the delay between the proposed scanning and offering of a wider E2EE program for iCloud, I doubt the two are actually related myself.

      • WhatAmLemmy@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        3
        ·
        edit-2
        1 year ago

        There’s also no way to validate that Apple’s E2EE operates as stated. They could have added a backdoor for themselves or “intelligence” agencies, and we have no way of knowing other than “trust us”. Even if the source code is ever leaked (or a backdoor exploited by hackers), it could be written with plausible deniability — in such a way that it could be interpreted as unintentional (a bug/error).

        This is why you should never trust closed source code with your sensitive data, and encrypt it yourself using open source, widespread/trusted, audited tools before uploading it to someone else’s computer.

        • bob_wiley@lemmy.world
          link
          fedilink
          English
          arrow-up
          9
          ·
          1 year ago

          The FBI has complained publicly about iMessage encryption many times. I view that as a good sign.

      • Zummy@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        This is exactly what Apple wanted to do and lots of people (myself included) were against that because it would involve Apple scanning data on your phone. Sure, it was only at the point to deciders to upload photos to the cloud, but still it was unacceptable to scan our phones for data that hasn’t been uploaded yet.

    • bob_wiley@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      ·
      1 year ago

      This is what end-to-end encryption is for. A properly designed system shouldn’t allow anyone but the user to see the data, not even those who created the system.

      In your model you need to question who decides what “good” is. Protecting children is always the way they try to get public support, as it is seen as a universal good. Once the practices have been established, what else is “good” and who decides?

      Suppression of political messages the company doesn’t agree with? Something that Verizon has done and T-Mobile seems to still being doing in SMS. How about scanning for copy-written material? I can see it now, a 5th grader getting a summons for a picture of a painting they used in a book report. How about parents getting flagged for taking picture of their babies? Or a woman who needs an abortion, but lives in a state where it’s not allowed… someone might thinks it’s “good” to stop that from happening and arrest her for trying.

      Searching someone’s phone or storage is no different than searching through their home. We have protections against random and unwarranted searches for a reason.