• deegeese@sopuli.xyz
    link
    fedilink
    arrow-up
    137
    ·
    edit-2
    1 year ago

    Call me crazy, but I don’t think corporations should be in the business of scanning everyone’s private data on behalf of “the authorities”.

    Too many ways it can go very wrong.

    • dylanmorgan@slrpnk.net
      link
      fedilink
      arrow-up
      16
      ·
      1 year ago

      Agreed. Children have been exploited for much longer than there has been photography, much less iPhones.

    • Astroturfed@lemmy.world
      link
      fedilink
      arrow-up
      7
      ·
      1 year ago

      Also, I feel like there’s probably not much of that on apples servers… Like wouldn’t that mind if cloud network be the last place you’d want to put illegal pictures? If I was trying to hide some felony pics or data I wouldn’t be trusting a large corporations cloud services.

  • dingleberry
    link
    fedilink
    arrow-up
    61
    ·
    1 year ago

    Remove locks from all your houses. Locks allow child abuse to go on without being reported.

    • conciselyverbose@kbin.social
      link
      fedilink
      arrow-up
      14
      ·
      1 year ago

      Why stop there? Walls allow kids to be hurt too. Why not mandate that every human, at their own expense, provides 100% video coverage of their property at all times, on penalty of automatic child endangerment charges.

      And obviously monitoring that isn’t free, so we’ll send you the bill shortly.

  • afk_strats@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    57
    ·
    1 year ago

    This title is misleading click bait for an article advocating for intrusive data scanning, which by the way, cannot be completely automated.

    Here’s a snippet of the iCloud TOS which specifically forbids CSA on iCloud.

    You agree that you will NOT use the Service to:

    a. upload, download, post, email, transmit, store, share, import or otherwise make available any Content that is unlawful, harassing, threatening, harmful, tortious, defamatory, libelous, abusive, violent, obscene, vulgar, invasive of another’s privacy…

    Further down, the same TOS specifically calls out that such content may be identified or removed by Apple

    Again, not defending Apple, but I’d rather not have them or an army of underpaid contractors search through people’s pictures as a type of corporate law enforcement, because “think of the children”. This is a systemic problem which can be addressed without invading EVERYONE’s privacy

  • Zummy@lemmy.world
    link
    fedilink
    arrow-up
    49
    ·
    1 year ago

    No, people don’t want this. To be clear, I AM NOT saying that people don’t want the end to CSAM. I would say most people want that. What I am saying is that people don’t want someone to spy on your data to stop it. Spying on everyone to catch the few sickos out their is not cool. Apple protects privacy, and that’s what I want.

  • blazera@kbin.social
    link
    fedilink
    arrow-up
    44
    arrow-down
    1
    ·
    1 year ago

    Illegal drugs are transported on the interstate. The department of transportation should be held accountable for not searching every vehicle.

  • Eggyhead@artemis.camp
    link
    fedilink
    arrow-up
    31
    ·
    1 year ago

    Heat Initiative is launching a multi-million dollar campaign in which it presses Apple on this issue.

    I can’t help but wonder which generous benefactors are providing Heat these multi-millions of dollars in order to pressure apple into compromising privacy?

    • dingleberry
      link
      fedilink
      arrow-up
      5
      ·
      1 year ago

      Are a few million dollars productive against a trillion dollar company?

      • Eggyhead@artemis.camp
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        1 year ago

        It’s not a trillion dollar company they’re looking to spend a few million dollars against, just their stance on your privacy.

  • Nogami@lemmy.world
    link
    fedilink
    arrow-up
    14
    arrow-down
    1
    ·
    edit-2
    1 year ago

    child safety advocacy group Heat Initiative wants corrupt governments to be able to scan your personal device for: evidence of sexual orientation, firearms ownership, anti-government speech, political affiliation, messages from any sort of sexual affair, oh yes, and CSAM.

    Please think of the children! lol.

    • AwesomeSteve@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      1 year ago

      This is a great opportunity to identify the group of people and organization that advocate for the government to scan your private data, a violation to your privacy and rights.

      It is very obvious they have very limited wisdom or are acting solely to serve a certain party interest i.e, the authority that wants your data without any hassle.

      Another possibility is they are a complete retarded. For anyone that interested,

      • Christian Brothers Investment
      • Degroof Petercam, a Belgian investment firm

      Christian Brothers Investment Services describes itself as a “Catholic, socially responsible investment management firm.” The proposal is believed to play a role in Heat Initiative’s advertising campaign as well. Simultaneously, the New York Times is also now reporting that Degroof Petercam, a Belgian investment firm, will also back the resolution.

      Maybe it is time to research into these two entities.

  • TigrisMorte@kbin.social
    link
    fedilink
    arrow-up
    11
    ·
    1 year ago

    If that is true provide the evidence to the FBI and they’ll arrest People. That isn’t what is done because they have no evidence and breaking encryption is the real goal.

  • kraxyk@lemmy.world
    link
    fedilink
    arrow-up
    3
    arrow-down
    27
    ·
    1 year ago

    All big tech analyzes our data. I rather they not analyze anything but since we’ll never see that day, they can at least use their privacy invasion power for good.

    • Kelsenellenelvial@lemmy.ca
      link
      fedilink
      arrow-up
      16
      ·
      edit-2
      1 year ago

      The thing is that while many companies have access to your data in various services, Apple has designed their systems such that they can’t access most user data. Can’t be both ways, your data is either private or not, and many would prefer it stay private.

      As I understand the actual situation with iCloud and CSAM scanning is Apple does scan iCloud photos (the ones that users choose to upload to iCloud) if they can. A few years ago they tried to design a privacy focused version of that scanning that would allow them to access that kind of content for the purposes of reporting it, while preserving the users privacy. It was supposed to happen on device(while most companies only scan the photos on their servers) before the photos were uploaded, and use hashes to compare user photos to known CSAM material. This seemed an odd thing at the time, but a while after that Apple released end to end encryption for iCloud Photos, which means they can’t scan the uploaded photos anymore because they don’t have that access. Some have a theory that the big tech companies have regular contact with various government/law enforcement/etc. agencies and the on device scanning was a negotiated by them as a response to Apple’s plans to add E2E encryption to iCloud Photos, among other previously less secure services.

      • scurry@lemmy.world
        link
        fedilink
        arrow-up
        7
        ·
        1 year ago

        Some nits: Apple could access many classes of data stored on iCloud by default (including any photos), even now, but you can make almost every class end to end encrypted now if you explicitly chose to. Previously, and by default now, it’s Apple policy and internal controls over the keys your data is encrypted with that protect that data, not the encryption itself (though you can opt in to the encryption itself protecting you from Apple). From what I understand, Apple is only known to actually scan iCloud mailboxes regularly, with the on-device scanning having never been implemented. Outside of nits, considering the delay between the proposed scanning and offering of a wider E2EE program for iCloud, I doubt the two are actually related myself.

      • WhatAmLemmy@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        3
        ·
        edit-2
        1 year ago

        There’s also no way to validate that Apple’s E2EE operates as stated. They could have added a backdoor for themselves or “intelligence” agencies, and we have no way of knowing other than “trust us”. Even if the source code is ever leaked (or a backdoor exploited by hackers), it could be written with plausible deniability — in such a way that it could be interpreted as unintentional (a bug/error).

        This is why you should never trust closed source code with your sensitive data, and encrypt it yourself using open source, widespread/trusted, audited tools before uploading it to someone else’s computer.

      • Zummy@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        This is exactly what Apple wanted to do and lots of people (myself included) were against that because it would involve Apple scanning data on your phone. Sure, it was only at the point to deciders to upload photos to the cloud, but still it was unacceptable to scan our phones for data that hasn’t been uploaded yet.