• rimjob_rainer
    link
    fedilink
    English
    arrow-up
    41
    ·
    6 months ago

    The former would be hilarious, it would mean that iOS explicitly classified those images as nudes.

    • StaySquared@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      ·
      6 months ago

      Indeed. But Apple does have the tech to analyze images/videos:

      Apple’s CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups.

        • KillingTimeItself@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          2
          ·
          6 months ago

          ok so probably not, CSAM detection, specifically modern detection the kind that MS does, is based on image hashes, and how it works is that the law collects and creates the hash sets for these images, and distributes them to tech companies, who can then use them to calculate against hashes of existing photos, and if a match returns, ladies and gentleman, we got em.