• dumbassEnglish
    arrow-up
    38
    arrow-down
    0
    ·
    5 months ago
    link
    fedilink

    Is it just nudes or is it all old photos?

    • rimjob_rainerEnglish
      arrow-up
      41
      arrow-down
      0
      ·
      5 months ago
      link
      fedilink

      The former would be hilarious, it would mean that iOS explicitly classified those images as nudes.

      • StaySquaredEnglish
        arrow-up
        11
        arrow-down
        0
        ·
        5 months ago
        link
        fedilink

        Indeed. But Apple does have the tech to analyze images/videos:

        Apple’s CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups.

        • answersplease77English
          arrow-up
          6
          arrow-down
          0
          ·
          5 months ago
          link
          fedilink

          which means they exported this task to some Indians overaeas fuck which is just worse

          • KillingTimeItselfEnglish
            arrow-up
            2
            arrow-down
            0
            ·
            5 months ago
            link
            fedilink

            ok so probably not, CSAM detection, specifically modern detection the kind that MS does, is based on image hashes, and how it works is that the law collects and creates the hash sets for these images, and distributes them to tech companies, who can then use them to calculate against hashes of existing photos, and if a match returns, ladies and gentleman, we got em.

        • DojanEnglish
          arrow-up
          2
          arrow-down
          0
          ·
          5 months ago
          link
          fedilink

          It’s using hashes, no?