Comments
  1. StratecheryBen Thompson8/9/2113 min
    13 reads6 comments
    9.8
    Stratechery
    13 reads
    9.8
    You must read the article before you can comment on it.
    • jeff2 years ago

      Great title and great analysis. This really does seem like a terrible mistake. I don't know how anyone involved with developing and deploying this technology at Apple could feel confident enough to roll this out at the scale they're operating at.

      When the numbers of users, devices, and photos are big enough there will be false positives and all it takes is an accusation to ruin someone's life or career. Meanwhile anyone who's actually producing or consuming CSAM just knows now not to use an Apple device with iCloud syncing turned on. This seems like a cover-your-ass move on Apple's part more than anything.

      • thorgalle
        Top reader this weekScoutScribe
        2 years ago

        Yep. If they wouldn't have some kind of human review to rule out false positives, this could be disastrous. But the article says that they check for "image hashes" of known CSAM images. So I'm hoping that this doesn't proactively scan for anything that might be CSAM, but rather sticks to a smaller set 100% certain abusive material.

        • justinzealand2 years ago

          The hash algorithm is a cat and mouse game. By making subtle tweaks, the offenders change the hash of the photo, and so there are numerous varieties of what to an end user would be the same photo. Some naive offenders may get caught, but the sophisticated power users already know how to get around these scanning methods. Still, a good start overall. And even if Apple didn’t nail it, I am still in favor of efforts to thwart this nightmare society of pedophiles. The creation of the Internet has amplified this market many 100s of times before there was digital sharing. It’s a horrible and growing problem that needs active solutions. Glad to see Apple stepping up.

          • jeff2 years ago

            I don't think it was mentioned in this article but in Apple's technical summary they state that they are using a form of perceptual hashing in order to match images that have been transcoded or resized. Still, that only changes the cat and mouse game. Looser, feature-based matching opens up the possibility of abuse by deliberately creating non-CSAM images that result in a hash collision and false positive which is part of the reason I'm weary of this implementation.

    • thorgalle
      Top reader this weekScoutScribe
      2 years ago

      A worthwhile decision to evaluate. I find it hard to believe that this is the first backdoor that was ever installed by Apple in an iPhone. And generally, I'd agree with Ben's suggested approach of "scan the cloud & leave the devices private".

      It’s truly disappointing that Apple got so hung up on its particular vision of privacy that it ended up betraying the fulcrum of user control: being able to trust that your device is truly yours.

      Weird last statement. I can see dimensions where Apple has (had) an advantage here (privacy), but from many other perspectives, Apple stubbornly limiting what you can do with your device (like, installing custom software) makes it less yours.

    • Florian2 years ago

      Wow this is super interesting. Technical enough to be detailed but not so technical that I wouldn’t get it 😅