• deadcade@lemmy.deadca.de
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    3
    ·
    9 months ago

    Please use up to date sources. (Disclaimer: Apple has continued and cancelled this “feature” enough times I’m not 100% sure if it’s currently in iOS, but I’m certain enough to not trust any Apple devices with any photos.)

    The hashing algorithm they used had manually craftable hash collisions. Apple did state they would be using a different hashing algorithm, but it likely contains similar flaws. This would allow anyone to get your iPhone at least partially flagged, and have your photos sent to Apple for “human verification”. Knowing how this algorithm works also allows people to circumvent any detection methods Apple uses.

    Not every iPhone is going to include a list of hashes of all illegal material, which means the hash of every image you view is sent to Apple. Even if you trust them to not run any other tracking/telemetry on your iPhone, this alone gives them the ability to track who viewed any image, by only having a copy of the image themselves. This is a very powerful surveillance tool, and can be used for censorship of nearly anything.

    • soren446@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      edit-2
      9 months ago

      It’s not the same thing. That is detecting whether or not a photo that contains nudity is being sent to/from a minor.

      The second article is out of date. That’s not being implemented. I don’t necessarily trust smartphones in general but please do tell what smartphone you trust with your private information.