Software engineer Vishnu Mohandas decided he would quit Google in more ways than one when he learned that the tech giant had briefly helped the US military develop AI to study drone footage. In 2020 he left his job working on Google Assistant and also stopped backing up all of his images to Google Photos. He feared that his content could be used to train AI systems, even if they weren’t specifically ones tied to the Pentagon project. “I don’t control any of the future outcomes that this will enable,” Mohandas thought. “So now, shouldn’t I be more responsible?”

The site (TheySeeYourPhotos) returns what Google Vision is able to decern from photos. You can test with any image you want or there are some sample images available.

  • Naich@lemmings.world
    link
    fedilink
    English
    arrow-up
    36
    arrow-down
    3
    ·
    20 days ago

    Don’t mind me, I’m just poisoning it with AI shit that it thinks is real.

      • rustyricotta@lemmy.ml
        link
        fedilink
        English
        arrow-up
        12
        arrow-down
        1
        ·
        19 days ago

        I think it’s pretty likely that online LLMs keep user inputs for training of future versions/models. Though it probably gets filtered for obvious stuff like this.

        • General_Effort@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          19 days ago

          They say they don’t. Would be very bad publicity if they did, and possibly breach of contract or other legal trouble.