shared via https://feddit.de/post/2805371

Each of these reads like an extremely horny and angry man yelling their basest desires at Pornhub’s search function.

  • I see this in a lot of news regarding AI, but these tools don’t generate pictures of anyone. They generate pictures that maybe look like someone, but they can’t undress you.

    If you make an AI generate “Billy Eilish boobs”, you’re not seeing a picture of her real boobs. You’re seeing a reproduction of her face on top of a reproduction of some boobs.

    These tools aren’t x-ray goggles, they’re the automated equivalent of the village creep cutting out celebrity faces to paste them onto pin-ups. We’ve had the same moral panic about Photoshop and I’m sure we’ll see the same thing happening again with whatever image manipulation technology comes next.

    We need to educate everyone, especially the elderly, that deep fakes exist. We can’t stop deep fakes and even just trying to is futile.

    In terms of blackmailing, I think this actually provides an opportunity. Ex leaked some nudes? Deep fake. Hacker broke into your phone? Deep fakes. Anyone can generate nudes of anyone else with only a few pictures on their gaming PC and modern models don’t actually fuck up the hands like people often claim they do. This even works on interactive videos. If we can get that message across, we can pretty much end the effectiveness of sexual blackmail.