A Telegram user who advertises their services on Twitter will create an AI-generated pornographic image of anyone in the world for as little as $10 if users send them pictures of that person. Like many other Telegram communities and users producing nonconsensual AI-generated sexual images, this user creates fake nude images of celebrities, including images of minors in swimsuits, but is particularly notable because it plainly and openly shows one of the most severe harms of generative AI tools: easily creating nonconsensual pornography of ordinary people.

  • afraid_of_zombies@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    8 months ago

    It’s stuff like this that makes me against copyright laws. To me it is clear and obvious that you own your own image, and it is far less obvious to me that a company can own an image whose creator drew multiple decades ago that everyone can identify. And yet one is protected and the other isn’t.

    What the hell do you own if not yourself? How come a corporation has more rights than we do?

    • Kairos@lemmy.today
      link
      fedilink
      English
      arrow-up
      6
      ·
      8 months ago

      This stuff should be defamation, full stop. There would need to be a law specifically saying you can’t sign it away, though.