• photonic_sorcerer@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    6
    ·
    edit-2
    1 year ago

    He was found extorting little girls with nude pics he generated of them.

    Edit: So I guess he just generated them. In that case, how’d they become public? I guess this is the problem if you don’t read the article.

    • Missjdub@lemmy.world
      link
      fedilink
      English
      arrow-up
      36
      arrow-down
      1
      ·
      1 year ago

      Earlier this month, police in Spain launched an investigation after images of underage girls were altered with AI to remove their clothing and sent around town. In one case, a boy had tried to extort one of the girls using a manipulated image of her naked, the girl’s mother told the television channel Canal Extremadura.

      That was another case in Spain. Not the guy in Korea. The person in Korea didn’t distribute the images.

      • Mango@lemmy.world
        link
        fedilink
        English
        arrow-up
        14
        arrow-down
        1
        ·
        1 year ago

        I really gotta wonder what the difference is between prosecuting someone for their thoughts and prosecuting them for jerking it to their own artwork/generative whatever they kept entirely to themselves. The only bad I see here is someone having their privacy invaded by someone else bigger than them and being put on display for it. Sounds familiar?

      • Lowlee Kun@feddit.de
        link
        fedilink
        English
        arrow-up
        18
        ·
        1 year ago

        Because that was another case. Extortion and blackmail (and in this case would count as production of cp as would be the case if you would draw after a real child) are already illegal. On this case we simply dont have enough information.