• azertyfun@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    67
    arrow-down
    10
    ·
    edit-2
    7 months ago

    There is almost certainly internal communication that basically reads “hey let’s get an actress who sounds as close to ScarJo as possible”. There’s also the CEO tweeting “her” on the day of release.

    Is that legal? IANAL, but OpenAI’s reaction of immediately shutting that shit down leads me to believe they realized it is, in fact, illegal.

    Your comparison is also incorrect. You’re not getting a JEJ soundalike, you’re getting a JEJ soundalike to do a Darth Vader impersonation. Meaningfully different semantics. They don’t just want “white american woman who vaguely sounds like ScarJo I guess” they have proven beyond doubt that they want “The AI from the 2013 movie Her starring Joaquin Phoenix and Scarlett Johansson”.


    Also legality aside, it’s really fucking weird and ethically wrong. I don’t care if it’s legal or not, you shouldn’t be able to make an AI replicate someone’s voice without their consent.

    • Resonosity@lemmy.world
      link
      fedilink
      English
      arrow-up
      26
      arrow-down
      3
      ·
      7 months ago

      OpenAI’s actions could just as easily be explained by them seeking to protect their image as much as possible, knowing that if they let the voice stay then bad PR would only grow.

      Even if there is no connection to ScarJo in this case, it’s still in OpenAI’s interest to appease the public for the sake of their reputation.

      • azertyfun@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        23
        arrow-down
        5
        ·
        7 months ago

        There is without a doubt a connection to ScarJo. They asked her to voice the AI, they asked her again right before release, and the CEO tweeted “her” on release.

        The only question is whether, backlash aside, they could technically get away with it (which does not make it right).