• Grimy@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    7
    ·
    7 months ago

    Just like scarlet doesn’t own all voices that sound mildly like her, Spike Jonze doesn’t own the concept of an AI companion.

    I’m not really sure what your point is, there’s nothing to rip off. No matter what they make it sound like, there’s going to be similarities with the movie. There’s nothing wrong with leaning into these for advertising purposes.

    • Monument@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      2
      ·
      7 months ago

      No matter what they make it sound like, there’s going to be similarities with the movie.

      I don’t follow.
      They literally disabled the ‘Sky’ voice Sunday night and now users can’t pick a voice that sounds like the character from Her.
      And, mind you, this is not a ‘huh, they sorta sound the same’ this is a ‘they sound very similar, and have the same personality’ situation, in addition to the fact that Sam Altman is on the record talking about being obsessed with the movie Her - which is circumstantial. What isn’t circumstantial is they literally referenced the movie’s name in their marketing materials. Sam tweeted a vague hint, and his colleagues confirmed it. It’s not speculative.

      There’s nothing wrong with leaning into these for advertising purposes.

      Actually, intellectual property theft is either wrong or merely only technically illegal, depending on where you stand on copyright, but it’s still wrong, either way. Then there’s trying to mislead the public into thinking that GPT-4o was endorsed in some way by those involved in the Her movie. A false endorsement is also illegal. So - wrong there, too.
      I’m sure an actual lawyer could find more wrong with it, but just those two things are actual, literal crimes.

      • Grimy@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        7 months ago

        I’m saying practically any voice with the associated bubbly flirty personality is going to make you think of the movie Her in such a context.

        Sure they leaned into it for advertising purposes but a tweet referencing it and showcasing the one voice that sounds like her out of the five isn’t crossing the line imo.

        I think it’s a slippery slope to say any AI assistant that has a similar timbre and personality as an AI in a movie is off limits.

        As long as they don’t infringe by calling it “Scarjo” or saying “From the movie Her” I don’t see a problem.

        • Monument@lemmy.sdf.org
          link
          fedilink
          English
          arrow-up
          1
          ·
          7 months ago

          I’m saying practically any voice with the associated bubbly flirty personality is going to make you think of the movie Her in such a context.

          I don’t know about you, but even a flirty Joaquin Phoenix voice would never make me think of Her.

          But if they’d had a “voice actor” do a spot on impression of Paul Bettany, complete with the little pauses and other flourishes of his portrayals of Vision, I’d think they ripped off the character.

          I think you and I differ there.
          As best as I can figure, you’re stuck on the flimsy excuse from Altman that they hired a voice actor. I see a line of events that points to OpenAI/Altman making a conscious effort to glom onto the Her movie, and specifically, the Samantha character to drum up interest, create viral buzz to enrich further themselves (without compensating anyone involved in the movie), and to try to add a veneer of credibility to a fading trend.

          Slight turn.
          In another life I was a photographer, and one of the things that they do not mess around with is model releases. Any person that appears in your photos that distributed must absolutely have a legal agreement in place. Using someone’s likeness for commercial purposes without consent and/or compensation will get you fucked in triplicate.

          There’s also the moral part of it. Artists know that you don’t rip off artists. Inspired by, sure. But there’s a line, and you don’t cross it. It’s as simple as that.

          Okay, and finally, this is based less on facts I know, and more feelings I have about the situation -
          It’s fucking creepy, dude.

          Okay, so the movie Her - an entire society becomes obsessed with their AI companions and falls in love with them, causing tremendous grief and trauma. And that’s like, what they’re going to lengths to brand but not brand this latest version with? What kind of fucked up things are going on in their heads over there?
          It doesn’t make sense.

          The rundown, again. (Sorry, I like to establish context. Yay neurodivergence.).
          Altman is on record saying that Her is his favorite movie, and that it is a major inspiration to him. One of the reasons he got into this field. He spent 9 months trying to convince Johansson to work with him on this, and lend her name/voice to this latest iteration of ChatGPT. He’s been so focused on getting her to lend his name to this, that he continued asking her to join in on this even just a few days before the announcement, which was like, the 13th. And that’s after she’s already turned him down, so he was just ignoring her boundaries and trying to pressure her…
          On the 11th - two days before this announcement, Altman does a Reddit AMA (which he was doing as part of the 4o press junket) and says that he’d like to open up ChatGPT for personal NSFW usage.

          I mean, everyone is focusing on Her, but we probably should also to be thinking about the Lucy Liu Futurama episode, because… well, I’m just going to say it. I think he already fucked the robot. The line of events from A to B is transparent and fucking gross.
          Not the whole fucking a robot thing - people got needs - but that the likeness is obviously stolen from a non-consenting actress that I’m beginning to believe he’s obsessed with.

          So … yeah… I have all the problems with this. I view concerns over the usage of her voice as immaterial to the usage of the character, and I see an inherent difference between an LLM mimicking a random voice that happens to sound like someone, and this situation, where the voice was clearly created to represent the character, and by extension, the actor that played the character. I don’t think there’s a slippery slope here. Most judges are fairly smart, and will be able to articulate something I (a non legal) took as a given from the outset.