While Meta’s license makes Llama 2 free for many, it’s still a limited license that doesn’t meet all the requirements of the Open Source Initiative (OSI). As outlined in the OSI’s Open Source Definition, open source is more than just sharing some code or research. To be truly open source is to offer free redistribution, access to the source code, allow modifications, and must not be tied to a specific product. Meta’s limits include requiring a license fee for any developers with more than 700 million daily users and disallowing other models from training on Llama. IEEE Spectrum wrote researchers from Radboud University in the Netherlands claimed Meta saying Llama 2 is open-source “is misleading,” and social media posts questioned how Meta could claim it as open-source.

One of Meta’s biggest open-source initiatives is PyTorch, a machine learning coding language used to develop generative AI models. The company released PyTorch to the open source community in 2016, and outside developers have been iterating on it ever since. Pineau hopes to foster the same excitement around its generative AI models, particularly since PyTorch “has improved so much” since being open-sourced.

The industry’s open source players tend to be smaller developers like Stability AI and EleutherAI — which have found some success in the commercial space. Open source developers regularly release new LLMs on the code repositories of Hugging Face and GitHub. Falcon, an open-source LLM from Dubai-based Technology Innovation Institute, has also grown in popularity and is rivaling both Llama 2 and GPT-4.

Pineau says current licensing schemes were not built to work with software that takes in vast amounts of outside data, as many generative AI services do. Most licenses, both open-source and proprietary, give limited liability to users and developers and very limited indemnity to copyright infringement. But Pineau says AI models like Llama 2 contain more training data and open users to potentially more liability if they produce something considered infringement. The current crop of software licenses does not cover that inevitability.

  • 48954246@lemmy.world
    link
    fedilink
    English
    arrow-up
    43
    arrow-down
    1
    ·
    1 year ago

    Facebook doesn’t do anything that doesn’t make Facebook money so excuse my skepticism.

    • folkrav@lemmy.ca
      link
      fedilink
      arrow-up
      7
      ·
      1 year ago

      React? I however seem to remember they did try to change the licensing at one point…

      • duncesplayed@lemmy.one
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        1 year ago

        They didn’t “try”: they did change the licence. From BSD+Patents to MIT. Hardly scandalous.

        • folkrav@lemmy.ca
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          We were both kind of right, actually. The initial 2013 release was Apache 2.0, they moved to BSD+patents by 2014, then relicensed to MIT in 2017

    • duncesplayed@lemmy.one
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      Facebook is a top 10 contributor to Linux. They are major developers for BtrFS and BPF and have contributed to a number of other kernel subsystems, too. Just Jens Axboe alone is a huge force in Linux.

      Outside of Linux, they’ve created some pretty big open source projects, like React and Go Ent.

      Honestly, they’ve open sourced almost everything they’ve ever done except for Facebook itself, and are one of the largest open source companies in the world.

  • RobotToaster@mander.xyz
    link
    fedilink
    arrow-up
    19
    ·
    1 year ago

    The article also mistakenly claims stability AI is open source, when it’s moralistic licence violates the OSD.

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    This is the best summary I could come up with:


    In July, Meta’s Fundamental AI Research (FAIR) center released its large language model Llama 2 relatively openly and for free, a stark contrast to its biggest competitors.

    People in the industry have begun looking at the limitations of some open-source licenses for LLMs in the commercial space, while some are arguing that pure and true open source is a philosophical debate at best and something developers don’t care about as much.

    Stefano Maffulli, executive director of OSI, tells The Verge that the group understands that current OSI-approved licenses may fall short of certain needs of AI models.

    “We definitely have to rethink licenses in a way that addresses the real limitations of copyright and permissions in AI models while keeping many of the tenets of the open source community,” Maffulli says.

    A recent report from Stanford, for instance, showed none of the top companies with AI models talk enough about the potential risks and how reliably accountable they are if something goes wrong.

    Acknowledging potential risks and providing avenues for feedback isn’t necessarily a standard part of open source discussions — but it should be a norm for anyone creating an AI model.


    The original article contains 1,002 words, the summary contains 193 words. Saved 81%. I’m a bot and I’m open source!