Meta made its Llama 2 AI model open-source because ‘Zuck has balls,’ a former top Facebook engineer says::Meta CEO Mark Zuckerberg took a big risk by making its powerful AI model Llama 2 mostly open source, according to Replit CEO Amjad Masad.

  • Womble@lemmy.world
    link
    fedilink
    English
    arrow-up
    51
    arrow-down
    4
    ·
    edit-2
    1 year ago

    It probably has more to do with the fuck up they did with accidentally creating a torrent for the original model’s weights allowing them to spread across the internet. Doesn’t really take “balls” to open source it after that and make it look like it was intentional. Still good that they did however rather than trying to use legal intimidation on anyone who used the leaked models.

    • ijeff@lemdro.id
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      1
      ·
      1 year ago

      It was already available for non-commercial use. The difference was that you had to submit a form and it was a slow roll-out.

      • Womble@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        IIRC you could apply as a credentialed academic researcher but not as a member of the general public, but i could be wrong about that.

        • ijeff@lemdro.id
          link
          fedilink
          English
          arrow-up
          4
          ·
          1 year ago

          They didn’t check credentials, but it was indeed for research purposes. Folks were getting access if they said they were a student or researcher.

    • just_another_person@lemmy.world
      link
      fedilink
      English
      arrow-up
      42
      arrow-down
      3
      ·
      1 year ago

      The model, weights, and pre-trained data sets are. The training tools are not. You could argue that it’s not “truly FOSS” without the tools to create that data, but technically, the article is correct.

      • BeefPiano@lemmy.world
        link
        fedilink
        English
        arrow-up
        27
        arrow-down
        5
        ·
        1 year ago

        The whole point of open-source is to be able to recreate it yourself so you can make changes. This is freeware. Free-as-in-beer, not free-as-in-speech. Hell, with freeware I can use it for commercial purposes, it’s not even as free as that.

        • just_another_person@lemmy.world
          link
          fedilink
          English
          arrow-up
          11
          arrow-down
          1
          ·
          1 year ago

          In the AI world it’s a bit different. You can do whatever you want with the model and weights data which will net you the functional part of the resulting product. Train, retrain, dissect, segment…etc. They’re just not giving out the source for the actual engine. The people working with such things really only care about the data, and in most cases, would probably convert it to a different engine anyway.

          • BeefPiano@lemmy.world
            link
            fedilink
            English
            arrow-up
            6
            arrow-down
            1
            ·
            1 year ago

            Can I remake the model only including Creative Commons sourced training material?

            • just_another_person@lemmy.world
              link
              fedilink
              English
              arrow-up
              4
              ·
              edit-2
              1 year ago

              You can reuse the data however you want, yes. You just can’t do it with their proprietary model. So, again, the ENGINE is not open source (the thing that drives their released version), but the model and data as it runs as released you can do whatever you want with.

                • just_another_person@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  5
                  ·
                  1 year ago

                  Nope. Free for educational, research, or commercial. I’m sure their license has some restrictions on what that actually means once you get to be competitive with the original as a product, but otherwise free unless you start a massive enterprise based on it, at which point you probably wouldn’t use it anyway. It’s just an LLM, it’s not doing anything super special like folding proteins for drug development, or curing cancer.

        • smileyhead
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 year ago

          Calling ML models “Open Source” is already confused. Because they are not programs, but rather formats, they don’t come 1:1 with the source.

          You can obtain a model and train it futher. Similliar how you can get JPEG file with permissive licence, edit it and share it. Having the GIMP/Photoshop project from the image was created from is helpful but not nessesary.

          • Dym Sohin@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            here’s core difference: the nature of ai-models is generative, but all layers in a .PSD file are inherently static.

            better analogy would be rendering of a fractal — a limited subset of infinite possibilities, but to explore the rest of them you need both rules and data

    • bionicjoey@lemmy.ca
      link
      fedilink
      English
      arrow-up
      21
      arrow-down
      1
      ·
      1 year ago

      That’s 1 of the 2 lies in the headline. The other appears in quotation marks.

      • Treczoks@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        1 year ago

        On the other hand, this could mean “this software it totally crap, and the users will shoot us down from all sides, and he has the balls to publish it anyway, even if we will get sued into kingdom come.”

  • ratzki
    link
    fedilink
    English
    arrow-up
    15
    ·
    1 year ago

    Was not too bad for Google to make android open source

  • BetaDoggo_@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    ·
    1 year ago

    It was probably just to get all of the free development work the community has done. There are multiple engines designed around optimizing llama models specifically such as llama.cpp and exllama, and many other projects built around the architecture.

    Facebook’s research division also has a pretty consistent track record of releasing things to the public rather than letting their research models rot.

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    8
    ·
    1 year ago

    This is the best summary I could come up with:


    The AI community has embraced the opportunity, giving Meta CEO Mark Zuckerberg his next potentially huge platform.

    This wouldn’t have happened unless Zuckerberg was willing to take a big risk on Llama 2 possibly being used for nefarious purposes, according to a former top Facebook engineer.

    “It takes a certain amount of guts to release an open-source language model, especially with political heat that Meta’s getting from that,” said Amjad Masad during a recent episode of the No Priors podcast.

    Before that, he spent almost 3 years at Facebook where he helped create React Native and other popular software development tools.

    During the No Priors podcast, Masad said he’s been surprised that Meta is the only major tech company so far to go the open-source route for AI models.

    He compared this to Facebook’s Open Compute project, which designed data center hardware and made that available for anyone to use and contribute to.


    The original article contains 512 words, the summary contains 153 words. Saved 70%. I’m a bot and I’m open source!

  • scarabic@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    14
    ·
    1 year ago

    If the main risk is the model being used for something nefarious, like teaching terrorists how to make weapons, then can we PLEASE stop calling Zuck “bold” for doing it?

    Not caring about moral consequences is not bold. It’s reckless and uncaring. Sure, the jackass who built his startup on it now lives up Zuck’s ass in thanks, but the rest of us should call it what it is.

    • Echo Dot@feddit.uk
      link
      fedilink
      English
      arrow-up
      11
      ·
      edit-2
      1 year ago

      I always dislike this arguement. AI isn’t magically going to give people capabilities they otherwise wouldn’t have. Everything that it can do is just automating tasks that could already be done by humans.

      If you want to know how to make bombs there are literally articles online for it, you don’t have to have an AI. If you did, then there wouldn’t be terrorists already.