• knotthatone@lemmy.one
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    2
    ·
    1 year ago

    So if someone builds an atom-perfect artificial brain from scratch, sticks it in a body, and shows it around the world, should we expect the creator to pay licensing fees to the owners of everything it looks at?

    That’s unrelated to an LLM. An LLM is not a synthetic human brain. It’s a computer program and sets of statistical data points from large amounts of training data to generate outputs from prompts.

    If we get real general-purpose AI some day in the future, then we’ll need to answer those sorts of questions. But that’s not what we have today.

    • teawrecks@sopuli.xyz
      link
      fedilink
      arrow-up
      5
      ·
      1 year ago

      The discussion is about law surrounding AI, not LLMs specifically. No we don’t have an AGI today (that we know of), but assuming we will, we will probably still have the laws we write today. So regardless of when it happens, we should be discussing and writing laws today under the assumption it will eventually happen.

        • teawrecks@sopuli.xyz
          link
          fedilink
          arrow-up
          4
          ·
          1 year ago

          I’m saying the discussion about AI law. You can’t responsibly have a discussion about law around LLMs without considering how it would affect future, sufficiently advanced AI.