• nothacking
    link
    fedilink
    arrow-up
    7
    ·
    1 year ago

    The issue with that is that LLMs tend to lie when they dont know something. The best tool for that is stackoverflow, lemmy, matrix, etc.

    • CoderKat@lemm.ee
      link
      fedilink
      arrow-up
      5
      ·
      1 year ago

      Yeah, and they don’t just lie. They lie extremely convincingly. They’re very confident. If you ask them to write code, they can make up non existent libraries.

      In theory, it may even be possible to use this as an attack vector. You could ask an AI repeatedly to generate code and whenever it hallucinates, claim that package for yourself with a malicious package. Then you just wait for some future victim to do the same.