• SkaveRat
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    1 year ago

    while it’s technically true that it “just predicts the next word”, it’s a very misleading argument to make.

    Computers are also “just some basic logic gates” and yet we can do complex stuff with them.

    Complex behaviour can result from simple things.

    Not defending the bullshit that LLMs generate, just to point out that you have to be careful with your arguments