• KomfortablesKissen
    link
    fedilink
    arrow-up
    3
    ·
    1 day ago

    Complexity for one. A cramped foot has an influence on the brain, as does apparently the gut bacteria. Focusing on the brain is a starting point and we don’t even understand that that well.

    If someone perfectly simulated your entire brain, would that digital brain be sentient?

    I don’t know. It could be. For now I don’t think so. Are you comparing that to an LLM? That would be like comparing the paths of snail slime to a comic. One could compare story lines and art styles to something that just isn’t there. And never will be.

    What is sentience?

    Sentience is the ability to experience feelings and sensations (wiki). A word not based on a clear understanding, but rather an attempt to categorize. Nonetheless, an LLM doesn’t experience anything. It uses pattern recognition and human provided categorization to try and create different stuff. All in the confines of the recognitions.

    I think it’s strange to say that AI will never be sentient.

    It’s why it’s important to distinguish between “AI” and “LLM”. AI, as an AGI, is something we might be able to build one day. LLMs might be a step on the way to this. But not the way they are now.

    • 0laura@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      19 hours ago

      You have a point with most of the things you said, it’s mostly a matter of perspective and how you define stuff. the only thing I really fundamentally disagree with is equating AI to AGI.

      • KomfortablesKissen
        link
        fedilink
        arrow-up
        1
        ·
        16 hours ago

        Why do you disagree with that? No, that’s a stupid question. How do you disagree with that? Can you elaborate your point?