ChatGPT has meltdown and starts sending alarming messages to users::AI system has started speaking nonsense, talking Spanglish without prompting, and worrying users by suggesting it is in the room with them

  • Coreidan@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    5
    ·
    4 months ago

    how is this different to many existing techniques and compositional models that are used practically everywhere in tech?

    It’s not. LLM is just a statistical model. Nothing special about it. Nothing different what we’ve already been doing for a while. This only validates my statement that we call just about anything “AI” these days.

    We don’t even know what true intelligence is, yet we are quick to make claims that this is “AI”. There is no consciousness here. There is no self awareness. No emotion. No ability to reason or deduct. Anyone who thinks otherwise is just fooling themselves.

    It’s a buzz word to get people riled up. It’s completely disingenuous.

    • sailingbythelee@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      4 months ago

      I think the point of the Turing test is to avoid thorny questions about the definition of intelligence. We cant precisely define intelligence, but we know that normally functioning humans are intelligent. Therefore, if we talk to a computer and it is indistinguishable from a human in a conversation, then it is intelligent by definition.

    • EnderMB@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      2
      ·
      4 months ago

      So, by your definition, no AI is AI, and we don’t know what AI is, since we don’t know what the I is?

      While I hate that AI is just a buzzword for scam artists and tech influencers nowadays, dismissing a term seems a bit overkill. It also seems overkill when it’s not something that academics/scholars seem particularly bothered by.

    • QuaternionsRock@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      4 months ago

      There is no consciousness here. There is no self awareness. No emotion. No ability to reason or deduct.

      Of all of these qualities, only the last one—the ability to reason or deduct—is a widely-accepted prerequisite for intelligence.

      I would also argue that contemporary LLMs demonstrate the ability to reason by correctly deriving mathematical proofs that do not appear in the training datasets. How would you be able to accomplish such a feat without some degree of reasoning?