• cbarrick@lemmy.world
    cake
    link
    fedilink
    English
    arrow-up
    10
    ·
    edit-2
    9 months ago

    impending AGI

    AGI is far from impending.

    LLMs are generative models. They’ve learned a distribution to model conversion, and they allow you to sample from that distribution. They aren’t “thinking” about what they say. They haven’t crossed the syntax-semantics barrier. There is no “general intelligence”.

    They just feel impressive because humans are language-centric.

    • MinusPi@yiffit.net
      cake
      OP
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      9 months ago

      AGI doesn’t have to think, it has to be able to perform any task it’s given. The models available today are far more capable than anyone predicted. With plugins available to it (which by the way, no one expected it to be able to use), it can perform tasks other than generation. All the data points towards this capability only getting better. Maybe “impending” was too strong a word, but I stand by the idea that it’s coming sooner than we expected.