Sjmarf@sh.itjust.works to Lemmy Shitpost@lemmy.world · 28 days agoHow to clean a rescued pigeonsh.itjust.worksimagemessage-square139fedilinkarrow-up11.27Karrow-down112
arrow-up11.26Karrow-down1imageHow to clean a rescued pigeonsh.itjust.worksSjmarf@sh.itjust.works to Lemmy Shitpost@lemmy.world · 28 days agomessage-square139fedilink
minus-squareRivalarrival@lemmy.todaylinkfedilinkEnglisharrow-up4arrow-down7·28 days agoYou say this like human “figuring” isn’t some “autocomplete bullshit”.
minus-squareHighlyRegardedArtist@lemmy.worldlinkfedilinkarrow-up4·27 days agoYou can play with words all you like, but that’s not going to change the fact that LLMs fail at reasoning. See this Wired article, for example.
minus-squareRivalarrival@lemmy.todaylinkfedilinkEnglisharrow-up2·edit-227 days agoMy point wasn’t that LLMs are capable of reasoning. My point was that the human capacity for reasoning is grossly overrated. The core of human reasoning is simple pattern matching: regurgitating what we have previously observed. That’s what LLMs do well. LLMs are basically at the toddler stage of development, but with an extraordinary vocabulary.
You say this like human “figuring” isn’t some “autocomplete bullshit”.
Here we go…
You can play with words all you like, but that’s not going to change the fact that LLMs fail at reasoning. See this Wired article, for example.
My point wasn’t that LLMs are capable of reasoning. My point was that the human capacity for reasoning is grossly overrated.
The core of human reasoning is simple pattern matching: regurgitating what we have previously observed. That’s what LLMs do well.
LLMs are basically at the toddler stage of development, but with an extraordinary vocabulary.