
I kinda get the average person overestimating the capabilities of an LLM, but technically minded people falling for it did surprise me.
Of course, I really shouldn’t be, the ELIZA effect is old and well documented, I suppose it’s more worrying that it can affect anybody.









To me lying implies an intent to deceive, LLMs can’t do that as they have no intentions or understanding of the output they produce.
It’s not lying, because it’s also not telling the truth either, it’s just statistically weighted noise.