• 0 Posts
  • 3 Comments
Joined 1 year ago
cake
Cake day: June 15th, 2023

help-circle

  • That doesn’t prove that GPT is reasoning, its model predicts that those responses are the most likely given the messages your sending it. It’'s read thousands of actual conversations with people stating something incorrect, then having it explained to them and them coming around and admitting they were wrong.

    I’ve seen other similar cases where the AI is wrong about something, and when it’s explained, it just doubles down. Because humans do that type of thing too, refusing to admit their wrong.

    The way it’s designed means that it cannot reason in the same way humans experience it. It can simulate a likely conversation someone would have if they could reason.