- cross-posted to:
- technology@lemmit.online
You must log in or # to comment.
that doesn’t really surprise me. your average chatGPT or Gemini isn’t really trained on medical data, and it’s designed to give you quick, generalized answers.
you shouldn’t expect expert advice from it.
Thank God for this. It’s probably not syphilis after all. I’m safe.
And yet we’re using it to make decisions on Medicare providing denials or allowing someone to live.

