mozz@mbin.grits.dev to Technology@beehaw.org · vor 7 MonatenSomeone got Gab's AI chatbot to show its instructionsmbin.grits.devimagemessage-square200fedilinkarrow-up1487arrow-down10file-text
arrow-up1487arrow-down1imageSomeone got Gab's AI chatbot to show its instructionsmbin.grits.devmozz@mbin.grits.dev to Technology@beehaw.org · vor 7 Monatenmessage-square200fedilinkfile-text
minus-squareMachineFab812linkfedilinkarrow-up1·vor 7 MonatenIt works because the AI finds and exploits the flaws in the prompt, as it has been trained to do.
It works because the AI finds and exploits the flaws in the prompt, as it has been trained to do.