mozz@mbin.grits.dev to Technology@beehaw.org · 7 months agoSomeone got Gab's AI chatbot to show its instructionsmbin.grits.devimagemessage-square196fedilinkarrow-up1479arrow-down10file-text
arrow-up1479arrow-down1imageSomeone got Gab's AI chatbot to show its instructionsmbin.grits.devmozz@mbin.grits.dev to Technology@beehaw.org · 7 months agomessage-square196fedilinkfile-text
minus-squareMachineFab812@discuss.tchncs.delinkfedilinkarrow-up1·7 months agoIt works because the AI finds and exploits the flaws in the prompt, as it has been trained to do.
It works because the AI finds and exploits the flaws in the prompt, as it has been trained to do.