ugjka@lemmy.world to Technology@lemmy.worldEnglish · 2 years agoSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeexternal-linkmessage-square294fedilinkarrow-up11.01Karrow-down116
arrow-up1992arrow-down1external-linkSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeugjka@lemmy.world to Technology@lemmy.worldEnglish · 2 years agomessage-square294fedilink
minus-squareboredtortoise@lemm.eelinkfedilinkEnglisharrow-up7·2 years agoMaybe giving contradictory instructions causes contradictory results
Maybe giving contradictory instructions causes contradictory results