ugjka@lemmy.world to Technology@lemmy.worldEnglish · 8 months agoSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeexternal-linkmessage-square267fedilinkarrow-up1994arrow-down115
arrow-up1979arrow-down1external-linkSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeugjka@lemmy.world to Technology@lemmy.worldEnglish · 8 months agomessage-square267fedilink
minus-squareboredtortoise@lemm.eelinkfedilinkEnglisharrow-up7·8 months agoMaybe giving contradictory instructions causes contradictory results
Maybe giving contradictory instructions causes contradictory results