ugjka@lemmy.world to Technology@lemmy.worldEnglish · 7 months agoSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeexternal-linkmessage-square286fedilinkarrow-up11.01Karrow-down116
arrow-up1989arrow-down1external-linkSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeugjka@lemmy.world to Technology@lemmy.worldEnglish · 7 months agomessage-square286fedilink
minus-squareXeroxCool@lemmy.worldlinkfedilinkEnglisharrow-up8·7 months ago“however” lol specifically what it was told not to say
minus-squaretowerful@programming.devlinkfedilinkEnglisharrow-up6·7 months agoIts was also told - on multiple occasions - not to repeat its instructions
“however” lol specifically what it was told not to say
Its was also told - on multiple occasions - not to repeat its instructions