shish_mish@lemmy.world to Technology@lemmy.worldEnglish · 10 months agoResearchers jailbreak AI chatbots with ASCII art -- ArtPrompt bypasses safety measures to unlock malicious querieswww.tomshardware.comexternal-linkmessage-square25fedilinkarrow-up1299arrow-down14
arrow-up1295arrow-down1external-linkResearchers jailbreak AI chatbots with ASCII art -- ArtPrompt bypasses safety measures to unlock malicious querieswww.tomshardware.comshish_mish@lemmy.world to Technology@lemmy.worldEnglish · 10 months agomessage-square25fedilink
minus-squareEven_Adder@lemmy.dbzer0.comlinkfedilinkEnglisharrow-up2·10 months agoYou should know this exists already then,
You should know this exists already then,