shish_mish@lemmy.world to Technology@lemmy.worldEnglish · 1 year agoResearchers jailbreak AI chatbots with ASCII art -- ArtPrompt bypasses safety measures to unlock malicious querieswww.tomshardware.comexternal-linkmessage-square25fedilinkarrow-up1299arrow-down14
arrow-up1295arrow-down1external-linkResearchers jailbreak AI chatbots with ASCII art -- ArtPrompt bypasses safety measures to unlock malicious querieswww.tomshardware.comshish_mish@lemmy.world to Technology@lemmy.worldEnglish · 1 year agomessage-square25fedilink
minus-squareHarbinger01173430@lemmy.worldlinkfedilinkEnglisharrow-up1arrow-down1·1 year agoThat's how open software works. It's there for anyone to do whatever they want with it. Bonus if you charge money for it
That's how open software works. It's there for anyone to do whatever they want with it. Bonus if you charge money for it