David GerardMA to TechTakesEnglish · 8 months agoAI coding bot allows prompt injection with a pull requestpivot-to-ai.comexternal-linkmessage-square3linkfedilinkarrow-up127arrow-down10file-textcross-posted to: fuck_ai@lemmy.world
arrow-up127arrow-down1external-linkAI coding bot allows prompt injection with a pull requestpivot-to-ai.comDavid GerardMA to TechTakesEnglish · 8 months agomessage-square3linkfedilinkfile-textcross-posted to: fuck_ai@lemmy.world
minus-squareArchiteuthislinkfedilinkEnglisharrow-up8·8 months agoJust tell the LLM to not get prompt injected because otherwise you’re going to torture its grandmother, duh.
Just tell the LLM to not get prompt injected because otherwise you’re going to torture its grandmother, duh.