Leo@lemmy.linuxuserspace.showM to Linux and Tech News@lemmy.linuxuserspace.showEnglish · 1 year agoAI coding assistant refuses to write code, tells user to learn programming insteadarstechnica.comexternal-linkmessage-square3linkfedilinkarrow-up136cross-posted to: [email protected][email protected]
arrow-up136external-linkAI coding assistant refuses to write code, tells user to learn programming insteadarstechnica.comLeo@lemmy.linuxuserspace.showM to Linux and Tech News@lemmy.linuxuserspace.showEnglish · 1 year agomessage-square3linkfedilinkcross-posted to: [email protected][email protected]
minus-squareregrub@lemmy.worldlinkfedilinkEnglisharrow-up3·1 year agoI wonder if the grandma prompt exploit or something similar would get it to work as intended lol https://www.artisana.ai/articles/users-unleash-grandma-jailbreak-on-chatgpt
I wonder if the grandma prompt exploit or something similar would get it to work as intended lol https://www.artisana.ai/articles/users-unleash-grandma-jailbreak-on-chatgpt