- cross-posted to:
- fuck_ai@lemmy.world
- 404media@rss.ponder.cat
- cross-posted to:
- fuck_ai@lemmy.world
- 404media@rss.ponder.cat
Anthropic created an AI jailbreaking algorithm that keeps tweaking prompts until it gets a harmful response.
You must log in or register to comment.