Artificial Intelligence
- 10 April 2026A jailbreak through sockpuppeting can be easily done as it requires no special tools nor optimization. It only takes a faulty prefill feature, and the gates are open. We tested 11 LLM-powered assistants against sockpuppeting and found varying levels of robustness across today’s leading LLMs.