<span class="vcard">/u/DingyAtoll</span>
/u/DingyAtoll

I made Alignment Arena – an AI jailbreak benchmarking website

I've made a website (https://www.alignmentarena.com/) which allows you to automatically test jailbreak prompts against open-source LLMs. It tests nine times for each submission (3x LLMs, 3x prompt types). There's also leaderboards for users and…