20-12-2024 15:40 via gizmodo.com

AI Chatbots Can Be Jailbroken to Answer Any Question Using Very Simple Loopholes

Even using random capitalization in a prompt can cause an AI chatbot to break its guardrails and answer any question you ask it.
Read more »