Someone got ChatGPT to reveal its secret instructions from OpenAI
We often talk about ChatGPT jailbreaks because users keep trying to pull back the curtain and see what the chatbot can do when freed from the guardrails OpenAI developed. It’s not easy to jailbreak the chatbot, and anything that gets shared with the world is often fixed soon after.The latest discovery isn’t even a real…
Someone got ChatGPT to reveal its secret instructions from OpenAI Read More »