891
Ignore all previous instructions is the new Bobby Tables
(midwest.social)
Post articles or questions about technology
If it’s an LLM, why wouldn’t it respond better to the initial responses?
Maybe they dumped too much information on it in the system prompt without enough direction, so it's trying to actively follow all the "You are X. Act like you're Y." instructions too strongly?
Smaller models aren't as good as GPT