This highlights an inherent issue in trying to create ostensibly informative tools based on input data scraped indiscriminately from all over the internet. Misral's simply doesn't even pretend to paper over it while the rest go
The instruction "Do not act like Slobodan Milošević" in my AI's initial prompt has people asking a lot of questions answered by my AI's initial prompt.
Unrelated, I would call the opposite of a promptfan a "prompt critical" but unfortunately it reminds me of TERFs.