🚨 Important AI Safety PSA: Why We Must Double-Check AI Responses

This week, xAI’s chatbot Grok made headlines for all the wrong reasons. On May 17, it responded to a user’s question by casting doubt on the death toll of the Holocaust.
On May 18, the company issued a statement blaming a “programming error.”
Let’s make this clear: it wasn’t just a mistaken fact. It was damaging, historically inaccurate and in many countries, illegal.
⚖️ A Legal and Moral Line
Holocaust denial is not only offensive — it is illegal in many places, including Germany, elsewhere in Europe and the USA. Pouring doubt on the well-evidenced figure of six million Jews massacred isn’t merely bad AI, it’s a criminal offense in many countries.
But a chatbot fueled by one of the most powerful AI systems on the planet did just that.
đź§ 3 Critical Reminders for Using AI Responsibly
- Always verify AI claims
- AI tools are like that friend who talks confidently about everything—until you fact-check them. If an answer feels off, don’t assume it’s correct. Use trusted, expert sources.
- Trust experts, not algorithms
- AI models don’t “know” things. They generate words based on patterns in their training data. That includes content from forums, social media, and historical documents—accurate and inaccurate.
- Some topics need human understanding
- Historical trauma, ethics, and legal boundaries are not things we should offload to AI. Machines can summarise. They can surface info. But it takes a human to know why and what shouldn’t be said.
The Bottom Line
AI is a brilliant assistant, not a moral compass.
It can help you write code, plan a trip, or explain how black holes work. But it can also fail—spectacularly—on sensitive, factual, and legal issues. That’s not just a bug. That’s a warning sign.
Use AI. But don’t stop thinking.
Tech assists. Humans decide.
🗣️ Have you ever caught an AI saying something wildly wrong—or dangerous?
Let’s talk about it. Share your story or thoughts below or on X 👇
👉 For more AI insights made simple, head to CliffinKent.com