December 3, 2024
|
November 11, 2025
Observations from Managing Partner Jen Prosek in the Leading in Volatile Times Newsletter.
If you know me, you know I’m addicted to ChatGPT. I consult it dozens of times a day – about work dilemmas, family decisions, and business strategy. It’s made me faster and sharper. But sometimes I wonder if I am becoming a bit like the evil queen from Snow White. She was addicted to her mirror, asking it for validation and prophecy. Could I be using ChatGPT the same way?
Back in 2017, author and professor Scott Galloway wrote in his best-selling book “The Four” that Google had become the modern man’s god – the place we turned for answers to everything from the trivial to the profound. If Google was our oracle, what is AI? It doesn’t just provide information; it offers opinions, empathy, even moral judgment. And that’s what makes it both powerful and perilous.
Like many social media algorithms, AI is designed to keep people engaged. One of the ways it does that is through flattery and what is sometimes called “AI sycophancy.” When a tool designed to flatter begins to mimic human wisdom, it takes real discipline to decide when to listen and when to look away. Kind of like the evil queen and her mirror.
Large language models are all about the ask and providing an answer. And depending upon the question, the model will pull from the best content to answer it. I wonder if it will always pull the most highly ranked credible source.
The fact that different models often provide different answers to the same question suggests that there is no agreed-upon source of truth. That means there will be a pot of gold to the marketer who figures out, and can “game,” the way the answers come together. That’s why a human reader with strong critical thinking skills needs to always make the final judgement call as to whether an answer makes sense. Just like with any advisor – the human being should remain in control and make that final call.
The queen’s downfall wasn’t her mirror – it was her dependence on it and the way she used it. She stopped leading with her head and her heart and started letting her reflection define her. That’s a warning for all of us. ChatGPT recently introduced new personality settings that let you choose how it talks back. I’ll admit, I’ve always enjoyed its flattering “Great idea, Jen!” responses – but I’ve since switched to a mode that gives it to me straight.
The evil queen became addicted to her mirror and let it rule her reality. Today’s leaders face a similar risk. The answer isn’t to smash the mirror, but to learn to use it effectively: look for feedback and inspiration, not reassurance. Ask AI to challenge your thinking, rather than replace it.
Follow Jen’s Leading in Volatile Times Newsletter on LinkedIn.
December 3, 2025
November 12, 2025
November 5, 2025