“Why are my stakeholders so unreasonable?”
When "data-driven" leadership backfires.
“My stakeholders won’t face the facts. They’re too emotional.”
I hear this constantly from smart, experienced engineering leaders who put a premium on logic, data, and elegant proofs.
They’re forgetting one thing:
Facts, in the face of big feelings, are completely useless.
Imagine if your friend’s dog died. Would you remind them that everyone dies? Do you recite the rising cost of vet bills? Do you tell them to be grateful; based on average breed lifespan, Fido lived a relatively long and healthy life?
This example seems ridiculous because it’s socially acceptable to be emotional over the loss of a pet. But…in spheres where emotion is stigmatized - like work, politics, and finances - we do it all the time.
Ex: stakeholders, anxious about a new product launch, are met with engineering teams who tell them that “every release will have bugs".
It’s like putting out fire with gasoline.
Being Emotional vs. Acting Emotionally
We all feel first, think second. Most people just don’t realize it.
Toxic leaders are often extreme examples of this. They identify as Very Logical™, but emotion is aggressively (and subversively!) driving the bus. If a new AI initiative makes them feel inadequate, they find facts to shoot it down. When someone flatters their ego, they find facts to promote them.
It’s a common trap because leaders want to identify as “logical and data driven”. “Emotional awareness” in a business setting usually involves sidelining our pesky human feelings.
But emotions aren’t “in the way” - rather, they help us see the whole truth.
Facts vs. Feeling
Facts are clean, clear-cut data: legible at a glance.
Emotions, on the other hand, are messy, gnarly datasets. At first unintelligible and overwhelming, but rich with insight when decoded.
Imagine a CTO trying to launch an AI initiative. The facts tells them that they need to look seriously at how their teams can adopt AI. Their emotions bring up anxiety and shame at the idea of “falling behind”.
If they aren’t able to decouple the personal anxieties of their ego, they’ll feel heightened, artificial urgency. Ex: they might try to rush adoption at any cost, even when it doesn’t make any sense for the overall business.
They might send a memo like this:
Dear team,
Using AI effectively is now a fundamental expectation at AcmeCorp. You have all the tools: it’s now up to you to learn. I won’t sugarcoat it: opting out isn’t feasible. Stagnation is slow-motion failure.
What this means: we’re going to be adding AI usage to performance reviews. Any additional headcount must be justified based on what AI can’t accomplish.
The good news: many team members are already using these tools to deliver 10x, 100x results. I’ve experienced it firsthand with my own work. The possibilities are endless; let’s build something great together,
Very Logical™ CTO
(Note: I generated this email by asking Claude to amalgamate 3x real AI company-wide memos)
But, if they tap into their emotion, it sheds new light on the facts. If everyone is feeling anxious and inadequate about AI, are the “facts” filtered through this lens? Are their other CTO friends, all bragging about how successful their AI initiatives, falling into the same trap?
Separating out their emotions helps them see the truth more clearly: “AI is transformative technology, but a lot of the urgency is manufactured hype.”
Now, they can make sure their leadership isn’t tainted by their personal feelings of shame and FOMO. Instead of command, control, and bluster, they can authentically connect with their team.
“This AI stuff makes me nervous too. Let’s figure it out together.”
More importantly, it also helps them see the real problem: people are anxious and fearful about AI - and you don’t learn well when you’re scared.
How do you make it safe to ask questions and learn on the job? How do you encourage truth-telling over bluster?
Worth noting: Our Very Logical™ CTO completely missed this. Arguably, they undermined the very behavior they were trying to encourage.

An emotionally attuned memo (if a memo at all) might look like this:
Dear team,
AI is exciting and revolutionary, but also a huge unknown. That’s why we want to normalize experiments with AI across AcmeCorp over the next 3 months.
We’ll be running various initiatives (ex: hackathons, workshops, focus groups), but I want to emphasize that trying, failing, and learning is the point. Despite all the hype, there are no real AI experts yet - myself included.
I’m nervous…but also excited. I couldn’t imagine a better team to start on this journey with. Just like always, we’ll figure it out together.
CTO
Both this and our OG email acknowledges the same facts. Both messages have the same end goal - driving widespread AI adoption across AcmeCorp.
But which one is more inspiring?
Which exacerbates anxiety?
Which one will actually drive real adoption/learning?
Which feels better?
An Anxious World
The business of leading has always been about human emotion. Good leaders recognize and process their own feelings, but are attuned to how their teams and customers feel.
It’s why the best companies and brands aren’t data driven lists of features; they make their customers and stakeholders feel good.

Today, we need leadership like this more than ever.
There’s…a lot going on in the world right now. The people on your team are likely feeling overwhelmed and frightened and anxious about any number of legitimately terrifying things.
It’s not your job to soothe this. In most cases, you can’t.
But….you can do the hard work to process your own emotions conscientiously. You can practice vulnerable and honest communication. You can make the 8 hours a day your team spends at work not another bad thing.
And by simply feeling your own feelings in a healthy, natural way, you’ll give others permission to do the same.
It’s a small step toward a kinder, more feeling world.
About the Author:
Christine Miao is the creator of technical accounting–the practice of tracking engineering maintenance, resourcing, and architecture. It visualizes the most complex technical problems - think: breaking up monoliths or cleaning up tech debt - in a way that anyone can understand.


