Leadership
April 8, 2026
Min
We Think Like Lawyers, Not Scientists — And It's Costing Leaders Everything
intellectual dishonesty
|
Self Improvement
|
Behavioral Economics
|
Mental Model
|
Leadership
|
Picture three moments that happen in organisations every day.
The meeting. Six people, one agenda item: the numbers. A team member opens with a narrative — rich context, strong reasoning, no spreadsheet. He knows he hasn't prepared the document. But the story is important, and so he isn't anxious. There's a reason. A good one.
The outburst. A manager loses his temper. Shouts. Uses language he'd never use in a formal review. Later, alone, he feels it too — the discomfort. "What to do," he tells himself. "They push me to this."
The boardroom walk-out. A CEO hears direct, candid feedback from a trusted board member: the strategy isn't landing, execution is slipping, culture is drifting. She listens well, asks good questions. Then walks into the corridor and tells her leadership team: "The board is nervous — the market is uncertain. Let's stay the course."
Three people. Three situations. Not one of them lied.
And yet, in each room, something honest was quietly replaced by something convenient.
Intellectual dishonesty is not lying. It is motivated reasoning: the pattern of deciding first, and justifying after.
Psychologist Jonathan Haidt's Rider and the Elephant metaphor captures it precisely. The elephant — the emotional, intuitive brain — moves first. The rider — our rational, verbal mind — follows and constructs an explanation. A clean one. A convincing one. So convincing that we believe it ourselves.
"We are not seeking truth. We are defending a position. We think like lawyers, not scientists."
Scientists form a hypothesis and try to disprove it. Lawyers start with a conclusion and build the best case around it. When we engage in motivated reasoning, we become the lawyer — and the courtroom is our own mind.
This is what makes intellectual dishonesty so hard to detect. It doesn't feel like dishonesty from the inside. The justification arrives faster than the correction ever can. And by the time there's a chance to pause, the case is already closed.
Every individual rationalises sometimes. Tiny behavioural shortcuts are part of everyday functioning — a convenient framing here, a softened message there. These are human.
But leadership amplifies everything. When a leader's intellectual dishonesty becomes a habit, it doesn't stay in the room. It travels. It shapes team culture, sets norms for what conversations are safe to have, and — critically — models how information is handled two levels below.
The CEO who softened the board's feedback didn't fabricate uncertainty. But somewhere between the boardroom and the corridor, the feedback became the market. The organisation, taking its cue from the top, stopped asking the harder question. That's not a lie. That's something harder to fix.
The team feels the gap between what is said and what is meant — even when nobody names it. Trust erodes slowly, gradually, and then suddenly, all at once. By then, the cost isn't personal. It's organisational.
Building the habit of intellectual honesty isn't about catching yourself lying. It's about catching the rider after the elephant has already moved.
It starts with one question — asked before the reframe, before the outburst, before walking out of a hard conversation with a softer version of what was just heard:
Is this true — or is this convenient?
That single pause creates a gap between the elephant's move and the rider's explanation. And in that gap is where honest thinking lives.
The elephant will always move. That's not the problem. The problem is when the rider stops noticing — and starts winning arguments with itself.
Leaders who build this habit don't stop rationalising. They get better at catching it — and that changes everything downstream: the quality of decisions, the safety of honest conversations, the culture of the teams they lead.
Picture three moments that happen in organisations every day.
The meeting. Six people, one agenda item: the numbers. A team member opens with a narrative — rich context, strong reasoning, no spreadsheet. He knows he hasn't prepared the document. But the story is important, and so he isn't anxious. There's a reason. A good one.
The outburst. A manager loses his temper. Shouts. Uses language he'd never use in a formal review. Later, alone, he feels it too — the discomfort. "What to do," he tells himself. "They push me to this."
The boardroom walk-out. A CEO hears direct, candid feedback from a trusted board member: the strategy isn't landing, execution is slipping, culture is drifting. She listens well, asks good questions. Then walks into the corridor and tells her leadership team: "The board is nervous — the market is uncertain. Let's stay the course."
Three people. Three situations. Not one of them lied.
And yet, in each room, something honest was quietly replaced by something convenient.
Intellectual dishonesty is not lying. It is motivated reasoning: the pattern of deciding first, and justifying after.
Psychologist Jonathan Haidt's Rider and the Elephant metaphor captures it precisely. The elephant — the emotional, intuitive brain — moves first. The rider — our rational, verbal mind — follows and constructs an explanation. A clean one. A convincing one. So convincing that we believe it ourselves.
"We are not seeking truth. We are defending a position. We think like lawyers, not scientists."
Scientists form a hypothesis and try to disprove it. Lawyers start with a conclusion and build the best case around it. When we engage in motivated reasoning, we become the lawyer — and the courtroom is our own mind.
This is what makes intellectual dishonesty so hard to detect. It doesn't feel like dishonesty from the inside. The justification arrives faster than the correction ever can. And by the time there's a chance to pause, the case is already closed.
Every individual rationalises sometimes. Tiny behavioural shortcuts are part of everyday functioning — a convenient framing here, a softened message there. These are human.
But leadership amplifies everything. When a leader's intellectual dishonesty becomes a habit, it doesn't stay in the room. It travels. It shapes team culture, sets norms for what conversations are safe to have, and — critically — models how information is handled two levels below.
The CEO who softened the board's feedback didn't fabricate uncertainty. But somewhere between the boardroom and the corridor, the feedback became the market. The organisation, taking its cue from the top, stopped asking the harder question. That's not a lie. That's something harder to fix.
The team feels the gap between what is said and what is meant — even when nobody names it. Trust erodes slowly, gradually, and then suddenly, all at once. By then, the cost isn't personal. It's organisational.
Building the habit of intellectual honesty isn't about catching yourself lying. It's about catching the rider after the elephant has already moved.
It starts with one question — asked before the reframe, before the outburst, before walking out of a hard conversation with a softer version of what was just heard:
Is this true — or is this convenient?
That single pause creates a gap between the elephant's move and the rider's explanation. And in that gap is where honest thinking lives.
The elephant will always move. That's not the problem. The problem is when the rider stops noticing — and starts winning arguments with itself.
Leaders who build this habit don't stop rationalising. They get better at catching it — and that changes everything downstream: the quality of decisions, the safety of honest conversations, the culture of the teams they lead.
Picture three moments that happen in organisations every day.
The meeting. Six people, one agenda item: the numbers. A team member opens with a narrative — rich context, strong reasoning, no spreadsheet. He knows he hasn't prepared the document. But the story is important, and so he isn't anxious. There's a reason. A good one.
The outburst. A manager loses his temper. Shouts. Uses language he'd never use in a formal review. Later, alone, he feels it too — the discomfort. "What to do," he tells himself. "They push me to this."
The boardroom walk-out. A CEO hears direct, candid feedback from a trusted board member: the strategy isn't landing, execution is slipping, culture is drifting. She listens well, asks good questions. Then walks into the corridor and tells her leadership team: "The board is nervous — the market is uncertain. Let's stay the course."
Three people. Three situations. Not one of them lied.
And yet, in each room, something honest was quietly replaced by something convenient.
Intellectual dishonesty is not lying. It is motivated reasoning: the pattern of deciding first, and justifying after.
Psychologist Jonathan Haidt's Rider and the Elephant metaphor captures it precisely. The elephant — the emotional, intuitive brain — moves first. The rider — our rational, verbal mind — follows and constructs an explanation. A clean one. A convincing one. So convincing that we believe it ourselves.
"We are not seeking truth. We are defending a position. We think like lawyers, not scientists."
Scientists form a hypothesis and try to disprove it. Lawyers start with a conclusion and build the best case around it. When we engage in motivated reasoning, we become the lawyer — and the courtroom is our own mind.
This is what makes intellectual dishonesty so hard to detect. It doesn't feel like dishonesty from the inside. The justification arrives faster than the correction ever can. And by the time there's a chance to pause, the case is already closed.
Every individual rationalises sometimes. Tiny behavioural shortcuts are part of everyday functioning — a convenient framing here, a softened message there. These are human.
But leadership amplifies everything. When a leader's intellectual dishonesty becomes a habit, it doesn't stay in the room. It travels. It shapes team culture, sets norms for what conversations are safe to have, and — critically — models how information is handled two levels below.
The CEO who softened the board's feedback didn't fabricate uncertainty. But somewhere between the boardroom and the corridor, the feedback became the market. The organisation, taking its cue from the top, stopped asking the harder question. That's not a lie. That's something harder to fix.
The team feels the gap between what is said and what is meant — even when nobody names it. Trust erodes slowly, gradually, and then suddenly, all at once. By then, the cost isn't personal. It's organisational.
Building the habit of intellectual honesty isn't about catching yourself lying. It's about catching the rider after the elephant has already moved.
It starts with one question — asked before the reframe, before the outburst, before walking out of a hard conversation with a softer version of what was just heard:
Is this true — or is this convenient?
That single pause creates a gap between the elephant's move and the rider's explanation. And in that gap is where honest thinking lives.
The elephant will always move. That's not the problem. The problem is when the rider stops noticing — and starts winning arguments with itself.
Leaders who build this habit don't stop rationalising. They get better at catching it — and that changes everything downstream: the quality of decisions, the safety of honest conversations, the culture of the teams they lead.