You cannot change someone's mind
All persuasion is self-persuasion. How Minds Change, by David McRaney
Masters in Public Affairs goes back to the foundational books in this field and extracts the principles that the best practitioners return to again and again. One book at a time.
“There is no superior argument, no piece of information that we can offer that is going to change their mind. The only way they are going to change their mind is by changing their own mind, by talking themselves through their own thinking, by processing things they’ve never thought about before, things from their own life that are going to help them see things differently.”
— Steve Deline, deep canvasser
That quote comes from a man who has had more than 15,000 recorded conversations trying to change people’s minds on contentious social issues. He’s an activist with a clipboard, not a psychologist or a political scientist. He and his team at the Leadership Lab in Los Angeles stumbled onto a persuasion technique that independent researchers later measured as 102 times more effective than traditional canvassing, television ads, radio, direct mail, and phone banking combined.
And his core insight, the one that everything else in their method hangs from, is that persuasion is something you help someone do to themselves.
This is the idea that sets up David McRaney’s entire book, How Minds Change: The Surprising Science of Belief, Opinion, and Persuasion. McRaney is a science journalist who spent years telling people there was no point in trying to change minds — that motivated reasoning, confirmation bias, and tribal psychology made it essentially impossible. Then the shift in public opinion on same-sex marriage broke his framework. In 2012, the majority of Americans opposed it. The very next year, the majority supported it. If minds can't be changed, how do you explain that?
The book is his attempt to answer that question. And the answer he found, across deep canvassers, street epistemologists, cognitive scientists, and conflict negotiators, is remarkably consistent: the techniques that actually work all share a common structure. They create conditions where people re-examine their own reasoning and discover its weaknesses themselves.
Blaise Pascal figured this out four hundred years ago: "People are generally better persuaded by the reasons which they have themselves discovered than by those which have come into the mind of others." Four centuries, and we're still designing campaigns around the information deficit model: the assumption that people disagree because they don't have enough facts, and that providing those facts will bring them around.
It won’t. And McRaney shows why.
Why this book matters
If your job involves shaping opinion, building coalitions, or moving decision-makers from one position to another — this book challenges the operating assumption underneath most of what you do.
The assumption is that better information produces better outcomes. That if we can just get the right facts in front of the right people, presented clearly enough, they’ll come around. This the information deficit model, and the evidence against it is overwhelming.
Political scientists Donald Green and Alan Gerber reviewed more than 100 published studies on voter persuasion. Canvassing, TV ads, direct mail, phone banking. None of them produced lasting attitude change. Zero.
McRaney documents techniques that do work and explains the cognitive mechanisms behind them, giving practitioners something we can build on.
The core reframe: this is a post-trust crisis, not a post-truth crisis. That distinction matters enormously. If the problem is post-truth, you solve it with better facts. If the problem is post-trust, you solve it with better relationships and a fundamentally different posture toward the people you’re trying to reach.
What we cover in the episode
This is the latest episode of Masters in Public Affairs, where we go deep on the foundational books in public affairs and extract the mental models that hold up over time. In this episode, we cover:
The information deficit model and why it fails. McRaney traces the assumption that facts change minds through centuries of well-intentioned failure — from 19th-century rationalist philosophers to Benjamin Franklin to Timothy Leary to the modern “post-truth” panic. Each generation believed the next information technology would resolve disagreement. Each was wrong.
The Redlwask experiment. Subjects exposed to a moderate dose of negative information about their preferred political candidate became more supportive — not less. Below a certain threshold, counter-evidence strengthens the position you’re trying to change. Above it, people accommodate. There is no gentle middle path. Half-measures inoculate.
SURFPAD and the construction of reality. Why reasonable people looking at the same information reach opposite conclusions — and why neither side experiences themselves as having made a choice. This connects directly to Lippmann’s pseudo-environment from Episode 1, and McRaney gives us the neurological mechanism underneath it.
Tribal psychology and the cost of changing your mind. Brooke Harrington’s line — “Social death is more frightening than physical death” — explains why people cling to beliefs that outside observers find absurd. They're clinging to the group, and the belief is just the badge.
Three field-tested persuasion methods. Deep canvassing, street epistemology, and the Smart Politics method. Three independently developed techniques that converge on the same principle: questions over arguments, stories over facts, the other person's reasoning over your own.
Network percolation. How opinion change scales through populations, and why you don't need thought leaders or elites to start a cascade. The key variable is the susceptibility of the network.
Listen and subscribe here:
Bonus: Ideas from the book that didn’t make the episode
There’s more in How Minds Change than fits in a single episode. Here are a few ideas from my highlights that are worth knowing, even if I didn’t have time to develop them fully.



