Chip Heath
# Chip Heath — On the Architecture of Stickiness
Chip Heath — On the Architecture of Stickiness
The Problem That Made This Work Necessary
There is a version of the world in which good ideas win. In that world, the better argument prevails, the clearer explanation spreads, and the more rational proposal gets adopted. We do not live in that world, and Chip Heath has spent his career documenting the gap between that fantasy and observable reality.
Heath arrived at Stanford’s Graduate School of Business carrying questions that organizational psychology had largely left unresolved: why do some ideas propagate through populations like viruses while equally valid — sometimes superior — ideas die in the room where they were first proposed? This is not merely a marketing question, though the marketing industry would eventually consume his findings with enthusiasm. It is a deeper epistemic and cognitive problem. The channel through which an idea travels is not neutral. The format, the emotional register, the narrative structure of an idea — these are not decoration layered on top of content. They are constitutive of whether the content survives contact with a human mind.
The intellectual context that made his work urgent was the collision of two streams: cognitive psychology’s growing understanding of how memory and attention actually function, and the practical chaos of organizations that were hemorrhaging institutional knowledge, making systematically poor decisions, and failing to execute even well-reasoned change initiatives. The academy had the theory; the boardrooms had the mess. Heath positioned himself at that intersection.
Made to Stick: Reverse Engineering Contagious Ideas
The 2006 book Made to Stick, co-authored with his brother Dan Heath, is organized around a framework they call SUCCES — an acronym covering Simplicity, Unexpectedness, Concreteness, Credibility, Emotions, and Stories. Read superficially, this looks like a mnemonic for a corporate workshop. Read carefully, it is a distillation of decades of cognitive and social psychology research into why certain ideas survive in memory and certain others do not.
The most intellectually interesting claim in the book is not any single element of the framework but the prior diagnosis: the Curse of Knowledge. Heath identifies this as the core communication failure of experts. Once you know something deeply, you cannot reliably reconstruct the experience of not knowing it. You compress, you abbreviate, you skip scaffolding steps, and you mistake your own fluency for the listener’s comprehension. The result is that experts are systematically worse communicators to novices than moderately informed peers are. This is not a new observation — it appears in scattered forms across education research and cognitive science — but Heath gave it a name, anchored it in concrete examples, and made it actionable in a way that the academic literature never quite had.
The concreteness principle deserves particular attention because it runs against a deep intuition in professional culture, which is that abstraction signals sophistication. Heath’s research suggests the opposite: concrete language is processed faster, remembered longer, and more reliably shared across different people with different background knowledge. The example he returns to is Feynman’s dictum about understanding physics well enough to explain it to a freshman. Abstraction is a compression tool; it is useful only among people who share the same decompression key. Concrete images are more universally parseable.
The role of story is similarly underappreciated in formal organizational settings. Stories, as Heath frames them, are not entertainment — they are simulation. When a narrative is well-constructed, the listener runs a mental model of the events, testing their own judgments and emotions against the scenario. This is why a single case study can outperform a statistical summary in changing attitudes: the story generates vicarious experience, and experience updates beliefs in ways that abstract data rarely does.
Decisive and Switch: The Decision and Change Problems
Heath’s subsequent work extended the intellectual framework into adjacent territories. Switch (2010) tackled organizational change through Jonathan Haidt’s elephant-and-rider metaphor — the rational deliberative mind as a small rider on a vast emotional elephant — and added the concept of shaping the path, the idea that changing behavior often requires changing the environment more than changing the mind. This connects directly to behavioral economics and Thaler and Sunstein’s nudge theory, and Heath was careful to situate the work within that conversation rather than pretending it arrived from nowhere.
Decisive (2013) is perhaps his most technically ambitious book. It takes on the cognitive bias literature — Kahneman and Tversky’s decades of work on heuristics and errors — and attempts to synthesize it into a practical decision process. Heath’s contribution here is the WRAP framework: Widen your options, Reality-test your assumptions, Attain distance before deciding, Prepare to be wrong. What distinguishes this from pop-psych self-help is the mechanism: each step is traceable to a specific, well-documented failure mode. Narrow framing creates false binary choices. Confirmation bias corrupts assumption-testing. Short-term emotion distorts priority. Overconfidence suppresses contingency planning. WRAP is not a magic solution; it is a checklist designed to interrupt predictable failure modes at the moments they typically occur.
Where the Work Lands Today
Heath’s ideas have been absorbed so thoroughly into organizational practice and communication design that they have become nearly invisible — which is, ironically, what he would predict happens to genuinely sticky ideas. The Curse of Knowledge is now a standard concept in UX research and product design. Concrete user stories in agile development methodology carry the fingerprints of exactly the argument he was making. The behavioral science consulting industry that has bloomed in the last decade operates largely on foundations he helped popularize.
The more interesting question is what remains unresolved. Heath has always been more practitioner-oriented than theoretically ambitious, and critics might fairly note that the SUCCES framework, for all its usefulness, does not fully explain why certain ideas stick and others fail to — it describes correlates of stickiness rather than a mechanistic theory. The gap between identifying properties of sticky ideas and generating them reliably is still large. There is also a darker implication in the work that Heath tends to leave underexplored: if you can reverse-engineer stickiness, you can engineer manipulation. The same principles that help a nurse communicate a drug interaction clearly can help a demagogue make a false narrative unforgettable.
Why It Matters
What makes Heath’s project genuinely interesting to a technically-minded generalist is that it treats communication and decision-making as engineering problems with diagnosable failure modes — not as arts that some people are born good at. There is something quietly radical about that framing. It implies that the gap between an idea that changes an institution and an idea that dies in a PowerPoint presentation is often not a gap in the quality of the underlying thinking but a gap in the design of the delivery.
That realization is uncomfortable for anyone trained to believe that rigor and truth should be self-evidently persuasive. Heath’s work is a patient, empirically-grounded argument that they are not — and that the ethical response to that fact is not to abandon rigor but to take the cognitive reality of your audience as seriously as the logical structure of your argument. The idea does not matter if it does not land. The landing is also part of the work.