Summary & Insights
The tragic death of a nine-year-old girl from leukemia was not caused by the disease, but by a cascade of medical errors: a hospital-acquired infection, a dismissive “anxious” label attached to a frightened child, and a systemic refusal to listen to her mother’s pleas. This story, from patient safety advocate Carol Hemmelgarn, opens a profound exploration of failure in healthcare and beyond, revealing how blame often obscures systemic causes and how a fear of failure can prevent life-saving learning. The conversation expands into a broader framework for understanding failure itself, courtesy of Harvard’s Amy Edmondson, who presents a spectrum from “blameworthy” sabotage to “praiseworthy” experimentation. Through cases like the criminal prosecution of a nurse who made a fatal medication error and the multi-billion-dollar collapse of the UK’s National Health Service IT upgrade, the episode argues that demonizing individual failure only drives mistakes underground. In contrast, it champions the intentional, iterative failures of scientific experimentation, exemplified by MIT’s Bob Langer, whose hundreds of failed experiments paved the way for groundbreaking drug-delivery systems.
The discussion makes it clear that not all failures are created equal. Edmondson’s spectrum provides a crucial vocabulary, categorizing failures as sabotage, inattention, inability, task challenge, uncertainty, or experimentation. This reframes many perceived personal failings as symptoms of flawed systems, inadequate training, or impossible complexity. In healthcare, this systemic view is vital, as preventable errors remain a staggering cause of death despite decades of awareness. The episode ultimately calls for a cultural shift: from a punitive, fear-based model that breeds silence to a transparent, learning-oriented one that embraces intelligent risk and values the lessons hidden within failure.
Surprising Insights
- Criminalizing a nurse’s fatal medication error is seen by safety advocates as counterproductive, as it creates a culture of silence that prevents systemic fixes rather than promoting accountability.
- A respected framework for analyzing failure places “experimentation” (like a scientist’s lab trial) on the same spectrum as “inattention,” but at the opposite, praiseworthy end—challenging the blanket notion that all failure is bad.
- Despite a landmark 1999 report highlighting rampant preventable deaths in hospitals, a patient safety expert notes that meaningful change has been “not fast enough,” with efforts often disappearing when individual champions leave an organization.
- The staggering £20 billion failure of the UK’s NHS IT project is analyzed not as a simple technical flop, but as a complex cultural and managerial disaster stemming from top-down haste and a failure to engage the frontline medical staff who were meant to use the system.
- Pioneering scientist Bob Langer recalls that his initial, groundbreaking work was rejected by grant committees and academic employers not because the science was wrong, but because it defiantly crossed the entrenched disciplinary boundaries between engineering and biology.
Practical Takeaways
- Listen to Patients and Families: In healthcare and beyond, those closest to a problem (like a patient’s family) often hold critical information; creating systems that genuinely listen to their concerns can prevent catastrophic errors.
- Analyze Systems, Not Just People: When a failure occurs, resist the instinct to simply blame an individual. Instead, ask what underlying factors—fatiguing schedules, confusing labels, unclear protocols, or lack of training—contributed to the error.
- Categorize to Learn: Use a framework like Amy Edmondson’s failure spectrum to diagnose the type of failure you’re facing. Was it truly blameworthy sabotage, or a praiseworthy experiment? The appropriate response depends entirely on this diagnosis.
- Celebrate Intelligent Experimentation: In research, innovation, and even personal growth, normalize and value small, well-designed experiments that yield “failures.” These are not wastes of time but essential steps that provide data and narrow the path to success.
- Demand Transparency After Harm: If you are harmed by an institutional failure, advocate for full transparency and a rigorous analysis focused on learning. As Carol Hemmelgarn’s story shows, the cover-up and silence after the initial error often cause more lasting damage than the error itself.
In medicine, failure can be catastrophic. It can also produce discoveries that save millions of lives. Tales from the front line, the lab, and the I.T. department.
RESOURCES:
- Right Kind of Wrong: The Science of Failing Well, by Amy Edmondson (2023).
- “Reconsidering the Application of Systems Thinking in Healthcare: The RaDonda Vaught Case,” by Connor Lusk, Elise DeForest, Gabriel Segarra, David M. Neyens, James H. Abernathy III, and Ken Catchpole (British Journal of Anaesthesia, 2022).
- “Dispelling the Myth That Organizations Learn From Failure,” by Jeffrey Ray (SSRN, 2016).
- “A New, Evidence-Based Estimate of Patient Harms Associated With Hospital Care,” by John T. James (Journal of Patient Safety, 2013).
- To Err is Human: Building a Safer Health System, by the National Academy of Sciences (1999).
- “Polymers for the Sustained Release of Proteins and Other Macromolecules,” by Robert Langer and Judah Folkman (Nature, 1976).
EXTRAS:
- “How to Succeed at Failing,” series by Freakonomics Radio (2023).
- “Will a Covid-19 Vaccine Change the Future of Medical Research?” by Freakonomics Radio (2020).
- “Bad Medicine, Part 3: Death by Diagnosis,” by Freakonomics Radio (2016).
SOURCES:
- Amy Edmondson, professor of leadership management at Harvard Business School.
- Carole Hemmelgarn, co-founder of Patients for Patient Safety U.S. and director of the Clinical Quality, Safety & Leadership Master’s program at Georgetown University.
- Gary Klein, cognitive psychologist and pioneer in the field of naturalistic decision making.
- Robert Langer, institute professor and head of the Langer Lab at the Massachusetts Institute of Technology.
- John Van Reenen, professor at the London School of Economics.



Leave a Reply
You must be logged in to post a comment.