0
0
Summary & Insights

The tragic death of a nine-year-old girl from leukemia was not caused by the disease, but by a cascade of medical errors: a hospital-acquired infection, a dismissive “anxious” label attached to a frightened child, and a systemic refusal to listen to her mother’s pleas. This story, from patient safety advocate Carol Hemmelgarn, opens a profound exploration of failure in healthcare and beyond, revealing how blame often obscures systemic causes and how a fear of failure can prevent life-saving learning. The conversation expands into a broader framework for understanding failure itself, courtesy of Harvard’s Amy Edmondson, who presents a spectrum from “blameworthy” sabotage to “praiseworthy” experimentation. Through cases like the criminal prosecution of a nurse who made a fatal medication error and the multi-billion-dollar collapse of the UK’s National Health Service IT upgrade, the episode argues that demonizing individual failure only drives mistakes underground. In contrast, it champions the intentional, iterative failures of scientific experimentation, exemplified by MIT’s Bob Langer, whose hundreds of failed experiments paved the way for groundbreaking drug-delivery systems.

The discussion makes it clear that not all failures are created equal. Edmondson’s spectrum provides a crucial vocabulary, categorizing failures as sabotage, inattention, inability, task challenge, uncertainty, or experimentation. This reframes many perceived personal failings as symptoms of flawed systems, inadequate training, or impossible complexity. In healthcare, this systemic view is vital, as preventable errors remain a staggering cause of death despite decades of awareness. The episode ultimately calls for a cultural shift: from a punitive, fear-based model that breeds silence to a transparent, learning-oriented one that embraces intelligent risk and values the lessons hidden within failure.

Surprising Insights

  • Criminalizing a nurse’s fatal medication error is seen by safety advocates as counterproductive, as it creates a culture of silence that prevents systemic fixes rather than promoting accountability.
  • A respected framework for analyzing failure places “experimentation” (like a scientist’s lab trial) on the same spectrum as “inattention,” but at the opposite, praiseworthy end—challenging the blanket notion that all failure is bad.
  • Despite a landmark 1999 report highlighting rampant preventable deaths in hospitals, a patient safety expert notes that meaningful change has been “not fast enough,” with efforts often disappearing when individual champions leave an organization.
  • The staggering £20 billion failure of the UK’s NHS IT project is analyzed not as a simple technical flop, but as a complex cultural and managerial disaster stemming from top-down haste and a failure to engage the frontline medical staff who were meant to use the system.
  • Pioneering scientist Bob Langer recalls that his initial, groundbreaking work was rejected by grant committees and academic employers not because the science was wrong, but because it defiantly crossed the entrenched disciplinary boundaries between engineering and biology.

Practical Takeaways

  • Listen to Patients and Families: In healthcare and beyond, those closest to a problem (like a patient’s family) often hold critical information; creating systems that genuinely listen to their concerns can prevent catastrophic errors.
  • Analyze Systems, Not Just People: When a failure occurs, resist the instinct to simply blame an individual. Instead, ask what underlying factors—fatiguing schedules, confusing labels, unclear protocols, or lack of training—contributed to the error.
  • Categorize to Learn: Use a framework like Amy Edmondson’s failure spectrum to diagnose the type of failure you’re facing. Was it truly blameworthy sabotage, or a praiseworthy experiment? The appropriate response depends entirely on this diagnosis.
  • Celebrate Intelligent Experimentation: In research, innovation, and even personal growth, normalize and value small, well-designed experiments that yield “failures.” These are not wastes of time but essential steps that provide data and narrow the path to success.
  • Demand Transparency After Harm: If you are harmed by an institutional failure, advocate for full transparency and a rigorous analysis focused on learning. As Carol Hemmelgarn’s story shows, the cover-up and silence after the initial error often cause more lasting damage than the error itself.

Everyone makes mistakes. How do you learn from them? Lessons from the classroom, the Air Force, and the world’s deadliest infectious disease.

 

RESOURCES:

EXTRAS:

SOURCES:

  • Will Coleman, founder and C.E.O. of Alto.
  • Amy Edmondson, professor of leadership management at Harvard Business School.
  • Babak Javid, physician-scientist and associate director of the University of California, San Francisco Center for Tuberculosis.
  • Gary Klein, cognitive psychologist and pioneer in the field of naturalistic decision making.
  • Theresa MacPhail, medical anthropologist and associate professor of science & technology studies at the Stevens Institute of Technology.
  • Roy Shalem, lecturer at Tel Aviv University.
  • Samuel West, curator and founder of The Museum of Failure.

Leave a Reply

Freakonomics RadioFreakonomics Radio
Let's Evolve Together
Logo