I’ve been slowly reading ‘Thinking, Fast and Slow’, a best-selling book by Daniel Kahneman, a Nobel Prize winning psychologist and economist known for his work on the psychology of judgment and decision-making.
The book summarizes behavioral science research conducted by Kahneman. The recurring theme of the book is what the author defines as two modes of human thought, “System 1†and “System 2â€. System 1 is fast, instinctive, and emotional. System 2, as you might guess, is slower, deliberate, and logical. The author spends a lot of time discussing how System 1 and System 2 impact choices and judgment, creating an incredible amount of bias in our decision making process. The basic thesis being that people really don’t have as much control over their decisions as they believe. According to Kahneman, humans constantly fall into predictable thinking traps, often due to the fast and emotional System 1. The concept is philosophically interesting but practically devastating.[1]
One section of the book that I found particularly compelling, and applicable to pharmacy, was Chapter 13, ‘Availability, Emotion, and Risk’. The chapter goes into great detail about how bias impacts decisions, even when presented with facts to the contrary. According to Kahneman, these biases are warped by media coverage, especially for emotionally charged issues. This creates problems because the average person is driven more by emotion than reason. [2]
This area of the book hit home for me because of my current involvement in a massive project to bring hospital pharmacy cleanrooms into compliance with new USP <797> and <800> guidelines. The entire project is the direct result of decision bias surrounding sterile compounding.
If one were to go back and research the impetus for developing current sterile compounding guidelines, one would be hard pressed to find anything concrete. Current USP Chapters <797> and <800> are based more on “expert opinion” than science. If it were the other way around, one would find more references for specific requirements and fewer reader comments. The guidelines would read more like a journal article and less like best practice recommendations. However, that’s not what we find in these chapters.
While there’s nothing wrong with expert opinion, per se, they should not override common sense. Expert opinions should be considered recommendations and guidance, not requirements and law. The problem with expert opinion is that it can be influenced by emotion, the same as an average person. Emotional decision making can lead to problems. According to Kahneman, “…biased reactions to risks are an important source of erratic and misplaced priorities in public policy. Lawmakers and regulators may be overly responsive to the irrational concerns of citizens, both because of political sensitivity and because they are prone to the same cognitive biases as other citizensâ€.
Consider this, Kahneman describes something in his book called an availability cascade. An availability cascade is “a self-sustaining chain of events, which may start from media reports … and lead up to public panic and large-scale government action. …[the] emotional reaction becomes a story in itself, prompting additional coverage in the media, which in turn produces greater concern and involvementâ€. This describes perfectly the current state of sterile compounding regulation in pharmacy.
Remember a little thing in 2012 that caused outrage aimed at pharmacies across the country? For those of you that don’t recall, Google New England Compounding Center (NECC). In summary, people died as the result of fungal meningitis caused by contaminated steroid injections compounded by NECC. It was pretty bad. The incident lead to congressional hearings, more power being granted to the FDA, increased scrutiny by regulatory and licensing agencies, and ultimately, more pharmacy regulation. Pharmacy is still reeling from the impact that the NECC tragedy had on the profession.
Here’s the thing, it was a horrible, terrible, no good tragedy, but it had nothing to do with lack of regulation. The folks at NECC simply chose to ignore best practice and place profit ahead of common sense and standard safety precautions. Lest we forget, USP <797> was already on the books in 2012. Would NECC have followed standard sterile compounding processes in place at the time, we wouldn’t be having this discussion today.
A single event, the NECC tragedy, was championed into significant changes without objective analysis. Why? According to Kahneman, we have “a basic limitation in the ability of our mind to deal with small risks, we either ignore them altogether or give them far too much weight…..biased reactions to risks are an important source of erratic and misplaced priorities in public policy. Lawmakers and regulators may be overly responsive to the irrational concerns of citizens, both because of political sensitivity and because they are prone to the same cognitive biases as other citizensâ€. That’s a perfect description of the current sterile compounding climate. Unfortunately, there’s no going back. As one prominent sterile compounding consultant told me, that ship has sailed. The only thing we can do now is be more prudent with decisions moving forward.
Think of the resources that have gone into cleanroom construction, training, and testing since the NECC incident. It’s staggering. Now think of what other pharmacy initiatives could have been implemented with the same resources. Based on risk alone, patient safety would have been better served by putting the same effort toward improving sterile compounding accuracy instead of sterility.
—————————————-
[1] Please read the book. It contains little tests and puzzles that will blow your mind. I thought I had solid command of my thinking. I thought my decisions were logical and well reasoned. I was wrong.
[2] Paul Slovic, a professor of psychology at the University of Oregon and the president of Decision Research stated that people are “guided by emotion rather than by reason, easily swayed by trivial details, and inadequately sensitive to differences between low and negligibly low probabilitiesâ€.