Just a pair of tweets that if we look at them with an appropriately skewed glance, we could see a lot of hope for cost control while providing better quality in healthcare.
— Chad Terhune (@chadterhune) June 1, 2016
Attitudes in two different industries pic.twitter.com/fwcVLmuLGK
— James (@jameswhatling) May 17, 2016
We all fuck up.
Well designed systems of learning and error minimization acknowledge that we fuck up and create a culture and systems to minimizing common fuck-ups. Aviation has that culture while medicine may have it in isolated pockets but it is not widespread.
Vanity Fair had a good article on the loss of Air France 447 over the Atlantic in 2009 and it raised a couple of interesting points:
It all depended on the captains. A few were natural team leaders—and their crews acquitted themselves well. Most, however, were Clipper Skippers, whose crews fell into disarray under pressure and made dangerous mistakes. Ruffell Smith published the results in January 1979, in a seminal paper, “NASA Technical Memorandum 78482.” The gist of it was that teamwork matters far more than individual piloting skill. This ran counter to long tradition in aviation but corresponded closely with the findings of another NASA group, which made a careful study of recent accidents and concluded that in almost all cases poor communication in the cockpit was to blame.
The airlines proved receptive to the research. In 1979, NASA held a workshop on the subject in San Francisco, attended by the heads of training departments from around the world. To describe the new approach, Lauber coined a term that caught on. He called it Cockpit Resource Management, or C.R.M., an abbreviation since widened to stand for Crew Resource Management. The idea was to nurture a less authoritarian cockpit culture—one that included a command hierarchy but encouraged a collaborative approach to flying, in which co-pilots (now “first officers”) routinely handled the airplanes and were expected to express their opinions and question their captains if they saw mistakes being made. For their part, the captains were expected to admit to fallibility, seek advice, delegate roles, and fully communicate their plans and thoughts. Part of the package was a new approach to the use of simulators, with less effort spent in honing piloting skills and more emphasis placed on teamwork.
How does this apply to the medical field and how do we get some hope.
The scope example is a common and preventable fuck-up with significant probable harm. If those are the types of errors that we are still failing miserably to avoid, that means that the medical system current state is amazingly far from the possibility frontier of perfect execution or at least air line level execution. Minor gains can be made fairly easily with significant improvements as most of the time quality improvement is a Pareto exercise. 80% of the gains can be achieved by 20% of the work needed to get to the possibility frontier.
The chart about team work and quality improvement for airline pilots shows a culture that acknowledges their humanity. The medical side of the chart is a culture of the heroic authoritarian hero. Culture change is often harder to do then technological change but it tends to be cheaper over the long run. There is an opportunity here for improvement so that has to be a source of hope.