The science of safety – and your local hospital

by Carolyn Thomas    @HeartSisters

Have you ever walked up to a big door and pulled when you should have pushed? Have you done this despite the sign on the door telling you specifically how it works? Have you even pulled repeatedly on the same door when it won’t open?  I sure have. . .

Dr. Terry Fairbanks tells this story of some door-watching he did at his local bagel shop while he sat at a table waiting for his wife.

  I watched person after person pulling on the shop door despite the PUSH sign. But if this were healthcare, we’d put a policy in place, make a policy binder, and put it on the nurses’ shelf!

“But it’s not about policy, it’s about changing the door!”   .          . 

The reason we pull when we are clearly being told to push, he explains, is that we’re in skills-based automatic mode. This is when most errors happen. We’re not consciously thinking.

“What’s happening is that the brain takes unconscious visual cues from the eyes. We can see a pull handle, but in automatic mode, our brain uses past experience, so we PUSH on that pull handle.”

Dr. Fairbanks is an Emergency physician, Professor of Emergency Medicine at Georgetown University and Founding Director of the National Center for Human Factors in Healthcare.

In fact, he started his professional career in the field of human factors engineering and systems safety – long before he went to medical school.

This means that, unlike his physician colleagues (and essentially all hospital administration decision-makers, too), Dr. Fairbanks was first known as an expert in human performance, and specifically in how to change systems to make that performance safer by addressing human  error.

But health care, he warns, does something that other complex high-risk industries don’t do anymore when it comes to errors:

   “We keep focusing on individual performance. We need to change our focus to reducing harm, not just reducing error.” 

Reducing harm by addressing misdiagnosis is a personal interest of mine, ever since I was misdiagnosed with acid reflux in mid-heart attack.

Despite textbook Hollywood Heart Attack symptoms of central chest pain, nausea, sweating and pain down my left arm, I was sent home from the Emergency Department with instructions to try antacid meds. I believed that doctor who had so confidently misdiagnosed me, so when my symptoms returned (which of course they did!) I was too embarrassed to return to Emergency – for just a little indigestion. It was only when the symptoms became truly unbearable that I finally forced myself to return to the same hospital, this time to a different doctor, a correct diagnosis and immediate emergency procedures.

But I’d bet my next squirt of nitro spray that at no point did anybody in that hospital report my initial misdiagnosis once it was confirmed during that second admission – and that means nothing was learned from my experience.

It was not discussed at rounds, by hospital administrators, in medical school classrooms. Future patients like me were not protected from being misdiagnosed themselves, because nothing was learned from my case. That’s because the medical profession does not require mandatory reporting of diagnostic error, and that’s because of a systemic fear of aggressive, punitive consequences for clinicians. No wonder physicians holler “Doctor bashing!!” every time I write about yet another study suggesting that female heart patients are more likely to be misdiagnosed compared to our male counterparts. 

But how can we address a patient safety issue that we don’t track – and are not even allowed to talk about?

Dr. Fairbanks told a South Carolina Patient Safety Symposium audience:

”   The first lesson in life we learn from our parents was, ‘Everybody makes mistakes’.

“They didn’t say, ‘Everybody except healthcare providers makes mistakes.’

“Take well-trained good people (which is almost all of us) and understand that they are still going to make mistakes, over and over again, throughout their careers.

“But hospitals keep practicing their policy trying to get individuals in their hospitals to stop making mistakes, despite what their own parents taught them.

“If we really want health care to become safer, we need to stop trying to reduce error and actually embrace it.

“We keep beating our heads against the wall thinking that if we can just fix that one doctor, or that one nurse, or make that one group compliant and change their behaviour, then we can become a safe organization – which is completely inconsistent with safety science.

“Safety science is not a religion. It’s a very basic foundation by which other high risk industries are becoming safer.”

But healthcare, he warns, has not yet embraced safety science. 

Human factors engineering is not about redesigning humans, or telling the operator to read the operating manual. It means designing for the reality of the workplace environment.

One example of a hospital environment that is designed to cause errors is the case of a nurse who gives medications to patients earlier or later than ordered:

”    This happens because in the 11 a.m.-1 p.m. time frame required to give noon meds, for example, the nurse cannot possibly do it. From an industrial design perspective, the nurse is being forced to find a shortcut.”

During his presentation, Dr. Fairbanks demonstrated shutting off the power button on his slide projector, which he says happens to be a predictable and common user error – even when you didn’t mean to shut off the power. 

But in the consumer electronics industry, power buttons have a built-in user feature, a protection from predictable error (e.g. that pop-up message, “Are you sure you want to shut down now?”)

Why, he asks his audience, doesn’t healthcare do this too?

”   Instead, the focus when you’re blaming individuals instead of bad design is to train all the potential users, or put a label on the unit, or do an in-service training for all staff.

“In healthcare, we think we’re going to fix the problem by training people, instead of fixing the problem by redesigning the system.

“Our basic premise is that the answer to safety is having people never make errors – which is why we’re not given slide projectors!”

He points to drug labels as an area of potentially serious medical error that could be addressed through basic safety science:

    “Why is the same vial that looks the same/feels the same/is the same being used for drugs that can be pushed through the IV when drawn up AND also for drugs that will kill somebody if they’re pushed through the IV and should only be dripped?

“Instead, we have a culture of accountability so we tell the nurses not to give the wrong drug.”

Dr. Fairbanks points to an example from the aviation industry.

Now, even as I type that word “aviation”, I can picture the mass eye-rolling going on out there if you’re a doctor.

Many physicians don’t like having problems in their profession compared to problems in other professions. Dr. Fairbanks, a human factors engineer-turned-physician, is well aware of this understandable response:

”    In the 1970s, when we were trying to reduce human error in aviation and we were using the nuclear industry as our model, aviation experts told us, ‘It’s different – we’re not the same, you can’t apply this stuff to us!”

Well, it’s turned out that both industries learned from each other’s experience in reducing harm.

Dr. Fairbanks recounts a story to his audience about his early career in safety science, specifically citing a study which found that pilots and air traffic controllers make an astonishing average of two errors per hour on a typical shift:

  “Yet flying on a major airline’s plane is just about the safest thing you can do.

“If you want to stay safe in the next week, get into a commercial airliner and just keep flying back and forth. Think about this next time you’re feeling nervous on a flight, one of 30,000 flights a day carrying 809 million passengers a year.”

What is the most dangerous part of a flight?  Some might guess take off and landing. Dr. Fairbanks, however, recalls overhearing a fellow passenger on a recent flight who called home just as their plane arrived at the terminal.

“I heard him tell his wife: ‘I’m safe, honey – I’ve landed.’

“I wanted to shake him and say, ‘You still have your drive home! That’s the most dangerous part of your trip!'”

Since everybody makes mistakes (like those pilots and air traffic controllers, for example), how can flying possibly be as safe as it actually is? Isn’t the goal to eliminate human error?

No, says Dr. Fairbanks.

 “Human error cannot be eliminated. Trying to do this causes a culture of blame and secrecy (or, the ‘name, blame, shame and train’ mentality).”

Trying to eliminate human error is what Dr. Fairbanks calls a “a futile goal that misdirects focus and resources”. The goal is to reduce harm.

First, here’s a warning about what does NOT work in achieving that goal. In the words of physician, patient safety activist and Harvard professor Dr. Lucien Leape:

“The single greatest impediment to error prevention in medicine is that we punish people for making mistakes.”

And Dr. Fairbanks adds:

“I can’t tell you how many times I’ve arrived at the scene of an adverse event, and a hospital nurse has already been walked down to the HR office and been suspended by the time we arrive to review the incident.

“Neither ‘normal error’ (mostly skills-based errors) nor ‘at risk’ behaviours deserve punishment. Only reckless, intentional behaviour should be addressed with sanctions.”

Watch Dr. Fairbanks’ conference presentation (about 50 entertaining minutes) on lessons we can learn from safety science and human factors engineering.

NOTE FROM CAROLYN: I wrote more about medical errors in my book A Woman’s Guide to Living with Heart Disease (Johns Hopkins University Press). You can ask for this book at your local bookshop, or order it online (paperback, hardcover or e-book) at Amazon  – or order it directly from Johns Hopkins University Press (use their code HTWN to save 30% off the list price when you order).

Q: Do you agree with Dr. Fairbanks that human error cannot be eliminated?

See also:

Ten medication mistakes that can kill

Cardiac gender bias: we need less TALK and more WALK

Unconscious bias: why women don’t get the same care men do

When you fear being labelled a “difficult” patient

The sad reality of women’s heart disease hits home

How can we get heart patients past the E.R. gatekeepers?

Mandatory reporting of diagnostic errors: “Not the right time?”

9 thoughts on “The science of safety – and your local hospital

  1. Pingback: World 24 7 News
  2. Hi Carolyn, I have first hand experience of a ‘mistake’ in a hospital surgery, and a big one.

    During open heart surgery to repair an aortic aneurysm, a catheter was sewn to my heart wall. It was discovered when, post surgery, they began to take out all the catheters, one would not move, they sent me straight down for CT scan to check it out and I had to be opened up again to remove it.

    The comment from OR staff was it had never happened, a one in a million chance (how I wish I could have run down to purchase a lottery ticket!). I am sure it would be very hard to forgive if the worst happened, but I remember at the time saying ‘ mistakes happen, let’s get it fixed’

    I make mistakes every day, often the same ones, we all do, in every possible situation, you will never eliminate human error. But as Dr Fairbanks knows, if you design better systems you could lower the odds. I make lists and check them twice 🙂

    Liked by 1 person

    1. Hello Lauren – that’s quite the mistake! I can just imagine the consternation of the hospital staff tugging away at a catheter that’s sewn into place by mistake! Ooops…

      That sounds like it could be in that “skills-based automatic mode” mistake category that Dr. Fairbanks mentions, the mistakes we make while we’re on autopilot (like putting stitches in…)

      It’s also what he describes in his presentation (see video link in this post) as a “near miss” mistake, the kind that are never reported, even the ones that could have been very serious had things not worked out. I’m glad that your experience had a good outcome.

      Like

  3. I believe in what Dr. Fairbanks says… that human error cannot be totally eliminated, however it should be a goal to reduce it to as low a level as humanly possible.

    Hand in hand with systems changes such as changes in shapes and colors of various medication vials. What I embrace as a nurse and educator is the concept that “punishment” is not the answer in many, if not most, cases of error.

    Nurses and doctors do not go into their profession with the goal of causing harm. But it does happen. What I find to be the greatest harm…. which Carolyn speaks about a lot… is the harm of ignorance and the harm of having an ego so large that one is not amenable to listening, learning and admitting mistakes. These qualities are qualities of character and can’t be legislated but could be taught.

    How is character, listening to patients, valuing every race, sex, creed and age taught in medical school? How is the role of self confidence differentiated from ego or narcissism? How do so many doctors show up in the clinic and hospital unwilling to admit they don’t know everything? Or are many health professionals these days from schools and families that never emphasized character above financial success?

    I have watched a distinct change in the Nursing Profession…. In the US it is one of the few professions you can go into with a 2 year associate degree and make very good money. I have met and taught many students that are going into nursing JUST for that reason. I am appalled … to me Nursing was a calling, a sacred trust, advocating for patients who are ill and can’t advocate for themselves.

    The condition of too few nurses being available has led to shorter schooling and higher pay…. to “attract” men and women into the profession. I would imagine the emphasis on less schooling and more pay ….might attract more people that are okay with short cuts and make more errors….. just speculating I have not any research.

    Honesty, Integrity, admitting we are fallible, offering education and guided follow up to those who need it but also realizing there is no room for laziness, sloppiness, ego, or short cuts when human life is at stake….

    It’s not like we are baking a pie and if it turns out wrong we can do it over!

    Liked by 1 person

    1. Hi Jill – no we are definitely not baking a pie!!

      You raise an interesting point about character: I’m not sure if character can be “taught” in med school like anatomy or other subjects are taught. Maybe better to screen for character during the med school admissions process in the first place…

      I’ve recently been asked to serve as a reference for a young friend who’s applying to med school and it’s been a real eye opener to read the kinds of questions that universities are now asking referees like me about the applicant. They are all about the “character” of the person applying!

      It used to be that medical schools simply picked the top brainiacs in the pool, those in the highest percentile of applicants – with little thought to personality traits like compassion or kindness or even a sense of humour!

      That was before somebody in some ivory tower somewhere figured out that brainiacs don’t necessarily make the best doctors… You also raise some good points about nursing training (although I’m sure nobody wants to go backwards to those ‘good old’ days of longer training and lower pay!) I spent many years working in hospice palliative care longside my nursing colleagues and I often used to say (to anybody who would listen!!): “I don’t care how much we’re paying these people – it’s NOT ENOUGH!!”

      Like

      1. I have studied and worked with a Spiritual teacher in India who has as one of his missions building schools, elementary through college that graduate students with character.

        By the time a person gets to med school you are right, it can be too late but I do believe in the infinite power of human beings to change no matter what age. I am glad that some med schools may be recognizing other traits besides IQ when screening applicants – YAY!

        Liked by 1 person

        1. This is a relatively recent development for med schools in terms of acknowledging that, while it’s important to be smart enough to be a doctor, there are many other traits that are equally important. Personally, over the years I’ve seen a number of older docs (all specialists) who may have been the smartest kids accepted by their medical school class in their day, but seemed to have zero bedside manner…

          The days of looking only at those applicants in the 97th percentile of their class are gone, I hope.

          Like

  4. This is a very interesting and informative piece of looking at human error.

    He got me from the start with the push/pull door that I have tried to enter many times but hopefully the next time I’ll read the signs. I need to read more of his work but find it heart warming his understanding of human nature. I just wish others were as enlightened and aware of this subject.

    Keep up the good work Carolyn.

    Liked by 1 person

Your opinion matters. What do you think?

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s