Have you ever walked up to a big door and pulled when you should have pushed? Have you done this despite the sign on the door telling you specifically how it works? Have you even pulled repeatedly on the same door when it won’t open? I sure have. . .
Dr. Terry Fairbanks tells this story of some door-watching he did at his local bagel shop while he sat at a table waiting for his wife.
“I watched person after person pulling on the shop door despite the PUSH sign. But if this were healthcare, we’d put a policy in place, make a policy binder, and put it on the nurses’ shelf!
“But it’s not about policy, it’s about changing the door!” . .
The reason we pull when we are clearly being told to push, he explains, is that we’re in skills-based automatic mode. This is when most errors happen. We’re not consciously thinking.
“What’s happening is that the brain takes unconscious visual cues from the eyes. We can see a pull handle, but in automatic mode, our brain uses past experience, so we PUSH on that pull handle.”
Dr. Fairbanks is an Emergency physician, Professor of Emergency Medicine at Georgetown University and Founding Director of the National Center for Human Factors in Healthcare.
In fact, he started his professional career in the field of human factors engineering and systems safety – long before he went to medical school.
This means that, unlike his physician colleagues (and essentially all hospital administration decision-makers, too), Dr. Fairbanks was first known as an expert in human performance, and specifically in how to change systems to make that performance safer by addressing human error.
But health care, he warns, does something that other complex high-risk industries don’t do anymore when it comes to errors:
“We keep focusing on individual performance. We need to change our focus to reducing harm, not just reducing error.”
Reducing harm by addressing misdiagnosis is a personal interest of mine, ever since I was misdiagnosed with acid reflux in mid-heart attack.
Despite textbook Hollywood Heart Attack symptoms of central chest pain, nausea, sweating and pain down my left arm, I was sent home from the Emergency Department with instructions to try antacid meds. I believed that doctor who had so confidently misdiagnosed me, so when my symptoms returned (which of course they did!) I was too embarrassed to return to Emergency – for just a little indigestion. It was only when the symptoms became truly unbearable that I finally forced myself to return to the same hospital, this time to a different doctor, a correct diagnosis and immediate emergency procedures.
But I’d bet my next squirt of nitro spray that at no point did anybody in that hospital report my initial misdiagnosis once it was confirmed during that second admission – and that means nothing was learned from my experience.
It was not discussed at rounds, by hospital administrators, in medical school classrooms. Future patients like me were not protected from being misdiagnosed themselves, because nothing was learned from my case. That’s because the medical profession does not require mandatory reporting of diagnostic error, and that’s because of a systemic fear of aggressive, punitive consequences for clinicians. No wonder physicians holler “Doctor bashing!!” every time I write about yet another study suggesting that female heart patients are more likely to be misdiagnosed compared to our male counterparts.
But how can we address a patient safety issue that we don’t track – and are not even allowed to talk about?
Dr. Fairbanks told a South Carolina Patient Safety Symposium audience:
“The first lesson in life we learn from our parents was, ‘Everybody makes mistakes’.
“They didn’t say, ‘Everybody except healthcare providers makes mistakes.’
“Take well-trained good people (which is almost all of us) and understand that they are still going to make mistakes, over and over again, throughout their careers.
“But hospitals keep practicing their policy trying to get individuals in their hospitals to stop making mistakes, despite what their own parents taught them.
“If we really want health care to become safer, we need to stop trying to reduce error and actually embrace it.
“We keep beating our heads against the wall thinking that if we can just fix that one doctor, or that one nurse, or make that one group compliant and change their behaviour, then we can become a safe organization – which is completely inconsistent with safety science.
“Safety science is not a religion. It’s a very basic foundation by which other high risk industries are becoming safer.”
But healthcare, he warns, has not yet embraced safety science.
Human factors engineering is not about redesigning humans, or telling the operator to read the operating manual. It means designing for the reality of the workplace environment.
One example of a hospital environment that is designed to cause errors is the case of a nurse who gives medications to patients earlier or later than ordered:
“This happens because in the 11 a.m.-1 p.m. time frame required to give noon meds, for example, the nurse cannot possibly do it. From an industrial design perspective, the nurse is being forced to find a shortcut.”
During his presentation, Dr. Fairbanks demonstrated shutting off the power button on his slide projector, which he says happens to be a predictable and common user error – even when you didn’t mean to shut off the power.
But in the consumer electronics industry, power buttons have a built-in user feature, a protection from predictable error (e.g. that pop-up message, “Are you sure you want to shut down now?”)
Why, he asks his audience, doesn’t healthcare do this too?
“Instead, the focus when you’re blaming individuals instead of bad design is to train all the potential users, or put a label on the unit, or do an in-service training for all staff.
“In healthcare, we think we’re going to fix the problem by training people, instead of fixing the problem by redesigning the system.
“Our basic premise is that the answer to safety is having people never make errors – which is why we’re not given slide projectors!”
He points to drug labels as an area of potentially serious medical error that could be addressed through basic safety science:
“Why is the same vial that looks the same/feels the same/is the same being used for drugs that can be pushed through the IV when drawn up AND also for drugs that will kill somebody if they’re pushed through the IV and should only be dripped?
“Instead, we have a culture of accountability so we tell the nurses not to give the wrong drug.”
Dr. Fairbanks points to an example from the aviation industry.
Now, even as I type that word “aviation”, I can picture the mass eye-rolling going on out there if you’re a doctor.
Many physicians don’t like having problems in their profession compared to problems in other professions. Dr. Fairbanks, a human factors engineer-turned-physician, is well aware of this understandable response:
“In the 1970s, when we were trying to reduce human error in aviation and we were using the nuclear industry as our model, aviation experts told us, ‘It’s different – we’re not the same, you can’t apply this stuff to us!”
Well, it’s turned out that both industries learned from each other’s experience in reducing harm.
Dr. Fairbanks recounts a story to his audience about his early career in safety science, specifically citing a study which found that pilots and air traffic controllers make an astonishing average of two errors per hour on a typical shift:
“Yet flying on a major airline’s plane is just about the safest thing you can do.
“If you want to stay safe in the next week, get into a commercial airliner and just keep flying back and forth. Think about this next time you’re feeling nervous on a flight, one of 30,000 flights a day carrying 809 million passengers a year.”
What is the most dangerous part of a flight? Some might guess take off and landing. Dr. Fairbanks, however, recalls overhearing a fellow passenger on a recent flight who called home just as their plane arrived at the terminal.
“I heard him tell his wife: ‘I’m safe, honey – I’ve landed.’
“I wanted to shake him and say, ‘You still have your drive home! That’s the most dangerous part of your trip!'”
Since everybody makes mistakes (like those pilots and air traffic controllers, for example), how can flying possibly be as safe as it actually is? Isn’t the goal to eliminate human error?
No, says Dr. Fairbanks.
“Human error cannot be eliminated. Trying to do this causes a culture of blame and secrecy (or, the ‘name, blame, shame and train’ mentality).”
Trying to eliminate human error is what Dr. Fairbanks calls a “a futile goal that misdirects focus and resources”. The goal is to reduce harm.
First, here’s a warning about what does NOT work in achieving that goal. In the words of physician, patient safety activist and Harvard professor Dr. Lucien Leape:
“The single greatest impediment to error prevention in medicine is that we punish people for making mistakes.”
And Dr. Fairbanks adds:
“I can’t tell you how many times I’ve arrived at the scene of an adverse event, and a hospital nurse has already been walked down to the HR office and been suspended by the time we arrive to review the incident.
“Neither ‘normal error’ (mostly skills-based errors) nor ‘at risk’ behaviours deserve punishment. Only reckless, intentional behaviour should be addressed with sanctions.”
Watch Dr. Fairbanks’ conference presentation (about 50 entertaining minutes) on lessons we can learn from safety science and human factors engineering.
NOTE FROM CAROLYN: I wrote more about medical errors in my book A Woman’s Guide to Living with Heart Disease (Johns Hopkins University Press, 2017). You can ask for this book at your local bookshop, or order it online (paperback, hardcover or e-book) at Amazon – or order it directly from Johns Hopkins University Press (use their code HTWN to save 20% off the list price when you order).
Q: Do you agree with Dr. Fairbanks that human error cannot be eliminated?