A patient safety audience hears about how outside industry might fix a process breakdown before or after a wrong drug error.

Guest Commentator 

  • Steven Spear, DBA, MS, MS; MIT; Cambridge, MA

Transcript

The following case abstract is based on a true story. Names and some details were changed to protect identities.

Mrs. Grant had successful cardiac bypass surgery and was recovering in the intensive care unit. At 8:15 a.m. two days post-op, the day nurse who was just starting her rounds discovered Mrs. Grant suffering a full body seizure. A code was called, and Mrs. Grant was taken to Radiology to rule out a neurological source for the seizure. The radiological examination proved negative, but a blood test from the code included an undetectable serum glucose level. Efforts to raise her blood sugar were unsuccessful. Mrs. Grant fell into a coma, and died following withdrawal of life support.

This case was related at a patient safety conference in Palo Alto, California, co-sponsored by Stanford University Medical Center and Harvard's CRICO/RMF. To help analyze the case, the speaker applied process improvement techniques from non-healthcare industries.

Steven Spear is Senior Lecturer at MIT, and a Senior Fellow at the Institute for Healthcare Improvement. Spear has written extensively about how exceptional organizations create competitive advantage through the strength of their internal operations.

After describing the case, Spear explained that a traditional root cause analysis was done at the hospital. The investigation concluded that the night nurse accidentally gave Mrs. Grant insulin instead of heparin in response to an alarm, as they were in similar-looking vials next to each other on the cart.

According to Spear, an effective process breakdown analysis would go beyond blaming the nurse or the pharmacy. In fact, an organization that is structured to fix problems before they hurt patients would have turned near-misses into process improvements that may have prevented Mrs. Grant's death.

[Spear]
Well, what really happened here is that the nurse was set up to fail because he was put in a situation where at some point, tired and in a rush, etc. etc. etc., all these human factors, environmental factors, design factors, similarly looking things in the same location used when the light is dim by a tired person, it can be easy to make a mistake.

So you start going through, trying to point the finger of blame—and it's going around. It's going around. He said well who killed Mrs. Grant? Well at the first level literally, the nurse did. He killed the patient. Let me ask you this question, are you satisfied with that answer? Are you satisfied with that answer? Someone raise their hand if they are dissatisfied with that answer. All right, you're not satisfied, all right… So for those who couldn't hear in the back, you can't look at the last step in a whole chain of events, the whole chain is corrupt, and then blame the last step. And we do that. We do that not just in health care. We did that societally and it's not fair to do. So now the finger of blame is going around and you say, well what are the other links in the chain? Well one of the links of course is presentation of the medication

Go through this. Wait a second, the nurse did the right thing, right? He was in the correct location, responded to the alarm appropriately, thought he was treating the patient as was merited. Pharmacy did the right thing in terms of packaging and presentation, yet we have a dead patient. So the nurse did the right thing, the pharmacy did the right thing. We have a dead patient. How do you reconcile those two?

Here's the wrinkle. This gets down to one of the first reasons why we have such an extraordinary gap between what the promise is of health care and medical science technology, training, employment, and what the actual deliverable on that promise is. Pharmacy did its work relative to the standards of pharmacy. Nursing did its work relative to the standards of nursing. The problem was, pharmacy didn't do its work relative to the needs of nursing.

Said more generally, a common failure mode in very complex systems—and this is not even a very complex one, is that people do their work organized within function with adherence to the function by the standards of the function or the discipline or the specialty without a very clear insight into what the needs of the function, the discipline, the specialty, the service are, the function that it's serving.

A common failure mode in organizations responsible for pulling together, pulling together the deep, deep knowledge of people across many disciplines in order to create value—this is true in health care and it is true outside of health care—is that they manage the pieces not in service to the process, not in service to the service line but within their own isolated domain. That's one failure mode. It's a solvable failure mode. It's not necessarily easy, but it is a solvable failure mode.

Let's say in this hospital it wasn't organized in terms of a department or department of pharmacy and department of nursing. There actually was a process called medication administration, and there was actually somebody who owned that process from when a doctor first examined a patient and wrote a script. Through the transmission of that script to the pharmacy the verification of that script is appropriate by pharmacists, the dispensing of meds by a tech, checking by a pharmacist, delivery by a delivery tech back to the nursing unit and the administration of that medication by a nurse. Let's say someone actually owned the process start to finish.

The inherent problem any time you try to design anything complex is that there are things you're gonna get wrong. Just the nature of the beast. When you have the work of many people across many disciplines, in the case of health care across many shifts, no matter what system, what process you design, there will be things wrong with it. And so then it gets to the question, what happens when people discover things wrong?

This gets to the second failure mode. So, we have a process called medication administration. We are actually managing people in service to the process, not just within their function. What happens then? This colleague I had, Anita Tucker, she tracked the nurses for 300 hours of observation. What she found was that 5 times, 10 times a shift, so every 12, every 5-10 minutes essentially, every few tasks, nurses ran into problems.

So what does a problem look like? A problem looks like a nurse goes to give medication, can't find the medication. A nurse goes to adjust the anesthesia in one of the patient-controlled pumps, doesn't have the key to let her have access, hasn't been given the day's code to let her have access. Goes into a room with contact precautions and doesn't have gowns or gloves.

The question is what does the nurse do? …There's all this warning that the system is not operating properly. There's all this warning and nurses are constantly encountering this warning. Well here's the problem. This woman, Anita Tucker, who did these observations tallied how many of these what she called operational failures (you don't have the right thing in the right place at the right time to do your work successfully), what happened when the nurses encountered these operational failures? Ninety percent of the time when they had an operational failure, they found a way to work around it, so if they are given a medication, if they have to give a medication for which they don't have, they call the pharmacy. If they need a test result which they don't have, they call the lab. If they have to do contact precautions, either they say son of a gun, no gloves, no gowns again and they find them or they do a hand hygiene as best as possible and then to avoid brushing the patient, they kind of lean over.

You ask yourself the question, why would a nurse work around a problem like that and not do the “right thing” and call attention to it? Well this gets down to behaviors characteristic in broken systems is that people who find themselves at the moment of operational failure face a dilemma, and I want to be absolutely clear about this. They are not lazy, they are not uncaring, they are not stupid, they are not uneducated, they are not shirking. They face a dilemma. The nature of a dilemma is really quite stark. That I can do the right thing, A, or I can do the right thing, B, but I can't do A and B both. 90% of the time the nurses work around the problem on their own. Ten percent of the time they ask for help to work around the problem.

Here's the basic fundamental problem with that is that the system in which they are working, the processes to which they are responsible are broken. They obviously are, because the right thing is not in the right place at the right time for the person who needs it to be successful in his or her work.

What they're doing is they are suppressing that signal. It is a very, very important signal. There is something broken here. It is something which is broken which is fixable, but instead of elevating that signal and recognizing that signal and respecting that signal of here's an opportunity to do better, that signal gets ignored. It gets suppressed. It gets squashed. And so people continue to do their work.

Coming back to Mrs. Grant's case, it's not as if by putting down the vial of insulin and picking up the heparin or putting down the vial of heparin and picking up the insulin or whatever else you needed, the conditions which made it easy to confuse the two disappeared. The conditions are still there. The factors are still there. The only thing is you didn't step on the land mine, but you left it there for somebody else.

I had a student who had done a tour of duty in Iraq. You know, we're trying to impress on students the importance of calling out the little stuff because the little stuff is an indication of vulnerability. It is an indication of potential harm down the road, and someone was really skeptical about this. Well, who has the time to do all this?

This guy volunteers and he says well, wait a second. I used to lead combat patrols up and down the roads outside of Bagdad, and the problem is we might be going up and down a patrol and depending on the time of day or the urgency of the mission to which we're going to or coming from, etc. etc., we might actually see an insurgent planting one of these improvised explosive devices by the roadside. If we have time, we are going to stop and stop the guy and disarm the device, but sometimes we can't because again, this dilemma problem.

Then he went through, he said well, what am I supposed to do when I get back to base? And of course every single student in the room raised their hands and said of course you need to tell your commander, and someone else has to go out and disarm the device because the cost of disarming the device is going to be far less than if it goes off as it is designed to go off. All right, that's the right answer there. Then he gets into the rhetorical question, you know, wouldn't it just be easier to tell everyone to be careful, wouldn't it just be easier to just pretend it's not there? Of course not, of course not, it wouldn't be.

So the ethos within health care is exactly the right thing, which is see a problem, contain a problem, solve a problem so the problem doesn't reoccur. Yet the behavior within healthcare around systems problems is exactly the opposite of that. So at this point we have talked a lot about this huge gap, here's how systems fail, structural problem and a learning and dynamic problem.

I promised to end on a very optimistic note. It is a solvable problem. There is enormous potential in the healthcare system. There's great disappointment in how it actually operates. That disappointment is entirely unnecessary. What these folks did, and what some of the hospitals I'm familiar with in Boston, and Seattle and elsewhere, New York, have done and what I'm sure many of you are doing is closing the gap one missing gown at a time.

Related Articles

X
Cookies help us improve your website experience.
By using our website, you agree to our use of cookies.
Confirm