Going back to the system to discover the cause of errors
Feb 20, 2014, By: Laura Eggertson
When a patient is inadvertently harmed, finding someone to blame is a common response. Rarely, however, is an adverse event the result of just one person’s mistake. Health-care professionals operate within systems that include the physical setting, the organization, other team members, and equipment and technology.
Studying the way humans interact within these systems, and how this interaction affects health and safety, is a discipline known as human factors. The discipline rejects the notion that humans are primarily at fault when they make errors using equipment and other technologies. “They are invented by humans and should be easy for humans to use,” explains Joseph Cafazzo, an associate professor in the University of Toronto’s faculty of medicine who conducts human factors research. “If the user makes an error, it’s usually the result of a design flaw.”
An estimated 185,000 adverse events occur in Canadian hospitals every year. The study of human factors is important to our understanding of why adverse events occur and how we can prevent them. “Nurses may have heard the term, but they may not understand how human factors affect their work and the safety of their patients,” says RN Kathy Momtahan, the corporate nursing research lead at the Ottawa Hospital and a human factors professional. She gives the example of an overdose of medication delivered to a patient through an infusion pump. “From the original prescription to delivery of the medication, there are many potential opportunities for system failures, including the design of the infusion pumps and the computerized systems that interact with them.”
Mitigating interruptions reduces medication errors
After observing oncology nurses as they administered critical chemotherapy medication, Toronto’s Healthcare Human Factors group designed a series of interventions to mitigate against the effects of interruptions. “Do not interrupt” signs were posted on equipment and in key areas where medication was administered, while signage was attached to infusion pumps to remind nurses to check connections, clamps and programming. Verification booths were installed to provide nurses with isolation from interruptions and noise while doing critical medication checks. A standardized process for verifying drugs before administration was implemented to help ensure accuracy. Nurses conducting timed tasks such as delivering intravenous push medications were given a countdown timer that displayed elapsed time to help them stay on track. When tested in a high-fidelity simulation lab, these interventions were found to significantly reduce errors in pump programming, intravenous drug push rates, push volumes and volume of ambulatory pumps.
Consider the following scenario: A patient with cancer is scheduled to receive radiation therapy. When the medical physicist programs the patient’s treatment plan into the linear accelerator, the machine delivering the radiation, an error message displays on the screen; however, the message is not informative, and the physicist believes the entire treatment plan has been saved — in fact, the machine has saved only part of the plan. The radiation therapist administering the treatment has been trained to focus on the patient and does not see, despite on-screen indicators, that the accelerator is delivering many times the intended dose of radiation. As a result, the patient is severely burned and later dies. The hospital finds the radiation therapist at fault.
“Classically, when we focus on the mistakes people make, we immediately jump to culpability — whether or not they are competent at what they do,” says Cafazzo. “Examining the error from a human factors perspective means we go back to the system to determine the cause.” In the case of the improperly programmed linear accelerator, responsibility would be apportioned not to the radiation therapist or the medical physicist but to the technology that produced the unclear error message and failed to communicate the consequences of the programming error to the therapist during the treatment.
“The human factors lens assumes that people are fallible and that we are going to make mistakes, no matter how well-qualified and trained we are,” says Cafazzo. “We work within systems that can elicit human errors. They are not necessarily an indication of incompetency. The question is, how do you minimize them?”
Finding ways to minimize human error within our systems is the focus of human factors professionals, who examine the way health-care professionals interact with their colleagues, their organization, and the tools, devices, interfaces and processes that are part of their job. These professionals work closely with users and incorporate the psychology of human behaviour and group dynamics to improve technologies, devices and processes so they are intuitive and easy to use.
Cafazzo is executive director of Healthcare Human Factors (HHF), a group of close to 40 engineers, designers, psychologists, researchers and computer scientists who test and evaluate new and existing medical devices, software and information technologies, from infusion pumps to CT scanners and syringes. Located at Toronto General Hospital within Toronto’s University Health Network (UHN), the group works both for UHN’s four hospitals and for external clients in the public and private sectors, including other hospitals. Team members observe the way health-care professionals communicate and collaborate in real-life settings such as operating rooms and radiation suites, and then find the design flaws, engineering errors and workflow inefficiencies that can lead to patient harm as surely as a missed blood clot or an incorrect diagnosis.
How safe is your work environment for your patients?
Researcher Kathy Momtahan suggests that nurses consider the following questions to assess, from a human factors perspective, whether or not their environment promotes patient safety:
- Are you working with infusion pumps and other equipment made by different manufacturers? Having to learn and remember different operating instructions for the same type of machine can lead to confusion and increased risk of error.
- Do you have to open and close multiple screens or applications to complete your task or use many steps to enter or retrieve information? Each individual piece of software may be well designed, but lack of consistency in the overall system design may lead to errors.
- Do you have smart computer systems, infusion pumps or other equipment that warn of problems or flag inaccuracies?
- Do you have to multi-task all the time?
- Are you frequently interrupted in your tasks?
There is growing appreciation within health-care organizations of the important role human factors play in ensuring patient safety, says RN Theresa Fillatre, senior policy advisor for the Canadian Patient Safety Institute (CPSI). “There has been a gradual shift from the culture of blame to a culture of learning from patient safety events where a health-care intervention has gone wrong. People are more open to reporting near misses and errors and are more likely to engage in processes to explore root causes, and to use that learning to make system improvements.” Asking what the patient issue is, listening to what team members say, and then talking about solutions are key to patient safety, says Fillatre, citing the “Ask. Listen. Talk.” mantra that underpins the work of CPSI.
HHF has organized workshops for nurses featuring examples of scenarios in which extremely competent individuals make what Cafazzo calls “seemingly egregious errors in medication administration.” The team then demonstrates that the error is not the nurses’ fault but rather the result, for example, of a badly designed piece of equipment or a poorly worded medical label. The group trains nurses to look objectively at how their workflow, the technology they use, or other factors in their work environment led to the error.
Cafazzo says that fatigue, the demands of multi-tasking, multiple interruptions and poor lighting conditions impair the ability of health-care professionals to do their jobs. “Designers need to assume that the people who are programming the machines may be tired and distracted and that they may have been trained on the equipment only once, six months previously, before they’re ever expected to use it.”
Few other organizations have the resources to devote as much attention to human factors as UHN, which Cafazzo says has the biggest team in the world dedicated specifically to human factors in health care. But there are steps individual health-care professionals and their organizations can take to improve safety.
One step is to recognize how important patients are in the process, explains Deborah Chan, an engineer with HHF. Mobile applications for patients are among the software projects that the group designs and evaluates to ensure these systems are intuitive to use. “Empowering patients by giving them information to manage their own health is a change in the way the health-care industry operates,” says Chan.
Resources for further reading
The Human Factor: Revolutionizing the Way We Live with Technology. Available from amazon.ca
“Applying Engineering Principles to Medication Safety.” Available from the Institute for Safe Medication Practices Canada
Canadian Incident Analysis Framework. Available from the Canadian Patient Safety Institute
Human Factors in Patient Safety: Review of Topics and Tools. Available from the World Health Organization
Another step is to encourage the use of incident reporting systems to capture hazards as well as adverse events. Within UHN, for example, nurses working in blood banks identified poor lighting that made it difficult to read labels as a potential hazard that could have resulted in a patient getting the wrong blood type. On another occasion, nurses in Mount Sinai Hospital’s neonatal intensive care unit raised concerns that recently procured heparinized syringes for arterial blood gas (ABG) sampling were too similar in appearance to standard syringes, after an incident where an ABG syringe was nearly mistakenly prepared for an injection. A simulation study conducted by HHF to test staff members’ ability to distinguish between the syringes confirmed the problem. The manufacturer of the heparin-coated syringe was alerted, the ABG syringes were removed from the care unit, and a potentially fatal error was averted.
Cafazzo points out that institutions should make equipment and technology procurement decisions based not just on cost but also on the need to avoid errors, and should incorporate the input of those who use the equipment. User input is also needed for workflow and process design, he adds. “It doesn’t take a lot of resources to redesign processes and forms and labels.”
Laura Eggertson is a freelance journalist in Ottawa, Ont.