At first blush, the health-care and nuclear-power industries don't appear to have much in common. But in a unique, two-day workshop in July 2012, leaders from these two industries met to discuss their similarities and differences, including technologies and human factors that affect risk and reliability. The result is a 120-page monograph, "Risk and Reliability in Healthcare and Nuclear Power: Learning from Each Other," recently released by the Association for the Advancement of Medical Instrumentation.
One of the participants in this unusual collaboration was David Gaba, MD, professor of anesthesia and the associate dean for immersive and simulation-based learning at the School of Medicine. He also co-directs the simulation center at the Veterans Affairs Palo Alto Health Care System. Gaba, who is interested in a variety of topics that affect patient safety, spoke with Inside Stanford Medicine writer Ruthann Richter about the work with his colleagues in the nuclear industry.
Q: Nuclear power and health care don't seem to have a natural affinity. What common factors do they share?
Gaba: Both of these industries have human beings operating powerful systems and technologies that ordinarily provide great benefit to humankind, but, if they go awry, can harm people. For nuclear power, an accident can, in principle, harm thousands of people, and leave a power plant and its surroundings unusable for decades. In health care, accidents, although rare, are more common than in nuclear power, but they only affect one patient at a time. Nonetheless, the commonality of operating dangerous systems is substantial. It revolves around how to build organizational systems and technological systems in which skilled, but fallible, human beings do this work as individuals and teams in ways that achieve very high levels of safety while also achieving high throughput of work. The commonality is particularly strong in comparing the control room of a nuclear plant with arenas like the operating room, intensive care unit or emergency department. In such settings, the "operators" depend on electronic displays from sensors to tell them how things are going, and they may have to respond to adverse events that play out in seconds, minutes and hours rather than in days, weeks, months or years.
Q: In the monograph, you address the issue of human factors and how this can influence outcomes. Can you talk more about this issue?
Gaba: Designing instrumentation and control systems to be simple and intuitive for human beings to use is a challenge in all arenas—aviation, space flight, nuclear power and health care to name a few. We can only attend to so many streams of information at once. Our memory is fallible, especially what is called "prospective memory"—remembering to do something in the future. We cannot always think straight when stress mounts. Human factors—a branch of engineering or applied psychology—is the field that studies such issues and is concerned with how best to provide information or the controls that operators can use to act on the information.
Nowadays human factors is also concerned with the organizational underpinning of work. People don't exist in a vacuum. They are driven by incentives or disincentives in their organization. Their ability to communicate with each other, or to work as a team, is influenced by the organizational safety culture and by structural aspects of their work environment and team structure. Many of these factors have similar challenges in both nuclear power and health care.
Q: What are some of the key things that can health-care providers learn from their colleagues in the nuclear industry?
Gaba: One big one is the need for standard operating procedures, where possible, which also retain flexibility as needed. A major spinoff of this principle, used extensively in nuclear power, is to provide graphically enhanced written protocols for emergency situations. It is long recognized that nuclear power operators cannot remember everything they need to know in managing an adverse event in a nuclear plant—memory is too fallible. Thus, the use of written procedures is a mainstay in this setting. Health care has long depended largely on the individual skill and memory of physicians and nurses. Protocols and checklists or emergency manuals were decried as cheat sheets or cribs. We now know that the best people use these kinds of supports—not because they are stupid but because that is the best way to get the best results in tough situations. My lab and other colleagues at Stanford have been working for some time on written cognitive aids and emergency manuals for anesthesia professionals. These have now been disseminated to all the anesthetizing locations in Stanford's hospitals and those of its close affiliates. This lesson has clearly come from the nuclear industry and from others such as aviation.
Another lesson from the nuclear industry is the importance of the safety culture in an organization. When the organization favors throughput so heavily that people cut corners on safety, or when personnel are afraid to speak up when they see something unsafe, the risk climbs.
Something near and dear to my heart is the utility of simulation for training of skilled professionals. My lab's development of simulators and simulation-based curricula in health care was triggered by knowing a little bit about how they are used in aviation and other industries like nuclear power. But I actually had no idea, until this workshop, just how much simulation is required for nuclear power operators. They spend six weeks doing their usual shifts in the control room, and the seventh week is spent in training simulations. All year round, no matter how much prior experience they have. Health care is just scratching the surface in simulation compared to that, but at least we have started our way down a similar road.
Q: How might patients ultimately benefit from this collaboration?
Gaba: Nuclear power has been incredibly safe in the United States. To my knowledge, no non-worker has ever been killed by the direct effects of nuclear power production. Sadly, there are still on the order of 100,000 possibly preventable deaths per year nationwide in health care due to errors or the sub-optimal response to adverse events. Granted, given the tens of millions of surgical operations and hospital visits—and an even larger ambulatory care endeavor—such catastrophic events are, fortunately, rare. And patients aren't nuclear plants. We can't build into patients the sensors and controls that we would like to. We have to take human beings as they are. No one would ever say that caring for patients is the same as running a nuclear plant. But the nuclear industry in the United States has found a way to do intrinsically dangerous work with an enviable record of safety. As health care strives to get closer and closer to a vision of zero accidents and zero preventable negative outcomes, there are lessons we can learn from the nuclear industry, and others, that we can adapt to our own uses in patient care. We don't have to be "like" nuclear power; we only have to learn useful things from their experience.
Incidentally, the nuclear power participants found plenty of things in health care that they thought were worth considering and how they might adapt them into their safety practices. It's not just a one-way street.
Q: What is your next step?
Gaba: Well, I'm fortunate to have cemented some professional connections with the nuclear power experts and made some new ones. Collectively, I think we know there are areas that still need a lot of work in health care to improve our safety record. The biggest challenges now are really in finding ways to implement the changes that we know really should be made, or at least doing very serious studies about the cost-benefit of some of these changes. It has taken decades for other industries to reach the levels of safety they can now boast about, so it should not be surprising that health care, as incredibly complex an endeavor as it is, will take some time to absorb and implement the lessons it has and will learn from our colleagues in other "industries of intrinsic hazard," as I like to call them. Certainly, we have a long road on human factors, safety culture and simulation to achieve the visions that we have for health care. But at least we know where we should be headed.