Skip Navigation U.S. Department of Health and Human Services www.hhs.gov/
Agency for Healthcare Research Quality www.ahrq.gov
www.ahrq.gov/
Mistake-Proofing the Design of Health Care Processes

Chapter 1 (continued)

Attributes of Mistake-Proofing

Several attributes of mistake-proofing are presented below. Although this book extols its benefits, mistake-proofing can encompass liabilities as well as benefits. It is equally important to know what mistake-proofing cannot do and which liabilities need to be addressed, as it is to know what mistake-proofing can do to reduce errors.

Mistake-Proofing is Inexpensive

The cost of mistake-proofing devices is often the fixed cost of the initial installation plus minor ongoing calibration and maintenance costs. Shingo's book contains 112 examples.4 He provides the cost (in 1986 U.S. dollars) of each example. Their distribution is shown in Table 1.6. The median cost of a device is approximately $100. Ninety percent of the devices cost $1,000 or less. Others26,27 implementing mistake-proofing report similar outcomes. A device's incurred cost per use can be zero, as it is with the 3.5-inch diskette drive. The cost per use can also be negative in cases in which the device actually enables the process to proceed more rapidly than before.

Table 1.6. Implementation cost for Shingo's mistake-proofing examples4

Cost
(1986 U.S. Dollars)
Probability Cumulative Probability
Cost <$2525.5%25.5%
$25 < Cost <$10029.1%54.6%
$100 < Cost <$25023.6%78.2%
$250 < Cost <$1,00013.6%91.8%
Cost > $1,0008.2%100.0%

The costs of implementing mistake-proofing in health care may be greater than the associated costs in manufacturing. More caution will be required to assess all possible risks of implementation. In some cases, clinical trials will be needed to ensure the efficacy of the device. In others, regulatory approval will be needed. All these steps will add to the cost.

At this writing, many health care providers are implementing bar coding, computerized physician order entry (CPOE), and robotic pharmacies (Figure 1.13). These are technologically sophisticated examples of mistake-proofing, which are effective responses to human error but are very complex and expensive to implement. They are not typical of the majority of mistake-proofing approaches, which are based on simplicity and ingenuity.

Bar coding and CPOE are technologically sophisticated examples of mistake-proofing.

In manufacturing, where data are available, mistake-proofing has been shown to be very effective. There are many management tools and techniques available to manufacturers. However, many manufacturers are unaware of mistake-proofing.

The TRW Company reduced its defect rate from 288 parts per million (ppm) defective to 2 parts per million.29 Federal Mogul had 99.6 percent fewer customer defects than its nearest competitor and a 60 percent productivity increase by systematically thinking about the details of their operation and implementing mistake-proofing.30

DE-STA-CO manufacturing reduced omitted parts from 800 omitted ppm to 10; in all modes, they reduced omitted parts from 40,000 ppm to 200 ppm and, once again, productivity increased as a result.31 These are very good results for manufacturing. They would be phenomenal results in health care. Patients should be the recipients of processes that are more reliable than those in manufacturing. Regrettably, this is not yet the case.1

Mistake-Proofing Can Result in Substantial Returns on Investment

Even in manufacturing industries, however, there is a low level of awareness of mistake-proofing as a concept. In an article published in 1997, Bhote32 stated that 10 to 1,100 to 1, and even 1,000 to 1 returns are possible, but he also stated that awareness of mistake-proofing was as low as 10 percent and that implementation was "dismal" at 1 percent or less.

Exceedingly high rates of return may seem impossible to realize, yet Whited33 cites numerous examples. The Dana Corporation reported employing one device that eliminated a mode of defect that cost $.5 million dollars a year. The device, which was conceived, designed, and fabricated by a production worker in his garage at home, cost $6.00. That is an 83,333 to 1 rate of return for the first year. The savings occur each year that the process and the device remain in place.

A worker at Johnson & Johnson's Ortho-Clinical Diagnostics Division found a way to use "Post-It® Notes" to reduce defects and save time that was valued at $75,000 per year. If the "Post-It® Notes" cost $100 per year, then the return on investment would be 750 to 1. These are examples of savings for a single device.

Lucent Technologies' Power System Division implemented 3,300 devices over 3 years. Each of these devices contributed a net savings of approximately $2,545 to their company's bottom line The median cost of each device was approximately $100. The economics in medicine are likely to be at least as compelling. A substantial amount of mistake-proofing can be done for the cost of settling a few malpractice suits out of court.

Mistake-proofing Is Not a Stand-Alone Technique

It will not obviate the need for other responses to error. Chapter 2 includes a discussion of how mistake-proofing relates to other common patient safety initiatives.

Mistake-Proofing Is Not Rocket Science

It is detail-oriented and requires cleverness and careful thought, but once implementation has been completed, hindsight bias will render the solution obvious. Chapter 3 presents tools and techniques that help to create mistake-proofing devices and analyze their impact on the process.

Mistake-Proofing Is Not a Panacea

It cannot eliminate all errors and failures from a process. Perrow34 points out that no scheme can succeed in preventing every event in complex, tightly-linked systems. He argues that multiple failures in complex, tightly-linked systems will lead to unexpected and often incomprehensible events. Observers of these events might comment in hindsight, "Who would have ever thought that those failures could combine to lead to this?" Perrow's findings apply to mistake-proofing as they do to any other technique. Mistake-proofing will not work to block events that cannot be anticipated. Usually, a good understanding of the cause-and-effect relationship is required in order to design effective mistake-proofing devices. Therefore, the unanticipated events that arise from complex, tightly-linked systems cannot be mitigated using mistake-proofing.

Although health care is a complex, tightly-linked system, many potential adverse events can be anticipated. In fact, some of the more common errors occur in hospitals daily or hourly. When a patient is misidentified, a specimen is mislabeled, or a wrong-site operation occurs, people familiar with patient safety will not say, "Wow. Who would ever have believed that could happen?" It is in this domain of anticipated events that mistake-proofing is beneficial.

Mistake-Proofing Is Not New

It has been practiced throughout history and is based on simplicity and ingenuity. Mistake-proofing solutions are often viewed post hoc as "common sense." Senders and Senders35 provide an example of mistake-proofing, the dispensing of medications in the mid-1800s (Figure 1.14).

Bottles of poison are variously identified by their rectangular shape, blue-colored glass, or the addition of small spikes to make an impression on inattentive pharmacists. Most organizations will find that examples of mistake-proofing already exist in their processes. The implementation of mistake-proofing, then, is not entirely new but represents a refocusing of attention on certain design issues in the process.

Return to Contents

Creating Simplicity Is Not Simple

In hindsight, mistake-proofing devices seem simple and obvious. A good device will lead you to wonder why no one thought of it before. However, creating simple, effective, mistake-proofing devices is a very challenging task. Significant effort should be devoted to the design process. Organizations should seek out and find multiple approaches to the problem before proceeding with the implementation of a solution.

This book is intended to help organizations design mistake-proofing devices. Its goal is to provide a process and a vocabulary for thinking about patient safety and error reduction. It is hoped that this book will also help reduce the amount of creativity needed to devise novel approaches to eliminating problems and reducing risk.

Each organization's mistake-proofing needs may be different, depending on the differences in their processes. Consequently, some mistake-proofing solutions will require new, custom-made devices designed specifically for a given application. Other devices could be off-the-shelf solutions. Even off-the-shelf devices will need careful analysis—an analysis that will require substantial process understanding-in the light of the often subtly idiosyncratic nature of their own processes.

Chapter 2 reviews current patient safety tools and proposes a flowchart view of how existing tools inform the process of mistake-proofing device design or selection. Existing tools provide the foundation of process understanding that enable us to make sense of events and errors, which is vital to effective mistake-proofing. Mistake-proofing cannot be effective without a sound understanding of what happens in the process and why.

Chapter 3 proposes a new use for an existing tool and combines it with other tools to facilitate mistake-proofing efforts. Chapter 4 is devoted to discussing important design issues, caveats, and limitations of mistake-proofing. Chapters 5, 6, 7, and 8 provide examples of mistake-proofing in health care. Chapter 9 describes a path forward and suggests resources to help make mistake-proofing successful.

Return to Contents

Implementing Mistake-Proofing in Health Care

Implementing mistake-proofing in medical environments will probably be more challenging and difficult than implementing the same techniques in manufacturing. An unranked list of opportunities and difficulties is provided in Table 1.7. The difficulties are not provided as excuses or reasons why mistake-proofing should not be implemented but rather as guides to what can be expected as implementation progresses. The impact of these concerns can be mitigated by early acknowledgment of their effects on the process.

Table 1.7. Comparison of medical mistake-proofing applications with those in other industries

Application Comparison
1. Legal liability and discoverability (need for anonymity?) Difficulty
2. Lack of shared examples Difficulty
3. Careful assessment of down-side risk Difficulty
4. Culture of depending on individuals, not on systems Difficulty
5. Processes that depend on individuals, not on systems: lack of consistent process Difficulty
6. Resource shortages Difficulty
7. Medical applications that focus more on information counter-measures Difficulty
8. Low barriers to diffusion Opportunity
9. Substantial buying power Opportunity

Legal Liability and Discoverability

Telling quality improvement stories requires great care. Claiming great improvements could implicitly reveal previous shortcomings. In claiming the "after," one must own up to the "before." This is not a significant concern in manufacturing applications because the problems are rarely safety related. Disgruntled customers simply get their money back or receive a replacement product. Remedies for poor quality in medicine are not as easily attained.

Mistake-proofing devices are physical evidence that actions have been taken to ensure patient safety. Although this book contains only examples of good practices, many of its contributors prefer to remain anonymous. Risk managers and medical system lawyers differ widely in their levels of concern about disclosing mistake-proofing devices. For this book, the range of concerns included individuals who were proud of their efforts and willing to receive all credit due them to those who required significant assurances of anonymity.

Concerns with the litigious environment surrounding health care will remain an impediment to mistake-proofing implementation for some time to come. In addition to existing channels like the Agency for Healthcare Research and Quality's (AHRQ) Web M&M (Morbidity and Mortality) and the National Patient Safety Foundation's (NPSF) LISTSERV™, more options for "safe" (perhaps anonymous) dialogue and information sharing should be sought.

Lack of Shared Examples

Manufacturing benefits from a set of four resources4,9-11 with 702 published examples. These examples provide a large set of existing solutions and approaches to problems; solutions that can stimulate thinking about additional approaches. Until now, there was no comparable set of examples of mistake-proofing in medicine. This book provides a starting place for sharing medical examples. It is not comprehensive. There are many more examples to collect for this ongoing effort.

Add to the body of knowledge in medical mistake-proofing: Submit any examples that you know of that do not appear in this book. Submissions can be made (anonymously, if desired) at http://www.mistake-proofing.com/medical.

Careful Assessment of Down-Side Risk

In manufacturing, one can afford to be more cavalier in trying new things. In fact, Hirano36 proposes the following heuristic: if a device is found to have greater than a 50 percent chance of success for mistake-proofing, then it should be tried immediately. The parts can be discarded if the experiment does not work. In health care, this approach is often unacceptable, given the requirement of a careful assessment of the patient safety risk of each new mistake-proofing device. Where there is risk of patient harm, careful analysis and clinical trials may be needed. However, experimenting with a device theoreticallyb will not make the patient "worse off " in cases in which current controls depend entirely on human attentiveness for their accuracy. If the mistake-proofing device is also a medical device, it must adhere to the same rigorous regulatory approval process required for any other device.


b. Practically, experimenting with a device could reduce patient safety if users depend on the device instead of exercising normal levels of care and attentiveness. Go to the discussion of Risk Homeostasis in Chapter 4 for more information.


A Culture of Depending on People, Not on Systems

The traditional approach within medicine has been to stress the responsibility of the individual and to encourage the belief that the way to eliminate adverse events is to get individual clinicians to perfect their practices. This simplistic approach not only fails to address the important and complex system factors that contribute to the occurrence of adverse events but also perpetuates a myth of infallibility that is a disservice to clinicians and their patients.37

The reasons are found in the culture of medical practice... Physicians are socialized in medical school and residency to strive for error-free practice... Physicians are expected to function without error, an expectation that physicians translate into the need to be infallible.38

The medical culture makes implementing mistake-proofing more difficult because health care professionals are accustomed to looking for solutions involving "knowledge in the head." Getting them to consider "knowledge in the world" can be very challenging. Some aspects of the implementation may challenge long-held assumptions, beliefs, and values associated with behavior and accountability.

Barry, Murcko, and Brubaker39 discuss this issue regarding medical software interfaces.

The complex displays allow the specialists to apply their mastery and preserve the special knowledge they have acquired. The experts do not see a need for any help, and they do not want any help. In fields other than health care, giving experts help, even if they do not want it, is found to reduce error rates.

Perhaps this is the case in the health care field, too: even though experts do not want help, maybe they could use a little anyway, for the good of the cause. Computer displays should make doing the right thing easier than doing the wrong thing... They should make it obvious, immediately, when the wrong thing has been done... All these ideas are not only common sense, they are poka-yoke.

Processes that Depend on People, Not on Systems

In medicine, the dependence on the individual is cultural, and it is exhibited in processes managed within that culture. As a result, medical processes are often customized by each practitioner of the art. Noted doctors and patient safety advocates have questioned whether there are processes in medicine at all. The lack of consistent processes will make implementation of mistake-proofing more difficult in medicine than in manufacturing.

Inconsistent processes are more difficult to mistake-proof because there are fewer predictable elements that can be used to check the process.

Resource Shortages

Ideally, changes will liberate additional resources, but adequate resources are required to make those changes possible. Adequate resources and staffing levels enable process improvement. A shortage of nurses, other staff, or resources will generally make mistake-proofing more difficult. Go to the section of Chapter 4, titled, "Spending Too Much or Not Enough" for additional information about the ironic situations that can prevent organizations from allocating resources to process improvement efforts.

More Focus on Information Enhancement Devices

Mistake-proofing in manufacturing has primarily focused on physical, sequencing, and grouping and counting mistake-proofing devices. More information-enhancement devices should be anticipated as mistake-proofing is more widely implemented in service industries. Health care services involve numerous "matching" tasks. These are tasks in which specific medications, medical devices, processes, and procedures are matched to specific, individual patients during specific time intervals. These tasks require the availability of substantial amounts of usable, accurate information. Mistake-proofing devices for information enhancement are needed in these circumstances. Increasing the proportion of theses devices will be challenging. There is a need for the invention and documentation of newer and better examples of these mistake-proofing devices.

Low Barriers to Diffusion

Health care enjoys an advantage over manufacturing. Because much of health care competition is geographically based, new devices can be shared with less impact on competitive advantage than there would be in the manufacturing sector. Consequently, mistake-proofing devices that would be cloaked in secrecy to foster a competitive advantage in manufacturing are more likely to be shared in the health care environment.

Substantial Buying Power

Vendors have already begun to use patient safety improvements as a marketing tool. Hospital systems and large payers possess the buying power to specify safer designs and to seek out vendors willing to provide them. Health care providers should employ a practice that Leenders and Blenkhorn40 called "reverse-marketing." Their concept is to reverse the roles of supplier and buyer so that buyers are marketing ideas to their suppliers.

Traditionally, the supplier tries to persuade the buyer to buy, but in reverse marketing the buyer tries to persuade the supplier to supply. That is, the buyer exerts influence on suppliers to encourage them to produce what the buyer wants. Leenders and Blenkhorn38 argue that purchasing managers are mistaken in their impression that they have most of the power in the transaction because they make the final purchasing decision. The authors make a convincing case that suppliers, by having control over which product configurations are offered, have much more power to shape transactions, and that purchasers should attempt to take some of that power back by trying to influence what is offered.

The British National Health Service is making an effort to shape the product offerings that affect them by seeking improved labeling of medications using this type of proactive approach (Go to Chapter 8, Example 8.29). Larger medical systems, for-profit hospital chains, government-run hospital systems, and payer groups could be very persuasive in convincing suppliers to change the designs of equipment, devices, and supplies.

Return to Contents

Conclusion

Mistake-proofing involves designing changes into the physical aspects of the design of processes. Design changes can prevent mistakes by simplifying or clarifying the work environment, making mistakes less likely. Mistakes can also be prevented by inspecting the source or cause of errors so that the effect cannot occur. When this is not possible, mistakes should be detected rapidly, prior to causing harm, or while remediation is still relatively easy.

If the mistake itself cannot be prevented or effectively detected, then preventing the influence of mistakes (the harm) may be warranted. These mistake-proofing techniques should be applied to actions taken by patients and their loved ones as well as to the actions of health care professionals.

Changing the design of health care processes and creating mistake-proofing devices is not a simple task. Careful deliberation and analysis will be required. In some ways, implementing mistake-proofing will be more difficult in health care environments than in other industries. There are, however, efforts to lay the foundation for successful implementation already underway. Chapter 2 describes these efforts.

Return to Contents

References

1. Kohn LT, Corrigan JM, Donaldson MS, eds. Institute of Medicine, To err is human: building a safer health system. Washington, DC: National Academies Press; 2000.

2. Joint Commission on Accreditation of Healthcare Organizations. http://www.jointcommission.org. Search entry: FMECA (Failure Modes Effect and Criticality Analysis). Accessed October 2004.

3. Croteau RJ, Schyve PM. Proactively error-proofing health care processes. In: Spath PL, ed. Error reduction in health care. Chicago: AHA Press; 2000.

4. Shingo S. Zero quality control: source inspection and the poka-yoke system. trans. by Dillion AP. New York: Productivity Press; 1986.

5. Spear SJ. Fixing health care from the inside, today. Harvard Business Review 2005 Sep;83:78-91.

6. Winter D. Get health-care suppliers lean. Ward's Auto World 2004 July 1: http://waw.wardsauto.com/ar/auto_healthcare_suppliers_lean_2/. Accessed: October 2006.

7. Connolly C. Toyota assembly line inspires improvements at hospital. Washington Post 2005 June 3: http://www.washingtonpost.com/wp-dyn/content/article/2005/06/02/AR2005060201944_pf.html. Accessed: October 2006.

8. Weber DO. Toyota-style management drives Virginia Mason. Physician Executive 2006 Jan-Feb. Available at: http://www.findarticles.com/p/articles/mi_m0843/is_1_32/ai_n16016072. Accessed: October 2006.

9. Hinckley CM. Make no mistake. Portland, OR: Productivity Press; 2001.

10. Shimbun NK Factory Magazine, ed. Poka-yoke: improving product quality by preventing defects. Portland, OR: Productivity Press; 1988.

11. Confederation of Indian Industry. Poka-yoke book on mistake-proofing. New Delhi: Confederation of Indian Industry; 2001.

12. Petroski H. To engineer is human: the role of failure in successful design. New York: Vintage Books; 1992.

13. Norman DA. The design of everyday things. New York: Doubleday; 1989.

14. Rasmussen J. Skills, rules, and knowledge: signals, signs, and symbols, and other distinctions in human performance models. IEEE Transactions on Systems, Man, and Cybernetics 1983;13(3):257-67.

15. Reason J. Human Error. New York: Cambridge University Press; 1990.

16. Nakajo T, Kume H. The principles of foolproofing and their application in manufacturing. Reports of Statistical Application Research. Union of Japanese Scientists and Engineers. 1985;32(2):10-29.

17. Tsuda Y. Implications of foolproofing in the manufacturing process. In: Quality through engineering design. Kuo W, ed. New York: Elsevier; 1993.

18. Chase RB, Stewart DM. Mistake-proofing: designing errors out. Portland, OR: Productivity Press; 1995.

19. Department of Health and the Design Council. Design for patient safety. London: Department of Health and Design Council; 2003.

20. Galsworth GD. Visual workplace: visual thinking. Presentation at 16th annual Shingo Prize Conference. Lexington, KY; May 2004.

21. Galsworth GD. Visual systems: harnessing the power of a visual workplace. New York: American Management Association; 1997.

22. Mistake-proof it! workbook. Tolland, CT: Resource Engineering, Inc.; 2000.

23. What you need to know about natural gas detectors. National Institute on Deafness and Other Communication Disorders, National Institutes of Health. http://www.nidcd.nih.gov/health/smelltaste/gasdtctr.asp. Accessed: October 2006.

24. What is mercaptan? Columbia Gas of Virginia. http://www.columbiagasva.com/safety_info/mercaptan.htm. Accessed: July 30, 2004.

25. Stewart DM, Grout JR. The human side of mistake-proofing. Production and Operations Management 2001;10(4):440-59.

26. Chase RB, Stewart DM. Make your service fail-safe. Sloan Management Review 1994 Spring; 35-44.

27. Marchwinski C, ed. Mistake-proofing. Productivity 1995; 17(3):1-6.

28. Grout J. Personal interview with Jean Jacques Poux. Weber Aircraft LP. Gainesville, TX; 1995.

29. Raub III GR. Personal communication from manager of quality systems, inflatable restraints for TRW. 1995.

30. Skalbeck JP. Poka-yoke as a continuous improvement tool. Presentation at the Mistake-Proofing Forum. Oshkosh, WI; August 1998.

31. Frye RH. Personal communication from the product manager for De-STA-CO. April 1997.

32. Bhote KR. A powerful new tool kit for the 21st century. National Productivity Review 1997 Autumn; 16(4):29.

33. Whited HM. Poka-yoke varieties. The power of mistake-proofing forum. Moline, IL: Productivity, Inc.; 1997 August 6.

34. Perrow C. Normal accidents: living with high-risk technologies. Princeton, NJ: Princeton University Press; 1999.

35. Senders JW, Senders SJ. Failure mode and effects analysis in medicine. In: Medication errors: causes, prevention, and risk management. Cohen MR, ed. Sudbury, MA: Jones and Bartlett; 1999.

36. Hirano H. Overview. In: Nikkan K. Shimbun/Factory Magazine, ed. Poka-yoke: improving product quality by preventing defects. Portland, OR: Productivity Press; 1988.

37. Small P, Barach SD. Patient safety and health policy: a history and review. Hematol Oncol Clin N Am 2002; (16):1463-82, 1478.

38. Leape LL. Error in medicine. JAMA 1994;272:1851-7.

39. Barry R, Murcko AC, Brubaker CE. The six sigma book for healthcare. Chicago: Health Administration Press; 2002.

40. Leenders MR, Blenkhorn DL. Reverse Marketing. New York: Free Press; 1988.

Return to Contents
Proceed to Next Section

 

AHRQAdvancing Excellence in Health Care