Tablets & Capsules

TC0714

Issue link: https://www.e-digitaleditions.com/i/341135

Contents of this Issue

Navigation

Page 26 of 63

Tablets & Capsules July 2014 25 and processes from them. In fact, not erecting barriers is— to paraphrase quality guru Shigeo Shingo—a management failure. Consider, for example, a hypothetical group of emergency room physicians who don't have ready access to nitrile gloves, sleeves, and other protective gear and who become infected by the blood of a patient. Is it the physicians' fault that their skin had pores and fissures that made infection possible? Of course not. In the same way, when managers don't institute adequate measures to reduce the probability of human mistakes from contaminating our systems, they contribute to the propagation of errors. These "opportunities" for errors to occur shouldn't be deemed "human errors," but failures to mistake-proof sys- tems and processes. Look at the three errors shown in Figure 1. Setting up a machine incorrectly doesn't necessarily implicate humans as the root cause. Rather, the machine was allowed to be set up incorrectly. Have you been to the airport lately? If so, you probably passed through a full-body scanner. Even if it was your first time through, you probably knew exactly how to stand because there was an outline of shoes painted on the floor. That allows travelers to understand quickly how to stand without further instruction. Compare the simplicity of that approach with the other inspection systems at airports, especially at the conveyor belt that transports carry-on luggage and personal items through a scanner. Almost without failure, a bottleneck forms as people are given verbal instructions about removing shoes and belts and placing liquids and laptop computers into the plastic bins. Likewise in the second case: The error in record-keep- ing likely stemmed from a poorly designed document or user interface—one that didn't apply the principles of human recognition sciences. If it was an electronic inter- face, it could have been programmed to disallow progress until an incorrect value was corrected. The last error likely occurred because there wasn't a system or standard for cleaning the line: how it must be done and measured before the next batch setup. Cardiac surgeons, for example, have a checklist of what to have ready in order to reduce complications and/or the proba- bility of a failure. I advised a hospital on this topic, and much of the system was adopted directly from the avia- tion industry's use of a pre-flight checklist [3]. Quality control tools Kaoru Ishikawa, the inventor of the "fishbone" diagram for problem-solving, said that the skillful use of seven quality control tools will resolve 95 percent of workplace problems [4]. Deming's favorites among these tools were cause-and-effect (C&E) diagrams, Pareto charts, flow charts, histograms, run charts, and control charts. Here's a quick review. C&E diagrams. Root-cause analysis is really just a set of steps that demonstrate causality. In a C&E diagram, the problem is the head of the fish and the causal factors form the ribs (Figure 2a). For more detail, you can add probabilities to the ribs and thus begin to attribute causal- ity to each element (Figure 2b). But look at everything carefully. In this case, under "machine" there is the causal Figure 1 Examples of error Machine setup error Record filled out wrong Process line incorrectly cleaned Travelers new to airport security checks will likely have trouble sorting their belongings correctly into bins. The outline of shoes in full-body scanners shows you where to stand. To prevent errors, cardiac surgeons adopted the idea of a preflight checklist. f-Locwinart_24-35_Masters 7/2/14 1:29 PM Page 25

Articles in this issue

Archives of this issue

view archives of Tablets & Capsules - TC0714