December 26, 2024
BANGOR DAILY NEWS (BANGOR, MAINE

Brain `wiring’ can lead to errors

For physicians, pilots, policemen and presidents, errors are nothing new. Like any other profession — engineers, lawyers, generals, ministers, priests, writers, etc., all professionals have their own share of errors. But lately, medical errors committed in the hospitals are in the front-page news — a Boston newspaper writer died because of an overdose of cyclophosphamide in one of the top cancer research hospitals, Dana-Farber in Boston; a man from Miami has the wrong leg amputated; and a woman with cancer at

Grand Rapids, Mich., had the wrong breast removed.

Why do we commit mistakes, errors, and slips? In a nutshell, because our brain is not perfectly wired and our environment is not perfectly designed.

To understand human errors, we must first understand human awareness and perception. Most of what we do is automatic, fast, easy, and smooth. When we wake up in the morning, we brush our teeth, dress, put on our shoes, read the newspaper, and start the car without much conscious thought and effort. This is possible because, in one corner of our brain, there is a set of mental wiring or neurons that is programmend unconsciously to perform a group of activities for survival. Psycholgists call these the unconscious mental models. And we have many sets of these mental models that we use throughout the day.

One Monday morning, when we are about to enter our car, we notice that the left front tire is flat. We know that we have a problem. This is different from the early morning mental model that leads to starting the car and heading for the office. In this instance, a different set of mental models takes over, the conscious or attentional control model. This mental mode is slow, deliberative, and needs more effort. However, it is still relatively easy to solve it. Either we call a taxi or we change the tire.

On arrival at the office, our boss is waiting for us with a new project: to design a system that will reduce the errors in our organization. Now we have to use another set of mental model, the higher analytic mental set. This is a much slower process because it involves more neurons to activate and connect to. We need to dig deeper into our mental-data base. We sometimes need to tap into a colleague’s mental model or into our computer’s database, or even the Internet. Now that we know the three different mental models it will be easier to think about how we commit errors.

Errors during the unconscious or automatic mental models are called “slips.” These happen when there is a break in the routine while our attention is temporarily distracted. In the morning, we occasionally put shaving cream on the toothbrush instead of toothpaste. Or we might pour orange juice into the coffee mug instead of milk or the doorbell rings and we answer the telephone. We do this because we are momentarily distracted by television or an irritating newspaper article. Fortunately, most slips do not lead to serious consequences.

The next type of error occurs when we are solving a problem. Sometimes we apply the wrong rule or solution to the problem. This frequently happens when clinicians give antibiotics to children with viral infection. Many children with “chronic bronchitis” are given antibiotics, when, in fact, they have asthma and need anti-inflammatory medication or bronchodilators.

The third type of error happens when an individual meets a new situation for which he or she has no prior experience or pre-programmed solution. This can occur easily when an unexperienced first year resident manages an asthmatic at the emergency room without taking Peak Flow Rate and Oxygen saturation before sending the patient home. In a training hospital, many errors are committed in the month of July, the starting month of recidency program.

The fourth, and probably common, complex, and more serious error occurs when there is a systems deficiency. These are errors that are “accidents waiting to happen.” An example of this is a trucker who drives longer than the legal limit, or a train engineer who takes drugs while on duty. Another instance of systems deficiency is when textbooks print the wrong information. Not too long ago, a popular reference for pediatricians printed the wrong dose of theophylline that resulted in overdosage.

How do we correct or prevent these human and environmental errors?

First, we should all recognize that human error is part of life and that we can reduce, but not eliminate mistakes. All of us can work, however, toward error reduction.

Second, we should not rely heavily on our memory. Our work or activities should be designed to reduce the use short-term memory and prolongd concentration. Checklists, portocols, and computerized decision models should be used in all of our critical activities. In a hospital or physician’s office, all tasks that involve intravenous medications should have a protocol that triple check the names of the drugs, doses, and frequencies of administration.

Third, we should develop a new and improved information access system. To adopt a Japanese system of management, “just-in-time-delivery,” all professionals should use “just-in-time-information.” In the management of asthma, be it in the hospital ward, emergency room, office, or ambulance, the guidelines developed by the National Institutes of Health should be available at a touch of a button or otherwise desplayed in front of those who will take care of the patient.

Fourth, all critical activities should be error-proofed or screened before they are started. They should be triple checked by at least three persons familiar with the procedures to be done. Three individuals for example, should triple check the identity of the patient and whether it is the correct foot to be cut.

Fifth, continued training and error prevention education should be part of certification and recertification of all professionals, especially pilots, physicians, pharmacists, and policemen. All organizations should have an error prevention and education department, whose head should report directly to the chief operating officer.

Sixth, our working environment should be designed in such a way that errors, mistakes, and malfunctions will be reduced. Faulty design of pressure gauges at Three-Mile Island gave a low pressure reading both when pressure was low and when the gauge was not working. The design of the O-rings of the Challenger did not take into account freezing temperatures.

Seventh, all organizations and our society should have a change in thinking about the way we look at those who commit and report errors. They should not be put down or embarrassed. They should be trained and accepted and perhaps even be rewarded for reporting mistakes. Organizations should encourage their personnel to report errors so they could be prevented.

We probably can’t eliminate errors. The best we can hope for is to reduce them. The way our brain is wired is not perfect. We are born with many defective genes. Human physiology has its own limits. Interactions between two individuals are frequently out of sync. Our language is far from perfect, as well as our working environment.

Leonardo Leonidas, M.D. is a Bangor physician.


Have feedback? Want to know more? Send us ideas for follow-up stories.

comments for this post are closed

You may also like