Listen to the Clinical Chemistry Podcast



Article

Geza Bodor. What Can a Clinical Chemist Learn from Aviation? Clin Chem 2016;62:1165-1166.

Guest

Dr. Geza Bodor is a professor of pathology at the University of Colorado, Denver, and Medical Director of Clinical Chemistry and Toxicology at the VA Eastern Colorado Health Care System in Denver. He is also an instrument rated pilot with over 2,200 flight hours.



Transcript

[Download pdf]

Bob Barrett:
This is a podcast from Clinical Chemistry, sponsored by the Department of Laboratory Medicine at Boston Children’s Hospital. I am Bob Barrett.

Though seemingly different, flying an airplane and practicing clinical chemistry may be more related than one might think. The August 2016 issue of Clinical Chemistry highlights the similarities between aviation and laboratory medicine. In an “Unveiling the Right Side” feature, Dr. Geza Bodor who is a medical doctor, a clinical chemist, and a pilot, offers his thoughts on what clinical chemists can learn from the field of aviation. Dr. Bodor joins us in this podcast. He is a professor of pathology at the University of Colorado, Denver, and is the Medical Director of Clinical Chemistry and Toxicology at the VA Eastern Colorado Health Care System in Denver. He is also an instrument rated pilot with over 2,200 flight hours.

So Dr. Bodor, you are flying clinical chemist. How long have you been doing these things?

Geza Bodor:
For a long time. I’ve been flying for over 24 years, and I’ve been a clinical chemist for over 25, but I also have clinical medicine experience before I got into clinical chemistry.

Bob Barrett:
And when did you first notice the similarities between flying and practicing clinical pathology? When did you realize that physicians could learn from aviation?

Geza Bodor:
That was a slow process. During my years as a clinical chemist, I was interested in management and in quality assurance, and quality improvement, and I noticed some issues in clinical medicine and clinical pathology that were not done properly. And eventually, probably when I most closely started paying attention to the similarities was after a trip that I was able to take to the Raytheon Beech Aircraft Factory in Wichita, this was in 2007. At that time, the factory was open to us pilots and we could tour it, and I was able to see how the company was assembling military airplanes.

They also were working on civilian airplanes, but what really impressed me was the military airplane production where they were following very good Six Sigma rules, and Six Sigma principles, and I was very impressed by that because at that time the hospital I had been working at that time was trying to introduce Six Sigma procedures, but they didn’t really do well with that, they bogged down on unimportant things, and they did not really catch the essence of Six Sigma.

So, that visit opened up my eye, and I started looking into if aviation does Six Sigma so well, what else may Six Sigma be doing, and I started specifically looking at different procedures, how we do it in aviation versus how we do it in clinical chemistry. And I ended up just making this mental notes and comparing them to each other.

The other factor that opened my eye was also related to Six Sigma, and you know that Six Sigma procedures came from the Toyota Factory, and it was very popular to use Six Sigma processes in the early 2000 -- actually, probably from the 1990s onwards. And then in 2009, Toyota had to recall three million vehicles because of the accelerator pedal issues, and then a few years later, I think in 2014, they recalled another seven million vehicles because of the fire danger. And that was the other side of looking at the Six Sigma aviation and non-aviation related Six Sigma usages, how overdoing something and mechanically following a good principle can lead to a disaster like it happened in the Toyota case. So, that’s how I started.

Bob Barrett:
Medicine has been striving to improve outcomes and eliminate errors for decades. Do you feel that these efforts haven’t been sufficient and we need to turn to the methods used by aviation and these other industries for ideas?

Geza Bodor:
Yes, I do. Why I do this, because if you look at what happened in aviation, back in the 1970s, fatal accidents’ rate was about 30 fatal accidents per year in commercial aviation. By the mid-2000s, this was cut down into five fatal accidents a year. I’m only talking about commercial aviation.

I’m not talking about general aviation and home-built and such, but commercial aviation. On the other hand, from another standpoint, the number of fatalities that were involved in these accidents decreased also. The more than 2,000 people who died in aviation-related accidents in the 1970s, this number is now less than 500 for all over the aviation field. And this happened while the total volume of flights, the size of the airplanes increased, and now we are at the point where about four accidents, fatal accidents, are happening per million flight hours in the airlines and a little bit more for the commuter and the charter airlines, 11 or more fatal accidents per million flight hours annually. So, this is a huge improvement in 30 to 40 years.

On the other hand, many of us are aware of the 1999 report by the Institute of Medicine. It was entitled “To Err is Human” and that was the publication that turned the medical community toward the magnitude of errors and also to show that this error rate, what we have in medicine is unacceptable. The Institute of Medicine in that paper reported or estimated 98,000 deaths per year that were attributed to medical errors.

The current statistics estimate between 200,000 and 440,000 medical errors per year, not just fatal, but all types, and numerous organizations, the Institute of Medicine, American Hospital Association, they claim that the error events are as high as 21% for all errors and as high as 1.4% for lethal errors in medicine. This is very, very high and this is unacceptable so it’s not just my assessment, but also the assessment of institutions and other professionals who look into this information.

The reason for this, I think, is the complexity that we deal with, that we have to deal with in medicine, the culture that we still have in medicine, and the understanding of what needs to be done. This is the same for medicine and it is the same for the airlines, so I think by observing somebody who is 30 years ahead of us, we can learn and maybe we can improve.

Bob Barrett:
So, since aviation has demonstrated these great improvements in error prevention, you are suggesting that we should copy their techniques and apply them to medical practice and maybe achieve the same improvement?

Geza Bodor:
I am not advocating a mechanical copying. There is no direct copying the two; there are similarities between the two fields of aviation and medicine, but they are not exactly similar. What I’m advocating is to look at the principles and techniques that aviation uses, and try to apply those to the medical field. For example, the environment is different. Commercial aviation is a very much industrial production, cookie-cutter operation. The airlines have the same airplanes. They have the same education, the experience, and they have big infrastructure supporting the flying operations.

Medicine is different. Medicine is more individualized. So, some things that aviation approached and aviation applied cannot be directly applied in medicine. My thinking is that if we discover the similarities, we can apply similar methods of solution, but where we recognize the differences between aviation and medicine, we can look at what aviation has done well or didn’t do well, and we can use that and adjust and innovate for our purposes. In other words, what I’m thinking about is that we cannot just learn from aviation, but we can also learn from the mistakes that aviation made even when they were trying to improve the error rate or improve their performance.

Bob Barrett:
Doctor, in your article, you mentioned crew resource management, CRM, and you stated that CRM has not been widely applied in medicine. Could you explain this in more detail?

Geza Bodor:
Yes, this is a very important thing for me because I see the differences, and I feel that we need to learn CRM or crew resource management in our practice in medicine.

So, to start with, you have to understand that all pilots who fly commercial airplanes have pilot licenses, they are fully licensed, and many of them have hundreds or even thousands of hours of flying time before they can get into the commercial aviation, before they can carry passengers. However, even with this background, they can have differences in experience. One can have 1,500 hours of flying time, and another one can have 40,000 hours of flying time. In spite of this, they keep alternating their roles. In commercial aviation, there is a crew of two pilots; one of them is the flying, the other one is the non-flying pilot, and between different segments of the flight, they alternate in what their roles are. The roles are well defined, and they alternate these roles regardless of who is the more or less experienced. This way, they can learn from each other and at the same time, they can also observe each other and correct each other.

There are other aspects of CRM. For example, there is communication. If the two pilots are equal in their responsibilities, they can communicate and they can openly tell each other “Oops, you did something wrong,” or, “Did you do this even if it was not wrong?” They check on each other. This helps conflict resolution and they have certain rules that they have to follow. For example, the Sterile Cockpit Rule, that means in certain phases of the flight, they cannot do anything else but concentrate on flying. During CRM, they can also unlearn authoritative management styles because very strong authoritative management stifles cooperative problem solving. Example, when this CRM was not practiced properly, where there was no communication -- for example the Air France flight that had an accident some years ago over the Atlantic, because the pilots did not recognize that there was a problem with the flight management system, but even when one of them recognized it, they did not communicate properly and the two pilots, the two flying pilots were introducing contradictory input into the flight management system.

Another example would be the Colgan Air, where the pilots were not following the Sterile Cockpit Rule and they crashed the airplane in icing condition while they were musing about, “Oh look,” they literally were saying, “Oh, I’ve never seen this much icing before,” instead of doing the evasive action and get out of this.

In medicine, there is potential to have this kind of authoritative environment and/or not to observe the Sterile Cockpit Rule, because in medicine we can be distracted or we can give in to another colleague in a higher position or higher authority, but those situations can lead to similar errors that we had seen in aviation. This is why some elements of crew resource management have been introduced in medicine, especially in surgery, to prevent this kind of problem, but it’s not widely used across all fields and it’s not widely used in practice.

As a matter of fact, in some cases, if a subordinate disagrees, expresses an opinion, even if the opinion is correct, but he disagrees with his or her boss, it is considered insubordination or criticizing the boss, and many times management will consider this to be disobedience on the part of the person who may have recognized this problem. So, what we really need to do is unlearn these practices on both sides.

The person who’s in the lower position has to learn to be able to speak up openly and point out potential or existing problems, but management in higher position, they have to learn to listen and accept that and aviation does this very well.

Bob Barrett:
Well, finally doctor you’ve expressed concern about the use of computerization and automation, and that it could cause harm as it did in aviation. Do you think we should be scaling back or even stopping our heavy reliance on computers in medicine?

Geza Bodor:
No. Computers are here to stay and they should be here. They have very vital roles. They contribute to a lot of error prevention practices. They are very helpful. They can transfer information from the laboratory to the clinician and from the clinician to the pharmacy and from the pharmacy back to the lab and other aspects. So, I’m not advocating doing away with them. However, over reliance on computers and instruments, it can lead to problems. One place I had worked, there was a blood gas machine outside of the laboratory, and that blood gas machine had not been serviced for years because it was in the possession of the clinical service that was using it. And when we found out about it in the lab, we looked at it and called a service to finally look at it. It turned out that the blood gas machine could only report a number of 98% oxygen saturation. Nobody ever questioned that in the clinical department because the instrument gave a digital readout that read as 98, and they were happy with that, that’s a normal result, nobody questioned it.

Other examples would be the shotgun ordering. When a clinician orders a large number of otherwise non-connected tests that have nothing to do with the patient’s presentation or the patient’s diagnosis, and the clinician hopes that among the multiple numbers that the laboratory will return to them, there will be a few that are abnormal, and that will pinpoint what the problem is with the patient. Obviously, this will not work because not all tests will be an absolute yes-no answer.

Additional examples are mechanical dependence on normal ranges. I get phone calls almost every day from clinicians who are calling because the normal TSH value for example -- that’s the latest one I got -- the normal TSH low-end in our laboratory is 1.0, and the patient had a result of 0.99, and the clinician is now trying to find out what’s wrong with the patient. This is overreliance on computers and computer numbers. And there are others that I could list in this area.

From another standpoint, there is the complete reliance on medical records. Laboratory computer downtime, hospital information downtime, can prevent availability of these results or slow down the availability of them. We still have to teach our doctors that they need to think in context and they may have to be able to treat without an immediate availability of tests results and laboratory results.

So, this is what I’m saying about the computers that we need to be very selective of what we use and how we use them. And we also need to be able to keep ourselves focused on medicine even when the numbers from the computers are not available.

Bob Barrett:
Dr. Geza Bodor is a professor of pathology at the University of Colorado, and is the Medical Director of Clinical Chemistry and Toxicology at the VA Eastern Colorado Health Care System in Denver. He is also an instrument rated pilot with over 2,200 flight hours. He’s been our guest in this podcast from Clinical Chemistry. I’m Bob Barrett. Thanks for listening!