Larry Fennigkoh: Lessons from the Hawaiian Missile Scare

The recent and horrific false ballistic missile attack alarm that occurred in Hawaii last week triggered a flood of emotion, as well as unpleasant parallel memories. While not on the same statewide scope as the incident in Hawaii, I’m reminded of situations when dedicated, well-meaning, and educated healthcare providers have also made mistakes, and patients were injured—or worse, accidentally killed.

The double and obscene insult in such cases is that the immediate or end-equipment user is then blamed, reprimanded, fired, or dragged into a lawsuit, and then any additional inquiry or investigation into how the incident may have really happened—trying to find the true root cause—just tends to just stop. After all, the reason appears immediately “obvious,” that the person just screwed up. As a result, we fail to truly learn and identify the latent defects and/or weaknesses within our devices, systems, or workflow patterns that invite or encourage device users to make such errors, let alone ones that they cannot back out or recover.

Equally disturbing is the inherent nature of our legal system—as good as it is in so many ways—that prevents or severely limits the results of any engineering or scientific inquiry that may have taken place from being shared or publicly disseminated. Imagine where aviation safety would be today if the findings and root cause of every airplane mishap and crash remained sealed and never made public or shared with equipment designers and the scientific community? One of the reasons air travel is so safe today is because of all of the incredibly precious knowledge that we have learned from such prior tragedies, failures, and loss of life.

What elevates such tragedies to the obscene—that double insult—is when we learn nothing as to why they happened in the first place. The unfortunate victims then, indeed, have died in vain. If the pilots and crew in the long history of aviation crashes hadn’t died—and if they worked in healthcare—then they, too, would have probably been reassigned, fired, or sued.

Granted, we have and continue to make considerable progress. But as long as our first response is to blame the person instead of the device or system (or its designers), then we still have a long way to go, and we should be ashamed.

Larry Fennigkoh, PhD, is professor of biomedical engineering at the Milwaukee School of Engineering and a member of the BI&T Editorial Board for AAMI

2 thoughts on “Larry Fennigkoh: Lessons from the Hawaiian Missile Scare

  1. The first response should be to blame no one, but instead to try to find out what really happened and why. Automatically blaming the device is no more correct than automatically blaming the user. (And designers are people too.)

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s