AAMI has been trying to collect examples of real-life systems issues involving healthcare technology to help our community better address the increasingly complex challenges of modern healthcare. It has been incredibly difficult to collect such examples, much more difficult than we had ever imagined. At first, we thought it was because subject experts were just too busy. Then we thought no one wanted to share their own systems problems. And a few subject experts have told us we are too optimistic in thinking that the healthcare community is actually using systems thinking and systems engineering tools to address complex technology challenges—so perhaps there are no examples to share.
When I read a recent book review of Overcomplicated: Technology at the Limits of Comprehension, it hit me that systems issues bombard us every day in our lives beyond healthcare and yet we still don’t think of them as systems issues. So, it was rather naïve of us at AAMI to think that it would be any easier to identify, discuss, and collectively work on systems issues in healthcare.
In his book review, Amir Alexander notes that author Samuel Arbesman uses a single day in recent history—July 8, 2015—to highlight some of the significant systems issues that are becoming commonplace. On that day, the New York Stock Exchange halted trading when its computer system went haywire, United Airlines had to ground its flights because of a computer glitch, and The Wall Street Journal’s website went dark for no explainable reasons at the time. Arbesman’s main point, according to the review, is that “our systems have grown too complex to handle. . . . From the electrical grid to Internet dating sites, the systems we live by have become ‘kluges’—overly complicated, inelegant, cobbled together messes. Even experts can no longer fully understand or control them.”
As I thought about kluges beyond healthcare and even beyond technology itself, I wondered whether our entire world is suffering from a gigantic systems overload, taxing our ability to even comprehend a problem—let alone realize that systems thinking is a first step to finding a clear path forward. Let’s take the Brexit vote in the United Kingdom, for example. I don’t have a personal or professional opinion about Brexit, but I have found it fascinating to see the enormously complex systems implications of Brexit continue to unfold, reflecting a much deeper complexity than its proponents or antagonists realized when the referendum was being discussed or the votes counted. While Arbesman’s book is about technology, similar systems challenges and crises pop up every day because our world has become so much more connected in ways that are beyond our individual capacity to comprehend. Alexander states it well in his description of Arbesman’s premise: “Over time, systems are connected to other systems, and each learns to deal with an increasing number of ‘edge cases’—rare occurrences that barely affect the overall performance of the system but nevertheless have to be accounted for. And so, bit by bit, a kluge is born.”
Arbesman’s answer for solving the kluges of the world is that we first must realize there is no grand, elegant, or powerful answer. We instead need to approach our man-made systems the way biologists approach the natural world of complex ecosystems: through experiments, observations, and incrementally advancing our knowledge as we test, reject, learn, and retest grand theories. Arbesman encourages us to approach systems issues with a humble scientist’s admission of ignorance, using experiments and observations to learn, and testing potential grand solutions.
I have to believe that the one thing we should not be doing is burying our heads in the sand and just accepting the systems failures that are expanding exponentially—and, even worse, closing ranks and trying to keep the world from seeing our systems vulnerabilities, as if our own kluges were an admission of failure to be exploited by others.
Pulling my thinking back to healthcare technology, we need a greater scientifically focused commitment to a systems approach in order to understand and then solve our own kluges. We need to do so before we have major disasters as a result of our cobbled-together solutions that are developed with the very best of intentions in our silos and comfort zones of expertise. We certainly have enough warning signs to realize we need to work across departmental, organizational, and discipline-specific boundaries to solve these growing technology-oriented challenges. The dearth of visible experimentation, observation, and examples of what’s working (and what’s not) makes me believe that we have a long way to go before we are ready to heed Arbesman’s words of wisdom. I hope it won’t take a disaster of greater proportion than a cybersecurity attack on a single health system IT network or a glitch in medical device software before we make that commitment to address technology challenges in a systems-oriented way, with the humility of a curious and ignorant biologist who dedicates his or her life to expanding our knowledge of our natural ecosystems.
I continue to be hopeful that medical technology companies, healthcare delivery organizations, researchers, and independent experts will send to AAMI their worst—and best—examples of humble ignorance, experimentation, observation, and success. Without such sharing, we will never build a body of knowledge that can confidently address the kluges of tomorrow that we can’t even imagine today.
Mary Logan, JD, CAE, is president and CEO of AAMI.