I was working in my office when the phone rang from our help desk intake line. A physician needed assistance with an ultrasound device that was intermittent. This was the beginning of an hours-long situation that took me from my normal routine into a “twilight zone” and back again.
Put him through, I said, and the doctor introduced himself.
“I have an ultrasound device, and every time I move the power cord, I get an error message and the device freezes,” the doctor said. “I’m with a patient now who is enrolled in a research protocol. If you come to the room I can show you what I’m talking about.”
I asked him what type of imaging he’s conducting and the manufacturer and model of the dive. He quickly read off the name and model number which, I must admit, I didn’t recognize.
“I’ll be there in 10 minutes. But I need to let you know we don’t support repairs to research equipment per our policy,” I said. “However, if all you need is a power cord, I think we can assist you with that.”
I invited my supervisor to join me and we made our way to the patient’s room. Upon arrival, we were surprised to find the ultrasound in question was not a “full sized” imaging system made by one of the “big four” manufacturers. Instead, it was a device the size of a smart phone. The application was for cardiology. The device performed an entire suite of measurements of cardiodynamic function, including color Doppler and wall-motion parameters.
Wow! In addition to the stethoscope draped around the doctor’s shoulders, he held a fully functioning diagnostic system that a few years ago would have been the size of dorm-room refrigerator and drawn 15 amps of power. If a picture is worth a thousand words, how much is a fully functional ultrasound worth that can be held in your hand?
And then—I truly realized what I was looking at. My attention focused on the platform, the data, and data security. My heart rate jumped, and I went to DEFCON 3 (“increased readiness”). If that is a cellphone, how can we secure the data? Where is it stored? How vulnerable is it to being lost or stolen or shared or stored? Alarms sounded in my head—this cardiologist was going to give me a heart attack!
I kept my cool. The doctor demonstrated that when he wiggled the small wire between the low-voltage power supply and his tiny ultrasound, the device froze and gave an error message.
“Can you fix it?” he asked with a hopeful, boyish grin.
“I’ll need to do some checking first,” I replied, hesitant to make a commitment. I took down some notes and returned to my office for research.
It’s amazing what you can find with Google’s assistance. Online, I found the device and a complete operator manual with data, application notes, and references. What I saw raised my alarm to DEFCON 2 (“high readiness”). A section of the manual described adding patient name and medical record number along with annotations and comments. That’s going to give our IT security team a heart attack, I thought while flipping the PDF pages.
I called the customer service number for the device and was eventually connected with a service agent. The conversation revealed many of the capabilities of the imaging device. I asked about the availability of an MDS2 (Manufacturer Disclosure Statement for Medical Device Security) form, which was not listed in the technical reference documents available from the online library. The agent emailed me a copy, and I hungrily read through the five pages of claims and notes. The first question on the MDS2 reads, “Can this device display, transmit or maintain private data (including Protected Health Information [ePHI]?”
The answer? No!
Without missing a beat, I got back on the phone to the manufacturer’s service support line. I was now at DEFCON 1—maximum readiness.
“Excuse me, but how can you state that ‘there is no PHI’ when section 14 of the manual refers to ‘adding patient name and medical record identifiers?’” I challenged, feeling secure that I had found the unlocked path to cybersecurity purgatory.
“Hmm,” came a cool and considered reply over the line. “I think I may have only sent you MDS2 for the scanner. There is a second form for the accessory computer app, which links the data to a laptop computer. Let me send you the second document.” After a brief wait, the email arrived and I eagerly opened the form. Yes, this one clearly stated that the answer to “contains PHI?” was “yes.”
I should have known that the engineers and designers of the device were cognizant of the PHI risk and cleverly designed a scanner that merely increments a counter with each new image or clip—that means no PHI. The application is a different story. It allows the savvy user to annotate, document, and manage images. Whew!
My blood pressure returned to the normal range—at least for now. I am still kept up at night by innovators and developers, who might implement this same type of useful device as an app on a smartphone under the guise of a “sports and fitness monitor” that produces cool images and crosses the boundary of a quasi-medical device and places PHI at risk.
I wrote up my notes and findings along with the information on repairing the intermittent power cable for the doctor. I settled back to DEFCON 5—normal readiness.
Eben Kermit is a biomedical engineering supervisor at Stanford Health Care in Palo Alto, CA.