The regulatory environment in the Heath Care industry is based on the premise that any change risks patient safety. Changing a single line of CSS literally takes 6 months to test, validate, document and get approval for, so everyone's afraid to change a thing. You can't automate anything because the current process survived 7 audits and regulatory is afraid changing it might raise an alarm. You'd be stunned at the number of hospitals still running Windows XP. Most systems use a plain text messaging protocol designed in the 80's -- no encryption or authentication anywhere to be seen, and half of them write messages to disk because "it's safer".
If ever there was an example of well intentioned regulation gone horribly wrong this is it. The whole industry is a cyber security nightmare waiting to happen.
A lot of medical stuff seems to suffer from this problem of caution paralyzing the behavior of professionals, not just in IT.
That being said, most commercial software seems to be way worse. There was the article the other month of a windows 10 machine automatically updating while a patient was being operated on forcing them to be kept under for an extra few hours.
In my opinion, "patient risk" is often used as an excuse by vendors (and some hospitals) to slow walk patching and testing. I can understand the motivation, it saves them money and they can wait until there's more patches to test and do them all at once.
I think the obvious solution would be to invest in a new open-source end-to-end infrastructure that could be thoroughly audited then implemented by hospitals everywhere.
Of course, that would need a sizeable investment of both money and time, but it would almost definitely be more efficient than updating one component at a time.
My armchair analysis of the obvious solution is to airgap all these systems. Perhaps this would require some new infrastructure in hospitals, but it would add a very difficult-to-penetrate layer.
That's one part of it, but the real innovation in remote care is RPM devices (Remote Patient Monitoring). These can be anything from blood-glucose sensors, dialysis machines, blood pressure sensors, etc, that have an internet connection and send data live to a physician or nurse.
The struggle with these devices is that they're often cheap embedded systems that never receive firmware updates, so they do present a security concern. However, they're also immensely useful and have without a doubt saved lives.
Yea so that won't happen.
Hospitals don't audit anything for real. The hospital admin just hires their buddy to rubber stamp junk and gets a kickback.
The actual software and hardware solutions too are based on who gives the best kickbacks to hospital admins and doctors.
Thats it. That's the American healthcare field and why its a complete shitshow.
IT staff is made to deal with decisions they have no say or power in and turnover is quite high.
My hospital offline for a whole week because they got hit by a ransomware attack, and they use Epic. I asked someone I knew at Epic what she knew about it, and confirmed that my hospital was up-to-date on the latest version of their software and following most of their security protocols. My initial thought was they had weak IT security and now I’m not so sure.
Doesn’t matter if their Epic servers are up to date if the attacker got a domain admin account somewhere else and can just log in normally to run the ransomware.
Yup just spearfish one of the employees with a password reset email. People including educated developers and MDs are in general very lax about security. But also you have windows 7 legacy systems running specialized equipment that has been validated for that OS and software version number. There is really no way around this, if a country wants to kill Americans right now IMO it is most effective to disable EPIC servers in ND/SD/WI/MT that would cause way too much chaos and people would die.
But also what are we doing running life-critical software on Microsoft-made OS? This is idiotic, it is great for gaming and excel but not hospitals. Microsoft could make another OS based on Linux or BSD and it could not be hot garbage. But that would eat into profits and take...effort. Linux and ChromeOS + 2FA is much better although not perfect.
Epic is an on-premise dumpster fire, so I'm not surprised. Plus there are many attack vectors besides the EHR. I assume they probably had access to services cut off rather than having patient data held up for ransom.
If there's a zero day, there's not a lot you can do. NHS got hit so bad because they were running very old Windows versions. A lot of embedded systems have no upgrade paths (MRIs running embedded XP should probably not be on the network at all).
Hospitals need full backup machines and with health care costs already through the roof, that will just add more. Even if you have all your order entry machines setup to not make external Internet connections except to update servers, one bad e-mail getting through and you could be in trouble.
You're gonna need your MRIs on the network cuz they transmit the actual PHI via PACS.
No way the operator is copying a 5GB+ dicom file to your record in your EMR manually.
You NEED to have the patient name added via modality worklists to reduce errors (ie. add the pt to the MRI software before the scan, and send the scan to the EMR once it's taken).
The worst thing is, this protocol is old and insecure. They just don't have the IT chops at hospitals to handle this.
Zero days may get the headlines, but attackers are finding a lot of value in leveraging old vulnerabilities. CISA, FBI and NSA have issued several advisories over the last month highlighting an overarching theme of advanced persistent threat groups targeting unpatched vulnerabilities lately.
In general, regulated entities are required to regularly prove that their change-management processes are sufficiently heavy as to make regular patching a non-starter.
Ha! Sorry, my political views are non-binary so I see how you're confused. Allow me to clarify: the regulations in the health care industry are structured poorly and have strong disincentives to even the simplest and most obvious improvements (e.g. updating software to receive security updates). They should not be removed, but they need to be rethought so people in the industry aren't afraid to make changes.
HIPAA contains a security and privacy rule, but its original aim was to spur patient record portability between providers and insurers. That lineage of regulation, which also includes HITECH, ARRA, and provisions tied to Medicare expansion, established the carrots and sticks thought necessary to modernize the health industries records--to get them off paper and into bits. All this modernization eventually happened, but it's hard to say whether the regulation was the primary driver or if these companies would have done it anyway. Having worked in the industry, I lean toward the regulations being the primary driver. Low risk tolerance was already a characteristic of health organizations before HIPAA (and I think patient safety was the main reason). When HIPAA was signed in 1996, most US industries were heavily computerized, but health organizations lagged far behind. Lack of competition where most providers and insurers operated meant there was little commercial incentive for them to spend money to be able to exchange files with organizations in other states. Digitalization just wasn't coming together in health care as rapidly as in other industries, although I didn't work in health at the time so I don't feel like I personally know all the reasons.
It's been a long time since 1996, but most of the IT messes inside health organizations are self-inflicted. HIPAA and friends don't mandate which operating systems you use, specify approved encryption algorithms, or tell you when and how to update your computer systems. These are all choices left to the implementation teams, and they chose to work with vendors who aligned their solutions to information architectures that just don't change very fast. I think if you compared this IT situation to, say, large scale manufacturing in the US you'd find similar problems of outdated platforms supporting expensive and hard-to-change niche software. And it's probably market forces, not government regulation, that's responsible for this similarity.
Likewise, I love FISMA, but I don't think hospitals would cease operations just because their systems couldn't get an ATO. What kind of accountability would motivate them to complete POAMs with any urgency? I don't think there is an effective way to incentivize a proactive approach - financial penalties would simply be indirectly paid for by customers.
As hospitals around the country race to the bottom, I'm not sure where a qualified IT team to manage these systems is going to come from. I don't think hospitals can afford them anymore.
I worked in hospital IT and it was a tough environment: it seemed like we had at least one big system rollout (EMR, radiology, lab, etc.) every year. It was difficult to manage when the hospital was paying a little below median for the area, now they are way below that where I live (western MA).
Really it's the regulatory environment. It treats any change as potentially life threatening. Imagine if you had to prove that none of your changes could possibly risk patient safety to people who think automated tests can't be trusted because they can be written to simply print "PASS" all the time.
If there is one thing I’ve learned from HN commenters, it’s that software engineers are never, ever individually responsible for the ethical or moral consequences of the software they write. It’s one of the most consistently and quickly downvoted topics here. It’s always the company’s fault.
It's such a strange dichotomy. On one hand, software engineers command healthy salaries, have massive power to decide where they work, and are in high demand everywhere. They get perks up the wazoo. On the other hand, when it comes to agency over what they work on, all of a sudden they claim their power is totally gone. "Whelp, if the boss tells me to write malware or cheat at a benchmark, I guess I just have to put my head down and do it! Poor me, nothin' I can do about it. Don't blame me, not my fault, everyone!"
Perhaps it’s time for hospitals to regularly report their OS versions and patch levels to our local health departments.