Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

FDA Calls On Medical Devicemakers To Focus On Cybersecurity

Soulskill posted about a year ago | from the i-don't-need-to-tweet-from-my-pacemaker dept.

Security 40

alphadogg writes "Medical device makers should take new steps to protect their products from malware and cyberattacks or face the possibility that U.S. Food and Drug Administration won't approve their devices for use, the FDA said. The FDA issued new cybersecurity recommendations for medical devices on Thursday, following reports that some devices have been compromised. Recent vulnerabilities involving Philips fetal monitors and in Oracle software used in body fluid analysis machines are among the incidents that prompted the FDA to issue the recommendations."

cancel ×

40 comments

Not the American Way (TM) (1)

AmiMoJo (196126) | about a year ago | (#44011705)

Isn't the normal solution to this kind of problem for those affected (or their families, since they will probably be dead) suing the manufacturer?

FFS (1)

Anonymous Coward | about a year ago | (#44011707)

Like we need more guvmint BS pushing up prices. Look at the cost of an insulin pump, greater than $5K for what is basically a re-purposed pager shell with a syringe pump.

Re:FFS (1)

sjames (1099) | about a year ago | (#44011877)

So it's about time they earned some of that $5K. It's not the government pushing the prices up, it's doctors who don't even bother to know the cost of the things they prescribe.

Government regulation does facilitate the gouging by limiting the number of competing vendors, and it should do something to prevent that side effect, but we really don't want people dropping like flies when their pumps malfunction.

Re:FFS (0)

Anonymous Coward | about a year ago | (#44011887)

Worth every penny (paid cash) to my wife. It's not a "pager shell with a syringe pump" - it's a complete change of lifestyle. I'll not defend the price of the thing further, but don't trivialize the fundamental worth of the machine.

Air gap the damned networks.... (1)

ducomputergeek (595742) | about a year ago | (#44011751)

Run an internal network with no access to the internet. Limit the internet to certain devices or terminals.

Doesn't help much. (0)

Anonymous Coward | about a year ago | (#44011815)

People will plug in any old system that may already be infected...

Re:Air gap the damned networks.... (1)

Anonymous Coward | about a year ago | (#44011831)

Which is fine - except a lot of manufactuers are wanting their devices to have Internet access to automatically download patches and data, and be remotelly accessible for maintainence. And they need to communicate with the rest of the hospital network to pass information.

Re:Air gap the damned networks.... (0)

Anonymous Coward | about a year ago | (#44011885)

then fuck the manufacturers and let the law suites begin

Re:Air gap the damned networks.... (1)

Doug Otto (2821601) | about a year ago | (#44011919)

Not without at least dinner and drinks.

Re:Air gap the damned networks.... (0)

Anonymous Coward | about a year ago | (#44011973)

Insulin pumps and glucometers, in particular, can be far more useful if the usage is recorded for parents, spouses, or medical personnel.

Re:Air gap the damned networks.... (1)

cfsops (2922481) | about a year ago | (#44012067)

Insulin pumps and glucometers, in particular, can be ...

Not just those, but cardiac implants as well, which go beyond the "pacemaker" of yesterday and include defibrillator and cardiac resync therapy, (CRT), functionality as well. Recording heart function for review is very useful. One problem with this sort of vulnerability is that some might choose to forego the device, and therefore forego a better quality of life, for fear of someone fucking with the device just for the lulz.

Re:Air gap the damned networks.... (3, Interesting)

Relic of the Future (118669) | about a year ago | (#44012273)

Since I helped write a system that pulled live data from medical devices (during surgery) to update patient records on the fly, and that, eventually, those records have to be sent to someone else (using the internet): No. You can't just run an internal network with no access to the internet.

Build layers of security. Don't use hard-wired passwords. I.e., Stop being stupid about security. But no, you can't just air gap.

Re:Air gap the damned networks.... (1)

saleenS281 (859657) | about a year ago | (#44012677)

You can air gap, it just requires more work - having a human manually transfer the data using known clean media.

Doesn't work. (0)

Anonymous Coward | about a year ago | (#44014177)

You don't seem to understand the GBs of data that would have to be transferred.

Manually transferring the data would mean the device would have to store LOTS of data... (not possible for internal devices), or a person would have to be assigned to continuous work doing nothing but data transfer... and a corresponding tripling of the cost per device.

As well as introducing possible transcription errors - wrong data for the patient...

Re:Doesn't work. (2)

saleenS281 (859657) | about a year ago | (#44015009)

This EXACT scenario occurs today in many organizations and it works just fine. You act as though this were an impossible feat prior to the invention of networking and that's just not true.

Re:Air gap the damned networks.... (1)

ChumpusRex2003 (726306) | about a year ago | (#44014251)

This adds a number of significant additional risks:
It adds a delay.
It adds the risk that the human will mix records, or will fail to do the job without reporting back.
It generates confidential waste that needs to be managed.

I work a specialist hospital, which gets patients from over a wide region, including neighbouring states. The normal way of transferring X-ray/MRI/CT records is by file transfer from one hospital's server to the other. However, for hospitals which are not common "feeders", which haven't gone to the expense of setting up the particular VPN connections required to connect into our site, a different approach was required.

So, when a patient is transferred to have their brain haemorrhage removed, the scanning hospital must first prepare a CD (using a proprietary encryption tool, to meet local regulations regarding confidentiality - a standard encryption format (including public key encryption to simplify key management) for medical image files has finally been introduced in the 2013 update to the specification, but is useless due to zero support in existing devices, and a typical device replacement period of 8-15 years), the CD has to be labelled, sent with the patient, taken to an admin office, the password has to be obtained by phone call, the proprietary encryption decrypted, the clear files burned to a new CD, and the clear CD loaded into the server (which has a specification conforming medical device is not permitted to load files except from a specification-conforming medium - i.e. an unencrypted CD or single layer DVD-R (with the files recorded in clear in a specific directory structure).

This adds substantial time, and frequently goes wrong. I've had blank (unrecorded CDs) sent with patients; CDs for the wrong patient; CDs labelled correcly, but with some other patient's images on; Some where the password has been lost, and a new disc has to be burned and couriered over; I've had episodes where the technologist on a 3 am, doesn't know how to burn a CD, or doesn't know how to the work the new proprietary encryption package that they're now seeing for the first time; we've had problems with permissions, where the technologist on-call cannot burn a clear CD, because their group policy has blocked CD burning under their user profile, etc. I'm aware of a number of cases, where patient's have gone for emergency brain surgery, where the only scan the surgeon has to guide the surgery, is a photo of a computer monitor taken with a cameraphone and sent by MMS (let's not even start on the privacy aspects of that).

Of course, with care, this procedure work, and we use it during network downtime (planned and unplanned). Similarly, we have backup plans when out CT scanner can't connect to the regional patient registry to verify identities, etc. However, in audits of data quality problems and data mix-up incidents, pretty much 100% can be traced to the use of a manual intervention.

Re:Air gap the damned networks.... (1)

saleenS281 (859657) | about a year ago | (#44015025)

Then your procedure is broken. There are many industries which accomplish this without issue. If you hire an untrained person who can't handle verifying the data is there before moving from one location to another, OF COURSE you're going to have issues.

Re:Air gap the damned networks.... (0)

Anonymous Coward | about a year ago | (#44013729)

aren't medical implants using wireless communication? There is already an air gap.

Re:Air gap the damned networks.... (0)

Anonymous Coward | about a year ago | (#44012643)

Any security provided by a limited network is a defense in depth measure that can't replace the level of device security that would be nessiary to operate on the internet at large.

Patient homes don't have the luxury of a cable drop to an isolated network.

Communications extend beyond standard IP communication, the BIG security embarrassment in the medical field was a weakness used in the RF protocol used to communicate with a system embedded in a patient's body.

Re:Air gap the damned networks.... (0)

Anonymous Coward | about a year ago | (#44013873)

Run an internal network with no access to the internet. Limit the internet to certain devices or terminals.

But how will I live in a cyberpunk future where people are brain hacked, if we mandate such common sense?

Weak Tea (0)

Anonymous Coward | about a year ago | (#44012003)

Hey I'm on a project building such a device as a sub-subcontractor for a company acquired by a megacorporation that will be selling a medical device. The security is abysmal and there was basically no thought given to the security implications of decisions like using s very of JVM, running an outdated and insecure HTTP server on the device, running on Windows and requiring the user be an admin on the system, hard coding a default password, and storing the username and password for the hospital's records database in plaintext (I managed to get that last one changed by shaming the developer involved).

This announcement means pretty much nothing and we still are unlikely to perform a cybersecurity audit on the device. That is because it is cheaper to CYA with some paperwork that says you considered things and the risk is "low" than it is to actually do anything that might threaten a ship date or cost more money. If the FDA doesn't actually start auditing device before they ship (gee I'm sure they can't afford to hire a few security geeks) nothing will happen. This is just more of the best government money can buy.

Simple standard? (3, Interesting)

Okian Warrior (537106) | about a year ago | (#44012013)

Network security is an add-on, largely viewed as an externality by corporations.

I think that it's largely because of this (and that mostly due to Microsoft) that people don't use good security features.

Suppose the socket layer had a function to generate a key pair, and a function call to set the key used for encoding and decoding. (Possibly a bit in the protocol to send a message using or not-using encryption). If it was that simple most products would use it, certainly safety-certified products would use it.

(There's Transport Layer Security [wikipedia.org] , but it's not really simple to use.)

Since there is no simple universal way to use good security, everyone ends up having to implement their own version, which costs time and money.

Simple secure communications should be an OS feature.

Re:Simple standard? (3, Informative)

Darinbob (1142669) | about a year ago | (#44012383)

I suspect most of these devices have either minimal operating systems, home grown operating systems, or no operating system at all. Even if security is in the network stack it doesn't fix things. Ie, do you require your hospital to run IPsec everywhere for every device? Having a top of the line IPsec enabled networking doesn't prevent hacking things if there are bugs due to injecting packets of the right type (ie, it isn't breaking through security to read data, but it is crashing the machine or corrupting data).

The other thing is that when these machines are hacked it is very often due to reverse engineering the machines. These don't run windows or linux, there's no pre-built hacker kit available, the attackers have access to actual machines and have cracked them open, read the flash or monitored the bus to figure out what the software is doing or what style of OS it has, scanned through to find out if there's a recognizable file system type, etc. When you're up against sophisticated attacks like that then your builtin OS security isn't going to be much defense.

I suspect most of these successful attacks are happening on machines that use Windows internally; ie, an app on a turnkey system, or Windows bolted onto the side of a device to provide a front end. But Windows already has a built in securre communication feature.

Re:Simple standard? (2)

Okian Warrior (537106) | about a year ago | (#44012705)

You're correct in that the programs should be tolerant of bad data, and much of the safety certification process addresses this issue. For example, as part of the certification process you need to show that buffer overflows cannot happen, that all cases of input data are covered (bad data is handled gracefully), and so on.

I believe the original article was referring to data transfer and firmware upgrades. These would be conveniently handled over the internet, if only we could guarantee the security and integrity of the data. This means that no one can snoop the data or synthesize false data.

The other thing is that when these machines are hacked it is very often due to reverse engineering the machines. These don't run windows or linux, there's no pre-built hacker kit available, the attackers have access to actual machines and have cracked them open, read the flash or monitored the bus to figure out what the software is doing or what style of OS it has, scanned through to find out if there's a recognizable file system type, etc.

Symmetric encryption handles this situation. If the private key is held by the company, the device can refuse firmware upgrade requests not correctly decrypted by the public key held in the device NVRAM. It does the attacker no good to discover the public key - without the private key, they still cannot form a correctly-encrypted upgrade command.

A similar situation exists for the device/recorder interface. If there were a symmetric key pair for each monitor/recorder, the hackers could only reverse-engineer the keys for each device they take apart. You could even use the "Sears Garage Door Opener" model where a monitor is brought near the logger, press the "learn" button on both machines, key exchange happens, and now the logger and monitor are linked and using secure communications.

(Some may balk at having a per-device key, but note that medical devices often have a per-device stored serial number.)

True security, and privacy, and the solution to a lot of the ills of the world we're seeing right now, is actually straightforward. We only lack the will to implement it.

For example, SMTP has "experimental" protocol headers (X-something). Way back before Google mail existed, the Mozilla mail reader was popular. If the designers had implemented a checkbox "keep message private if possible" which would handle key discovery and key logging between senders and recipients, and a green checked circle "this message is private" when a key exists between recipients, it would have been popular, causing other mail systems to implement it in order to be competitive. (No need to actually *store* mail encrypted, just encrypting the channel would bring an enormous boost to privacy.)

As it happened the designers didn't take that step, Google mail is now the popular model, and the government (and Google) reads all our mail.

(Compare with the current Mozilla policy on "do not track" (we'll implement it, but leave it off by default), or ask them whether the features of "ssh everywhere" or "ghostery" should be bundled with the system.)

Re:Simple standard? (2)

Darinbob (1142669) | about a year ago | (#44012947)

The devices I worked on in the past had protection in firmware and such. The goal however was to protect against competitors and unauthorized resellers, not random hackers. Ie, trying to crack down on the second hand market where they try to clone the firmware and try to resell old machines as new or to sell license features they haven't paid for. Firmware wasn't encrypted in this case but is definitely signed. Encrypting doesn't help much if the attacker has access to the bus.

Re:Simple standard? (1)

ewanm89 (1052822) | about a year ago | (#44014055)

Don't you mean asymmetric encryption? Symmetric encryption means it is the same key used, not a public/private key pair. However you are using it for integrity checking not confidentiality, you don't mind who can read the firmware binary, just want to make sure that is not modified in transport. Therefore it is a digital signature you want to use, not encryption.

Obvious NSA joke (1)

sir_eccles (1235902) | about a year ago | (#44012107)

In Soviet Russia pace makes you?

Only NSA backdoors allowed (-1)

Anonymous Coward | about a year ago | (#44012147)

Translation- all medical equipment companies must liaise with the NSA and implement backdoors into the items they manufacture. Murder by drone is so old hat now. Imagine how much better it would be for Team Obama to be able to hack into medical equipment across the planet, and murder people like Snowden or the leader of Syria that way.

Although most of you are far to ignorant to know this, the spy game has long included attempts to get at targets via their doctors, medication, or on the operating table. Not every target is 'considerate' enough to make themselves vulnerable this way, but more and more do so as they get older.

Obama is a ruthless murdering psychopath, and his replacements will be no better. He is currently implementing a program to murder millions of people like you and I, simply because they happen to be civilians living in Syria. Any act that extends the ability of his people to target and exterminate people anywhere on this globe will be pursued with ruthless efficiency. Ensuring that powerful people fear that if they are on the wrong side of Team Obama, they had better fear anything done to their body in the name of medicine, is a very useful control tool.

The other side of this propaganda is the old "chinese hackers are out to get you" garbage that Slashdot loves to promote- even when you have been informed in absolute terms that the criminals hacking civilian targets around the world are actually in the employ of the US government and are considered a part of the US armed forces. Your own computers are at risk, because backdoors built into Windows, for instance, by the NSA, eventual become known to the wider criminal fraternity, at which time the NSA back-doors are used for common-place criminal attacks on all of our machines. Isn't that nice.

Worse, even if you prevent 'infection', millions of man-hours are lost when computers have updates forced on them to close old NSA holes and open new NSA holes.

BTW- why does no-one demand new additions to the US constitution to address such abuses? The Constitution could only concern itself with issues understood at the time it was written. However, the principle of the Constitution clearly DEMANDS additions added to ban abusive actions by the US government when changes in society make such abuses likely. There should be an amendment of the Constitution that states it is ILLEGAL for the government to interfere with or 'bug' any item that is to be purchased or used by an ordinary citizen. Pre-emptive back-doors should be illegal in all forms. The government should be obliged to first reasonably identify a potential criminal target, and only then deploy special methods to pursue that target.

The NSA designed Xbox One, for instance, is a literal crime against Humanity, regardless of how many chumps declare themselves happy to have the NSA spy on them in their homes. After all, the right to free speech does NOT mean that everyone is obliged to use that right, but the fact that plenty of chumps state that free speech is a 'bad' thing does not mean the rest of us should lose that right.

So, again, why do you Yanks not demand new amendments to your Constitution to prevent these abuses by the NSA? Do you really have so little respect for why you have a written constitution in the first place? If "we trust our President" was a good position to hold, you wouldn't have a constitution in the first place- surely you have enough intelligence to realise this.

At this time in history, Obama can do anything with no oversight, and you Yanks are bending over and accepting this because he reminds you of a young OJ. Dear lord, you Yanks are now getting blockbuster movies where an Obama-look-alike 'president' is the super-hero action figure. Yet you morons have the damned cheek to mock the cult of personality found in nations like North Korea.

What about... (1)

Anonymous Coward | about a year ago | (#44012183)

if I want to overclock my pacemaker? Will it stop me from installing Linux?

Patch cycles (1)

manu0601 (2221348) | about a year ago | (#44013015)

If FDA really wants secure devices, that means we will have patch cycles for medical devices. This is not a very desirable perspective: What happens if you pacemaker is down after a patch? Will you need a doctor to patch?

Re:Patch cycles (1)

cusco (717999) | about a year ago | (#44013413)

Of course there's the alternative, manufacturers could spend the time and money to actually make a secure system, but that's just crazy talk. It might lower short term profits and damage some executive's chance at receiving another quarterly bonus, so we can't have that.

Re:Patch cycles (1)

Zumbs (1241138) | about a year ago | (#44013885)

Many device manufacturers use Linux or Windows as an underlying OS as both have a lot of support form development tools and are widely used. This means that a lot of work has been poured into the OS to make it a lot more secure than some home brew system. On the other hand it also means that all vulnerability in the OS also apply to the device. As long as the device was sealed, this was no problem, but once users want to get data from the device, or even get it to interact with various internet-based services (e.g. journaling systems), the cat is out of the bag. As many OS vulnerabilities are reported after the initial release of the device, manufacturers will have to either patch the OS on the device or ignore the issue. And then there are all the other 3rd party components on a medical device.

As I understand the current FDA guidelines (I haven't read TFA yet, so it may have changed), patching the device is rather expensive as the manufacturer have to be able to document that each patch will not adversely affect the functioning of the device (to avoid the issue with the pacemaker that manu0601 considered). This means that patching is often tied with general patching of the device which will only be done as long as the manufacturer produces updates to the device. Patches may also cost money (or at the very least downtime), so users may chose to skip patches.

Re:Patch cycles (2)

ChumpusRex2003 (726306) | about a year ago | (#44014421)

The problem with implantable devices is that they are severely power constrained, as typically a battery life of less than 5 years is considered unacceptable, with 10 years wanted for something like a cardiac pacemaker.

This leaves very little power for CPU/communications/encryption functions. Any kind of crypto hardware, or any kind of unnecessary complexity in the firmware (e.g. duplicated bound checking, etc.) is likely to increase energy consumption and shorten battery life.

This is becoming less of a problem with modern silicon which is more power efficient, and the use of NFC and induction coils can support the energy required for communication; so there is less excuse for including some form of well designed security on the device.

I have managed to reboot an implanted nerve stimulator once, by scanning the patient it was implanted in, in a top-end 3 Tesla MRI scanner. Interestingly, everything other than program code, was stored in RAM, rather than flash (including stuff like serial numbers, electronically readable model number!!, as well as treatment parameters). After the device rebooted all these settings were lost. The manufacturer had anticipated this, and the MRI instructions for the device, specifically said that these must be read-out of the device and a hard copy made, with instructions to how reprogram the device if it did reboot.

There are different constrants with non-implanted devices (e.g. laboratory equipment, scanners, servers, etc.) Traditionally, all the specifications for these devices were made at the time when they would be connected a clean, isolated network. As a result, security has been a very, very late arrival to these specifications. TLS support was ratified into the DICOM specification a few years ago (storage and transmission of X-ray/CT/MRI,etc) - but I've never come across a DICOM TLS installation in the field. So little installed software supports it, and the replacement cycle is so long (many hospitals are signing 10 year contracts for a particular version of the software) that it is, at present, completely useless. Even basic level network security is made difficult by certain aspects of the protocol - e.g. DICOM network connections cannot traverse NAT (due to a classic-FTP-like protocol for initiating file transfers, and due to the fact that both client and server nodes must be on pre-configured static IPs) and has enough tricks up its sleeve that it will catch out unwary net admins when they try and configure firewall permissions, or unwary sysadmins who try and set up clustered servers

Re:Patch cycles (1)

manu0601 (2221348) | about a year ago | (#44018667)

I appreciate your concern about crypto, but note that secure communications are useless if the device has remotely exploitable vulnerabilities. crypto increase code complexity and therefore the odds to get such a vulnerability. I am more comfortable with administering network switches over serial line rather than over SSH, but I am not sure we can find a similar approach with implanted medical devices.

What does grampa have to hide, anyway? (1)

Impy the Impiuos Imp (442658) | about a year ago | (#44014325)

"Seal the holes!" screamed the FDA. "Just not this one, and that one, and that one, which the FBI, CIA, and NSA use."

Remote Heartattack device (0)

Anonymous Coward | about a year ago | (#44014745)

Don't forget about the pacemaker that, over WI-FI, can be instructed to kill a person.

Treat them like a DMZ (1)

onyxruby (118189) | about a year ago | (#44015073)

I did a bunch of work a number of years back where we had critical (financial services not medical) computers that we absolutely were not allowed to patch. The solution I implemented was to treat any computer that can't be patched as a mini-DMZ.

The computer is firewalled from the rest of the network, put on a locked down VLAN and given only specific destinations, ports and so on as required in order to function. The concept of least privilege can and should be used for computers like this just as you would use it for a user.

You can use this concept for medical devices and it would work just as well. There is work involved and it is a pain in the ass to do. That being said this balances the risk of systems that you are highly unlikely to be able to patch with the need to secure your environment. Once completed your system is allowed to work and your network is mitigated from the risk of having that system in the first place.

More Inefficiency (1)

zmaragdus (1686342) | about a year ago | (#44015455)

I worked in the medical device field for a while. The level of paperwork and documentation required for validation activities is staggering, plus the medical field in general doesn't have as good a handle on fulfilling government requirements as well as, say, the aviation industry. The path to take a device from concept to validated, sellable product is a long one. Adding cybersecurity (while a worthy endeavor) will only exacerbate the arduous and hair-tearing experience of developing a product.

And the solution is .. (1)

dgharmon (2564621) | about a year ago | (#44016625)

Don't connect your Medical devices to the Internet and don't use Computers that are so easily compromised by connecting to the Internet
Check for New Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...