[HN Gopher] Hundreds of patient data breaches are left unpunished ___________________________________________________________________ Hundreds of patient data breaches are left unpunished Author : taubek Score : 59 points Date : 2022-05-14 18:44 UTC (4 hours ago) (HTM) web link (www.bmj.com) (TXT) w3m dump (www.bmj.com) | acchow wrote: | This is the perfect area for a vigilante to regulate the market | karaterobot wrote: | > If an organisation is non-compliant with their agreement, we | work with them to address any problems and conduct follow up | audits to ensure they are fully resolved. | | This feels like the right response to me. In most of these cases, | we're talking about a data provider with reasonable governance | controls in place, who grants access to a requester who says | they'll use the data responsibly, then just does not. | | If the requester is part of a large research university, it | doesn't make sense to say "researchers in Study A violated the | data use agreement, therefore hundreds of other researchers in | studies B-Z must now erase the data they've already downloaded, | and never apply for access to more data from the largest research | data provider in the country ever again." Those other studies had | nothing to do with the violation, so shouldn't be punished. | | The institution should punish the offending individuals, and the | data provider should blacklist those individuals, as well as | carefully audit both the institution (for its education and | oversight of its research teams) and the principal investigators | of the offending study for some length of time. | downrightmike wrote: | The government has put in all kinds of laws, but really others | find ways around. IT should be written into law that the spirit | is such, that if you fuck up, you pay. Orgs need to keep data | safe like life or death depended on it. In the end, it does. | Data encrypted in flight, at rest and only kept around as long | as needed. | JumpCrisscross wrote: | > _Orgs need to keep data safe like life or death depended on | it. In the end, it does._ | | Then the parties injured can bring claims with the actual | damages in hand. If the courts get clogged with such cases, | we'll have the evidence with which to legislate. | | Jumping the shark by assuming hypothetical harms are real is | how we supercharge needless bureaucracy. | Ferrotin wrote: | This is evil shit you're saying (I'm being a bit dramatic, | sure). Healthcare is made more expensive by all these rules | about IT. Life or death depends on keeping costs down and | making it seamless for doctors to share information with one | another. It isn't some catastrophe if a breach happens. | They've happened before, yet people have not been getting | their private health information published in the local | newspaper. | karaterobot wrote: | At the same time, I wouldn't want anyone to be able to | enforce certain parts of that. For example, to make sure that | data was only kept around as long as needed, you'd need to be | able to monitor the contents of all the computers that | contained that data. This creates problems of its own, much | larger than the original one. To a certain extent, we just | have to trust researchers with sensitive data, and severely | punish gross violations of that trust. | | To be honest, I've heard of many more examples of | organizations who put _too strict_ of controls on their data. | This is due to researchers trying to walk a line between a | requirement that they share their data, and their | (understandable) desire to keep their work to themselves as | long as possible, so other competing researchers can 't | publish on it first. A bad data governance committee fails | much more _often_ in allowing data contributors to be too | strict with their data, even though I agree that a data | breach is a worse outcome, and avoiding it should be the | highest priority. ___________________________________________________________________ (page generated 2022-05-14 23:00 UTC)