To me, if we can criminalize something like a major oil spill such as BP/Deepwater Horizon, how is this much different? It's not like they did the oil spill on purpose, but they still need had consequences for those risks that they were taking. Software companies, esp larger ones like Facebook, should have the same kind of consequences for their risks of software bugs that cause these kinds of privacy breaches.
Also, as someone else below pointed out to someone else with a similar tone as your phrasing of "criminalize software bugs": "intentionally obscuring the debate. Gross negligence is an entirely different standard than just software bugs."
The government does a good job in this area forgiving innocuous violations, as long as all parties disclose it immediately and follow procedure.
The problem is that we're all giving our data away to these "free" platforms. That makes it difficult for a user to argue that they've "lost" something of value when there's a breach. But of course the user has lost something of value. Facebook has built their entire company around the value of our information but we let them have it both ways. It's valuable when they're selling it but worthless when they fail to protect it. Statutory damages for data breaches would deter negligence and (partially) compensate users who have been victims of data breaches.
Come to think of it, does anyone know of good auth resources for a mean stack that isn't a copy paste blog? I'm trying the udacity auth course as a starting point (uses oauth2)
The relevant rfc and drafts perhaps? https://tools.ietf.org/wg/oauth/
Also checkout OWASP https://www.owasp.org/
If your implementing openid connect, use a certified lib https://openid.net/developers/certified/
Dex by coreos, Open Policy Agent, Kubernetes docs & code are all good examples, lots of frameworks have docs / code.
Case in point look at the quality of medical software today. Hospitals still use windows xp and other completely insecure and outdated software. Because absolutely nobody wants to deal with the nightmare that is HIPAA.
"But HIPAA" has never, in my experience, been employed except by people who find the idea of doing the right thing inconvenient or inconveniently expensive. (It is virtually never that hard and its benefits are clear.)
There are reasons for not modernizing tech stacks in the medical space. HIPAA is, in every case I've ever observed, not a meaningful one.
Thank you for directly attacking my character without even addressing my actual argument.
I'm not arguing against HIPAA, I'm arguing against such regulations in spaces that don't require that kind of sensitivity. I think that medical data absolutely requires the protections it has. But it absolutely has had the unintended consequence of making current medical data more insecure and stifling innovation in the space. Most doctors don't even follow HIPAA compliance sending patient medical records over email.
I would estimate that 40% of doctors today are not compliant with HIPAA, sending X-rays and other similar patient information over email with providers that they haven't signed BAAs with.
>There are reasons for not modernizing tech stacks in the medical space. HIPAA is, in every case I've ever observed, not a meaningful one.
Then please enlighten us. Up until a few years ago (maybe even just a year) you couldn't use AWS to host medical data. Today you can't use Google Cloud to host medical data unless you are a large enough business to be able to get into contact with one of their sales reps. Can you even sign a business associate agreement with digital ocean? So up until a year ago you could not even have a small healthcare startup hosted on the cloud. Please explain to me how this hasn't stifled medical software innovation.
If it isn't HIPAA it's some other outdated regulation.
https://slate.com/technology/2018/06/why-doctors-offices-sti...
So you're fine with financial losses, loss of privacy, and the material harm that goes along with both? Disregarding the impact that data breaches imply is just naive.
> Case in point look at the quality of medical software today. Hospitals still use windows xp and other completely insecure and outdated software. Because absolutely nobody wants to deal with the nightmare that is HIPAA.
I wrote medical device software for more than a decade. HIPAA has nothing to do with it. Many systems run on outdated platforms because the cost of replacing them is deemed to outweigh the benefits. That determination is debatable on a case by case basis, but in practice we see a hell of a lot more damage being caused by breaches of companies running on modern technology than we do e.g. hospital systems or LIMS.
please, if there is provable material harm they can take it to civil court.
And your "nightmare" scenario of (civil) liability flowing from programming bugs already exists in the investment world and it hasn't come apart at the seams. Google Axa Rosenberg. A coding error in their trading algorithm went undiscovered for two years. Negligent for sure, but not why the SEC went after them. The problem was they didn't promptly disclose the error to investors and they didn't promptly correct it. Algorithmic trading firms should have mechanisms to catch errors, correct errors, and disclose those errors to investors. And after seeing Axa Rosenberg's $250 million fine and Rosenberg's lifetime ban from the industry guess what they all implemented?
This is false.
Source: Works for a company that has mandatory HIPAA training for every employee every six months.