From the news article (I don't have time to review the source leak indepently) there doesn't seem to be anything really concerning here. The closest to an indication of anything wrong seems to be that someone raised an issue about the risk of improper employee use of data and a need for training around that in an internal meeting on the project and has not received a formal specific response on that issue from corporate leadership. Having spent a long time in HIPAA-related work, that neither that issue being raised in regard to a new project or the fact that it was raised being merely one of many inputs into a policy generating process that makes general adjustments considering a wide range of concerns, legal parameters, and other issues but not receiving a specific direct response seems...pretty typical. And HIPAA does not require notification or opt-in (or even opt-out opportunity) for data sharing between a covered entityand Business Associate, as BA’s are (while under HITECH independently subject to HIPAA privacy and security rules) basically considered institutional agents of the covered entity to which the covered entity’s authority to have and use data is delegated under the Business Associate agreement.
I don't know if there is really nothing of concern in the dump or the journalists covering it don't have enough understanding of the domain to even distinguish things that would indicate a problem, but what it looks like from the news article is a “whistleblower” making accusations and dumping docs, but nothing substantial and concrete in the docs supporting the thrust of the “whistleblower’s” accusations of wrongdoing.
Boundaries and distributions should be clearly, specifically specified, with any non-essential distributions requiring specific assent, defaulting to none. If there are consequences to sharing, those can be made known. We've been drawn into a circumstance which has long been untenable.
However, their little snafu with SureScripts and Remy Health just got them banned from accessing healthcare history - however there are pending FBI and FTC investigations regarding their mis-management of healthcare data. Worst case, the digital pharmacy Amazon just bought will be barred from sending or receiving digital prescriptions and their HIPAA accreditation will be voided for three years (with a fine).
Seems like a great way to waste a couple hundred million dollars.
> The data is being transferred with full personal details including name and medical history and can be accessed by Google staff
I think a reaction of horror is quite appropriate. Your comment is "whataboutism". Let's discuss this leak without invoking bogeymen.
I'm not going to comment on this specific case but I do have almost a decade of previous non-Google experience working in clinical documentation technology.
As others have said, entering into a BAA with a covered entity, as HIPAA defines it, shouldn't be seen as a controversial action.
There are numerous problems in healthcare that are too complex for individual health systems to tackle. For example:
* Population Health: are there emergent changes in the regional population? What do you do about it? * Continuity of Care: The number of individual providers involved in a particular person's care continues to grow. How can you effectively inform the entire team--across health systems--what's most important for an individual now? How do you make sure nobody drops the ball?
To give you an idea of the scale, I have two examples. The first is MD Anderson Cancer Center in Houston. They used to have 200+ engineers working on their sophisticated home-grown EMR. It was a huge undertaking. But even with MDACC revenue, that development was unsustainable, and they moved to a 3rd party EMR vendor.
Second is the Mayo Health System. Another huge provider with facilities not just in flagship Rochester MN, but in several other sites. Again, there were realities that even at this scale internal development isn't sustainable across the board and they wound up with a $100M+ adoption of a 3rd party vendor.
And this is mostly straight-forward CRUD-level workflows. The technology is straightforward but the workflow expertise is not.
Now, try and solve some bigger problems. You're going to need help to do this at scale, and trying to solve it necessarily means giving access--not control of!--to medical records to drive R&D. It's happening right now, and Google is not the only player doing this at scale. They're not even the largest one.
Lastly HIPAA controls have real teeth, in comparison to the general consumer space (at least in the US).
I'm not certain what aspect you are trying to highlight with this example, but readers should know that the MD Anderson implementation of the EPIC EMR system led to a 77% drop in income and layoffs approaching 1,000 people (2016-2017 time frame)[1][2]. I'm not up to date enough to know whether they have ever recovered.
[1] https://www.modernhealthcare.com/article/20170106/NEWS/17010...
[2] https://www.beckershospitalreview.com/finance/md-anderson-po...
You're correct in that literal books could be written about EMR adoption gone wrong. That doesn't change the fact that even super huge mega-health systems can't afford to do it all themselves.
You place more faith in HIPAA than I do. HIPAA does not protect privacy to the degree that most people assume.
> There are numerous problems in healthcare that are too complex for individual health systems to tackle.
True, but that doesn't mean that Google is the right entity to do this. In my opinion, they're the wrong entity, because Google is not exactly trustworthy.
> Google is not the only player doing this at scale. They're not even the largest one.
But they're Google. What this sort of thing means for me is that I need to start asking medical providers if they're participating in this sort of thing with Google (or other companies that I consider bad actors), so I know which ones to avoid using.
That's correct. People would be surprised at the number of HIPAA violations that happen everyday. It is, however, among the strongest and most well-enforced data privacy laws (in the US).
> True, but that doesn't mean that Google is the right entity to do this. In my opinion, they're the wrong entity, because Google is not exactly trustworthy.
You're certainly right to be concerned. I don't share your opinion about Google per se, but this is important data for our society. I'd argue that OpSec at a large provider--let's say Microsoft--is more sophisticated than a start-up. So how does an organization decide who is the "right" entity to deal with?
> But they're Google. What this sort of thing means for me is that I need to start asking medical providers if they're participating in this sort of thing with Google (or other companies that I consider bad actors), so I know which ones to avoid using.
If this is important to you, I would strongly encourage it. Our health industry is better when consumers are better informed, and can make informed decisions. Personally, it's more important to me to be able to actually know how much a procedure is going to cost rather than who owns the AI stack behind their clinical decision support system.
Can anyone elaborate?
"Google's harvest of medical data includes names and full details of millions"
is hyperbole, when
"Google partners with health system on clinical documentation research" is more accurate.
I'm a proponent of more consumer control over their data, along the lines of GDPR. You could, theoretically, request that your covered entities give you a copy and then delete all of your records from their systems at any time.
Which for consumer data Google already gives you the option to do so (e.g. takeout.google.com)
Also - The deal was only just signed, e.g. the transfer hasn't happened yet?
There's a lot of hearsay in all of this reporting...
Google has entered into similar partnerships on a much smaller scale with clients such as the Colorado Center for Personalized Medicine. But in that case all the data handed over to the search giant was encrypted, with keys being held only on the medical side.
It sounds like they are migrating to Google Cloud for some of their infrastructure.. and someone (and the paper) are sure trying to raise a stink about it. its misleading in a few spots, but not quite lying, hard to tell if its intentional, or technical incompetence.
For example, all storage in google cloud is encrypted, and the keys are rotated every 24 hours by default. There is an audit log you can see, any time anyone at google touches your data. (very, very few people can actually access the keys)
https://cloud.google.com/blog/topics/inside-google-cloud/our...
The real problem, though, is that this data was transferred without even notifying patients, let alone getting their consent.
Honesty it’s one of the areas where (in the right hands) I think the benefits of collecting large amounts of data for targeting services to people is justified.
Is Google ‘the right hands’? That’s probably worth debating.
https://www.hhs.gov/hipaa/for-individuals/guidance-materials...
>What about patient data? All of Google’s work with Ascension adheres to industry-wide regulations (including HIPAA) regarding patient data, and come with strict guidance on data privacy, security and usage. ... To be clear: under this arrangement, Ascension’s data cannot be used for any other purpose than for providing these services we’re offering under the agreement, and patient data cannot and will not be combined with any Google consumer data.
That said, most people do not understand how HIPAA works (I am in no way saying you are one of these people). Unless you are a healthcare provider (think doctor) or a business that is supporting those providers (think 3rd party tools built specifically for managing healthcare records) it's pretty difficult to have a legitimate HIPAA complaint made against you.
If google is able to get these, what’s stopping anyone else?
"In addition, business associates of covered entities must follow parts of the HIPAA regulations."
All vendors must comply with HIPAA laws (ie EMR systems)
Not because I think what Google did breaks HIPAA laws - there are many sub-threads below that can explain that better than I that this doesn't violate HIPAA - but rather the question helps highlight where we should truly be upset.
What Google did was legal; because of that, we should be upset that the government / regulatory bodies created an environment such that this was legal. Rallying against a publicly traded company of 100's of 1000's of employees for doing something "immoral" is not a productive use of your energy.
(The irony here is I usually am _against_ more regulation!)
It's a similar argument I made with the whole martin shkreli debacle: senators & congressmen/women got their picture day grilling him with the whole "how could you price gouge these poor, sick people?" But his consistent response was, essentially, the inverse: "How could you create an environment where this is totally 100% legal?"
But suddenly this company, Google, makes it immoral? It seems to me that if you care this much about private companies having your data, you should switch to a publicly owned healthcare system.
This is the most scary part[0]. I'm sure plenty here would disagree, but I simply don't (yet) share your optimism for A.I.
[0] Not that the rest isn't scary.
The casual data-grabbing that they're doing is most-scary to me, since I would hope that my data would never be accessible to them/others as readily as I now suspect it is.
Can I please have my life back please?
It'd be convenient if we could assume perfect personal responsibility, but human behavior doesn't align with that assumption.
(i.e. the question in their minds is "Is the data safer in the source repositories?" And it's probably not).
After all, many companies trial new ideas and technology in house.
So would be insightful into what companies like Google do inhouse.
[0]: https://www.zdnet.com/article/google-employees-protest-dont-...
[1]: https://www.theverge.com/2019/7/16/20695964/google-protest-l...
[2]: https://www.cnn.com/2019/05/01/tech/google-employees-protest...
[3]: https://www.vox.com/recode/2019/6/19/18691870/google-employe...
[4]: https://www.theverge.com/2019/5/1/18525473/google-employee-s...
Complaint Requirements
Anyone can file a health information privacy or security complaint. Your complaint must:
Be filed in writing by mail, fax, e-mail, or via the OCR Complaint Portal
Name the covered entity or business associate involved, and describe the acts or omissions, you believed violated the requirements of the Privacy, Security, or Breach Notification Rules
Be filed within 180 days of when you knew that the act or omission complained of occurred. OCR may extend the 180-day period if you can show "good cause"
I mean, filing a complaint is free but I imagine it should have a grievance attached to be taken seriously.
login, click on the wheel, and the delete link is at the bottom.
That has no effect on the central theme of the story (which is health care firms partnering with Google as a Business Associate, and thereby sharing patient data).
> Among the documents are the notes of a private meeting held by Ascension operatives involved in Project Nightingale.
The whole article is written like they are trying to tell a spy story which brings into question the credibility that there's any wrong doing.
As a UK based paper Guardian could at least focus on British issues
tl;dr: Unrelated avenue in a field they've been interested in for a long time.
Google and other large companies have made some significant AI advances in the last decade & I think it's in all of our interests to see if these advances can lead to improvements in health care.
Yes, it's scary how much data these companies have collected about us, but there are other things in the world which are even more scary, like heart attacks and cancer. I think we need to stop having an automatic knee-jerk reaction every time a company gets access to our data, especially if proper legal protocols with privacy protections are being followed, as it appears to be in this case.
Of course, I would love to live in a world with 100% perfect personal privacy AND perfect treatments for all diseases, but we don't live in that world: In our world, as we move forward, there are going to be difficult tradeoffs between health innovation and patient data access: We should try to navigate these tradeoffs in a level-headed way, without just insisting on greater walls around all data in every instance.
Last thing we should do is have radically open medical data. Some busybody parent could go out and search all the kids in in her kids' school who might have HIV or something. Or imagine all the crazies out there searching for a list of women in their town who have had abortions.
The only thing you do with open medical data is ratchet up the "crazy" in society. In an ideal world where everyone is rational, it's fine. But that world doesn't exist.
Sure, we may want to have properly-designed legislation to come up with standards across databases that make the use of such sensitive data for combining records less necessary, but we better make sure it's well designed or we could end up slowing down medical innovations.
Encouraging is fine, but in the end, consent from individuals should be obtained and if it isn't, that data should be omitted.
If that's the standard society wants to adopt, so be it- But it might come at a dramatic cost in slowing down medical innovations. Personally, that seems like a bad tradeoff to me, but who can say for sure?
For different takes on the same story, "Previously" with a link to earlier discussion, may be better.
For evergreen topics (e.g., Bertrand Russell's "In Defence of Idleness", submitted many times through the years, and again a day or so back), "earlier submissions" noting the years, of 2-3 top instances, can point to earlier interesting discussion.