How will my “Personal Information” be used?
One of the most annoying emails in my inbox is from a digital address app. Emails come from people I do not know asking me, via the app, to “update my contact information”.
This clearly states that “Personal Information” can be given or sold to a third party, but emails from the app tout that information is private. I never directly shared my email information with the sender, although there are numerous ways the sender or the app may have gotten my email. But I do not have a relationship with the sender or the app.
Without a relationship, there can be no trust.
The Issue: We Need a Trust Revolution
“The digital revolution needs a trust revolution. Huge shifts are occurring as the world moves towards comprehensive information sharing via social media, cloud computing and big data. Systems of record (such as email) have become systems of engagement (such as social media) and are now moving towards systems of intelligence (data analytics). However, this progress cannot occur unless customers trust how their data is used. The challenge: more than 90% of consumers feel they have lost control of their data.”
Recent Privacy Concerns in Healthcare
Before I write a post for HL7standards.com, I generally have read and collected quite a few articles on a particular topic. My “Consent of the User” list was overflowing. I am going to limit this post to three timely concerns in healthcare: Healthcare.gov, “matchbacks”, and 23andMe.
In case you missed it, Healthcare.gov was saving personal health data in referrer URLs from people using the system. This personal health data was also being shared with “third parties”, at least 14, according to the Electronic Freedom Foundation:
EFF researchers have independently confirmed that healthcare.gov is sending personal health information to at least 14 third party domains, even if the user has enabled Do Not Track. The information is sent via the referrer header, which contains the URL of the page requesting a third party resource. The referrer header is an essential part of the HTTP protocol, and is sent for every request that is made on the web. The referrer header lets the requested resource know what URL the request came from. This would for example let a website know who else was linking to their pages. In this case however the referrer URL contains personal health information.
According to MEDCITYNews, “At first, the administration defended the current standing of privacy standards, but advocates and lawmakers became very vocal and demanded changes.”
According to Bloomberg News, “matchbacks” are a little known process of assigning patients unique codes based on their prescription drug records. Marketers can then send tailored Web ads to patients. Federal regulators were not aware of this practice when contacted by Bloomberg News. It may be legal, but many do not consider it ethical. According to Bloomberg, matchbacks were also not addressed in privacy policies.
De-Identified, Anonymous and Confidential Have Different Meanings
Just because data are de-identified, that does not mean anonymous. Most people do not realize that de-identified, anonymous, and confidential all have different meanings, especially when it comes to research, which brings us to 23andMe.
Opt-In vs. Opt-Out
Vendors and apps often say that you can always opt-out. However, most people prefer a choice to opt-in. If technology wants to build trust, opt-in will need to be the model.
A Set of Universal Principles for Data Protection
At the WEF Annual Meeting, a set of universal data protection principles was called for.
- First, “consent” must always be requested and granted.
- Second, how personal data is used must be fully “transparent.”
- Third, heightened “accountability” must accompany higher levels of data access.
Is the Enterprise Cloud a Model for the Consumer Cloud?
“We all have to step up to another level of transparency, especially the vendors. So whether you are an enterprise vendor or a consumer vendor, we all need to open up a lot more to be able to say exactly where is the data, what’s going on with the data, who has the data, and if there’s a problem with the data – a security problem or some other issue with the data – immediate disclosure, complete and total transparency. No secrets. Because only through that transparency are we going to get to a higher level of trust. That is not where we are today.“We’re the enterprise cloud. Our customers are the GEs, the Philips, the BMWs, it’s their data. We can’t do anything without our customers saying what we can do. It’s their data. They tell where they want it, how they want to use it, what applications are using it. We can’t see it, the data is black to us, it’s encrypted. But that very much is a model from where the consumer companies are going to have to go. Enterprise companies can’t do anything without their customers saying it’s okay. That’s our agreement with our customers that we sign with them. In the consumer world, you don’t know what’s going on, and that is going to have to change. Total disclosure is critical.”
Trust is about weighing trade-offs – how much privacy do I have, how secure do I feel – what are the benefits do I get, in exchange for that? You need to afford the individual trace and control. The user’s own their data. They should be able to examine it, take it with them, bring it to other sites, bring it to other vendors that they trust more. Basically, have a system and a market that helps people make these trade-offs and these decisions. But they should have control over how they use the system, or whether they use the system at all. People have trouble making some of these trade-offs because the vendors are not being transparent enough, not providing enough controls and choice.”
Tim Berners-Lee said that at MIT they are working on a new architecture for how we store data, and proposed “Beneficent Apps.”
Is what I am doing beneficent? Basically, is it good for users? Suppose we have a brand, this is a beneficent app, that means while I am writing the app, you are going to pay me for the app, and I am going to think about what you want. That’s the business model we are going to see.
Terms of Service, Privacy Policies
The moderator of the WEF panel, Nick Gowing, said the that Terms and Conditions are not the small print, “Terms and Conditions, No, that’s the Big Print.”
Terms of service and privacy policies may not identify what third parties can do with data. So even if you trust an app or service, you may not know what a third party can do with your data. This will become increasing important with the growth in consumer health data that is not necessarily patient data. In a world of convergence, the Internet of Things, wearable technologies and integrated health app platforms, we need to build with consent of the user.
Consent means, we won’t use your data for any other purpose, unless you approve it.