(Or, why do we accept a world without opt-out?)
Earlier this week, I got an email update from CrashPlan notifying me about updates to the terms and conditions of their service. Included in their email summary of the upcoming changes was the following line:
We learn a lot about your solution needs through aggregated and anonymized system usage data. We, along with third parties we may engage, may use this information.
Now there is nothing new or unusual about this phrase, and that is precisely what has got me writing this post. Namely:
The collection of aggregated and anonymised data has become acceptable and mandatory in order to use anything.
Be it my television, or my telephone or even my lightbulbs, if I want to use a product, I have to give the manufacturer the right to harvest data about how I use it. Shouldn’t I have the right to decide if they can use my data or not?
I applaud software developers like the OmniGroup (and in this specific case, OmniFocus) who ask you when you start their products if they can take usage data from you. Unfortunately, they are far from the norm
Why did I call out Americans? Because, as a non-American acquaintance once put it, “Americans will give up everything and their grandmother’s pants (underwear) for a free packet of chips (french fries).” Whilst not elegantly put, it is inherently true: Americans will trade privacy for services, or cheaper prices, or both. As a consumer population, we have also become so complacent to phrases like we may collect data, aggregate it, use it and sell it to anyone we want that we don’t even flinch when a lightbulb or a toy: 1) has a EULA, 2) has reporting back to the manufacturer on how you use it, 3) doesn’t give you the ability to opt-out.
Some of this comes as we see companies deliver on greater transparency about how they collect, use and share the data that they collect from their users. Unfortunately, it reveals the very unpleasant nature of data collection. For example when Microsoft, in anticipation of their data-mining juggernaut release of Windows 10, updated their EULA in July 2015, they granted themselves very broad authority to collect nearly everything you do on a Windows box and use it “to make Windows better.” A good analysis was done by EDRI on this and the results make one very afraid to use Windows, even for legitimate purposes.
And AT&T advertises a fibre-optic “GigaPower” service at a price that includes, by default, the mining of your usage enabled and authorised to use the data for their own purposes. Users have to pay an extra $30 (if they can even find out how to activate this fee – it is not easy) to not be data mined.
And once each year I get a note from Chase that says “hey, you, consumer. You use our credit card and we are totally going to sell that data to anyone who will buy it and you can’t do a darn thing about it since we are increasingly ceasing to be a cashless society and you need to have cards to function.” OK, that is not an exact quote, but it does say this (which roughly translates into my paraphrasing above):
I understand why the companies want to do all this data collection and related sale of it – money! Your privacy has value to them and they want to reap that as much as possible. And now we know an approximate value since AT&T charges $360/year/household to keep your usage private. It’s no wonder they want to “monetise” this.
Back(up) to CrashPlan
So now that we have been through the data collection and resale landscape and the seeming acceptance of it by the masses, we head back to the main topic of
Why on earth should the people who I give my drive backups to with every lick of every piece of data I own not give me a way to opt out of data collection about my data?!?!? Now it may be that they have no intention of using data beyond the “is our app running right” notion, but they don’t say that in the EULA. They do tell me in big, bold letters that I still own the data (well, thank goodness for that)
CODE42 ACKNOWLEDGES THAT YOU HAVE ALL RIGHT, TITLE, AND INTEREST IN THE USER DATA (AS BETWEEN YOU AND CODE42), AND IN NO WAY DOES THIS EULA GIVE CODE42 ANY OWNERSHIP RIGHTS TO THE USER DATA. YOU SHOULD ONLY USE THE CODE42 PRODUCTS AND SERVICES WITH USER DATA TO WHICH YOU HAVE FULL RIGHT, TITLE OR LICENSE.
But the part on Usage Data is vague and does not say what the data is, what it will be used for or who they will share it with.
“Usage Data” means any and all aggregated information reflecting the access or use of the CrashPlan Software and Code42 Products or Services by or on behalf of you, including visit-, session-, or stream-data and any statistical or other analysis, system health information or data based on or derived from any of the foregoing.
So I have no idea if the data I am backing up over their service is subject to analysis and sale or not (e.g. People who live in Ann Arbor and are 6’0″ and male have 35% of their backed up documents in .FIT format. Let’s market more Garmin gear to them!)
Libraries want to do it right
I have been lucky enough to be part of the Consensus Framework to Support Patron Privacy in Digital Library and Information Systems project over the past year and have seen a great model of how two key topics can build trust in your data collection efforts and give people the option to choose how their data is collected, used and shared.
Transparency: Quite simply, tell your users that you are collecting data, what specific data you are collecting, what you are going to do with data and who else will get to see it and how they will use it. By doing this, you ensure that your users are fully aware and can go into the next principle (opt-out) able to make an informed decision.
Opt-Out: Let the users choose if they want you to collect data or not; don’t assume that they do and force them to figure out how to have it stop. Make them explicitly say “Yes. I know what I am getting into, and the value I will get back as a result. Go ahead and have this data. It’s A-OK!” Of course, there are some data that are just part of the use of an application – call it out (see Transparency), but there is much that is kept “for marketing purposes” or “to make our system better” but doesn’t actually contribute to making the system better, just to making the bottom line better thanks to selling the data. If you let the end-user make the choice, they may decide to let you, but they also may not.
There are many other great principles that are part of this framework, and I encourage you to go to the NISO web site to take a deeper look, or ask me for more details and I will be happy to share my perspectives on the framework as a whole.
I worry about what data will be given to companies so that they can sell us more things, and that we have no ability to opt-out. I worry that with the advent of IoT (all with EULAs that grant the right for ubiquitous and vague data collection) and data breaches increasing, that Idiocracy is not too far away. Did I see a billboard for Brawndo on the way home?