TL; DR: It is abundantly important to be transparent and honest about how data is collected, used and shared from the outset of your service or as early as possible, or you can lose the trust of your customers that once (if) you do become transparent, they may not believe your intentions; case-in-point: Facebook.

Last week, in the wake of the landmark ruling by the EU Court of Justice, Facebook’s CSO, Alex Stamos, issued a blog post on their use of the datr cookie (the one that is the primary subject of the recently updated Facebook privacy policy and EU COJ ire).

One of the key tenets that I believe all companies should follow is that of transparency (and by definition, honesty) about how data is being collected, used and with whom it is shared.

In this post, Stamos discusses the ways that the datr cookie is used for security measures, which as a fellow CISO, I understand and support.  However, Facebook has had a history of being less-than-complete in its description of how data is being used and collected.  One such example is Facebook Beacon, a programme from 2007 which was declared by Facebook to only track users in a particular way to create a targeted ad platform.  A researcher from CA, Stephan Berteau, discovered that not only did Beacon track users whilst they were using Facebook, it also tracked users after they logged off of Facebook and any time they got to a page that had a Facebook “like” link on the page .  Facebook later ended Beacon and settled a number of privacy-based lawsuits.  Since then, the fundamentals of Beacon have trickled back into Facebook and a very robust tracking mechanism has grown within the application.

For too long, Facebook has been less than transparent (one might even say opaque) about the ways that they collect, use and share data and therefore people have grown a significant distrust of Facebook when it comes to privacy.  As such, when I informally polled friends and colleagues on the statement made by Stamos on 15 October 2015, the response was unilaterally that of disbelief and underlying mistrust.  The best quote I heard was “this may be a true statement to the letter of what is written, but you can bet they are also mining the shit out of the data they do get for the time that they have it.”  Again, distrust.

Let’s say for a moment, that the information presented by Stamos is completely accurate, the lack of information on how else the data is used or a declaration that data is not used for any other purposes (both of which are missing from the Stamos statement), then the appropriate transparency might finally be present and begin to re-establish Facebook as a transparent company when it comes to how data is collected, used and shared.

For you, reader, the request (and recommendation) is this: as you develop your privacy policies on data, please consider the benefit of starting out with transparency so that your users can begin their interaction with you with a level of trust that has been lost on many current consumer-grade providers.  Let your users make educated decisions on how they want their data to be used before they sign up for your service; help them understand the benefits they get in return for the use of their data and they can enter into the relationship with their eyes wide open.  Lastly, establish an ethical review process akin to an IRB for current and future use of your customer’s data and publish the underlying principles so that your customers know who they are really getting in bed with.  Similar to in-person relationships, if you have trick someone into using your service, or if you are hiding your underlying intentions, the relationship will not be sound.

Back to Facebook for a moment… on Saturday, a post on Facebook came out that attempted to show the value of the way that security aspects of the way Facebook monitors and tracks usage, this time bringing out the spectre of state-sponsored hacking.  I read this less as a way to protect you, but more as a visible defence of the tracking that they do with a focus on a single benefit, without review of the offsetting detractors.  This is not atypical to the physical “protect your security vs defend your privacy” debates that have gone on with airport security, border security and other security topics since 2001.  In this case however, it feels very posturing, with intended misdirection and with some level of theatre for the benefit of the EU regulators that are currently reviewing the case of how Facebook tracks usage and its ability to continue to move such data outside the EU.