Working Toward a Privacy Framework for the “Big Data” Era

Posted by Peter Cullen
Chief Privacy Strategist, Microsoft

Over the past several months, we’ve been convening discussions with some of the world’s foremost privacy thinkers, including representatives of regulatory bodies, government policymakers, academia, NGOs and industry to explore alternate models for privacy in a modern information economy. At meetings in Washington, D.C.; Brussels; Singapore; Sydney and Sao Paulo, we’ve debated how best to evolve the notice, choice and consent model to better meet changing societal needs. Yesterday, we advanced those discussions at a global forum here in Redmond, Washington. 

Microsoft has a long-standing commitment to privacy and, as part of Trustworthy Computing’s 10-year milestone last January, Corporate Vice President Scott Charney suggested that, in a world of connected devices, technology-enabled information use, and the emergence of “big data,” it’s time to consider evolving the frameworks that have governed aspects of the protection of personal data. He proposed a model that shifts focus toward acceptable use of data, and he suggested specific ways to hold organizations accountable for its management, as opposed to the current common themes of collection limitation, notice and choice. 

First, some background. Forty years ago, the first Information Privacy statutes were enacted. After intense discussions in North America and Europe at the end of the 1970s, a number of privacy principles emerged under the concept of Fair Information Practices, and later became the foundation of the OECD Privacy Guidelines. They form the basis of most privacy legislation around the world. At their core, they require that the processing of personal information must be lawful, which in practice means that it is either explicitly permissible under the law or that the individual whose personal data is being processed has – after being informed of reason, context and purpose of the processing – given her consent.

Almost everywhere an individual ventures online, she will be presented with long and complex notices, routinely written by lawyers for lawyers, and then requested to either “consent” or abandon the use of the desired Internet service. That very binary choice is not what the privacy architects four decades ago had envisioned when they imagined an empowered individual making an informed decision about who to permit processing of his or her personal data. And, in practice, it certainly is not the most optimal mechanism to ensure that an individual’s information privacy is being protected.

Asking people to police the use of their data leads to a disproportionate, unsustainable burden of responsibility, and doesn’t create the right incentives to protect privacy. Equally challenging is the fact that in the age of big data, much of the value or utility of information is not apparent at the time of collection, when notice and consent would normally be exchanged; the value, often to society, is determined through analysis. 

The previous global discussions and this week’s forum are only the beginning of what we believe is a much-needed examination of how privacy practices must evolve to support individuals in a more complex and data-driven society, while at the same time allowing all of us to benefit from the value of information-driven innovation. It is a process that must happen collaboratively across geographic boundaries and across public and private sector interests.

I look forward to sharing further developments stemming from our global discussion series in the coming months, as well as identifying the next steps we all must take to redefine privacy protections.

Comments (1)

  1. Sanchezjb says:

    Peter, very much enjoyed reading your post.

    You wrote that "in the age of big data, much of the value or utility of information is not apparent at the time of collection."  I respectfully disagree.  The potential value of an individual's personal information, his/her contacts and relationships, and what they do online is very apparent.  It's another matter for analysis to make this potential value real.

    If organizations want to benefit "from the value of information-driven [read: data that originates from individuals], they will need to determine how to also make that innovation meaningful and valuable to the individuals that the data comes from.  People are becoming much more privacy-conscious than they were two – three years ago.  Pew Research just released a report that found "more than half of mobile application users have uninstalled or avoided certain apps due to concerns about the way personal information is shared or collected by the app."  See…/Mobile-Privacy.aspx.

    There are four key challenges here:  1) We need to move away from an ad-hoc approach to protecting individuals' online privacy.  2) Make privacy policies easy for people to understand. 3) Establish significant penalties if privacy tenets are violated rather than the all too common "slap on the wrist" reprimands we read about today. 4) Provide meaningful and valuable incentives to individuals if organizations want to share their data.

Skip to main content