Posted by Brad SmithGeneral Counsel & Executive Vice President, Legal & Corporate Affairs, Microsoft
This week, I had the opportunity to speak in Uruguay at the 34th International Conference of Data Protection and Privacy Commissioners. This conference brings together leading authorities on privacy from more than 50 countries, including many of the key government officials and regulators responsible for privacy policies around the world. It provides a great opportunity to engage in a dialogue about one of the most important topics facing our industry today. The theme of the conference was “Privacy and Technology in Balance,” a theme that describes well both the challenge we face and how we think about the goal.
In my remarks, I focused on a few key questions. First, does privacy still matter? And second, how has technology changed the nature of privacy? I also talked about the way we all need to come together – the technology industry, advertisers, government, publishers and others – to shape a thoughtful and consistent approach to privacy that respects the needs and expectations of consumers while balancing the many other benefits that today’s technology and use of data can provide.
Some people in our industry argue that privacy does not matter to consumers anymore – just look at Facebook and the fact that a billion people are freely sharing all sorts of information about themselves every day. In Microsoft’s view, it is clear that privacy still matters, and the success of Facebook in fact is a good example of this. Five years ago, MySpace was the leading social network with 100 million users, while Facebook had 24 million users. Today, Facebook has more than a billion users. The differing ways that the two companies handled privacy defaults was an important part of this story. The default settings on MySpace five years ago allowed everyone to see your profile.
On Thefacebook (as it was originally called), the default allowed people to see only those profiles of others at the same school, or profiles of people that had been explicitly friended. Facebook had a degree of privacy built in and, as it turned out, people preferred greater control over how and who they shared their information with. In a similar way, a recent Pew Research study found that 56 percent of consumers had decided not to complete an online transaction because of the data they were expected to share, and 30 percent had uninstalled an app from their phone over privacy concerns. So, yes, privacy matters to consumers.
But technology has changed the nature of privacy. It used to be that privacy was equated with secrecy. This in fact has defined legal thinking and analysis over the past century. But it is clear today, in a world where people share so much, that people no longer equate privacy with secrecy. People want to share more personal information, but they want to decide who they share information with, and they want to determine how their information is used. Privacy practices have not always kept pace with these changes in expectations or technology. Privacy practices have frequently focused on “notice and consent” – telling customers what you are planning to do with their data and getting their consent.
The problem with this approach, as a recent study at Carnegie Mellon concluded, is that the average consumer would need to spend 76 days a year reading the privacy notices that they are asked to review and approve for the Internet services they use. This, coupled with the fact that 41 percent of people hardly ever or never read privacy policies, illustrates the fact that this model, by itself, is insufficient to protect consumer privacy needs. It is clear we need some fresh approaches.
In thinking about privacy, we need to strike a balance and ensure that we also take advantage of the benefits of technology and innovation. In my speech, I spoke about the positive impacts of capabilities like “big data”, the use of data to improve products and services, and the role data plays in combatting security risks and nuisances like spam. And I talked about the importance of preserving the broader ecosystem of partners including websites, content owners, advertisers and publishers. We believe part of the answer lies in approaches like Fair Information Practices, which reflect a consensus on best practices and of transparency, accountability, security, fair limitations on the use of data, as well as notice and consent and other concepts.
We also think part of the approach has to rely on thoughtful regulation – providing clear and consistent approaches in every country around the world. We need self-regulation, including new industry standards. Finally, we think any solution needs to include market-based innovation as well.
We believe our recent decision to turn on the Do Not Track (DNT) signal in Internet Explorer 10 and Windows 8 builds on these concepts. Based on our research, this reflects what our customers want: 75 percent of the consumers we surveyed in the U.S. and Europe said they wanted DNT on by default. But as I discussed in my keynote, we recognize that this is not the end of the story.
We in fact need four things to come together to make DNT a success. First, we need a final and effective DNT standard that is adopted by the W3C. We need a standard that provides real privacy protection to consumers, and we need a standard that recognizes the legitimate and reasonable needs of all participants in the ecosystem.
Second, we believe that privacy will benefit if we all recognize that browser vendors should have the ability to turn the DNT signal on or off when they ship a product. If you look at standards around the world, they specify the technology but they don't tell companies whether they have to turn it on or keep it off.
Third, we believe that browser vendors should clearly communicate to consumers whether the DNT signal is turned on or off and make it easy for them to change the setting. We made changes in August to do this for IE 10 and Windows 8. We recognize that you cannot have privacy without transparency, and we recognize that we have an obligation to ensure that it is clear to consumers how our product is configured. And there is room for an ongoing conversation across the industry and more broadly about the best ways for vendors to communicate this information to consumers and the best ways to enable them to change this setting as they use the product themselves.
Finally, there's a fourth piece that has gotten too little attention in our view. There needs to be an easy and effective way for responsible advertisers and ad networks to inform consumers and obtain persistent consent for their services even if the DNT signal is turned on. Just because the signal is turned on doesn't mean that a consumer wants no services that involve tracking. It means instead that consumers are empowered to make their own choices, including selecting services that involve tracking from advertisers and ad networks they trust.
Whether we’re thinking about DNT in the narrow sense or privacy more broadly, we're reaching an important moment. Technology has changed and consumer expectations are changing too. We need to work together to shape a thoughtful way forward – a path that puts people first by ensuring that privacy and technology remain in balance.
You can read a transcript of my remarks and see my slides here.