Colin Bennett logo white

Blog Post:

Big Data and Small Agencies: Reflections on the 33rd International Conference of Data Protection and Privacy Commissioners

I first attended the annual meeting of the International Data Protection and Privacy Commissioners in 1987 in Quebec City.  In those years, the global community of commissioners amounted to about 30 delegates from around 10 countries.   No corporate representatives were invited, although there were a handful of academic and journalistic observers.  David Flaherty gave the keynote, drawn from his forthcoming book Protecting Privacy in Surveillance Societies.  He warned of the creeping levels of surveillance, the weakness of data protection authorities, and advised of the need for a broader social perspective beyond individual rights and redress.

Fast forward 24 years.   Hundreds of participants congregated in Mexico City under the gracious and efficient hospitality of the Mexican Federal Institute for Access to Information and Protection of Personal Data (IFAI).  The delegates represented data protection authorities from around the world, national governments, corporate interests, NGOs, and academics.  A good time was had by all, lubricated by generous amounts of Tequila at the social events.  And the disparate community once again discussed the creeping levels of surveillance, lamented the weakness of data protection authorities and spoke of the need for a broader social perspective beyond individual rights and redress.

Of course, many things have changed since 1987.  Data has got “bigger” for one.   I still don’t entirely know what big data means.   But  it is the latest fashionable buzzword for describing the era in which big corporations say that we currently inhabit.  Gus Hosein of Privacy International said he was tired of the glib descriptions of new eras and societies.  So am I.  And although the keynote speaker, Kenneth Neil Cukier of the Economist, was probably better than the average software developer, CEO, futurist, or technological guru who has seen fit to pronounce on the nature and future of the contemporary human condition, I was still left with the feeling that big data suffers from the same technological determinism as the other glib labels that have populated our discourse over the last 30 years.

Nevertheless, big data it was.   This was the new problem  — producing remarkable human benefits of course.   Inexorable and unquestionably good but carrying new challenges to privacy protection that need to be debated and resolved in this big new privacy community all gathered for the ostensible reason for figuring out a way that we can have our cake and eat it.  It is not a zero sum game, we are told.   Privacy protection is a means to innovation, a method of gaining consumer trust, a partner to security, a value that cannot and should not stand in the way of innovation. I do not want to discount the valuable debate at this conference.  But for the most part, I did come away with the feeling that the challenges of Big Data are being framed and debated through a string of clichés and platitudes, many of which could have been heard back in 1987.   Here are a few of my favorites.

Privacy is good for business.  Wrong!  Some elements of privacy protection are sometimes in the interest of some businesses at some times.   There is heaps of academic literature on the complex conditions under which business does and does not act in the public interest.  Charles Raab and I discussed these issues in our 2006 book The Governance of Privacy.  For the most part, the nuances in the literature get overwhelmed by the cozy narrative that corporations should and do embrace privacy in all its manifestations.  Of course the reality is that most corporations would be far better off if they could process any personal data they liked, anywhere in the world, without restriction.  And it is that tussle which is at the heart of the Internet economy and explains why so many corporate actors and their lawyers were present in Mexico, playing their part in the privacy show, claiming that we are all in this together, but also taking every opportunity to lobby the commissioners behind the scenes.

Privacy is a fundamental human right.  Privacy should not, as David Vladeck of the Federal Trade Commission said, be up for a vote. In the abstract, of course, he  is correct.  The problem, however, is that privacy is never understood or negotiated in the abstract.  Some people do get more privacy than others.  The struggle for privacy rights tends to pit vulnerable individuals, or poorly resourced civil liberties groups, against very powerful public or private organizations.  And the more that we hear this truism, the less effect it has.

Across the street from the plush Hilton Reforma Hotel where the conference was taking place and the majority of delegates were cocooned in an island of advanced industrial affluence, there is the Alameda Park.   There one finds another aspect of Mexico street vendors scraping a living selling street food, trinkets, chocolate bars, cigarettes and so on.  Here is the visible manifestation of center and periphery economies.  A chocolate bar in the Hilton cost $3 US; that in the Alameda cost 10 pesos (one fifth).  So what would this economy, the world outside the Hilton, the world that struggles to gain the basic material needs of life, make of the argument that privacy is a fundamental and universal human right?   That world hardly entered the  radar screen of the 33rd International Conference of Data Protection and Privacy Commissioners, save for one reference in the powerful closing speech of the Canadian Commissioner, Jennifer Stoddart, who pleaded with us to see this issue in broader global and economic terms. That world we were warned against, with dire warnings about walking the streets, buying the street food and so on.  The fact that the representatives of Facebook were given their own bullet proof vehicle to get to and from the Hilton in itself accentuates the two worlds the two privacies:  the one of big data and the other of real people.

The problem is not the technology, but the people who use it.  Here is another assumption that tends to resonate with the privacy community.   This one allows us to embrace innovation, to accept big data, and then direct our many messages and solutions to the human dimension, to fallible and/or rapacious individuals and organizations.  This tension was revealed in an exchange between Peter Fleischer of Google, and the representative of the Spanish Data Protection agency concerning current litigation in Spain about the right to be forgotten.  Google is a neutral search intermediary, Fleischer asserted, and should not be put in a position of censoring legal content.   Perhaps, but Fleisher ignores the fact that context is everything. The same data buried in a local Spanish newspaper can carry far greater risks to an individual when it appears on the first page of a Google search. Technology is not a neutral force, and data is not like oxygen as Ken Cukier argued.  Technology carries bias, and that bias can be pro-privacy or pro-surveillance.

So we need to build privacy into our systems from the bottom-up, right? — Privacy by Design.  I have a high regard for the work of my friends at the Ontario Information and Privacy Commissioners’ office.   They have  pushed a powerful message, which now has a resonance in many quarters of the community.  The message is also being translated into formal policy guidance in many countries.   We now have Privacy Impact Assessments for Privacy by Design PIAs for PbD.  The message is, however, in danger of losing focus. When PbD was originally coined, it tended to relate to specific privacy-enhancing technologies that could, for some organizations in some contexts, allow them to advance their goals without collecting identifiable information and without, therefore, subjecting individuals to enormous risk.  Today, however, PbD embraces the entire suite of desirable organizational and management practices designed to ensure that privacy is not that last minute add-on. And that message would have been heard 24 years ago as well.  I feel that it has lost focus, and is susceptible to the criticism that there is really nothing new.

Regulators need to speak softly and carry a big stick.  The DPAs have always been talking about the correct blend of carrots and sticks necessary to promote better data protection.  The debate tends to be engaged in terms of a number of false dichotomies: education versus enforcement; conciliation versus coercion; reaction versus pro-action; positive versus negative. At this year’s conference there was a sharp exchange between the British and Dutch data protection commissioners, Christopher Graham and Jacob Kohnstamm.  The former took exception to the latter’s expression of frustration with the private sector and his insistence that we need more strong enforcement.  The reality, of course, is that every commissioner needs to be strategic and pick the right tool for the right problem.  That choice will only partially be dictated by the personality of the regulator and the content of law.  More often, it is dictated by the scale of the problem and the power of the enemy.

The United States is different.   I am really tired of hearing American commentators from government and the private sector insist that there is something inherently different about the values and political culture of the United States which makes a comprehensive legislative solution along European lines simply incompatible with the American way of doing things. Every country has distinctive cultural and political characteristics.  The question is whether those differences should make a difference.   The reality, of course, is that the early theory of information privacy actually stemmed from US policy debates.  The decision to adopt a more pragmatic and sectoral approach, at least in relation to the private sector, was a political choice.  As new technologies have made it impossible to determine where one sector ends, and another begins, US policy makers have come to the belated conclusion that a comprehensive consumer bill of privacy rights enacted in law, is necessary.  But the means of enforcement, through codes of practice and regulation by the FTC for unfair and deceptive trade practices, will obviously not provide truly enforceable privacy rights, as would be evident if they bothered to study the experience with privacy codes of practice elsewhere.

There seems to be a conventional wisdom afoot that US and European approaches are converging.  I don’t see it.   The EU is certainly trying to simplify its directive, as well as embrace some of the new buzzwords of the community accountability and interoperability being the most fashionable.  At the same time, and even if current legislative proposals are enacted, the American approach to privacy in the private sector will continue to be fragmented and reactive.  The reason has little to do with fundamental differences in attitudes or approach.   Rather the American political system is so susceptible to pressure from private interests that no Congress will dare to enact the kind of robust protections in force in Europe. There are now 76 countries with comprehensive data privacy laws as a new study by Graham Greenleaf reveals.  There has been a remarkable convergence.  But that convergence tends to stop at the US border.

Although there is now a good deal of highly sophisticated debate about problems and solutions, there is also much that is depressingly familiar and repetitive within the discourse surrounding privacy rights and new technologies.  The constant repetition of these, and other, platitudes reinforces another powerful myth that this is one big community (a broad church) all seeking the same thing, better privacy protection.   There is a constant narrative about DPAs, advocates, corporations, governments and consultants all in the same boat rowing in the same direction:  tonterias (look it up!).   For all the cosy bonhomie at the social events, it is wrong to forget the real differences and conflicts about these issues.  People do not mean the same thing when they speak about privacy.  Scratch the surface of the vague rhetoric, ask penetrating questions about what words like accountability, consent, access, interoperability and so on, actually mean and you get very different answers.  Corporate actors do not agree with one another.  DPAs do not agree.  And certainly privacy advocates do not agree.

I have come to the conclusion, therefore, that these conferences do less good than they once did.  I have a deep sympathy for the plight of the DPAs.  They need to be able to meet and discuss strategy alone, away from the lobbying of the corporate sector, and the pressure from privacy advocates.  At the closed session, they countenanced the possibility of allowing DPAs to organize conferences with closed sessions alone.  That is a positive step.  No DPA should be put in the position of having to organize such a huge jamboree of a conference and to compromise its independence by seeking money from the very corporations that they could be regulating.   At the end of the day (and there is another cliché), the image that will remain in my mind of the 33rd Annual International Conference of Data Protection and Privacy Commissioners is that of independent commissioners sitting on the stage surrounded by aggressive corporate branding from the very companies, like Google, that have been the focus of their attention and investigation.  We all enjoyed the corporate hospitality, but I wonder whether the price for admission is still worth  paying.