On Wednesday, May 27, 2020, the Atlantic Council’s GeoTech Center and Accenture hosted Dr. Jennifer King, Director of Consumer Privacy at the Center for Internet and Society at Stanford Law School, and Ms. Jana Gooth, legal policy advisor to MEP Alexandra Geese, for the inaugural episode of the jointly presented Data Salon Series, which will host private roundtables preceded by publicly recorded presentations concerning data policymaking and governance.
The event was co-hosted by Mr. Steven Tiell, Senior Principal for Responsible Innovation at Accenture, and Dr. David Bray, Director of the GeoTech Center at the Atlantic Council.
Dr. King’s presentation began by identifying a source of frustration shared among data scientists and policymakers: that the consent and privacy agreements we have all mindlessly clicked through act as a barrier to helping consumers make informed decisions and are little more than a liability shield for companies. Further complicating the dilemma is the system’s lack of scalability—there are simply too many terms of service agreements updating to often for consumers to be able to meaningfully process that information. It is a process made by and for lawyers that has failed to adjust to consumer concerns about data usage, and Dr. King argued that a paradigm shift is required, not just tinkering within an existing framework.
Dr. King went on to describe several approaches to begin addressing that policy gap, gathered from research and forums. The first pushed for the development of personal user agents, or software that helps users aggregate and coordinate their relationship with their data and its privacy, much like a password manager. Others hoped to expand the knowledge base of policymakers, especially concerning human-computer interactions, through data visualization tools. A third approach hoped to consider data technology in terms of public spaces given its impending ubiquity in the forms of IoT, facial recognition, smart cities, and so on—how can we accommodate people in public places who don’t want to be recorded in some way? Similarly, a fourth approach emphasized the importance or proactive efforts to include the interests of marginalized, vulnerable communities not traditionally considered in the tech design process.
From a more regulatory perspective, data trusts were identified as a possible way to shift data ownership to the community level and away from the individual, particularly regarding genetic data. In addition, some sought to incentivize companies to use data responsibly rather than to use prohibitions and penalties. Further, many subscribed to the concept of algorithmic explainability—the notion that people providing data should be able to understand exactly what happens to it, who controls it, and what decisions it ultimately guides. Finally, some hoped to legislate limits on widespread public surveillance and develop a system of metrics for the harm associated with certain uses of data through an independent body. Dr. King ended her presentation with an appeal to reconsidering the Fair Information Practice Principles as a framework that constrains the ethical dynamics that must be considered in legislating data.
In her follow up, Ms. Gooth provided context from the EU perspective that aligned with Dr. King’s main points: that the EU’s GDPR and Data Privacy Direction still exist in the traditional notice and consent framework with no provisions regarding design. The closest thing to design-focused policy was a potential requirement for web browsers to default to the highest privacy settings, though the legislation has been stuck for years.
While the Data Salon format usually holds a portion of the discussion under the non-attribution Chatham House rule, our audience participants unanimously and graciously allowed for the unabridged recording of the entire event to be made publicly available. First, participants inquired about the possibility of using different groups or regulatory bodies to produce applications that would regulate an individual’s privacy settings for them, and about the possibility of requiring companies to track data use in the same way they keep records of financial flows (which is already required under the GDPR). One recurrent issue was the lack of appropriately skilled manpower to enforce future or extant regulations.
There were also concerns about the future of data: how likeness rights and associated laws would change in response to inferred information, how post-COVID data legislation would deal with slackened regulations after the pandemic, what will be done when most machines have collected enough data to carry out their tasks without further training, and how would legislation handle the myriad edge cases that about in a technology infused world—for example, what rights does a pedestrian have when an automated car uses their presence for data, or a shopper meandering through a mall?
Underlying the many questions raised by the discussion was the premise of design. One participant asked whether something could be changed in the business model behind data to move from companies’ avoidance of penalization to their pursuit of some good, perhaps through a new understanding of fiduciary responsibility. Another considered how the lens of anti-trust law might be applied, given that data collection often requires a stated end, limiting its use in yet-unknown studies. That consideration tapped back into discussions of data trusts, a construct that would pool data indefinitely, doing away with the purpose-limitation framework so common today. Others considered whether more granular consent would result, say for collection, specific inferences, and degrees of generalizability from those inferences. Another participant looked towards the narrow banking model, which would empower individuals to chose whether their data could be put to use, presumably in return for some fee like interest on a savings account, or be simply stored safely and left untouched.
Ultimately, the constant frustration of discussants was the sheer diversity of places data is gathered from and put to use in. Generalizable policy is difficult in that environment, particularly when trying to imagine a paradigm shift. Both speakers concluded on a similar note though: that successful policies about notice, consent, and privacy will require cooperation between industry, government, and consumers, and trust must be built between the three, particularly between regulators and industry and between consumers and the products they use.