Digital Policy European Union International Markets Internet Technology & Innovation United States and Canada

GeoTech Cues

September 25, 2020

Transfer of EU user data to the United States halted

By Richard J. Cordes

Richard J. Cordes is a researcher focused on Narrative and Memetic Warfare, Complex Systems, Knowledge Management Systems, and Cybernetics. He founded the Cognitive Security and Education Forum and contributes research to a variety of working groups and committees across DoD, IEEE, and the Private Sector on topics like Gray Zone Warfare, Knowledge Management Technology, Optimization of Human Learning, and Decentralized Systems. 

Why We Need Next-Generation Data Governance 

Ireland’s Data Protection Commission, an EU privacy regulator, has issued a preliminary order to Facebook Inc. that demands a suspension of the transfer of EU user data to the United States, where regulations on user data are less stringent. The order must be complied with by mid-September, else Facebook will be fined for 4 percent of its annual revenue, amounting to billions of dollars. It is not likely that Facebook will be the only company to be issued such an order—and Ireland’s Data Protection Commission is unlikely to be the only privacy regulator to issue one, as more than five thousand companies are expected to be affected by these kinds of challenges to trans-Atlantic data flows. The European Data Protection Board has already claimed it is forming a task force to investigate other companies on similar grounds.

This order to Facebook isn’t the result of some sudden hypocrisy by an Ireland that hosts Facebook’s regional headquarters and that was happy to embrace and boast about a massive Facebook data processing center just outside Dublin but instead the result of an EU-wide legal decision in July. This decision attacked the legitimacy of an European Union-United States data transfer agreement known as “Privacy Shield” on the basis that it was found to be in conflict with EU privacy law. The conflict stems from Section 702 of the US Foreign Intelligence Surveillance Act (FISA), which allows for mechanisms of data surveillance and aggregation that contradict the privacy rights of EU citizens granted by the General Data Protection Regulation (GDPR) as well as the EU Charter of Fundamental Rights and the European Convention on Human Rights.

While this recent data protection order may be considered a win for privacy activists now, it’s not clear that it will be in the long run. Privacy Shield is actually a replacement framework, developed in haste in response to the removal of Safe Harbor, Privacy Shield’s predecessor. It is important to note that Safe Harbor was removed on functionally identical grounds to Privacy Shield. There are talks already underway to replace Privacy Shield with some new system, and it is likely that whatever system comes next will also run counter to the interests of privacy activists.

It could be argued that Safe Harbor, Privacy Shield, and whatever system will come next are all replacements for what should have been a common narrative between Western nations on the nature and meaning of data, as well as a common narrative between their citizens about the nature and meaning of the word “free” in a world where so many services are driven by advertising revenue. Even with data governance as it exists now and with no prolonged storage of the users’ data points themselves, there is nothing stopping a firm from using the data as it appears to drive the development of a “model of your behavior” from which they can source and sell predictive analytics.

This move to stop trans-Atlantic data flow to maintain privacy also comes at the same time as calls to increase data sharing in response to COVID-19. While privacy and sharing may seem to be mutually exclusive, this is not necessarily the case. The EU’s GDPR compliance requirements don’t disallow sharing of data—they just require consent, transfer accountability, breach reporting, and anonymization in that sharing, as well as the clear assignment of a person or persons to be held accountable for meeting these standards while the data is under an organization’s control. These requirements map perfectly to the common standards for data on human subjects in research like that being done on COVID-19, which require documentation, anonymization, and accountability. While the tendency has been to achieve these goals with regulation, these are all problems which can be solved through new data governance standards.

This is about securing data supply chains end to end.

Dr. Michael Zargham, CEO, Blockscience

Computation Verification, Proof of Authority, and Hardware Integration” are some of the most important requirements for next-generation data governance, said Dr. Michael Zargham, CEO of Blockscience and visiting researcher at the University of Vienna’s Interdisciplinary Institute for Cryptoeconomics and expert on data governance. Dr. Zargham designed and implemented automated decisions systems for data management in the advertisement technology industry and has since worked on the economics and governance systems of very large, decentralized networks.

Computation verification refers to the ability to verify whether some function was carried out correctly. Blockchain, made famous by the cryptocurrency Bitcoin, derives its name from its form of computation verification, in which each “block” in a “chain” is verified by multiple parties to ensure accuracy without a centralized ledger to check against.

Proof of authority is essentially a type of consensus algorithm where you assert that you have the right to verify a block because you can prove that you are acting as an entity who has the right to do so,” said Zargham. As an example, before a new block of Bitcoin transactions can be confirmed, a majority of the network holding the blockchain ledger needs to participate in its verification. This method limits transactions-per-second (TPS), but there are proof-of-authority methodologies which do not rely on large networks, such as those which make use of data trusts, proof of stake, or specially made hardware.

Specially made hardware allows for far more complicated computation verification than what is used in the distributed software which verifies bitcoin transactions and can perform it without the need to check with the majority of the network. Dr. Zargham noted that “if you’re forcing people to go all the way to spoof physical devices rather than hijack the information system, you’re making it far more expensive to attack the system.” Speaking to hardware integration’s role in the implementation of next-generation data governance technologies, Dr. Zargham stated that “hardware integration is critical in that it reduces the operational complexity in application of these technologies, mitigates security risks by binding roles and rights to specific devices, and allows for applications in the internet of things (IoT), robotics, and defense without opening new attack vectors… Ultimately, computation verification, proof of authority, and integration of hardware—this is about securing data supply chains end to end.

Data standards which meet these requirements while being paired with the right incentives for widespread adoption could usher in a new era of individual governance over usage of data while also allowing for sharing and accessibility at scale. Governance of this kind would allow a user to anonymize their data and make it accessible for specific use cases as well as to know where their data lives and when it was accessed at any time. If paired with identification standards, individuals could also be alerted when data was being created about them and informed of who owns it. If the standards were made reasonable for legal use cases, there would be no reason that they could not be applied at scale in healthcare and in healthcare research as well. While facing COVID-19, there was a significant increase in positive sentiment in the UK regarding sharing health and general personal data, and it’s likely that this change could have been far greater if participants felt that they had clearer expectations of privacy and transparency. Trust is essential to organizational cohesion at all scales. That being said, it should be unsurprising that many of the social tools which are responsible for the development of human civilization, such as contracts, courts, codified language, and governments can be represented as mechanisms which are meant to make it easier to extend trust to one another absent of preexisting relationships. In a time when institutional trust is in free-fall, new tooling to help institutions be more transparent in their usage of data may be exactly what is needed.

If we want next-generation privacy and sharing, we will need next-generation data governance. And if we want next-generation data governance, we will need next-generation data standards that allow for individual governance through chain-of-custody and computation verification in business, operations, legal, technical, and social use cases. Such a standard would likely remove any need for yet another replacement system for Privacy Shield—echoing the words of Thomas Jefferson, the data could be “bound down from mischief by the chains of [its] constitution.” In addition, the development of such a standard would also help to align US and European understandings of data ownership.

Other data governance articles

GeoTech Cues

Sep 15, 2020

Why data governance matters: Use, trade, intellectual property, and diplomacy

By Pari Esfandiari, PhD, Gregory F. Treverton, PhD

Global data and internet governance represents a scattered, multi-stakeholder, bottom-up, and driven by loose coordination among various players. Data governance can be thought of as incorporating a triangle of individuals and their privacy, nation-states and their interests, and the private sector and its profits. Its current status and prospects might be thought of along several lines of activity, which are interrelated but, for the sake of clarity and with some danger of oversimplification, are discussed in the following different sections: privacy and data use; regulating to police content; using antitrust to dilute data monopolies; self-regulation and digital trade; intellectual property rights; and digital diplomacy.

Cybersecurity Digital Policy