Report of the Commission on the Geopolitical
Impacts of New Technologies and Data

Chapter 3. Enhanced trust and confidence in the digital economy

Scroll down to navigate and learn more

Enhanced trust and confidence in the digital economy is founded upon personal privacy, data security, accountability for performance and adherence to standards, transparency of the internal decision-making algorithms, and regulations and governance for digital products and services. Trust and confidence in the digital economy is diminished by practices that do not protect privacy or secure data, and by a lack of legal and organizational governance to advance and enforce accountability.1Amon, “Toward a New Economy of Trust.” Data breaches, malware embedded in downloaded apps, unfiltered mis- and disinformation, and the lack of governance models to effectively address these harms all contribute to the degradation of social and civic trust. This degradation undermines economic and civic confidence, is costly,2World Economic Forum, “Why trust in the digital economy is under threat,” accessed March 26, 2021, http://reports.weforum.org/digital-transformation/building-trust-in-the-digital-economy/, citing an estimate by McAfee that the costs associated with cybersecurity incidents approximated $575 billion in 2014; Accenture, Securing the Digital Economy: Reinventing the Internet for Trust, 16, accessed March 26, 2021, https://www.accenture.com/us-en/insights/cybersecurity/_acnmedia/Thought-Leadership-Assets/PDF/Accenture-Securing-the-Digital-Economy-Reinventing-the-Internet-for-Trust.pdf#zoom=50. Cites five-year loss of foregone revenue from 2019 to 2023 to be $5.2 trillion, calculated using a sample of 4,700 global public companies. constrains the growth of the digital economy,3Congressional Research Service, Digital Trade and U.S. Trade Policy, 11, May 21, 2019, accessed March 26, 2021, https://crsreports.congress.gov/product/pdf/R/R44565; Alan B Davidson, “The Commerce Department’s Digital Economy Agenda,” Department of Commerce, November 9, 2015, accessed March 26, 2016, https://2014-2017.commerce.gov/news/blog/2015/11/commerce-departments-digital-economy-agenda.html. Davidson identifies four pillars: promoting a free and open Internet worldwide; promoting trust online; ensuring access for workers, families, and companies; and promoting innovation. and has destabilizing effects on society, governments, and markets. Trust and confidence in the digital economy is essential for open societies to function, and for resilience against cascading effects of local, regional, or national economic, security, or health instabilities.

Finding 3: To enhance trust and confidence in artificial intelligence and other digital capabilities, technologies must objectively meet the public’s needs for privacy, security, transparency, and accountability.

The growth of digital economies is changing how trust is valued by institutions, businesses, and the public.4Frank Dickson, “The Five Elements of the Future of Trust,” IDC, April 22, 2020, accessed March 26, 2021, https://blogs.idc.com/2020/04/22/the-five-elements-of-the-future-of-trust/. The traditional view of trust is expressed in terms of the security of a business transaction. The increase in cyberattacks, identity theft, social media disinformation campaigns, and the use of autonomous decision-making software, introduces new factors that affect trust. Trust in a firm’s reputation and ethical practices, privacy protection, and how personal data are used depend on technology, business practices, and the public’s perception of how well these components of trust are protected.

Not everyone has the same perception of what is trustworthy. However, reaping the benefits of the digital economy requires a high level of trust among users. Therefore, government and industry should work to enhance the transparency and accountability of digital systems to improve trustworthiness. Challenges include the following: (i) views on personal privacy protection are context-dependent, vary by culture or location, and may be formalized in different terms across nations, regions, and states; and (ii) as automated decision-making algorithms proliferate, new applications reveal trust weaknesses regarding implicit bias, unethical use of personal data, and lack of identity protection.

Trustworthiness needs to be prioritized and empirically demonstrated in the evolving market. Building trust involves educating all participants on the fundamental value of trust in the digital economy and ensuring digital systems reflect individual and societal conceptions of trust. There must be national and international standards for judging how well technologies and systems protect trust. Professional organizations that audit for trust in the digital economy will strengthen accountability.

As European Union (EU) member nations work to conform national rules and laws to the General Data Protection Regulation (GDPR), the European Commission notes that these steps may strengthen trust relationships. Other nations propose that a global framework for cross-border Internet policies may be able to protect data security and privacy while still allowing national laws and regulations as a part of the approach if certain trust relationships are maintained. For both approaches, a set of rules or principles provides the foundation for trust.

The GDPR6“General Data Protection Regulation,” Intersoft Consulting, https://gdpr-info.eu/. establishes regulations for data security and privacy that apply to any organization that collects or uses data related to people in the EU. The entire data chain is covered by the GDPR, including data collection, processing, storing, and managing.

The GDPR comprises principles that govern data protection and accountability for those who process data. There are technical measures for data security, and organizational design principles for data protection. Data privacy is expressed in terms of privacy rights, including the right: to be informed, to rectification, to erasure, to restrict processing, to data portability, and to object, and the right of access. There are also rights in relation to automated decision-making and profiling. The governance mechanism centers on Data Protection Authorities that work to align each EU member nation’s approach to data security and privacy to conform with the GDPR. These Data Protection Authorities have enforcement powers and the ability to levy fines when a GDPR rule is violated.

Data privacy protection is vulnerable to advanced data analytics that can infer personal identifiable information by joining loosely related data sources. As a result, the growing use of current machine learning methods applied to large, multi-source data sets highlights potential limitations in the GDPR where such computational methods can infer data originally made private. The development of new data science capabilities may require research on new privacy-preserving technologies for nations to remain compliant with the GDPR. With increasing amounts of personal medical and genetic information being held in data repositories, this need is urgent.

Finding 3.3: Evolving US data privacy approaches consider outcome-based methods, versus prescriptive methods.

The development of data privacy laws in the United States is an evolving patchwork, with more than one hundred and fifty state data privacy laws proposed in 2019.8“2019 Consumer Data Privacy Legislation,” National Conference of State Legislatures, January 3, 2020, accessed March 26, 2021, https://www.ncsl.org/research/telecommunications-and-information-technology/consumer-data-privacy.aspx. There is no overall federal data privacy law.

One instance of federal legislation for data privacy proposed in the 117th Congress9“Information Transparency and Personal Data Control Act,” fact sheet, accessed March 26, 2021, https://delbene.house.gov/uploadedfiles/delbene_consumer_data_privacy_bill_fact_sheet.pdf; Information Transparency & Personal Data Control Act, H.R. 2013 — 116th Congress (2019-2020), accessed April 2, 2021, https://delbene.house.gov/uploadedfiles/delbene_privacy_bill_final.pdf. includes the following key privacy features, which are viewed as outcome-based.10“Developing the Administration’s Approach to Consumer Privacy,” Federal Register, September 26, 2018, accessed March 26, 2021, https://www.federalregister.gov/documents/2018/09/26/2018-20941/developing-the-administrations-approach-to-consumer-privacy; Alan Charles Raul and Christopher Fonzone, “The Trump Administration’s Approach to Data Privacy, and Next Steps,” Sidley Austin LLP, October 2, 2018, accessed March 26, 2021, https://datamatters.sidley.com/the-trump-administrations-approach-to-data-privacy-and-next-steps.

  • Transparent communication of the privacy and data use policy
  • Affirmative opt-in and opt-out consent
  • Preemption, in which the proposed statute would preempt most state laws with limited exceptions for data breaches, and other limited situations
  • A right to action, enforced at the federal or state level, to address alleged violations
  • Independent audit of the effectiveness and appropriateness of the privacy policy for each entity providing data services

Several bills11Setting an American Framework to Ensure Data Access, Transparency, and Accountability (SAFE DATA Act), S.4626 — 116th Congress (2019-2020), https://www.congress.gov/116/bills/s4626/BILLS-116s4626is.pdf; Online Privacy Act of 2019 , H.R. 4978 — 116th Congress (2019-2020), https://www.congress.gov/bill/116th-congress/house-bill/4978/text; COVID-19 Consumer Data Protection Act of 2020, S. 3663 — 116th Congress (2019-2020), https://www.congress.gov/bill/116th-congress/senate-bill/3663. introduced in the 116th Congress addressed a subset of the above features or are focused on COVID-19 contact tracing, health status, and identifiers. In addition, several bills introduced in the 116th Congress addressed disclosing how data are used or monetized by social media companies that enhance the accessibility and portability of a user’s data across devices.12Designing Accounting Safeguards to Help Broaden Oversight and Regulations on Data Act, S. 1951 — 116th Congress (2019-2020), accessed March 26, 2021, https://www.congress.gov/bill/116th-congress/senate-bill/1951. The informal reference, DASHBOARD Act, is found in articles about this bill; Public Health Emergency Privacy Act, S. 3749 — 116th Congress (2019-2020), accessed March 26, 2021, https://www.congress.gov/bill/116th-congress/senate-bill/3749. This has been reintroduced in the 117th Congress. Mark R. Warner, Warner, Blumenthal, Eshoo, Schakowsky & DelBene Introduce the Public Health Emergency Privacy Act, press release, January 28, 2021, https://www.warner.senate.gov/public/index.cfm/2021/1/warner-blumenthal-eshoo-schakowsky-delbene-introduce-the-public-health-emergency-privacy-act; Augmenting Compatibility and Competition by Enabling Service Switching (ACCESS) Act of 2019, S. 2658 — 116th Congress (2019-2020), accessed March 26, 2021, https://www.congress.gov/bill/116th-congress/senate-bill/2658.

The National Institute of Standards and Technology (NIST) Privacy Framework describes a risk- and outcomes-based approach to establishing privacy protection practices in an organization. Organizations can vary the technologies and design of the privacy protection aimed at satisfying performance outcomes. This may be advantageous when the technologies and applications are changing at a fast pace, e.g., artificial intelligence (AI) and the Internet of Things (IoT).13National Institute of Standards and Technology, “NIST Privacy Framework: A Tool for Improving Privacy through Enterprise Risk Management, Version 1.0,” January 16 2020, accessed March 26, 2021, https://www.nist.gov/system/files/documents/2020/01/16/NIST%20Privacy%20Framework_V1.0.pdf.

The variety of new ways to collect, process, and analyze data is increasing at a fast rate, while compliance often is determined on a case-by-case basis by regulatory and legal experts. To keep pace, automated testing for compliance with data privacy regulations is necessary.

While there are several federal data privacy laws specific to certain industries or groups, e.g., the Health Insurance Portability and Accountability Act (HIPAA),14Congressional Research Service, Data Protection Law: An Overview, March 25, 2019, accessed March 26, 2021, https://fas.org/sgp/crs/misc/R45631.pdf. the eventual form and scope of US data protection laws will depend on policy and legal considerations. A key decision concerns the model for data protection laws. The EU GDPR model is prescriptive; GDPR compliance involves demonstrating that the procedural rules were followed. An alternate model for data protection laws is outcome-based, which allows flexibility in how to achieve data protection.15Ibid., 56.

A choice between prescriptive versus outcome-based approaches must assess their relative costs and benefits and how the two approaches can work together. The proposed bills in the 116th Congress identify a robust set of data privacy features while promoting flexibility and innovation in their implementation; the GDPR model has greater worldwide traction, creating opportunities for harmonized regulatory treatment.

Finding 3.4: New information technologies compel automated compliance testing.

New information technologies and advanced data capabilities challenge current methods of compliance and enforcement. The variety of new ways to collect, process, and analyze data is increasing at a fast rate, while compliance often is determined on a case-by-case basis by regulatory and legal experts. To keep pace, automated testing for compliance with data privacy regulations is necessary.

Table 2 portrays some of the challenges and solutions for achieving automated compliance testing. This research agenda identifies the following key developments: standards, new privacy-preserving technologies, and automated methods to establish compliance.

Table 3. Big Data Value Association Strategic Research and Innovation Agenda
Challenges Solutions
A general, easy-to-use, and enforceable data protection approach Guidelines, standards, law, and codes of conduct
Maintaining robust data privacy with utility guarantees Multiparty computation, federated learning approaches, and distributed ledger technologies
Risk-based approaches calibrating data controllers’ obligations Automated compliance, risk assessment tools
Combining different techniques for end-to-end data protection Integration of approaches, toolboxes, overviews, and repositories of privacy-preserving technologies

Source: Timan and Mann 201916Timan and Mann, Data protection.

Privacy-preserving technologies are an active research area, and include the following:17Big Data UN Global Working Group, UN Handbook on Privacy-Preserving Computation Techniques, accessed March 26, 2021, https://marketplace.officialstatistics.org/privacy-preserving-techniques-handbook.secure multiparty computation, (fully) homomorphic encryption, trusted execution environments, differential privacy, and zero-knowledge proofs.

The value of privacy-preserving technologies involves trade-offs between privacy and utility—how useful is the resulting data—both of which are context dependent.18Daniel Bachlechner, Karolina La Fors, and Alan M. Sears, “The Role of Privacy-Preserving Technologies in the Age of Big Data,” proceedings of the 13th Pre-ICIS Workshop on Information Security and Privacy, San Francisco, December 13, 2018, accessed March 26, 2021, https://www.albany.edu/wisp/papers/WISP2018_paper_11.pdf; Felix T. Wu, “Defining Privacy and Utility in Data Sets,” University of Colorado Law Review 84 (2013), accessed March 26, 2021, http://lawreview.colorado.edu/wp-content/uploads/2013/11/13.-Wu_710_s.pdf. Affecting these trade-offs are the technical methods, the technical definitions of privacy, and the specifications of the privacy laws. The technical methods (e.g., anonymization, sanitization, and encryption) operate on data in different ways. The technical definition of privacy varies by application and the user’s perceptions of risk versus the benefit of making personal data available. Privacy laws vary across nations, challenging the uniform application of technical methods. For both professionals and members of the public, making trade-offs between privacy and utility remains challenging. This is partially due to the absence of definitions of and standards for measuring privacy and the social benefits obtained from making data available for use by others.

Finding 3.5: Trust and confidence in digital capabilities requires businesses and governments to focus on the responsible use of technology.

Increasing trust and confidence in emerging technologies, such as AI, requires a recognition by both businesses and governments that they have an obligation to use technology responsibly, ensuring that technology has a positive impact on society, especially with regards to equality and inclusion.19Kirsten Martin, Katie Shilton, and Jeffrey Smith, “Business and the Ethical Implications of Technology: Introduction to the Symposium,” Journal of Business Ethics 160, 307–317 (2019), accessed April 16, 2021, https://doi.org/10.1007/s10551-019-04213-9 Developing and innovating responsibly means ensuring that (i) ethical frameworks and policies exist to guide organizations during all aspects of a product’s development and deployment, (ii) fairness in design is emphasized from the outset, and that (iii) questions around the manner in which technologies will be used are given the same rigorous examination as technical issues. As technological capabilities evolve and become more deeply intertwined in all aspects of society, businesses and governments must put ethics at the center of everything they do.

Approach 3: Build in trust-enabling technologies, measure performance against standards, conduct independent compliance audits.

The digital economy relies on achieving a high level of trust and confidence on a continuing basis as technologies evolve. Trust and confidence-enabling technologies must be developed and built into the components of the digital economy infrastructure; a detailed understanding of the trade-offs between privacy versus utility is an essential foundation. Such technologies must be paired with similar civic norms, practices, and rules designed to enhance confidence in the digital economy. To assure businesses that they remain compliant with data protection regulations as they modernize their practices, automated compliance testing, accompanied by standards of performance, is needed. To establish transparency for automated decision-making algorithms, standards for the measurable performance, i.e., the output results, are necessary. Independent assessments of the compliance testing and algorithmic transparency by professional auditing organizations could enhance trust among all participants in the digital economy and aid accountability and governance; such methods should be explored. However, mechanisms for compliance testing and auditing by regulators are also necessary.20Nicholas Confessore, “Audit Approved of Facebook Policies, Even After Cambridge Analytica Leak,” New York Times, April 19, 2018, accessed March 26, 2021, https://www.nytimes.com/2018/04/19/technology/facebook-audit-cambridge-analytica.html.

Recommendation 3: Develop international standards and best practices for a trusted digital economy that accommodate national rules and regulations, streamline the process of independently assessing adherence to these standards.

Recommendation 3.1: Develop a US data privacy standard.

Congress should create a national data privacy standard that embodies the following principles: (i) appropriate use of data: this defines the intended purpose for the collected data, the scope of what can be collected, the needed security, and the entities that are covered by the principle; (ii) nondiscriminatory use: the collected data cannot be used to discriminate against protected classes; (iii) informed participation: the individuals must receive the privacy policies in a transparent manner before data are collected, and provide affirmative express consent, including the ability to revoke consent and require destruction of the data or the movement of the data as directed by the individual (i.e., portability); (iv) public reporting: covered entities must periodically report on the data collected, retained, and destroyed, and the groups of individuals from whom the data were collected; (v) independent audit: the performance of covered entities with respect to the data privacy standard must be annually audited by an independent auditing organization, with parallel mechanisms to accommodate auditing and review by regulatory agencies; (vi) enforcement: federal and state enforcement organizations are given the authority to pursue violations of the laws for data privacy protection; (vii) preemption: this would preempt state privacy laws that are inconsistent with the proposed national standard; and (viii) consumer protection laws: the privacy standard would not interfere with consumer protection laws on issues apart from data privacy.

The data privacy standard should recognize gradations in the sensitivity of personal data—some personal data are treated more strictly than others. Affirmative express consent should be structured based on the types of data and how they will be used.

Congress should work to develop a national data privacy standard that can achieve global interoperability and should request an analysis of emerging privacy standards and issues that limit this achievement. Congress also should use the proposed national data privacy standard to inform the development of transparent national consumer data privacy laws that preserve individuals’ control of their personal data and facilitate the development of trusted networks and applications.

The results should establish federal data privacy standards for personal data, establish standards for content moderation by information providers, and should regulate platform providers’ ability to conduct experiments or surveys with users and user data without prior consent.

Recommendation 3.2: Develop privacy-preserving technologies for the digital economy and demonstrate in a full-scale test their conformance with the General Data Protection Regulation.

The administration should direct NIST to establish and test privacy-preserving technologies that enable a risk- and outcomes-based approach to trust in the digital economy. The test should evaluate, at scale, conformance with relevant GDPR rules, conformance with existing US laws governing data privacy, and robustness with respect to innovations and advances in information technologies and data capabilities, especially those based on AI, machine learning, and the IoT. This work should include the development of technical definitions of privacy and application-specific measures of the utility of analyses that are based on privacy-protected data. The tests should include end user evaluations.

The administration should establish a near-term program that demonstrates privacy-preserving technologies to aid the trusted collection and sharing of data for the purpose of improving individuals’ access to healthcare during large-scale biological events. This program should be jointly managed by NIST, the Department of Health and Human Services (HHS), the National Institutes of Health (NIH), and the National Science Foundation (NSF). This program will monitor system performance to inform the development of standards for the ethical use of the shared data and how data governance will be formulated.

Recommendation 3.3: Create measurement methods and standards for evaluating trust in the digital economy.

The administration should direct the National Institute of Standards and Technology (NIST) to establish methods for evaluating users’ trust in the digital economy given the increasing use of AI, big data analytics, and automated decision-making algorithms. This work builds on the Commission on Enhancing National Cybersecurity’s Report on Securing and Growing the Digital Economy21Commission on Enhancing National Cybersecurity, Report on Securing and Growing the Digital Economy, December 1, 2016, accessed March 26, 2021, https://www.nist.gov/system/files/documents/2016/12/02/cybersecurity-commission-report-final-post.pdf. and the National Strategy for Trusted Identities in Cyberspace.22White House, “National Strategy for Trusted Identities in Cyberspace, Enhancing Online Choice, Efficiency, Security, and Privacy,” April 2011, accessed March 26, 2021, https://obamawhitehouse.archives.gov/sites/default/files/rss_viewer/NSTICstrategy_041511.pdf. One assessment framework example23Bhaskar Chakravorti, Ajay Bhalla, and Ravi Shankar Chaturvedi, “How Digital Trust Varies Around the World,” Harvard Business Review, February 25, 2021, accessed April 16, 2016, https://hbr.org/2021/02/how-digital-trust-varies-around-the-world#:~:text=To%20that%20end%2C%20in%20partnership,user%20experience%3B%20the%20extent%20to. describes measures of: “(i) user trust in the digital environment, e.g., data privacy, security, private sector efforts to control the spread of misinformation, and private sector adherence to cybersecurity best practices; (ii) the user experience, i.e., the effort needed to interact with the digital environment; (iii) user attitudes, e.g., how trusted are government and business leaders; and (iv) user behavior, i.e., how much do users interact with the digital environment.”24Appendix A provides several references on the topics of trust and countering digital misinformation.

The administration should create a coalition to develop international standards for achieving trust in the digital economy. The coalition should include representatives from NIST, the Federal Trade Commission (FTC), private industry, Federally Funded Research and Development Centers (FFRDCs), University Affiliated Research Centers (UARCs), and international standards organizations. The United States and like-minded nations and partners should develop national assessments of trust in the digital economy using these standards.

Recommendation 3.4: Empower an organization to audit trust in the digital economy.

Congress should establish or empower an organization to audit the efficacy of measures designed to ensure trust in the digital economy and assess conformance to current and future standards designed to enhance and maintain such trust. Independent third parties or the Government Accountability Office (GAO) are examples of where such auditing organizations could be housed. 

As part of this process, the auditing organization could provide recommendations to Congress on legislation that would enhance existing trust measures, develop new trust measures, and create trust performance standards. The auditing organization should also provide a mechanism through which the public and industry can raise topics and concerns for attention and, for cases where assessments or audits were done, include an ombudsman function for assessment appeals, identification of new information, or adjudication of concerns in a manner distinct from political influence.

The administration should work to establish a similar auditing program with EU members of the International Organization of Supreme Audit Institutions.

Recommendation 3.5: Assess standards relating to the trustworthiness of digital infrastructure.

Congress should direct an assessment by the National Academies of Sciences, Engineering, and Medicine of the current national and international standards relating to the trustworthiness of digital infrastructure to support the digital economy. “Trustworthiness of an information system is defined as the degree to which an information system (including the information technology components that are used to build the system) can be expected to preserve the confidentiality, integrity, and availability of the information being processed, stored, or transmitted by the system across the full range of threats.”25National Institute of Standards and Technology, Security and Privacy Controls for Information Systems and Organizations, Special Publication 800-53, Revision 5, September 2020, accessed April 16, 2021, https://nvlpubs.nist.gov/nistpubs/SpecialPublications/NIST.SP.800-53r5.pdf.

Due to the increasing complexity of the digital infrastructure, the assessment should also review design standards for complex systems-of-systems from the perspective of trustworthiness. The overall assessment focuses on systems that support the digital economy. The study should assess the sufficiency of existing standards to guide improvements in trustworthiness, identify where new standards are needed, and recommend the data collection and testing methods that would enable ongoing assessments.

Recommendation 3.6: Educate the public on trustworthy digital information.

Congress should establish a grant program led by NSF for the purpose of developing a curriculum on trustworthiness of information—distinct from the trustworthiness of information systems—in the digital age. This curriculum should be created by a consortium headed by a university or coalition of universities. The program should be administered by select universities, with the participation of US information providers. The goal should be to educate the public on how to assess the trustworthiness of information—its credibility, truthfulness, and authenticity, and to develop tools that students and members of the public can use and benefit from on a regular basis.

Recommendation 3.7: Conduct demonstration projects involving artificial intelligence to improve delivery of public- and private-sector services at local, state, and federal levels.

Congress should authorize and appropriate funds for AI demonstration projects that improve the delivery of public services.26A potential source for the types of initiatives of interest is the OECD Network of Experts on AI (ONE AI). This group provides policy, technical and business expert input to inform OECD analysis and recommendations. “OECD Network of Experts on AI (ONE AI),” OECD.AI, accessed March 26, 2021, https://www.oecd.ai/network-of-experts. The overall program would be managed by one of the National Laboratories or by a newly created FFRDC with the mission to leverage technology to improve the delivery of public services. These testbed projects would be supported by local and state grants, cross-cutting federal government efforts, and public-private partnerships (PPPs) to employ AI to improve healthcare, workforce training, food production and distribution, and other areas. The overarching goals are to increase public trust in, understanding of, and confidence in AI; to learn how to use AI in ways that reduce inequality and enhance, rather than replace, human work; and to improve access, affordability, and availability of such services. At local, state, and federal levels, individual government agencies will gain long-term benefits by acquiring the necessary data infrastructure to employ AI to improve the delivery of public services.

Recommendation 3.8: Produce a framework for assessing ethical, social, trust, and governance considerations associated with specific current and future use cases for AI.

The administration should request the National Academy of Sciences to produce a framework for assessing ethical, social, trust, and governance considerations associated with specific current and future use cases for AI solutions. The framework should identify where new federal standards and rules are needed. This guidance should be developed with the participation of relevant executive branch departments and agencies, and in consultation with private industry, academia, members of the public, and government and industry representatives from foreign partners.