Notes from the Vault
Larry D. Wall
The recent breach at Equifax is more than an inconvenience for a large number of people—it will result in significant hardship for some. It represents an attack on one of the foundations of a modern economy. For a very large share of economic transactions, it is critical that both parties are confident that whom they are transacting with is who they say they are and will likely honor any future commitments. As one of three credit bureaus in the United States, Equifax provides identify verification services and is a source of information on the reliability of individuals in meeting their financial commitments. The breach allows the attackers falsely to claim to be someone they are not and, by doing so, create the impression that a person whose identity was stolen does not honor his or her commitments.
Although private-sector participants have incentives to manage carefully their information on an individual's identity and reliability, some of these incentives may be significantly different from what would be socially optimal. In response, governments around the world have increasingly adopted laws and regulations that allocate rights and responsibilities both to enhance economic efficiency and reflect other social preferences. However, as would be expected in a rapidly evolving field, governments are moving at different rates and adopting different approaches. The European Union (EU) is in the process of adopting an approach to individual privacy that differs in some important ways from the U.S. one. This post will provide a high-level summary of the recently adopted EU data privacy rules after first summarizing the recent Equifax breach and related rules currently in place in the United States.
Equifax first issued a press release on September 7 announcing a cybersecurity incident had occurred, affecting an estimated 143 million U.S. consumers. Information such as consumer names, social security numbers, and birthdates had been accessed.1 Additionally, the breach involved some consumers in Canada and the United Kingdom.
Some background on the breach, including a brief history, is found in a memo from the majority staff of the U.S. House of Representative Committee on Energy and Commerce. The breach arose because of a failure to patch a security flaw in at least one of Equifax's websites that relied on an open-source web application service called Apache Struts. The security flaw in this software was announced and a patch provided on March 7, 2017. However, Equifax did not install the patch, with the result being the breach first occurred around May 13, according to Equifax. Equifax security identified suspicious traffic associated with the breach on July 29 and took the affected application down on July 30. The firm contacted a cybersecurity firm on August 2 and made its first public announcement on September 7. Thus, it was over one month between the time the firm had evidence it may have been breached and the time it disclosed the breach. Additionally, an article by Lily Hay Newman, Wired security reporter, reveals Equifax stored some consumer data in plain text rather than in encrypted form, which likely made it easier for the intruders to obtain usable data.
U.S. data protection and breach disclosure laws
The United States has an assortment of laws that impose various data protection and breach disclosure requirements. The requirements are based on the type of information (sectoral requirements) and which state(s) are affected.
Several federal laws are potentially relevant to the financial information released by Equifax. Section 501 of the Gramm-Leach-Bliley Act (GLBA) imposes three requirements on financial institutions: (1) ensure the security and confidentiality of customer records; (2) protect against anticipated threats to the security or integrity of the records; and (3) protect against unauthorized access or use of the information that could result in substantial harm or inconvenience to consumers. A Federal Trade Commission (FTC) web page is devoted to discussing the requirements under GLBA for nonbank firms, such as Equifax. Additionally, Sections 503 and 504 of GLBA give consumers the right to opt-out of information sharing by their financial service provider with unaffiliated parties (see, for example, the Federal Deposit Insurance Corporation web page Your Rights to Financial Privacy).
Section 5 of the Federal Trade Commission Act bans "unfair and deceptive acts or practices in or affecting commerce." The FTC uses this section to prohibit companies from making deceptive claims regarding the privacy and security of consumer information. The FTC has used this authority in enforcement actions and obtained settlements in about 60 cases. Additional requirements arise under the Fair Credit Reporting Act, which limits access to credit data and use of related reports to those with a legally permissible purpose. See an article by Lee Matheson, an International Association of Privacy Professionals Westin Fellow, for a discussion of how federal laws, including the Fair Credit Reporting Act, may apply to Equifax.
Along with the relevant federal laws, 48 states have enacted breach notification requirements and 12 have laws dealing with commercial data security. These laws typically mandate that a company have "reasonable" security.2
EU data protection and breach disclosure laws
The EU recently adopted two important rules with implications for the control and security of personal data. The most relevant is the General Data Protection Regulation (GDPR), which was adopted in April 2016 and will become effective on May 25, 2018. A narrower measure, Payment Services Directive 2 (PSD2), was approved in December 2015 and will be implemented by the EU member states on January 13, 2018.
The European Commission's press release explaining the GDPR lists several benefits, including providing uniform rules for Europe that will enable "people to better control their personal data" and allowing business "to make the most of opportunities of the Digital Single Market by cutting red tape" and reinforcing consumer trust. Provisions giving individuals more direct control over their data include the right to data portability (data transfer to other service providers) and a limited "right to be forgotten."
The GDPR is intended to create a data security framework sufficiently flexible to work in the future. Thus, its scope is broad enough to include parameters such as genetic and biometric data. Its geographic reach will include many U.S. entities, as it applies to all companies that process data in offering goods or services to EU individuals, irrespective of the company's location, and companies that process but do not own individual EU data (such as Amazon Web Services). This extraterritorial scope requires that treatment of data transferred outside of the EU must be "essentially equivalent" (Recital 81). This comes after the striking down of the previous U.S.-EU "safe harbor" data transfer agreement by the EU Court of Justice in an October 2015 decision. The GDPR also establishes a single EU standard in order to avoid regulatory arbitrage within the EU.
The GDPR will try to achieve these goals by requiring that data protection be a critical part of the entire product life-cycle of firms. Organizations will need to:
- Assess the data protection impact of their activities and flag particularly high-risk ones (Article 35)
- Take security-enhancing measures into account when designing products and services (known as "security by design") and, by default, select techniques that are most secure (Article 23)
- Receive explicit consent from clients to use their data
- Continuously test, assess, and evaluate data protection systems/protocols (Article 32)
- Maintain data processing records (Article 28).
The GDPR's prescriptions are in general terms rather than in specific technical standards. Furthermore, protection must be risk based: the degree of data protection must be predicated on the assessment of underlying security risks (Articles 25, 28, 33, and 34). Firms must assess the state-of-the-art in data protection and the nature and scope of the risks as well as implementation costs. The GDPR encourages without necessarily requiring measures such as encryption and pseudonymization, or the processing of personal data in such a way that the data can no longer be attributed to a specific data subject without the use of additional information (Article 4 (3b)).
To enforce the data security provisions at the firm level, the GDPR requires new governance measures that increase accountability. First, many companies subject to the GDPR, including banks, will need to appoint a data protection officer (DPO) who is to be involved in all issues related to the protection of personal data (Article 38).3 Additionally, under Article 38 the DPO must be independent ("not receive any intrusions"), receive the active support of senior management, have adequate resources, and must not be dismissed or penalized for carrying out his or her duties. Equally significant, companies will be required to disclose data breaches quickly. If the breach may cause damage to the rights or freedoms of individuals, the organization will have to report it to the supervisory authority within 72 hours of discovery of the breach (unless there is justifiable cause for delay). The firm will also have to notify the individual(s) affected without undue delay if the breach is considered "high-risk" (Article 34). Firms that comply with the GDPR will qualify for a seal or mark that would signal the quality of their data protection standards to the market (Article 42).
In the event that internal controls and enforcement are not sufficient, the GDPR may impose significant external sanctions/disincentives. Articles 82 and 83 provide for both governmental fines and compensation to individuals who suffered damage. The maximum fines may be "only" the greater of €10,000,000 or up to 2 percent for infringement of some requirements, but as high as the greater of €20,000,000 or up to 4 percent of global revenue for some other types of infringement.
PSD2's focus is on creating a single market for financial services and enhancing competition by setting rules requiring payment service providers (such as banks) to share data with third parties that provide payment initiation and account information services. However, PSD2 starts from the same premise as the GDPR insofar as individuals should be able to exercise control over their data, including being confident appropriate security measures are being taken. Thus, PSD2 provides various standards for determining whether an individual has authorized data transfer and for secure channels to share data via application programming interfaces.4 The GDPR applies to the handling of data that may be shared under PSD2, while PSD2 assigns liability to the payment service provider (primarily, banks) for failing properly to authenticate the consumer has granted account access.
The United States has a variety of state and federal data security and breach disclosure requirements that appear applicable to the Equifax breach. Which requirements are relevant, and whether Equifax complied with those requirements, will likely to be the subject of future lawsuits filed by various governmental agencies and private individuals.5
An analysis of the merits of the United States adopting GPDR's and PSD2's provisions is far beyond the scope of this short post. We would, however, like to suggest several considerations related to data security.
First, the potential for internet-based commerce between individuals who do not know one another cannot be achieved if the participants are unable to verify the identity of the counterparty and the likelihood of that individual honoring the transaction.
Second, the frequency of intrusion and breach announcements raises the question of whether current systems are more vulnerable than necessary.6 The GDPR standard of "security by design" seems a better approach to security than adding security features after a system has been designed. Fully implementing this approach will take time, however, and may be successful if it expands beyond simply more effective security at financial institutions.
Third, even with the best systems, individuals and organizations still need to act responsibly. Here, economics would suggest that wherever possible, incentives should be aligned so that whomever is in the best position to provide enhanced security also bears most of the costs of lax security.
Larry D. Wall is executive director of the Center for Financial Innovation and Stability at the Atlanta Fed, and Steven Zitzer is professor of global financial regulation and fintech at Colegio Universitario de Estudios Finacieros (CUNEF) in Madrid, Spain, as well as a private investor. The authors thank Scott Frame for helpful comments. The view expressed here are the authors' and not necessarily those of the Federal Reserve Bank of Atlanta or the Federal Reserve System. If you wish to comment on this post, please email firstname.lastname@example.org.
4 Albeit, the technical standards for the application programming interfaces will not be effective until the third quarter of 2018 at the earliest.