Notes from the Vault
Larry D. Wall
June 2019

The Atlanta Fed's 2019 Financial Markets Conference Mapping the Financial Frontier: What Does the Next Decade Hold? featured policy panels, academic paper presentations, and three keynote speakers exploring various developments having a significant impact on the financial system. One panel discussed concerns about data privacy and availability. A second panel discussed some of the potential for blockchain technology to reshape wholesale financial markets. A third panel reviewed some of the major changes that have occurred in the markets for residential mortgage-backed securities.

This post reviews these panels and two related academic paper presentations. A companion macroblog reviews a panel on the way the Fed plans on implementing monetary policy. It also discusses a keynote speech by Federal Reserve Board Chairman Jerome Powell on the risks posed by corporate debt and a keynote speech by Simon Johnson, the Kurtz professor of entrepreneurship at the Massachusetts Institute of Technology, on revitalizing the U.S. economy. More information on all of the sessions is available at the conference agenda page, which has links to the various sessions' audio, transcripts, and slide decks.1

How do we navigate between under- and overregulation of data?
An increasingly important issue is that of control over data, including an individual's privacy rights. The policy panel on data began with Douglas J. Elliott, a partner at Oliver Wyman, setting the scene for the discussion that follows. Elliott's presentation, which drew heavily from his recent paper, made three main points. First, control over data will "matter greatly for financial policymakers in the years ahead." Second, the policy issues in this area are "really complex." Indeed, Elliott counts at least seven different ways of framing the problem, making the design of policy a "seven-dimensional chess" game. Third, public and private members of the financial sector need to come together to agree on principles. The financial data differ in a variety of ways from other sorts of data, so it is important that the public policies being adopted work for the financial system.

Elliott then turned the podium over to Alessandro Acquisti, a professor at Carnegie Mellon University, to present his paper that reviews the theoretical and empirical economic literature on privacy and data. Acquisti's review highlighted both how complicated the topic of data privacy is and how little we know about the economic issues. He noted that at the micro (individual) level, there are valid arguments for privacy protection, but there are also benefits from sharing information. At the macro (societal) level, the picture is even more mixed. For example, privacy rules may limit the ability to obtain social gains, such as improved medical outcomes, but in other circumstances, the lack of privacy regulation can encourage the inefficient collection of data. Further, existing studies abstract from the real world in ways that may have important implications for regulatory policy. For example, the studies typically assume consumers are fully informed and economically rational, but "the world of data is afflicted by pervasive information asymmetries." Moreover, most studies analyze the effect of different rules on the attainment of only one policy goal, whereas policymakers typically consider a mix of goals.

A presentation by Tara Sinclair, a professor at George Washington University and a senior fellow at Indeed, followed Acquisti. Sinclair discussed the potential for the enormous amounts of data collected by private firms to help in understanding "economic conditions, social trends, and that sort of thing" to provide important new insights. One of the benefits of using data collected by private firms is that these data generally reflect individuals' actual behavior and not survey responses that measure what people think they might do. However, the data collected typically also have several limitations. Private firms are collecting data based on their own needs, which are not necessarily the data most informative about public concerns. The data are "rarely representative" of the entire population or any subgroup other than those who voluntarily interact with the firm.

The last panelist to give a presentation in the data policy session was James Stoker, the director of analytics innovation at SunTrust Banks, who is responsible for areas such as fraud and anti-money laundering. Stoker stated that he appreciated the damage that can be done by a lack of privacy and that in his experience the trade-off between data privacy and the accuracy of the forecasts is not that large. He also explained the importance of separating the tasks of constructing the data set from that of analyzing the data. Such separation greatly reduces the risk of inadvertently using data items for inappropriate purposes.

Prior to the policy panel, Christopher Tonetti, a professor at Stanford University, presented a paper titled "Nonrivalry and the Economics of Data." As the title suggests, Tonetti's paper emphasizes that the use of data by one agent does not preclude its usefulness to another agent. This is in contrast to other factors of production, such as machines and labor, where their use by one firm at a point in time necessarily precludes their use by a different firm at the same time. Building on this property, Tonetti analyzes a model in which data are a valuable input into the production process of firms. This provides an incentive for firms to collect and hoard data to gain a competitive advantage. However, from a consumer and social perspective, this hoarding may be socially suboptimal. Although an individual firm's profits may decline, society would be better off if multiple firms could use the data to improve their production process. In this model, giving consumers control over their data leads to greater sharing and improved social outcomes.

Discussant John Abowd, chief scientist and associate director for research and methodology at the U.S. Census Bureau, noted that in Tonetti's model, privacy was a binary issue—once the consumer has surrendered his or her information to someone else, the privacy loss is complete. However, Abowd argues that a lot of work has gone into "privacy preserving ways of releasing data" that allow data users to gain valuable insights about social issues without obtaining information about individual consumers. Abowd observed there are ways of adding noise to the data that preserves some level of privacy. However, this noise may reduce the predictive accuracy of models. The trade-off then becomes one between the degree of privacy afforded individuals versus the loss in predictive accuracy. Abowd also noted another mechanism to facilitate both privacy and data availability is to have a trusted third party—such as the U.S. Census Bureau—collect data and control its usage.

Will blockchains lead us to a trustless world?
Along with concerns about privacy and availability of data, an ongoing concern is the accuracy of whatever data are available. The concerns related to accuracy come in a variety of forms, including the concern that someone may tamper with the data for fraudulent purposes, according to a paper presented by New York University professor David Yermack. He opened the policy panel on blockchain technology by explaining how blockchains make such fraud "a profoundly more difficult problem." As a result, a large number of major financial institutions have been using blockchain technology in a variety of potential applications.

Yermack pointed out that the various technological developments underlying blockchains were first put together as a part of the whitepaper proposing bitcoin. Yermack observed, however, that the so-called permissionless version of the underlying blockchain technology distributes the entire history of transactions (ledger) to everyone who wants a copy. That openness is not a good database property for many wholesale financial applications. As a result, Yermack notes that virtually all of the applications of blockchain technology to wholesale financial markets use the so-called permissioned blockchain, in which access to the ledger can be restricted.

Among the many financial applications that have been considered, Yermack focused on two popular use cases for blockchains: improving the speed and accuracy of transactions between banks, and using them in shipping and logistics. Yermack noted that the estimated gains from using blockchains for interbank transaction settlement, such as one recently announced by JPMorgan, is approximately $20 billion. He further observed that the use of blockchains could reduce speed settlement in shipping, thereby reducing the capital tied up in logistics by potentially as much as $5 trillion.

Atlanta Fed visiting scholar Warren Weber, the panel's moderator, introduced another potential application of blockchains. He observed that the problem of verifying a firm's financial accounts could be simplified if transactions between two firms were shared with a central third party (such as an accounting firm). If both firms agreed on the terms of the transaction, it would be recorded permanently on a blockchain.

The other two panelists agreed with Yermack's description of the capabilities of blockchain and distributed ledger technology, but they pointed to a number of issues involved in its development and implementation. Keith Pritchard, a director at JDX Consulting, discussed two questions that he said should be addressed before starting a blockchain project in wholesale financial markets. His first question was, "Is the capability that the blockchain brings—is that solving a problem that we have?" Although the answer to this question depends upon the specific application, for a large number of financial applications the answer was essentially, "Yes, it is solving a real problem in allowing all of the parties to see the same data."

Pritchard's second question was, "Is there an existing technology that also brings that capability…?" If the answer is yes, he argued that we should consider distributed ledgers and blockchains to be "just be part of our tool kit" along with other technologies. In response to his second question, Pritchard answered that for many use cases, there are other ways to distribute data that are more efficient than blockchain technology.

Martin Walker, the director of banking and finance for the Center for Evidence-Based Management, listed several "fundamental challenges" with the development of any information technology system in his presentation. He asked whether blockchain technology helped overcome these problems. Walker's first challenge was of the accuracy of the data. Although blockchain technology addresses the problem of tampering with the data after it enters the system, the use of a blockchain per se does nothing to make sure the data entering the system are correct.

Another challenge Martin raised is that of getting the relevant parties to agree on standards on how to model and structure the data. Such an agreement is needed so that computers know what to do with it and people understand it. Agreement on standards can become rather difficult when different organizations or even different parts of the same organization have to reach an agreement on data standards. Martin concluded by observing, "People are building plausible and interesting systems…using blockchain technology." However, he added that blockchains are not the magic ingredient.

Along with a policy panel focused on permissioned blockchains in wholesale financial markets, the conference also included a presentation of an academic paper titled "The Economic Limits of Bitcoin and the Blockchain," by University of Chicago professor Eric Budish. The paper focuses on the vulnerability of the bitcoin mechanism for reaching consensus called proof-of-work. Private, permissioned blockchains limit the set of parties who can write to a blockchain to a select few people who trust in one another. In contrast, the permissionless blockchain underlying bitcoin allows anyone to participate. This leaves bitcoin potentially open to the double-spending problem, where someone tries to rewrite history so that he or she can respend a bitcoin previously paid to someone else. The bitcoin's proof-of-work has so far successfully prevented such double-spending. However, Budish's analysis shows that the transactions size on bitcoin are too small relative to the costs of conducting such an attack. If sufficiently large transactions ever arise on bitcoin, attackers are likely to be motivated to attack bitcoin successfully. As a result, Budish is skeptical about the ability of this sort of blockchain to become a major component of the global financial system.

The discussion by David Andolfatto, an economist at the Federal Reserve Bank of St. Louis, highlighted the key insight of Budish's paper. The "miners" who are writing records to a blockchain must not only be compensated for their services, but that compensation must be sufficiently large so that they do not have an incentive to try double-spending. The incentive to double-spend is directly proportional to the maximum feasible transaction size; thus, as transactions sizes increase, the fees earned by the miners must also increase.

How has the housing finance system evolved since the crisis and where is it headed?
Both a keynote conversation and a policy panel addressed postcrisis developments in housing finance. In the keynote conversation, Atlanta Fed president Raphael Bostic interviewed Freddie Mac chief executive officer Donald Layton. Layton made the point that prior to the crisis, the housing related government-sponsored enterprises (GSEs) "had lots of problems that needed fixing." However, he argued that "almost every one of them [the big weaknesses] has been addressed … in a reasonable way."2 As one example of a problem that had been addressed, Layton summarized some of the progress that has been made on reducing the GSE's credit risk via the use of CRT (credit risk transfer). In particular, for about the last year, Freddie Mac has put "almost all the catastrophic risk out to investors."

The policy panel on the housing finance system included four presentations, with one providing more detail on the GSE's use of credit risk transfer. Moody's Analytics chief economist Mark Zandi's presentation included a chart showing the distribution of credit risk since 2001. He found that since the adoption of CRT in 2013, such transfers now account for almost 20 percent of the risk in residential mortgage originations. Although Zandi called CRTs a "slam-dunk success," he also raised two possible points of concern. First, they have only been issued in a "pristine credit environment." That leaves open the question of how they will perform in a "risk-off" environment. The second concern is that CRTs may be more costly than the GSEs issuing more capital.

The panel's moderator also discussed a bigger-picture issue related to the funding of mortgage-backed securities (MBS). The presentation by Atlanta Fed economist W. Scott Frame discussed the changing proportions of securities held by different investors. The two GSEs, Freddie Mac and Fannie Mae, held about one-third of MBS in 2002. That proportion declined to 15 percent in 2007 and has continued to decline postcrisis due to limitations imposed on them as a part of their bailout and conservatorship. Shortly after the crisis, the Federal Reserve started implementing its large-scale asset purchase program (also known as quantitative easing, or QE) which resulted in the Fed holding 31 percent of all agency MBS. However, the proportion of MBS held by the Fed has fallen in recent years. Frame ended his presentation with some discussion and questions about who will replace the share of MBS previously held by the GSEs and the Fed.

The other two panelists addressed specific issues that could have a significant impact on the housing finance system. Andrew Davidson, president of Andrew Davidson & Co., discussed the rules related to the so-called qualified mortgage. A regulation issued by the Consumer Financial Protection Bureau creates a requirement that mortgage lenders verify borrowers' ability to repay their loans. This regulation allows lenders to avoid detailed analysis if a loan meets either of two conditions: (a) the borrower's debt-to-income ratio remains below 43 percent or (b) the loan is approved by one of the GSEs. This second condition is referred to as the "QM patch" and is set to expire in 2021. With that background, Davidson analyzed the predictive power of the 43 percent debt-to-income ratio in determining which mortgage loans default. He found the 43 percent ratio is not by itself a good measure. Indeed, loans above the 43 percent ratio but otherwise conservatively underwritten had lower default rates than loans below that ratio. Davidson's results suggest the need for a more comprehensive measure of mortgage loan risk.

The final panelist, Michael Bright, chief executive officer of the Structured Finance Industry Group Inc., discussed a specific weakness in the housing finance system that could have major consequences. Bright focused on the process by which mortgage payments by homeowners are aggregated and passed on to MBS investors. This process is important because of the large sums going through each month, roughly $120 billion. It is also important because the U.S. government guarantees the MBS guaranteed by Ginnie Mae, raising the possibility that late payment by Ginnie Mae would constitute default by the U.S. government. Bright noted a number of weaknesses in this chain. One is that individuals' payments are first sent to mortgage servicers who are not regulated for safety and soundness, and in many cases have limited ability to deal with loan defaults. Bright raised a second example, the risk that operational problems could impede payments. He observed that such problems happened twice in his tenure as the head of Ginnie Mae. Bright called for increased regulatory attention on the way payments pass through the financial system to reduce the risk of delayed payments to MBS investors.

Conclusion
The financial system is constantly evolving in response to changes in technology and government policy. The Atlanta Fed's 2019 Financial Markets Conference helped to map the emerging new frontiers in privacy, blockchain technology, and mortgage finance.

Larry D. Wall is executive director of the Center for Financial Innovation and Stability at the Atlanta Fed. The author thanks Mark Jensen for helpful comments. The view expressed here are the author's and not necessarily those of the Federal Reserve Bank of Atlanta or the Federal Reserve System. If you wish to comment on this post, please email atl.nftv.mailbox@atl.frb.org..

_______________________________________

1 The agenda page also has links to post-panel interviews with some of the session participants and conference attendees.

2 Layton's comments about GSE developments are referring to changes made at the GSE he heads, Freddie Mac, and at Fannie Mae.