Policy Hub: Macroblog provides concise commentary and analysis on economic topics including monetary policy, macroeconomic developments, inflation, labor economics, and financial issues for a broad audience.
Comments are moderated and will not appear until the moderator has approved them.
Please submit appropriate comments. Inappropriate comments include content that is abusive, harassing, or threatening; obscene, vulgar, or profane; an attack of a personal nature; or overtly political.
In addition, no off-topic remarks or spam is permitted.
September 23, 2022
How Has the Market Responded to Restoring Price Stability?
The Federal Open Market Committee (FOMC) implements monetary policy chiefly through changes in its federal funds rate target, and market participants form expectations about the evolution of future monetary policy decisions based on data they think are relevant to policymakers. Between the June and July FOMC meetings, incoming data started to suggest some parts of the economy might already be feeling the effects of tighter monetary policy. On the other hand, data since July tell a different story, one that suggests the FOMC still has a way to go in its efforts to fight inflation and restore price stability. So how have market participants interpreted these disparate pieces of data, and what could they mean for future monetary policy decisions?
In this post, I use the Atlanta Fed's Market Probability Tracker to understand how data since the June and July FOMC meetings have affected the market's expectations about the path of future monetary policy, similar to the analysis I did with Atlanta Fed economist Mark Jensen in a Macroblog post late last year. As another Atlanta Fed colleague, Mark Fisher, and I discussed in a Notes from the Vault article, the Market Probability Tracker generates estimates of the expected federal funds rate path based on eurodollar futures and options on eurodollar futures. Eurodollar futures and options deliver a three-month LIBOR (the London interbank offered rate) interest rate average, which is closely linked to the federal funds rate. Additionally, eurodollar futures and options are among the most liquid of financial instruments, with contracts that are expiring several years in the future regularly traded. As a result, the Market Probability Tracker generates our best estimates of expected rate paths by incorporating all the available data that the market believes will affect future policy decisions.
Figure 1 below uses the expected rate paths produced by the Market Probability Tracker to illustrate how market expectations between the June and July FOMC meetings responded to new information about the economy. The solid black line in figure 1 below shows the federal funds rate path expected by market participants after the FOMC's meeting press conference on June 15 . The black dotted and dashed lines represent, respectively, expectations after Fed chair Jerome Powell's Senate Banking Committee testimony on June 22 and after the release of the US Department of Labor's weekly report of unemployment insurance initial claims on June 30. Although economic updates occurred throughout the intermeeting period (the New York Fed maintains a list of important releases going back to 2018 in its Economic Indicators Calendar), the changes in expectations on these two dates best summarize the overall change in the market's expectations. Lastly, the solid orange line represents expectations following the FOMC's press conference on July 27. I also include (shaded in gray) the target range of 225 to 250 basis points announced at that meeting.
Starting with the black dashed line, after the June FOMC meeting, market participants expected the federal funds rate target range to reach 375 to 400 basis points by the first quarter of 2023 and then fall 75 basis points over the course of the next two years. During his June 22 Senate Banking Committee testimony, Chair Powell said that an economic downturn triggered by rate hikes to tame inflation was "certainly a possibility," although he added that such a downturn was neither the Fed's intent nor, in his view, necessary. His statement was important, albeit qualitative, information for the market because the FOMC in the past has often responded to economic downturns by lowering interest rates, which market participants interpreted as lower overall federal funds rates in the future (represented by the dotted black line). Expectations fell even lower after the Department of Labor's June 30 report hinted at a moderating labor market, with initial claims near five-month highs (represented by the dashed black line). By the July FOMC meeting (represented by the orange line), market participants further expected rate hikes to end this year and then fall 100 basis points during the next two years, as subsequent initial unemployment claims reports remained elevated and the July 21 Philadelphia Fed's manufacturing survey showed a drop in activity.
Figure 2 shows the Market Probability Tracker's estimates of the federal funds rate path expected by market participants following the July FOMC meeting (the solid orange line). It also shows the expected rate path following the August 5 release of the July employment situation report from the US Bureau of Labor Statistics (represented by the dotted black line), the expected rate path following the September 8 release of the initial unemployment claims report (represented by the dashed black line), and the expected rate path following the September 13 release of the August consumer price index (represented by the dot-dash line). Much like figure 1, these dates best summarize how market participants reacted to the evolving data since the July FOMC meeting (despite many other data releases occurring throughout the intermeeting period). Lastly, the solid black line represents the expected rate path on September 20, the day before the press conference following the FOMC meeting.
In the eight weeks since the July FOMC meeting, the data that the Committee said it would use to evaluate future policy moves came in much stronger than expected. Rather than moderating, labor markets appeared to tighten, with the Bureau of Labor Statistics reporting a 526,000 increase in nonfarm payrolls in its July jobs report and the Department of Labor reporting a decline in initial unemployment claims throughout August that culminated, in the September 8 report, in their lowest reported levels since May. The consumer price index for August, which many had expected to fall on a month-over-month basis, showed inflation increasing 0.1 percent from July. The less-volatile core measure of inflation that excludes gasoline and food prices also rose 0.6 percent from July, twice the expected rate. Given the data's direction leading up to the September FOMC meeting, market participants expected a more aggressive pace of rate hikes through the end of the year, from 325 basis points after the July FOMC meeting to nearly 450 basis points. They also expect rates to remain much higher for much longer, with rates at the end of 2025 near 350 basis points—which would be at least 100 basis points higher than the 225 to 250 basis points that the chair described as the "neutral" policy rate at his July press conference.
Figure 3 shows the Market Probability Tracker's estimates of the federal funds rate paths that market participants expected the day before the press conference following September's meeting (the solid black line), and after the press conference on September 21 (the solid orange line). Chair Powell, in his opening remarks, commented that tight labor markets "continue to be out of balance" with demand and that "price pressures remain evident across a broad range of goods and services." The information contained in both the press conference and the material released by the Committee did not significantly change market expectations about the future path of monetary policy, which already incorporated recent data on inflation and labor market conditions.
Turning back to the question posed by this post's title, the rate path movements seen in reaction to the incoming data show that, initially, market participants expected rate hikes to end in 2022. But the data, which came in much stronger after the July FOMC meeting, led the market to expect the Fed to raise rates higher than had been expected following the June FOMC meeting. Market participants also expect the Fed to keep those rates much higher for longer in order to cool demand—as Chair Powell put it in his August 26 speech at the Jackson Hole economic policy symposium, "until we are confident the job is done."
More importantly, the movements in the rate paths highlight the insights we can gain from the Market Probability Tracker into how information about the economy affects the market's expectations of future monetary policy decisions. Chair Powell observed during the June press conference that "monetary policy is more effective when market participants understand how policy will evolve." With the rate paths produced by the Market Probability Tracker each day, we can begin to make that assessment.
July 6, 2022
Workshop on Monetary and Financial History: Day Two
In yesterday's post, I discussed the first day of the Atlanta Fed's two-day virtual workshop on monetary and financial history. In today's post, I'll discuss the workshop's second day, which began with presentations by Maylis Avaro (University of Pennsylvania) and Caroline Fohlin (Emory University).
Avaro's paper , coauthored with Vincent Bignon (Banque de France), examined the historical (19th-century) credit policies of the Banque de France using a comprehensive dataset of credits that the Banque extended during one year (1898). In its credit operations, the Banque attempted to offset regional economic shocks by providing directed credit to the affected regions. It provided credit only against collateral, and it intensely monitored counterparties, especially for riskiness of their business model. Econometric analyses show that the Banque tended to extend credit at branches experiencing regional shocks, but usually only to parties judged to be sufficiently prudent. This analysis also shows that the Banque favored banks over nonfinancial firms and existing over new counterparties. Avaro concluded by arguing that this historical example illustrates the potential benefits of central bank credit operations when appropriate risk management can limit moral hazard.
The discussant for this paper was Angela Redish (University of British Columbia). Redish argued that the credit operations documented in the paper were somewhat different from what might be expected in other lender-of-last-resort situations, in which market disruptions might hinder assessments of risk and impair the value of collateral. Redish also questioned why private banks did not lend against the same sorts of collateral as the Banque, suggesting that some of the Banque's lending success might have been the result of its market power as the monopoly issuer of banknotes within France.
Fohlin's presentation described a research program, undertaken with Stephanie Collet (Deutsche Bundesbank), to construct a comprehensive dataset of interwar German stock prices. Fohlin's presentation focused on data from the 1920s. These data span a number of major disruptions to the German economy, including the 1921–23 hyperinflation, the 1927 stock market crash, and the rise of the Nazi party. Volatility of individual stock prices and bid-ask prices were high during this unsettled period. Despite this volatility, micro analysis of the data shows that the German stock market was surprisingly liquid for most of the sample, with buy orders typically exceeding sell orders. Another surprise was that shares of new companies were as liquid as those of existing companies. Market illiquidity increased during the late 1920s, however, in the wake of the 1927 stock market bubble and ensuing market crash.
The discussant was Eugene White (Rutgers University), who suggested that the data collected by the authors could be applied to a number of interesting research topics. For example, the data could provide additional perspective on the performance of the interwar stock market if its performance were contrasted with the pre-1913 market. A greater understanding of the institutional background of the market—for example, regulatory structure and stock voting rights—would also be useful. White also questioned how much the release from wartime capital controls in1919 was behind the apparent vitality of the 1920s market. Finally, White suggested that the authors investigate the impact of Reichsbank regulatory policy, margin requirements in particular, on market liquidity.
Looking back, and ahead
The fourth session of the workshop consisted of a panel discussion of the past and future of money. The panelists were François Velde (Federal Reserve Bank of Chicago), Gary Gorton (Yale University), and Marc Flandreau (University of Pennsylvania),
Velde's presentation considered possible roles for central bank digital currencies (CBDC) in the context of historical examples of monetary innovation. Velde observed that central banks arose from earlier, coin-based monetary systems, to fill gaps in those systems, first through giro transfers (a type of payment transfer between banks) and later through circulating currency. Originally most central banks only dealt with large-value payments, and even today most central banks are not retail-oriented. That could change with CBDCs, Velde noted, especially if future CBDCs incorporate innovative features such as smart contracts, although the property rights of information collected on CBDC users will be a contentious policy issue. Another unresolved issue regarding CBDCs is their role with respect to private digital currencies. Velde argued that competition between public and private moneys could be beneficial. Velde concluded by noting the monetary innovations are often the product of accidents or strong underlying trends, rather than conscious policy choices.
Gorton's presentation focused on new forms of private money and associated policy issues. Gorton argued that all private money inherently has the problem of information asymmetry and that the classic solution to this problem is to create money with a par value so that its value does not have to constantly be reassessed in market transactions. Typically, par money is debt that is backed by other debt. This solution creates another problem, Gorton argued, which is that debt-backed money can be subject to runs and sudden loss of value when confidence is lost in the money's backing. Stablecoins—digital tokens that have safe asset backing—are susceptible to the same problems as paper-based forms of private money, as recent runs on stablecoins show. Gorton concluded by drawing on the history of paper currency to suggest that the only viable long-term solution to the run problem will be central bank monopoly of digital token issue.
Flandreau's presentation considered the impact of monetary innovations on international currency competition. Flandreau rejected the "unipolar" and "multipolar" interpretations of monetary history literature (basically, a tendency to converge toward one or more dominant currencies), instead arguing that the true nature of monetary evolution has been one of currency competition regimes. As an example, he cited a currency competition regime that centered around the bill of exchange, an important international payment instrument from the 14th through the early 20th centuries. Major European currencies competed for international status within this regime by developing dense markets for bills of exchange. However, latecomers to this currency competition (Germany, Japan, and the United States) increased their competitiveness through new infrastructure, such as international branch banking, and new payment instruments, such as telegraphic transfers. These innovations supported the rise of the US dollar as an international currency when bills of exchange fell from use during the 1930s. Flandreau saw this history as illustrating the idea that currency regimes depend on their underlying financial infrastructure and monetary instruments.
The conference's fifth session featured paper presentations by Sasha Indarte (University of Pennsylvania) and Marc Weidenmier (Chapman University). Indarte's paper analyzed a dataset of sovereign bond defaults from 1869 to 1914, which was matched to a dataset of sovereign bond prices during the same period. Indarte described how a critical aspect of sovereign bond issue during this period was the reputation of the party underwriting the bond in the London financial markets. Econometric analyses show the presence of underwriter-related spillovers. More specifically, default of one sovereign bond issue typically depressed the prices of bonds with the same underwriter, after taking into account other observable factors. The reputation spillover effect is economically significant and evident for at least two years following a default. Indarte concluded by observing that this same pattern of underwriter spillover effects might be present in modern contexts such as syndicated lending.
Jonathan Rose (Federal Reserve Bank of Chicago) provided the discussion . Rose noted that the effect of underwriter reputation, while well documented in the paper, was perhaps more critical in historical than modern contexts (citing mortgage-backed securities as an example), due to the sovereign bonds' lack of regulation or credit enhancement features. Rose also raised the possibility of self-selection in the data sample, with weaker bond issuers seeking out underwriters who were more willing to risk their reputations.
Weidenmier presented new data series of US industrial production in the decades surrounding the Civil War (1840–1900), taken from a recent paper coauthored with Joseph David (Vanguard Group). The data series uses hand-collected, city-level data and are separated by Northern and Southern states. The data show that growth in Northern-state industrial production was little affected by the war. Southern-state industrial production, on the other hand, fell precipitously during the war and did not return to prewar levels until about 1875. Capital-intensive industrial production in Southern states was especially slow to recover. However, rapid growth resumed in the 1880s, perhaps because of the resolution of uncertainty regarding investor property rights following the end of Reconstruction.
The discussant for this paper was Mark Carlson (Board of Governors), who noted that some of the regional differences documented in the paper might have been the result of the population's westward expansion, which was more pronounced in the North. He also suggested that the quality of some of the immediate postwar data in the South might have been poor, possibly biasing statistical results. Those concerns aside, a striking feature of the data is that the North did not see a postwar contraction, as occurred in the United States after World War II. Finally, Carlson proposed that the observed regional differentials might be attributable to the disruption of the banking system that the South experienced during the Civil War and its immediate aftermath.
The conference's final panel was a discussion of the evolution of the Fed's mission and governance. The panelists were Sarah Binder (George Washington University), Lev Menand (Columbia University), and Ned Prescott (Federal Reserve Bank of Cleveland).
Binder began the panel with an analysis of the Fed's political independence drawn from her recent book with Mark Spindel. Binder proposed that the conventional view of the Fed as an agency insulated from short-term political pressures is incorrect, arguing that this view is overly rooted in struggles to control inflation during the 1970s and 1980s. Moreover, it overlooks both the historical importance of financial crises in shaping the Fed and the partisan context in which the Fed operates. Under this view, the Fed and Congress display interdependence, the Fed gaining political support and the Congress gaining the Fed's ability to quickly react in crises, as well as to absorb blame for unfavorable economic outcomes. Binder went on to argue that this interdependence has driven the structural evolution of the Fed, in the form of cycles whereby successive crises result in Congressional reforms of the Fed. Sensitivity of Fed policymakers to this cycle of reform has influenced many Fed policy decisions, Binder noted, its structural independence notwithstanding.
Menand's presentation focused on the role of the Fed within the broader legal framework of the US monetary system. Although some legal scholars see the Fed as a "hodgepodge agency" with many unrelated functions, Menand argued that the Fed's role is a coherent one within the US monetary tradition, which "outsources" money creation to independent entities, including chartered commercial banks but, since 1913, also the Fed. The structure of the Fed also resonates with the US tradition of geographic diffusion of banking, as well as the tradition of bank supervision. In addition, independence of the Fed from the executive branch coheres with the general US tradition of outsourcing the task of money creation. However, Menand proposed that more recently, the Fed has ventured beyond its traditional boundaries through its interactions with shadow banks, which are entities that issue deposit-like liabilities but lack bank charters. Menand concluded by arguing that Fed actions to support shadow banks during financial disruptions in 2008 and 2020 have eroded traditional political limits regarding what the Fed is expected to accomplish.
Prescott's presentation focused on the evolution of the research function at the Reserve Banks, drawing on a recent paper coauthored with Michael Bordo. Prescott described how research was not emphasized at the Reserve Banks until the 1951 Treasury-Fed Accord, which granted the Federal Open Market Committee more independence in setting monetary policy. Prescott noted that during the 1950s and 1960s, this independence led to an increased emphasis on research, in part because more economists had become Reserve Bank presidents. During this period, the St. Louis Fed assumed the role of a "dissenting Reserve Bank," articulating policy positions based on monetarist ideas. During the 1970s, research conducted at the Minneapolis Fed fostered new approaches to monetary policy that incorporated the concept of rational expectations. Later, ideas promoted by researchers at the Richmond Fed (policy transparency) and the Cleveland Fed (inflation targeting) also came to influence Fed policymaking. Prescott concluded with the observation that these historical examples illustrate the value of the Fed's decentralized structure, in which alternative approaches to policy can be formulated and incorporated into the policy process.
July 5, 2022
Workshop on Monetary and Financial History: Day One
On May 23 and 24, the Federal Reserve Bank of Atlanta hosted a workshop on monetary and financial history. The workshop was organized by Atlanta Fed economist William Roberds, in cooperation with Michael Bordo (Rutgers University) and Warren Weber (Federal Reserve Bank of Minneapolis, retired). The workshop featured seven paper presentations, along with three panel discussions and a keynote lecture.
Exploring fiscal-monetary interaction
The first session of day one of the workshop featured three papers that examine interactions between monetary and fiscal policy.
The first paper of the session was presented by Michael Bordo and Oliver Bush (Bank of England) and coauthored with Ryland Thomas (Bank of England). Their paper examines causes of the 1970s inflation in the United Kingdom, which was higher than in other advanced economies at the time. Bordo and Bush presented structural decompositions of the 1970s UK inflation. These decompositions suggest that a combination of fiscal responses to external shocks and passive monetary policy was the principal causal factor. The same decompositions suggest that fiscal reforms enacted during the 1980s and early 1990s enabled the Bank of England to reduce inflation to more acceptable levels.
In the discussion , Joshua Hausman (University of Michigan) proposed that the authors emphasize narrative aspects of this historical episode. What economic models were most conducive to policies that led to double-digit inflation? Given that other countries were using the same models, why did reliance on these models result in worse inflation outcomes in the UK than elsewhere? Hausman also noted that the lower inflation rates achieved from the 1980s onward did not lead to uniformly better economic outcomes, in the form of higher trend economic growth, lower unemployment, and higher growth of real wages.
The fiscal-monetary theme continued with the second paper of the session, which was presented by George Hall (Brandeis University) and co-authored with Thomas Sargent (New York University). Their paper compares the financing of US government expenditures associated with the COVID-19 pandemic to the financing of World Wars I and II, which gave rise to expenditures of comparable magnitude. Hall presented an accounting framework that decomposes wartime financing into three main components: taxes, debt, and money creation. This decomposition indicates that in contrast to expenditures during the two world wars, COVID-19 expenditures have been funded very little by taxes and largely by debt. Also different from the world wars is the fact that much of the COVID-19 debt has taken the form of monetary assets, interest-bearing Fed reserves, and reverse repos. Hall suggested that the experience of the world wars indicates that inflation will eventually amortize much of the debt induced by COVID-19.
The Hall and Sargent paper was discussed by Chris Meissner (University of California, Davis). Meissner argued that the COVID-19 shock was different from the world wars insofar as it was largely unanticipated, simultaneously global, and associated with widespread financial market disruptions. For this reason, reliance on debt funding might have been a more appropriate policy than for either of the war episodes. However, Meissner also suggested that reliance on debt financing might have significantly reduced the United States' ability to respond to future external shocks.
The final paper of the first session was presented by Eric Leeper (University of Virginia) and coauthored with Margaret Jacobson (Board of Governors) and Bruce Preston (University of Melbourne). Their paper focuses on the performance of the US economy during the Great Depression and argues that the economy's initial recovery in the early years of the Roosevelt administration can be attributed to fiscal expansion combined with the repeal of the gold standard. Leeper argued that latter policy enabled the Fed to finance much of Roosevelt's fiscal expansion via unbacked bonds, but that the recovery was then paused by more restrictive fiscal policies adopted after 1937.
Kris Mitchener (Santa Clara University) discussed this paper in the context of the large literature on the Great Depression, which has generally emphasized monetary rather than fiscal policy as a driving force in the initial Roosevelt recovery. Mitchener noted that for this reason, the paper's fiscal-monetary focus represents a new explanation of the post-1933 recovery. However, Mitchener also noted that during the Depression, most individual Treasury bond holdings were limited to higher-income households and that this heterogeneity would matter for the paper's arguments. In addition, banks held many bonds, and it would be desirable to model the effects of fiscal expansion on banks' balance sheets. Additionally, Mitchener recommended that the authors consider the effects of US policies on its international trade.
Putting inflation into historical perspective
The second session of the conference featured a panel discussion of the current inflationary outlook in the context of earlier inflationary episodes. The panelists were Robert Hetzel (Federal Reserve Bank of Richmond, retired), Jeremy Rudd (Board of Governors), and Mickey Levy (Berenberg Capital Markets).
Hetzel proposed that there are enough commonalities of the current situation with historical episodes—particularly the inflationary acceleration experienced in the 1960s and 1970s—for the Federal Open Market Committee to consider formally integrating monetary history into the policymaking process. He argued that this integration would lead to a more transparent statement of the FOMC's monetary standard.
Drawing on his experience at the Richmond Fed during the 1970s, Hetzel recalled the Federal Reserve's intense resistance at that time to explicitly articulating policy objectives. He argued that although there have since been improvements, the Federal Open Market Committee (FOMC) could better articulate a monetary standard through integration of historical perspectives into policymaking. More specifically, this proposal would include (1) establishment of a committee of monetary historians that would report directly to the FOMC, (2) a restructuring of the Teal Book (the briefing document prepared by the Board of Governors staff for FOMC meetings) to include a historical breakdown of how the economy got to its current state, and (3) replacement of the FOMC's current Summary of Economic Projections, which reports a collection of individual forecasts, with a consensus FOMC forecast that would be informed by consultations with the historian committee in part (1) of the proposal and the historical breakdown in part (2).
Rudd's presentation focused on potential changes in underlying inflation dynamics observed since the start of the COVID-19 pandemic. Rudd observed that prepandemic, Fed policymakers had been able to rely on stable long-term trend inflation in the US economy, as well as a flat Phillips curve (a negative correlation between unemployment and inflation), although the factors giving rise to these favorable conditions were not well understood. This lack of understanding has hindered Fed policymaking post-COVID, when inflation has increased in part due to large relative price shocks, creating uncertainty as to whether trend inflation has now moved higher. To overcome this uncertainty, Rudd argued that it might be useful to examine historical episodes and, particularly, the increase in trend inflation observed during the late 1960s.
Rudd proposed that the underlying dynamics in the 1960s were different from those of the current economy, because of less anchored long-term inflation trends and a steeper Phillips curve. Hence, we should reject a repeat of 1960s-style overheating as an explanation for the recent pickup in inflation. A commonality with the 1960s, however, is that policies adopted then seemed reasonable at the time, and policymakers didn't foresee them as fostering persistent inflation. A major question for policymakers, then as now, is whether the recent acceleration in inflation reflects a fundamental shift in the structure of the economy. Rudd concluded by noting that by the time this question is answered, reversing any increase in trend inflation could be difficult.
Levy's discussion focused on a recent research paper coauthored Michael Bordo. The paper surveys cyclical patterns of Fed policymaking over its entire history, from 1914 until present. Bordo and Levy argue that in these cycles, the Fed has had a general tendency to wait too long to remove monetary accommodation. They cite four factors behind this tendency: shifting doctrines about how monetary policy should be conducted, ambiguity surrounding the Fed's dual mandate, misreads of data on the state of the economy, and political pressures.
Levy proposed that these same factors have been present in the most recent policy cycle, leading to delayed removal of accommodation. Movement to a neutral level of policy interest rates will now be difficult, he argued, and—given current negative real interest rates—a hard landing has a high probability. Levy concluded with three recommendations for Fed policy going forward. First, the Fed should place more emphasis on rules-based policy (for example, a type of Taylor rule) as a benchmark. Second, the Fed should adopt a less ambiguous interpretation of its dual mandate. Third, the Fed should pay more attention to the lessons of history and incorporate these lessons into its policy doctrine.
The conference's keynote lecture was delivered by Barry Eichengreen (University of California, Berkeley). Eichengreen's presentation surveyed the evolution of payments instruments over the past millennium, from medieval-era banknotes in China to today's digital forms of payment. A theme of the presentation was increasing technological efficiency: transactions in paper money were more efficient than the physical transfer of coins, and modern types of electronic transactions are more efficient still. The shift towards digital forms of payment has recently accelerated, Eichengreen observed, because of the COVID-19 epidemic, which led consumers to prefer online forms of payment. This shift has occurred in virtually every nation with a sufficiently advanced cellphone network, but especially in Sweden, which coincidentally was also an early adopter of printed banknotes.
Eichengreen noted that a factor that has worked against more widespread adoption of advanced digital forms of payment is that in most countries (with exceptions such as Sweden) these are not ubiquitous. Network externalities may soon lead to the emergence of dominant private forms of digital payment, however, but such dominance could result in monopoly pricing and the need for regulation. Blockchain technology alone will not by itself resolve issues with digital payments. Stablecoins have the potential to supplant paper currency for many purposes, but monetary history teaches that fractionally backed stablecoins might be susceptible to runs, again suggesting a role for regulation.
The lecture concluded with the observation that retail-level central bank digital currencies (CBDCs) might offer advantages in terms of ubiquity and stability, but CBDCs would simultaneously pose operational challenges for central banks and could encourage disintermediation of commercial banks. For these reasons, Eichengreen suggested that CBDCs are more likely to be issued at the wholesale level—for example, to commercial banks—and these banks would in turn manage CBDC transactions that their customers initiate.
In his closing remarks, Eichengreen argued that although paper currency has been a ubiquitous form of money only within a relatively short period of human history, its advantages mean that it will likely persist even as digital forms of payment become more widespread. As the latter become more prevalent, strong network externalities will necessitate government involvement, both through the provision of CBDCs and the regulation of private forms of money.
In tomorrow's post, I'll cover the presentations and discussions in the workshop's second day.
April 2, 2018
Thoughts on a Long-Run Monetary Policy Framework, Part 4: Flexible Price-Level Targeting in the Big Picture
In the second post of this series, I enumerated several alternative monetary policy frameworks. Each is motivated by a recognition that the Federal Open Market Committee (FOMC) is likely to confront future scenarios where the effective lower bound on policy rates comes into play. Given such a possibility, it is important to consider the robustness of the framework.
My previous macroblog posts have focused on one of these frameworks: price-level targeting of a particular sort. As I hinted in the part 3 post, I view the specific framework I have in mind as a complement to, and not a substitute for, many of the other proposals that are likely to be considered. In this final post on the topic, I want to expand on that thought, considering in turn the options listed in part 2.
- Raising the FOMC's longer-run inflation target
The framework I described in part 3 was constructed to be consistent with the FOMC's current long-run objective of 2 percent inflation. But nothing in the structure of the plan I discussed would bind the Committee to the 2 percent objective. Obviously, a price-level target line can be constructed for any path that policymakers choose. The key is to have such a target and coherently manage monetary policy so that it achieves that target. The slope of the price-level path—that is, the underlying long-run inflation rate—is an entirely separate issue.
- Maintaining the 2 percent longer-run inflation target and policy framework more or less as is, relying on unconventional tools when needed
As noted, the flexible price-level targeting example I discussed in part 3 was constructed with a long-run 2 percent inflation rate as the key benchmark. In that regard, it is clearly consistent with the Fed's current inflation goal.
Further, a central question in the current framework is how to interpret a goal of 2 percent inflation in the longer run. One interpretation is that the central bank aims to deliver an inflation rate that averages 2 percent over some period of time. Another interpretation is that the central bank aims to deliver an inflation rate that tends toward 2 percent, letting bygones be bygones in the event that realized inflation rates deviate from 2 percent.
The bounded price-level targets I have presented do not force a particular answer to the question I raise, and both views can be supported within the framework. Hence, the framework is consistent with whichever view the FOMC might adopt. The only caveat is that deviations from 2 percent cannot be so large and persistent that they push the price level outside the target bounds.As to the problem of the federal funds rate falling to a level that makes further cuts infeasible, nothing in the notion of a price-level target rules out (or demands) any particular policy tool. If anything, bounded price-level targets could expand the existing toolkit. They certainly do not constrain it.
Targeting nominal gross domestic product (GDP) growth
Targeting nominal GDP growth, which is the sum of real GDP growth and the inflation rate, represents a deviation from the price-level targeting I have described. In this framework, the longer-run rate of inflation depends on the longer-run rate of real GDP growth.
To see how this works, consider the period from 2003 to 2013. In 2003, the Congressional Budget Office projected an average annual potential GDP growth rate of 2.9 percent over the next 10 years. Had there been a nominal GDP growth target of 5 percent at this time, the implicit annualized inflation target would have been just over 2 percent. However, current CBO estimates indicate that actual potential GDP growth over this period averaged just 1.5 percent, which would suggest an inflation target of 3.5 percent. As data came in and policymakers saw this lower level of growth, they would have responded by shifting upward the implicit inflation target.
For advocates of using a nominal GDP target, shifting inflation targets is a key feature and not a bug, as it allows policy to adjust in real time to unforeseen cyclical and structural developments. What nominal GDP targeting doesn't satisfy is the principle of bounded nominal uncertainty. Eventually, price-level bounds that are set with an assumed potential real growth path will be violated if shifts in potential growth are sufficiently large. The appeal of nominal GDP targeting depends on how one weighs the benefits of inflation-target flexibility against the costs of price-level uncertainty inherent in that framework.
- Adopting flexible inflation targets that are adjusted based on economic conditions
Recently, my colleague Eric Rosengren, president of the Boston Fed, offered a proposal (here and here) that has some of the flavor of nominal GDP targeting but differs in important respects. Like nominal GDP targeting, President Rosengren's framework would adjust the target inflation rate given structural shifts in the economy. However, if I understand his idea correctly, the FOMC would deliberate specifically on the desired rate of inflation and adjust the target within a predetermined range.
Relying on the target's appropriate range opens the possibility of compatibility between President Rosengren's framework and the one I presented. Policymakers could use price-level targeting concepts in developing a range of policy options given the state of the economy. The breadth of the range of options would depend on the bounds the FOMC felt represented an acceptable degree of price-level uncertainty.
Summing all of this up, then—to me, the important characteristic of a sound monetary policy framework is that it provides a credible nominal anchor while maintaining flexibility to address changing circumstances. I think some form of flexible price-level targeting can be a part of such a framework. I look forward to a robust and constructive debate.
- Business Cycles
- Business Inflation Expectations
- Capital and Investment
- Capital Markets
- Data Releases
- Economic conditions
- Economic Growth and Development
- Exchange Rates and the Dollar
- Fed Funds Futures
- Federal Debt and Deficits
- Federal Reserve and Monetary Policy
- Financial System
- Fiscal Policy
- Health Care
- Inflation Expectations
- Interest Rates
- Labor Markets
- Latin AmericaSouth America
- Monetary Policy
- Money Markets
- Real Estate
- Saving Capital and Investment
- Small Business
- Social Security
- This That and the Other
- Trade Deficit
- Wage Growth