The Fed’s Inflation Target and Policy Rules

The Brookings Institution held an interesting conference yesterday organized by David Wessel on “Should the Fed Stick with the 2 Percent Inflation Target or Rethink It?” Olivier Blanchard and Larry Summers argued, as they have elsewhere, that the Fed should increase its inflation target—say from 2% to 4%. Others—such as John Williams—argued that the Fed should change the target in some other way such as by focusing on the price level. Sarah Binder, Peter Hooper and Kristen Forbes were on a panel to answer questions about political, market, and international issues, respectively. I was on that panel to answer questions  about monetary policy rules, and the first question posed by David Wessel was about the inflation target in the Taylor rule. Here’s a summary of my answer and later remarks during the course of the panel:

For the policy rule that came to be called the Taylor rule, first presented in 1992, I used a 2% inflation target (π*). This was long before the official adoption of a 2% target by the Fed, the BOJ or the ECB.  The central banks of New Zealand, Chile and Canada were moving toward inflation targeting about that time, but not with the single number of 2% as a target. John Murray presented the Canadian history at the conference.

I chose 2% rather than zero back then because of the upward bias in measuring inflation, which was widely discussed at that time, and because of the zero bound problem for the interest rate.  It was not an arbitrary choice. I also chose an equilibrium real interest rate (r*) of 2%. That was not arbitrary either with the real GDP growth rate trending a little over 2%. The actual rule for the interest rate (i) was i=π+.5y+.5(π-π*)+r* with π*=2 and r*=2. This meant that the equilibrium nominal rate was 4%. In equilibrium the output gap (y) equal zero and the inflation rate (π) equals π*.

Regardless of whether or not the Fed changes its inflation target π* going forward, it is important that its monetary framework be based on policy rules. The good economic performance during the Great Moderation was due largely to policy becoming more consistent with a rules-based framework, and the devastation of the Great Recession was due in part to deviating from rules-based policy.

It is also important for the new research on π* to be based on policy rules. In fact, virtually all economic research on the matter has been conducted using policy rules, including the important recent work by Fed economists Michael Kiley and John Roberts for the Brookings Papers on Economic Activity—a paper which was widely cited at the conference.  The whole new section on policy rules in the Fed’s recent Monetary Policy Report and speeches last year by Fed Chair Janet Yellen also use this approach.

All the alternative proposals considered at this conference can and should be evaluated using policy rules, including price level targeting, nominal GDP targeting, and different inflation targets. If you want to evaluate a higher inflation target, you just stick in a higher value for π*.  If you choose an inflation target of 4% with r* still 2%, then the average nominal rate will be 6%. If r* was 0% rather than 2%, an inflation target of 4% would mean an equilibrium nominal interest rate of 4%, exactly as in the original Taylor rule. In each case one can evaluate performance of the economy over a range of models as in Volker Wieland’s model data base.

Such policy rules or strategies would fit into the legislative language in recent bills in Congress, including the “Monetary Policy Transparency and Accountability Act,” which simply require the Fed to describe its policy strategy and compare it with policy rules of its own choosing. The rules would also help clarify the Fed’s actions to the markets and to policy makers in other countries.

The main motivation for the newer inflation targeting proposals is concern about the zero lower bound (ZLB), or the effective lower bound, on the interest rate.  But the lower bound is not a new thing in economic research. Policy rule research took that into account long ago. In my 1993 book, for example, I noted that the policy rule “must be truncated below some nonnegative value.” We used 1% then: whenever the policy rule “calls for a nominal interest rate below 1 percent, the nominal interest rate is set to 1 percent.”

Another alternative is to move to a money growth regime. For example, in 1996 I noted that the interest rate rule needed “to be supplemented by money supply rules in cases of either extended deflation or hyperinflation.” Recently, Peter Ireland and Michael T. Belongia have suggested a return to money growth rules in the case of the ZLB.

Other proposals for dealing with the zero bound have been made over the years. In 1999 David Reifschneider and John Williams proposed that the interest rate be kept extra low following an ZLB period. For example, the interest rate would be kept at zero until the absolute value of the cumulative sum of negative deviations of the actual interest rate from the ZLB equals what occurred during the period of the ZLB.

Many at the conference thought that the ZLB is more of a problem now than in the past because estimates of r* have fallen. But those estimates are uncertain and may reverse soon. Volker Wieland and I demonstrated this uncertainty, especially in current circumstances, in considering the influential research of Thomas Laubach and John Williams. The low estimates of r* may be due to “fog” cause by unusually low policy interest rates and unconventional monetary policies at many central banks. Permanently changing the target inflation rate may not be the best response.

There are also international considerations. As we all know the original 2% inflation target is becoming universal for central banks around the world, and there is also a clamoring for a more rules-based international monetary system. One reason for the clamoring is research showing the increased exchange rate and capital flow volatility of recent years has been due in part to a deviation from a rules-based system. Now is an opportune time to move in the direction of a rules-based international system by simply reporting on the policy strategy in each country. Changing the inflation target in these strategies unilaterally will make this more difficult.

For all these reasons, I would be hesitant to change the inflation target introduced 25 years ago.  But as research on policy rules at the Fed and elsewhere continues, I hope two related concerns are addressed.

First, there is a danger in the way that the numerical inflation target has come to be used in practice. It seems that even if the actual inflation rate is only a bit below the 2% inflation target—say 1.5% or 1.63%—there is a tendency for people to call for the central bank to press the accelerator all the way to the floor. This is not good monetary policy; it is not consistent with any policy rule I know, and it could create excesses or even bubbles in financial markets. This problem could be remedied as the Fed continues to clarify its strategy.

Second, the greater attention to a numerical inflation target may have reduced attention to other aspects of the policy rules, including the idea that we need a policy rule at all. In other words, trying to give more precision to π* may have led to less precision about other parameters, including the sizes of the responses. Recall that the Fed and other central banks moved toward rules-based policy well before they adopted formal numerical inflation targets.  Most of the move to rules-based policy occurred during the period when Paul Volcker and Alan Greenspan simply said that inflation should be low enough that it did not interfere with decision- making. Again, I think this problem can be remedied as the Fed continues to clarify its strategy.

Posted in Monetary Policy

Happy New Decade!

The Great Recession began exactly one decade ago this month, as later determined by the NBER business cycle dating committee chaired by my colleague, Bob Hall. There is still a great debate about the causes of the Great Recession, its deepness, its length, and the Not-So-Great Recovery that followed. But there is no question that the economic growth rate over the past ten years has been dismal—only 1.4 percent per year on average. A chart of the ten-year moving average of growth rates tells the story. Let’s hope the new decade that begins tomorrow will be a happier new decade for economic growth in the United States.

I still think the explanation in my 2009 and 2012 books Getting Off Track: How Government Actions and Interventions Caused, Prolonged, and Worsened the Financial Crisis and First Principles: Five Keys to Restoring Americas Prosperity are basically correct, and I am encouraged that there has been a turnaround recently in regulatory policy and tax policy.

But more investigative research into real-time records of policy actions is essential to determine what went wrong during the past decade.  A good example is the new book, forthcoming in 2018, The Fed and Lehman Brothers: Setting the Record Straight on a Financial Disaster, by Larry Ball which investigates the records and uncovers inconsistencies in the government’s story of its role in the panic of 2008. He shows that the Fed could have legally prevented the chaos surrounding the Lehman bankruptcy, but didn’t do so either because of political concerns or a botched implementation of its game plan.

It is also essential to review and assimilate all the policy research that has been done in the past decade, both inside and outside of government institutions. In this regard, also ten years ago this month we created an Economic Policy Working Group at the Hoover Institution with the express purpose of doing policy-related research on the crisis, focusing on the change in policy that many of us—including George Shultz, John Cogan, Darrell Duffie, Michael Boskin, Ken Scott—saw had begun a few years before. The group has grown and now includes many more economists, including John Cochrane and Josh Rauh who moved from the University of Chicago. The 160 policy meetings and conferences organized by this Working Group have been the source of many papers and books including the early research work on stimulus packages, quantitative easing, bankruptcy reform, international monetary reform, and, most recently, John Cogan’s The High Cost of Good Intentions.  To aid in communication and assimilation, brief summaries of all the meetings were written and collected in real time and are available here on the Hoover website here. Some of the recent summaries by John Cochrane are very thoughtful essays on their own right.

Posted in Financial Crisis, Slow Recovery, Teaching Economics

What’s Past is Prologue. Study the Past

 

Each year the Wall Street Journal asks friends for their favorite books of the year. Two years ago I chose Thomas Sowell’s history of income distribution in Wealth, Poverty, and Politics and Brian Kilmeade’s history on Thomas Jefferson and the Tripoli PiratesLast year I chose The Man Who Knew, Sebastian Mallaby’s biography of Alan Greenspan, and War by Other Means by Bob Blackwill and Jennifer Harris.

This year I chose two amazingly relevant  books on U.S. economic history: John Cogan’s The High Cost of Good Intentions: A History of U.S. Federal Entitlement Programs and Doug Irwin’s Clashing over Commerce: A History of US Trade Policy. My reasons in brief are found in the passage below from the printed December 16 WSJ edition. In these days of big economic policy changes, history is essential, and if I had room for a third book, it certainly would be another economic history, namely Tom Hazlett’s The Political Spectrum: The Tumultuous Liberation of Wireless Technology, from Herbert Hoover to the Smartphone.

 

Posted in Budget & Debt, International Economics, Teaching Economics

A Policy Rule Presented at a Conference 25 Years Ago Today

Ed Nelson sent me a nice note today saying that the past two days (November 20-21) mark “the twenty-fifth anniversary of the Carnegie-Rochester Conference at which you laid out your rule.” I had forgotten about the specific dates, but his note reminds me how much has changed in those 25 years.

Back then, research on monetary policy rules was indicating that rules needed to be very complex with many variables and many lags. There were serious doubts about the usefulness of the research, and some expressed doubts that the results would ever be applied in practice. I had been conducting research at Stanford in the 1980s with a number of graduate students including Volker Wieland and John Williams. So Allan Meltzer (who organized the Conference Series with Karl Brunner) called me and requested that I present a paper on the subject at the November 1992 conference.

The question was: Could we design a simple practical policy rule that was consistent with our research? The answer turned out to be yes, with the interest rate—the federal funds rate—rather than the money supply or the monetary base as the instrument.  The Fed still wasn’t talking publicly about its settings for the federal funds rate, so there was criticism of that design. However, several discussions with Alan Greenspan, who was then Chair of the Fed, gave me a degree of confidence that this approach was workable. In fact, Greenspan later joked that the Fed deserved an “assist” in the developing the Taylor rule.

Ben McCallum was the discussant of my paper at the conference, and he recently wrote a retrospective on the impact of the conference and the rule.  Ben describes how the request to me from Allan Meltzer originated in a meeting of the Carnegie-Rochester Advisory Board on which Ben served.  Ben’s recollection of what Allan was supposed to ask me to do was quite different from what I recall Allan actually asked me to do.  We will perhaps never know how that interesting “miscommunication” arose, but it clearly made a difference.

Posted in Monetary Policy, Teaching Economics

New Results on International Monetary Policy Presented at the Swiss National Bank

This week I gave the Swiss National Bank’s  Annual Karl Brunner Lecture in Zurich, and I thank Thomas Jordan who introduced me and the hundreds of central bankers, bankers, and academics who filled the big auditorium. Karl was a brilliant, innovative economist who thought seriously about both policy ideas and institutions. For the lecture, I focused on ideas and institutions for international monetary policy.

Since Karl died in 1989, we can only wonder what he would think about monetary policy in the past dozen years. But we can get some hints from his former student, collaborator, friend, and great economist Allan Meltzer, who died earlier this year.

About one year ago at the annual monetary conference in Jackson Hole, Meltzer argued that the Fed’s “quantitative easing” was in effect a monetary policy of “competitive devaluation,” and he added that “other countries have now followed and been even less circumspect about the fact that they were engaging in competitive devaluation. Competitive devaluation was tried in the 1930s, and unsuccessfully, and the result was that around that time major countries agreed they would not engage in competitive devaluation ever again.”

In the lecture, I examined this idea empirically, and I found striking results. A monograph with the details will soon be published by the MIT Press, but a very short taste of the results can be given here.

I began by introducing a simple modelling framework which captures key features of recent economic policy. I focused on the balance sheet operations of the Federal Reserve, the European Central Bank, and the Bank of Japan. I concentrated on the liability side and, in particular, on reserve balances which are used to finance asset purchases, as a measure of the balance sheet operations. For the three central banks this gives RU which measures the Fed’s reserve balances in millions of dollars, RJ which measures the BOJ’s current account balances in 100s of million yen, and RE  which measures the ECB’s current account plus deposit facility in millions of euros. I also considered the central bank in a relatively small open economy—the Swiss National Bank.

To examine the impact of the balance sheet operations of the central banks in the three large areas I estimated the following equations.

XJU = α0 + α1RJ + α2RU + α3RE

XJE = β0 + β1RJ + β2RU + β3RE

XUE = γ0 + γ1RJ + γ2RU + γ3RE

where XJU is the yen per dollar exchange rate; XJE is the yen per euro exchange rate; and XUE is the dollar per euro  exchange rate.

All the estimated coefficients are significant, and they showed that:

  • An increase in reserve balances RJ at the Bank of Japan causes XJU and XJE to rise, or, in other words, causes the yen to depreciate against the dollar and the euro.
  • An increase in reserve balances RU at the Fed causes XJU to fall and XUE to rise, or, in other words, causes the dollar to depreciate against the yen and the euro.
  • An increase in reserve balances RE at the ECB causes XJE and XUE to fall, or, in other words, causes the euro to depreciate against the yen and the dollar.

The charts below show the patterns of reserve balances and the corresponding exchange rate movements: first there is the increase in reserve balances at the Fed with a depreciation of the dollar; second there is an increase in reserve balances at the BOJ with a depreciation of the yen; and third there is increase in reserve balances at the ECB and a depreciation of the euro.

In other words, there are significant exchange rate effects of balance sheet operations for the large advanced countries.  In the lecture I then went on to show that there are similar effects for the Swiss National Bank, as in other central banks in small open economies that have little choice but to react to prevent these unwanted moves in their own exchange rates.

These exchange rate effects are likely to be a factor behind balance sheet actions taken by central banks and the reason for the policy contagion in recent years as countries endeavor to counteract other countries’ actions to influence exchange rates. In this sense, there is a “competitive devaluation” aspect to these actions as argued by Allan Meltzer—whether they are intentional or not.

The resulting movements in exchange rates can be a source of instability in the global economy as they affect the flow of goods and capital and interfere with their efficient allocation. They also are a source of political instability as concerns about currency manipulation are heard from many sides. They are another reason to normalize and reform the international monetary system. In my view a rules-based international system is the way to go, as I discussed in the lecture at the Swiss National Bank referring to earlier work here.

 

Posted in International Economics, Monetary Policy

Still Learning From Milton Friedman: Version 3.0

We can still learn much from Milton Friedman, as we celebrate his 105th birthday today.  Here I consider what we can learn from his participation in the monetary policy debates in the 1960s and 1970s. I draw from a 2002 paper that I presented to lead off his 90th birthday celebration in Chicago in 2002  and from two 2012 pieces: a paper I presented at the centennial of his birth in 2012 and an article written on his 100th birthday in 2012.The lessons are very relevant to the debates raging during the last 15 years and continuing today.

Back in the early 1960s, the Keynesian school first came to Washington led by Paul Samuelson who advised John F. Kennedy during the 1960 election campaign and recruited Walter Heller and James Tobin to serve on the Council of Economic Advisers. The Keynesian approach received its official Washington introduction when Heller, Tobin, and their colleagues wrote the Kennedy Administration’s first Economic Report of the President, published in 1962. The Report made the case for discretion rather than rules. For monetary policy it said that a “discretionary policy is essential, sometimes to reinforce, sometimes to mitigate or overcome, the monetary consequences of short-run fluctuations of economic activity.”

In that same year Milton Friedman published Capitalism and Freedom (1962) giving the competing view. He argued that “the available evidence . . . casts grave doubt on the possibility of producing any fine adjustments in economic activity by fine adjustments in monetary policy—at least in the present state of knowledge . . . There are thus serious limitations to the possibility of a discretionary monetary policy and much danger that such a policy may make matters worse rather than better . . .”

So there were two different views, and the fundamental difference was over discretion versus rules-based policies. From the mid-1960s through the 1970s the Samuelson view was winning with practitioners putting discretionary monetary policies into practice, mainly the go-stop policies that led to both higher inflation and higher unemployment. Friedman remained a persistent and resolute champion of the alternative view. Fortunately, Friedman’s arguments eventually won the day and American economic policy moved away from an emphasis on discretion in the 1980s and 1990s. Paul Volcker, who as chair of the Fed implemented the more rules-based policy, had to confront the disparity of views as he did so as I described here.

But this same policy debate is back today. Economists on one side push for more discretionary monetary policy such as the quantitative easing actions and resist the notion of rules-based monetary policy. Other economists argue for a return to more predictable and rule-like monetary policy. They argue that the bouts of quantitative easing were not very effective, and that deviations from rules-based policy helped worsen the great recession.

Of course there are many nuances today, some related to the difficulty of distinguishing between rules and discretion when the zero lower bound is thought to be a constraint. Interestingly, you frequently hear people on both sides channeling Milton Friedman to make their case.

The debate is not merely academic. Rather it is a debate of enormous practical consequence with the well-being of millions of people on the line. The House of Representatives has passed a bill calling for the Fed to describe its policy rule and recent Fed reports have talked about normalization raising questions about a return to rules-based policy.  The same issues arise in discussions of unconventional monetary policy in Europe and Japan.

Can the disagreements be resolved? Milton Friedman was optimistic that debates could be resolved, and I am sure that this is one reason why he kept researching and debating the issue so vigorously.

Today people on both sides can learn from him. First, while a vigorous debater he was respectful, avoiding personal attacks. Second, he had a strong believe that empirical evidence would bring people together. Yes, people would come to the issue with widely different prior beliefs, but their posterior beliefs—after evidence was collected and analyzed—would be much closer. In this way the disagreement would eventually be resolved.

Although posterior beliefs in the monetary area now seem just as far apart as prior beliefs were 50 years ago, I sense that empirical work on the policy decisions of recent years, like the empirical paper by Alex Nikolsko-Rzhevskyy, David Papell, Ruxandra Prodan, can bring about more convergence of views.  As I said at the time of his 100th birthday, clearly we can learn a lot from Milton Friedman in deciding how to proceed.

Posted in Monetary Policy

Debate Over the Very Principles of Economics

Today is the launch of the online version of my Economics 1 course (and namesake of this Blog and my Twitter handle) on the Principles of Economics for summer 2017. This year is also the tenth anniversary of the start of the Global Financial Crisis and the Great Recession which began in 2007.

During these ten years there has been great deal of hand-wringing among economists and others about the subject of Economics. This is an important debate, and the different positions deserve to be covered in the basic economics course.

As early as 2009, a cover of The Economist magazine showed a book titled “Modern Economic Theory” melting into a puddle to illustrate what the writers viewed as the problem with economics. It was the most talked about issue of the year.

Some economists have been calling for a complete redo of economics—or for a return to a version of the subject popular decades ago. They say that economics failed to prevent the Great Recession and the Global Financial Crisis or even led to them. Many of these economists argued for a change government policy, saying that John Maynard Keynes was right and Milton Friedman was wrong.

Paul Samuelson spoke this way in an interview in the New Perspectives Quarterly in 2009 saying, “today we see how utterly mistaken was the Milton Friedman notion that a market system can regulate itself… This prevailing ideology of the last few decades has now been reversed…I wish Friedman were still alive so he could witness how his extremism led to the defeat of his own ideas”.

Paul Krugman, in a piece in the New York Times Magazine in 2009, also faulted modern economics for bringing on the crisis. He said it focused too much on beauty over practicality and did not recognize the need for more government intervention to prevent and cure the crisis. His fix was to add more psychology to economics or to build better models of credit.

And over the years the debate has continued. Last year Thomas Sargent, in commenting on a Handbook by macroeconomists said the “collection belies uninformed critics who assert that modern macroeconomics was wrong footed by the 2007-2009 financial crisis….both  before and after that crisis, working  macroeconomists had rolled up their sleeves to study how financial frictions, incentive problems, incomplete markets, interactions among monetary, fiscal, regulatory, and bailout policies, and a host of other issues affect prices and quantities and good economic policies.”

But also last year Paul Romer, now chief economist at the World Bank, wrote a widely discussed piece called “The Trouble with Macroeconomics.” Then Ricardo Reis of the London School of Economics wrote a paper more supportive of economics with the title “Is something really wrong with macroeconomics?” My colleague John Cochrane commented positively on the views of Reis, and Noah Smith explained in a Bloomberg View column why “So Many Critics of Economics Miss What it Gets Right

So the debate moves on. My view, throughout this period, has been that the Great Recession and the Global Financial Crisis do not provide evidence of a failure of economics. Rather theses events vindicate the theory. The research I have done, here for example for the Fed’s Jackson Hole Conference in 2007, points instead to a deviation of economic policy from the type of policy recommended by economic principles–a deviation from the type of policy that was responsible for the remarkably good economic performance in the two decades before the crisis. Economists call this earlier period the Long Boom or the Great Moderation because of the remarkably long expansions and short shallow recessions. In other words, the crisis did not occur because economic theory went wrong. It occurred because policy went wrong.

Posted in Financial Crisis, Teaching Economics