Still Crazy After All These Years—And What About the Next 50

Yesterday I was talking to a friend in my office about the great benefit to students from writing undergraduate honors theses in their senior year.  I have long advised students to do so, perhaps because of the rewarding experience I had years ago.  It got me thinking, so I pulled my undergraduate thesis off the shelf, and I noted that it was dated April 5, 1968–submitted exactly 50 years ago to the day. The thesis was all about policy rules, and, yes, I am still working on that topic, “still crazy after all these years.”

I got the idea from Phil Howrey who was then on the Princeton faculty. Phil was doing research with Michio Hatanaka, who had just published Spectral Analysis of Economic Time Series with Clive Granger who had visited Princeton.  They were all interested in dynamic stochastic models of the economy, and so I got interested. How lucky for me. I had taken a macro course from Phil, without ISLM but with many dynamic stochastic equations—quite unusual for the time, and I was forced to view the economy as a moving dynamic structure. The idea of policy as a rule made so much sense. There was no other way to do policy with those models.

I recall that I really loved working on this project. I spent long hours in a carrel in the basement of the library reading and deriving equations in the winter and spring of 1968.

I did the simulations of the differential equations at the Princeton University Computer Center on a digital computer (an IBM 7094)

and on an electro-analog computer (an Electronic Associates TR-20 located at the Engineering School)—a circuit with capacitors, resistors, and amplifiers hooked up to an oscilloscope.

I simulated monetary policy rules of the kind that engineers had used to stabilize mechanical processes: proportional, derivative and integral. The monetary policy rules had the money supply on the left-hand side, rather than the interest rate.

Milton Friedman gave his famous AEA presidential address that academic year (on December 29, 1967) in which he criticized the Phillips curve and discussed the role of monetary policy. My thesis combined two strands of A.W. Phillips research: evaluation of policy rules and a model cyclical growth. But, fortunately, the thesis did not exploit any long-run trade-off implicit in the Phillips curve by trying to raise money growth and inflation permanently to get a permanently lower unemployment.

We have made progress in the development and application of monetary policy rules in the past 50 years, but I am optimistic that we will make much more progress in the next 50 years. Laptops are thousands of times faster than that old IBM 7094, and we now have artificial intelligence, machine learning, big data, bitcoin, now-casting, and instantaneous and global communications. And we are accumulating vast amounts of practical experience over time and in different countries. So, students, get started on that thesis.

Posted in Monetary Policy, Teaching Economics

Favorite Economics April Fools Day Jokes on Twitter

Thanks for a little humor about two of the most important economic topics of the day: monetary policy at the Fed and bankruptcy policy at Tesla. I am sure there are others, but these two are my favorites:

Monetary Policy

F 🌐 🌮 ☪ @fmn13  April 1, 2018

Marvel: ‘Infinity War is the most ambitious crossover event in history’ Me:

 

________________________________________

Bankruptcy Policy

Posted in Uncategorized

Monetary Policy Getting Back on Track

In many ways, the Fed has begun to bring monetary policy back on track as it emphasizes a strategy and the use of monetary policy rules:

On January 18 of last year, former Chair Janet Yellen described the Fed’s strategy for the policy instruments, saying that “When the economy is weak…we encourage spending and investing by pushing short-term interest rates lower….when the economy is threatening to push inflation too high down the road, we increase interest rates…”  In a speech the following day, she compared this strategy with the Taylor rule and other rules, and she explained the differences.

On February 11 of last year, former Vice-Chair Stanley Fischer gave a talk with a similar message, comparing actual policy with monetary rules and explaining how rules-based analyses feed into FOMC discussions to arrive at policy decisions.

On July 7 of last year, the Fed added, for the first time ever, a whole new section on “Monetary Policy Rules and Their Role in the Federal Reserve’s Policy Process” in its Monetary Policy Report . It noted that “key principles of good monetary policy” are incorporated into policy rules. It listed specific policy rules, including the Taylor rule and variations on that rule. It showed that the interest rate was too low for too long in the 2003-2005 period according to the Taylor rule. It showed that, according to three of the rules, the current fed funds rate should be moving up.

On February 23 of this year, the Fed, now with new Chair Jerome Powell, again included a whole section on policy rules in its latest Monetary Policy Report, elaborating on last July’s Report and thus indicating that the new approach will continue.

On February 27 and March 1 of this year, in his first testimony in the House and Senate as Fed Chair, Jerome Powell referred explicitly to making monetary policy with policy rules. He said that “In evaluating the stance of monetary policy, the FOMC routinely consults monetary policy rules that connect prescriptions for the policy rate with variables associated with our mandated objectives. Personally, I find these rule prescriptions helpful. Careful judgments are required about the measurement of the variables used, as well as about the implications of the many issues these rules do not take into account. I would like to note that this Monetary Policy Report provides further discussion of monetary policy rules and their role in the Federal Reserve’s policy process, extending the analysis we introduced in July.”  This emphasis on rules and strategy did not go unnoticed by those who follow policy: As Larry Kudlow put it: “I’ve never seen that in any testimony before….and I think that’s progress.”

On March 8 of this year, the Fed posted a new web site on the principles of sound monetary policy, Monetary Policy Principles and Practice, with a very helpful note on Policy Rules and How Policymakers Use Them.

While the Fed has not yet endorsed the “Monetary Policy Transparency and Accountability Act,” these reforms represent substantial progress in that direction and should be acknowledged.

Posted in Monetary Policy

A Better Way to End Big Bank Bailouts

David Skeel and I wrote the following on an important report on bankruptcy reform just released by the U.S. Treasury:

Yesterday the U.S. Treasury released its official response to President Trump’s memorandum of last April asking for a review of whether an improved bankruptcy law “would be a superior method for the resolution of financial companies” compared to the regulator-run resolution process embodied in the Dodd-Frank Act.  After a year of hearings, consultations, and study, Report to the President on Orderly Liquidation Authority and Bankruptcy Reform states “unequivocally” that bankruptcy should be the preferred method of resolution. The Report calls for a “more robust, effective, bankruptcy process for financial companies” along the lines of the “Chapter 14” proposal of the Hoover Institution Resolution Project, of which we are members, now written into legislation making its way through the House and Senate.

Through careful analysis and judgment of which reform is likely to work politically, financially, and internationally, the Report provides a practical road map to get the legislation passed.  Chapter 14, so called because there is now no Chapter 14 in the bankruptcy code, would rely on the rule of law and strict priority rules of bankruptcy, but would operate faster than current law—over a weekend—leaving operating subsidiaries outside of bankruptcy entirely.  After filing for Chapter 14, the parent company would transfer its operations and short-term debt to a newly created bridge company that is not in bankruptcy.  The bridge company would be recapitalized and ready to continue operations, while the long-term unsecured debt and stock would be left behind in the old company.  The old company would go through bankruptcy in a predictable, rules-based manner without harming the financial system or the economy.

The Report concludes that the Orderly Liquidation Authority (OLA), as the current resolution process under the Dodd-Frank Act is known, “confers far too much unchecked administrative discretion, could be misused to bail out creditors, and runs the risk of weakening market discipline.” A particularly glaring flaw is the absence of clear priority rules: as receiver for the distressed financial company, the Federal Deposit Insurance Corporation is permitted to pick and choose which creditors are paid first.  The Report urges the FDIC to commit to honoring the ordinary priority rules in OLA, to make it more rule-like and predictable.

OLA also gives the FDIC access to vast amounts of funding from the U.S. Treasury, which critics worry could function like a bailout.  The Report calls for much tighter constraints on the use of the funding.  The loans should be secured by good collateral, and should require substantial interest payments, in keeping with the classic approach for providing liquidity to a distressed financial institution.  For similar reasons, the Report takes aim at a provision that gives tax exempt status to any bridge institution that is formed for the purposes of an OLA resolution.  There is no justification for this special treatment, and the Report rightly calls for its removal.

This “reform rather than repeal” approach to OLA leaves in place international arrangements through which resolution authorities in different countries can coordinate the resolution of large international financial firms.  If OLA were repealed, there would be no parallel authority in the United States.  Moreover, with Chapter 14 in place, the resolution planning process required by other provisions of the Dodd-Frank Act would work better, because large institutions could credibly outline how their distress would be handled in bankruptcy. Some of the resolution plans submitted by the large financial firms have been rejected by Fed and FDIC.

The Report adopts a middle ground with respect to regulators’ role in the Chapter 14 process.  Unlike OLA, which gives regulators’ complete control, the managers of the troubled company would be the ones to file the Chapter 14 case.  Although our Hoover group recommended that either the company or regulators be permitted to file, the Report worries that regulators and a troubled financial company might engage in a game of chicken if both had the power to file.  The Report would not exclude regulators from the filing decision, however.  It would encourage judicial deference to a Federal Reserve determination that the Chapter 14 transaction should be approved.

The Report, like the versions of Chapter 14 currently pending in Congress, would not provide any government funding of the resolution process.  Although we have advocated access to limited governmental financing, the Report rightly recognizes that Chapter 14 would require much less new funding than a more complex and time consuming resolution framework.  Because the new bridge company would be fully solvent, having left most of its debt behind, private lenders are likely to be willing to provide any necessary new funding.

If Chapter 14 were added to the bankruptcy code, it would become the strategy of choice for resolving the financial distress of large financial institutions.  OLA would still be available as an alternative, but it would rarely if ever be needed.

Bankruptcy reform is an essential element of an economic growth program.  The reform makes failure feasible under clear rules without disruptive spillovers. It would help prevent bailouts, diminish excessive risk-taking, remove uncertainty associated with an ad hoc bailout process, and reduce the likelihood and severity of financial crises.  Research by our colleague Emily Kapur shows the new law might have prevented the contagion associated with the failure of Lehman Brothers in 2008.

The Administration has laid out a clear path to ending too big to fail and making the financial system more resilient. Now is the time to move forward and get the job done.

Posted in Uncategorized

Application Deadline Approaching for Free Public Policy Program

After a very successful launch last summer, Stanford’s Hoover Institution is again offering a one-week public policy boot camp this coming August 19-25. This “residential immersion program” is aimed at college students and recent graduates. It consists of lectures, workshops, informal discussions, and active collaboration with study groups outside of class.  It covers the essentials of today’s national and international policy issues. It requires a 100% time commitment for the whole one-week program, but if last year is any indicator both faculty and students will find it to be rewarding and fun.

As with last year the teachers in the program are faculty and fellows from the Hoover Institution, which includes scholars in economics, government, political science, and related fields. This summer the lineup includes economists Terry Anderson, Michael Boskin, John Cogan, Caroline Hoxby, Edward Lazear, Joshua Rauh, George Shultz, Amit Seru, and me, along with political science and national security experts Scott Atlas, David Brady, James Ellis, Stephen Haber, Daniel Heil, Michael McConnell, Kori Schake, Kiron Skinner, and Bill Whalen. Last year I talked about monetary policy and the Fed, and I will do the same this year, updated of course.

Believe it or not, the program is free of charge to accepted participants, including lodging and meals.  Attendees will be responsible for travel costs and incidentals. There is still time to apply but the application deadlines are fast approaching: February 1, 2018 for early bird applicants and March 1, 2018 for all applicants.   So apply now and I hope we’ll see you next summer!

Posted in Teaching Economics

Unique Cooperative Research Effort

This week marks the 20-year anniversary of a “notable conference” on monetary policy as Ed Nelson, who reminded me, puts it.  The conference took place at the Cheeca Lodge in the Florida Keys on January 15-17, 1998, and it resulted in the book  Monetary Policy Rules published by the University of Chicago Press for the NBER.

It was an unusual conference.  As stated on the back of the book jacket shown below, it was a “unique cooperative research effort between nearly thirty monetary experts and policymakers.” The purpose was to evaluate alternative monetary policies, all of which were described by policy rules for the interest rate. It was unique because the participants in the conference not only evaluated the performance of their own proposed policy rules with their own models, they also evaluated the performance of other participants’ proposed rules with their models. This put the focus on robustness and effectiveness in a way that had not been done before.

As I summarized in the Introduction “we asked researchers who participated in the conference to investigate the other researchers’ proposals for policy rules using their own models. We did not specify what model (whether large or small, rational or nonrational) should be used.  That decision was left up to the researchers.” It turned out that nine models participated in the evaluation exercise, each of which was described in the individual research papers given at the conference:  Performance of Operational Policy Rules in an Estimated Semiclassical Structural Model by Bennett McCallum and Edward Nelson; Interest Rate Rules in an Estimated Sticky Price Model  by Julio Rotemberg and Michael Woodford; Policy Rules for Open Economies by Laurence Ball; Forward-Looking Rules for Monetary Policy by Nicoletta Batini and Andrew Haldane; Policy Rules for Inflation Targeting by Glenn Rudebusch and Lars Svensson; and four models in the Robustness of Simple Monetary Policy Rules under Model Uncertainty by Andrew Levin, Volker Wieland, and John Williams

The main finding of this effort surprised people at the time. It was that “simple policy rules are more robust and more efficient than complex rules with multiple variables,” a finding that has stood the test of time and many more studies of over the past two decades.

It was a tough to establish uniformity in the evaluation method so that each rule was treated fairly. Thanks to the model data base of Volker Wieland and improvements in computer and information technology, it is much easier to do conduct this kind of robustness study now.

Monika Piazzesi, a Stanford graduate student at the time, prepared a very useful summary of the discussion that took place at the conference. Soon afterwards the Fed began referring to the Taylor (1999) rule which had a higher coefficient on output than the so-called Taylor rule.  (1999 was the year the book was published.). I complained because I had not proposed such a rule in the chapter in the book but simply compared it with other rules. Because of my complaint, Janet Yellen started calling that rule the balanced rule from then on, not that the name made much difference.

 

 

Posted in Monetary Policy

The Fed’s Inflation Target and Policy Rules

The Brookings Institution held an interesting conference yesterday organized by David Wessel on “Should the Fed Stick with the 2 Percent Inflation Target or Rethink It?” Olivier Blanchard and Larry Summers argued, as they have elsewhere, that the Fed should increase its inflation target—say from 2% to 4%. Others—such as John Williams—argued that the Fed should change the target in some other way such as by focusing on the price level. Sarah Binder, Peter Hooper and Kristen Forbes were on a panel to answer questions about political, market, and international issues, respectively. I was on that panel to answer questions  about monetary policy rules, and the first question posed by David Wessel was about the inflation target in the Taylor rule. Here’s a summary of my answer and later remarks during the course of the panel:

For the policy rule that came to be called the Taylor rule, first presented in 1992, I used a 2% inflation target (π*). This was long before the official adoption of a 2% target by the Fed, the BOJ or the ECB.  The central banks of New Zealand, Chile and Canada were moving toward inflation targeting about that time, but not with the single number of 2% as a target. John Murray presented the Canadian history at the conference.

I chose 2% rather than zero back then because of the upward bias in measuring inflation, which was widely discussed at that time, and because of the zero bound problem for the interest rate.  It was not an arbitrary choice. I also chose an equilibrium real interest rate (r*) of 2%. That was not arbitrary either with the real GDP growth rate trending a little over 2%. The actual rule for the interest rate (i) was i=π+.5y+.5(π-π*)+r* with π*=2 and r*=2. This meant that the equilibrium nominal rate was 4%. In equilibrium the output gap (y) equal zero and the inflation rate (π) equals π*.

Regardless of whether or not the Fed changes its inflation target π* going forward, it is important that its monetary framework be based on policy rules. The good economic performance during the Great Moderation was due largely to policy becoming more consistent with a rules-based framework, and the devastation of the Great Recession was due in part to deviating from rules-based policy.

It is also important for the new research on π* to be based on policy rules. In fact, virtually all economic research on the matter has been conducted using policy rules, including the important recent work by Fed economists Michael Kiley and John Roberts for the Brookings Papers on Economic Activity—a paper which was widely cited at the conference.  The whole new section on policy rules in the Fed’s recent Monetary Policy Report and speeches last year by Fed Chair Janet Yellen also use this approach.

All the alternative proposals considered at this conference can and should be evaluated using policy rules, including price level targeting, nominal GDP targeting, and different inflation targets. If you want to evaluate a higher inflation target, you just stick in a higher value for π*.  If you choose an inflation target of 4% with r* still 2%, then the average nominal rate will be 6%. If r* was 0% rather than 2%, an inflation target of 4% would mean an equilibrium nominal interest rate of 4%, exactly as in the original Taylor rule. In each case one can evaluate performance of the economy over a range of models as in Volker Wieland’s model data base.

Such policy rules or strategies would fit into the legislative language in recent bills in Congress, including the “Monetary Policy Transparency and Accountability Act,” which simply require the Fed to describe its policy strategy and compare it with policy rules of its own choosing. The rules would also help clarify the Fed’s actions to the markets and to policy makers in other countries.

The main motivation for the newer inflation targeting proposals is concern about the zero lower bound (ZLB), or the effective lower bound, on the interest rate.  But the lower bound is not a new thing in economic research. Policy rule research took that into account long ago. In my 1993 book, for example, I noted that the policy rule “must be truncated below some nonnegative value.” We used 1% then: whenever the policy rule “calls for a nominal interest rate below 1 percent, the nominal interest rate is set to 1 percent.”

Another alternative is to move to a money growth regime. For example, in 1996 I noted that the interest rate rule needed “to be supplemented by money supply rules in cases of either extended deflation or hyperinflation.” Recently, Peter Ireland and Michael T. Belongia have suggested a return to money growth rules in the case of the ZLB.

Other proposals for dealing with the zero bound have been made over the years. In 1999 David Reifschneider and John Williams proposed that the interest rate be kept extra low following an ZLB period. For example, the interest rate would be kept at zero until the absolute value of the cumulative sum of negative deviations of the actual interest rate from the ZLB equals what occurred during the period of the ZLB.

Many at the conference thought that the ZLB is more of a problem now than in the past because estimates of r* have fallen. But those estimates are uncertain and may reverse soon. Volker Wieland and I demonstrated this uncertainty, especially in current circumstances, in considering the influential research of Thomas Laubach and John Williams. The low estimates of r* may be due to “fog” cause by unusually low policy interest rates and unconventional monetary policies at many central banks. Permanently changing the target inflation rate may not be the best response.

There are also international considerations. As we all know the original 2% inflation target is becoming universal for central banks around the world, and there is also a clamoring for a more rules-based international monetary system. One reason for the clamoring is research showing the increased exchange rate and capital flow volatility of recent years has been due in part to a deviation from a rules-based system. Now is an opportune time to move in the direction of a rules-based international system by simply reporting on the policy strategy in each country. Changing the inflation target in these strategies unilaterally will make this more difficult.

For all these reasons, I would be hesitant to change the inflation target introduced 25 years ago.  But as research on policy rules at the Fed and elsewhere continues, I hope two related concerns are addressed.

First, there is a danger in the way that the numerical inflation target has come to be used in practice. It seems that even if the actual inflation rate is only a bit below the 2% inflation target—say 1.5% or 1.63%—there is a tendency for people to call for the central bank to press the accelerator all the way to the floor. This is not good monetary policy; it is not consistent with any policy rule I know, and it could create excesses or even bubbles in financial markets. This problem could be remedied as the Fed continues to clarify its strategy.

Second, the greater attention to a numerical inflation target may have reduced attention to other aspects of the policy rules, including the idea that we need a policy rule at all. In other words, trying to give more precision to π* may have led to less precision about other parameters, including the sizes of the responses. Recall that the Fed and other central banks moved toward rules-based policy well before they adopted formal numerical inflation targets.  Most of the move to rules-based policy occurred during the period when Paul Volcker and Alan Greenspan simply said that inflation should be low enough that it did not interfere with decision- making. Again, I think this problem can be remedied as the Fed continues to clarify its strategy.

Posted in Monetary Policy