Still Learning From Milton Friedman: Version 3.0

We can still learn much from Milton Friedman, as we celebrate his 105th birthday today.  Here I consider what we can learn from his participation in the monetary policy debates in the 1960s and 1970s. I draw from a 2002 paper that I presented to lead off his 90th birthday celebration in Chicago in 2002  and from two 2012 pieces: a paper I presented at the centennial of his birth in 2012 and an article written on his 100th birthday in 2012.The lessons are very relevant to the debates raging during the last 15 years and continuing today.

Back in the early 1960s, the Keynesian school first came to Washington led by Paul Samuelson who advised John F. Kennedy during the 1960 election campaign and recruited Walter Heller and James Tobin to serve on the Council of Economic Advisers. The Keynesian approach received its official Washington introduction when Heller, Tobin, and their colleagues wrote the Kennedy Administration’s first Economic Report of the President, published in 1962. The Report made the case for discretion rather than rules. For monetary policy it said that a “discretionary policy is essential, sometimes to reinforce, sometimes to mitigate or overcome, the monetary consequences of short-run fluctuations of economic activity.”

In that same year Milton Friedman published Capitalism and Freedom (1962) giving the competing view. He argued that “the available evidence . . . casts grave doubt on the possibility of producing any fine adjustments in economic activity by fine adjustments in monetary policy—at least in the present state of knowledge . . . There are thus serious limitations to the possibility of a discretionary monetary policy and much danger that such a policy may make matters worse rather than better . . .”

So there were two different views, and the fundamental difference was over discretion versus rules-based policies. From the mid-1960s through the 1970s the Samuelson view was winning with practitioners putting discretionary monetary policies into practice, mainly the go-stop policies that led to both higher inflation and higher unemployment. Friedman remained a persistent and resolute champion of the alternative view. Fortunately, Friedman’s arguments eventually won the day and American economic policy moved away from an emphasis on discretion in the 1980s and 1990s. Paul Volcker, who as chair of the Fed implemented the more rules-based policy, had to confront the disparity of views as he did so as I described here.

But this same policy debate is back today. Economists on one side push for more discretionary monetary policy such as the quantitative easing actions and resist the notion of rules-based monetary policy. Other economists argue for a return to more predictable and rule-like monetary policy. They argue that the bouts of quantitative easing were not very effective, and that deviations from rules-based policy helped worsen the great recession.

Of course there are many nuances today, some related to the difficulty of distinguishing between rules and discretion when the zero lower bound is thought to be a constraint. Interestingly, you frequently hear people on both sides channeling Milton Friedman to make their case.

The debate is not merely academic. Rather it is a debate of enormous practical consequence with the well-being of millions of people on the line. The House of Representatives has passed a bill calling for the Fed to describe its policy rule and recent Fed reports have talked about normalization raising questions about a return to rules-based policy.  The same issues arise in discussions of unconventional monetary policy in Europe and Japan.

Can the disagreements be resolved? Milton Friedman was optimistic that debates could be resolved, and I am sure that this is one reason why he kept researching and debating the issue so vigorously.

Today people on both sides can learn from him. First, while a vigorous debater he was respectful, avoiding personal attacks. Second, he had a strong believe that empirical evidence would bring people together. Yes, people would come to the issue with widely different prior beliefs, but their posterior beliefs—after evidence was collected and analyzed—would be much closer. In this way the disagreement would eventually be resolved.

Although posterior beliefs in the monetary area now seem just as far apart as prior beliefs were 50 years ago, I sense that empirical work on the policy decisions of recent years, like the empirical paper by Alex Nikolsko-Rzhevskyy, David Papell, Ruxandra Prodan, can bring about more convergence of views.  As I said at the time of his 100th birthday, clearly we can learn a lot from Milton Friedman in deciding how to proceed.

Posted in Monetary Policy

Debate Over the Very Principles of Economics

Today is the launch of the online version of my Economics 1 course (and namesake of this Blog and my Twitter handle) on the Principles of Economics for summer 2017. This year is also the tenth anniversary of the start of the Global Financial Crisis and the Great Recession which began in 2007.

During these ten years there has been great deal of hand-wringing among economists and others about the subject of Economics. This is an important debate, and the different positions deserve to be covered in the basic economics course.

As early as 2009, a cover of The Economist magazine showed a book titled “Modern Economic Theory” melting into a puddle to illustrate what the writers viewed as the problem with economics. It was the most talked about issue of the year.

Some economists have been calling for a complete redo of economics—or for a return to a version of the subject popular decades ago. They say that economics failed to prevent the Great Recession and the Global Financial Crisis or even led to them. Many of these economists argued for a change government policy, saying that John Maynard Keynes was right and Milton Friedman was wrong.

Paul Samuelson spoke this way in an interview in the New Perspectives Quarterly in 2009 saying, “today we see how utterly mistaken was the Milton Friedman notion that a market system can regulate itself… This prevailing ideology of the last few decades has now been reversed…I wish Friedman were still alive so he could witness how his extremism led to the defeat of his own ideas”.

Paul Krugman, in a piece in the New York Times Magazine in 2009, also faulted modern economics for bringing on the crisis. He said it focused too much on beauty over practicality and did not recognize the need for more government intervention to prevent and cure the crisis. His fix was to add more psychology to economics or to build better models of credit.

And over the years the debate has continued. Last year Thomas Sargent, in commenting on a Handbook by macroeconomists said the “collection belies uninformed critics who assert that modern macroeconomics was wrong footed by the 2007-2009 financial crisis….both  before and after that crisis, working  macroeconomists had rolled up their sleeves to study how financial frictions, incentive problems, incomplete markets, interactions among monetary, fiscal, regulatory, and bailout policies, and a host of other issues affect prices and quantities and good economic policies.”

But also last year Paul Romer, now chief economist at the World Bank, wrote a widely discussed piece called “The Trouble with Macroeconomics.” Then Ricardo Reis of the London School of Economics wrote a paper more supportive of economics with the title “Is something really wrong with macroeconomics?” My colleague John Cochrane commented positively on the views of Reis, and Noah Smith explained in a Bloomberg View column why “So Many Critics of Economics Miss What it Gets Right

So the debate moves on. My view, throughout this period, has been that the Great Recession and the Global Financial Crisis do not provide evidence of a failure of economics. Rather theses events vindicate the theory. The research I have done, here for example for the Fed’s Jackson Hole Conference in 2007, points instead to a deviation of economic policy from the type of policy recommended by economic principles–a deviation from the type of policy that was responsible for the remarkably good economic performance in the two decades before the crisis. Economists call this earlier period the Long Boom or the Great Moderation because of the remarkably long expansions and short shallow recessions. In other words, the crisis did not occur because economic theory went wrong. It occurred because policy went wrong.

Posted in Financial Crisis, Teaching Economics

Economics 1 Online. No Charge.

This summer I will be offering my Stanford course Principles of Economics online for free.  You can find out more and register for the course, Economics 1, on Stanford’s open on-line platform Lagunita.  The course starts at 8 am PT Monday July 17, when I will post the first week’s videos and reading material, and it goes through 11:59 pm PT September 18. It is possible to register and join the course at any time throughout this period.

The course is based on my on-campus course at Stanford. Each day after giving a 50-minute lecture, I recorded the same lecture divided into smaller segments for online viewing. We added graphs, photos, and other illustrations–just as in the on-campus course; we captioned and indexed the videos–an attraction not in the on-campus course; and we added study material, reviews, quizzes, and a discussion forum.

The first week of the course covers “The Basic Core of Economics” focusing on such ideas as opportunity cost and the supply and demand model with practical applications. Just learning this Basic Core is a significant and worthwhile accomplishment. The course then goes on to consider many topics in microeconomics and macroeconomics including key economic policy issues. I draw on experience in government and the private sector. The course stresses the key idea that economics is about making purposeful choices with limited resources and about people interacting with other people as they make these choices. Most of those interactions occur in markets, and the course is mainly about markets, including labor markets and capital markets.

People who participate in the open online course and take the short quizzes following each video will be awarded a Statement of Accomplishment, or a Statement of Accomplishment with Distinction. The latest version 8.0 of my textbook with Akila Weerapana, Principles of Economics, and its shorter versions, Principles of Microeconomics and Principles of Macroeconomics, can be purchased and downloaded to go along with the course thanks to FlatWorld.

I am pleased to say that there has already been a buzz about the course on Twitter during the past 24 hours. Here is a sampling:

  • Russell Roberts‏ @EconTalker: Great class. Great teacher. No charge. Get your basics right here.
  • Ike Brannon‏ @coachbuckethead: The most entertaining economist I know.
  • Brian Wesbury‏ @wesbury:  If you want to learn Economics from one of the best, click on this link!  What great news!
  • Chris Pippin‏ @ChrisPippin: Wow. Can’t recommend highly enough. This is the class and the professor that made me choose Econ as a major.
  • Juan Carlos Martinez‏ @juank700410: Educación gratuita y de calidad

Thank you!


Posted in Teaching Economics

A Whole New Section on Policy Rules in Fed’s Report

The Federal Reserve Board’s semi-annual Monetary Policy Report issued by Chair Janet Yellen last Friday contains a whole new section called “Monetary Policy Rules and Their Role in the Federal Reserve’s Policy Process.” The section contains new information and is well worth reading. Below is an excerpt which first lists three “key principles of good monetary policy” that the Fed says are incorporated into policy rules; it then lists five policy rules, including the Taylor rule and four variations on that rule that the Fed uses, with helpful references in notes which are also excerpted below.

The three principles sound quite reasonable: on the third–called the “Taylor Principle” by Mike Woodford and others–the Fed is quite specific in that it gives the numerical range for the response of the policy rate–the federal finds rate–to the inflation rate. The policy instrument is not mentioned specifically for the other two principles

More information, including some algebra, is given in Figure A which is reproduced below. It is good that one of the five policy rules–which the Fed calls the “Taylor (1993) rule, adjusted”–is based on the important 2000 research paper by David Reifschneider and John Williams on the zero lower bound which I have written about here. Note that the Fed describes these rules using the unemployment rate rather than real GDP, relying on an empirical connection between the real GDP/potential GDP gap and the unemployment rate (Okun’s law). Note that what the Fed calls the “balanced-approach rule” is the Taylor rule with a different coefficient on the cyclical variable

The Fed’s Report then goes on to compare the FOMC’s settings for the federal funds rate with the rules as summarized in the next chart. It shows that the interest rate was too low for too long in the 2003-2005 period according to the Taylor rule (not sure if the Fed was looking at the other rules back then), and that according to three of the rules the current fed funds rate should be moving up. (The Fed makes these calculations using its estimate of time variation in the neutral rate of interest ).In reporting on well-known policy rules, the Fed is doing part of what is called for in the legislation which recently passed the House as Title X, Section 1001 of H.R. 10. However, aside from being positive about the three principles, it does not say much about its own policy strategy in the document as also called for in the legislation.

In addition, the report focuses extensively on differences, rather than similarities, in the policy rules, and on the differences in inputs to the policy rules. The differences in measures of inflation, the neutral interest rate, and other variables are part of monetary policy making and always will be. In reality they are a reason to use policy rules as a means of translating these differences in measurement into differences about policy in a systematic way. Such differences do not imply that policy rules or strategies are impractical, as the Report seems to suggest, at least based on some financial reporting.

Chair Yellen will testify at the Financial Services Committees of the House on Wednesday and at the Banking Committee of the Senate on Thursday of this week on the Report and other matters. The testimony and the questions and answers about the Report at the hearings will be well worth following.





Posted in Monetary Policy

Seeing Through the Fog of Federal Budget Forecasting

Every summer since 2010 I’ve charted the latest Congressional Budget Office (CBO) long-term projection of the federal debt, noting the similarity with the Fourth of July fireworks. But during these years, the CBO has changed its procedures several times, fogging up comparisons over time and lessons from experience.

Starting with my first post in 2010 on this topic, CBO reported projections of the debt as a percentage of GDP going out 75 years based on their “alternative fiscal scenario” which is more realistic than their “baseline scenario” that assumes no-change in current law. Here’s what CBO projections made in 2009 and 2010 looked like. You can see the explosions clearly as the debt to GDP ratio forecast reached 767% by 2083 in the 2009 estimate, and even higher in the 2010 estimate. I also sketched in an optimistic, but sensible, idea of what could happen the next year, 2011.

Perhaps because the explosions looked so bad, CBO implemented procedural changes to make their projections look less like fireworks. First, starting with the 2011 projections, CBO stopped reporting the debt to GDP ratio once it exceeded 200 percent of GDP, which turned out to be in 2031, ending the exercise 50 years earlier than the previous projection.  So, in my blog post about the the 2011 projections I had to calculate my own debt projection.  It was based entirely on CBO assumptions and is shown in this next chart, which clarified that the explosion was still there, even though CBO stopped publishing it. (As you can see the 2011 projection was not what I hoped for.)

The second CBO procedural change was to discontinue the use of the “alternative fiscal scenario” in the long-term projections which made it impossible to update my plots and make comparisons as before.  So, in the 2013 post, I simply superimposed CBO’s longer-term 2009 projection on their shorter-term 2013 projection, and thereby simulated a comparison as in the next chart.  As the chart shows quite clearly, the debt picture had not improved at all.

The third change at CBO was to stop using the alternative fiscal scenario completely, and instead rely only on the “baseline scenario” which unrealistically assumes current law remains fixed.

This change was unfortunate in my view. In fact, it turns out that the “alternative fiscal scenario” has been more accurate than the “baseline scenario. To show this I compared the CBO 2010 projection of the debt for the years from 2011 to 2016 with actual debt just reported by the CBO in their March 2017 long-term budget outlook. The next chart shows how much closer the alternative fiscal scenario is to what actually transpired compared with the baseline scenario.

In any case, without the alternative fiscal scenario, it is not possible to continue updating and comparing the charts in an apples-to-apples fashion. Fortunately, the Committee for a Responsible Federal Budget (CRFB) has come to the rescue by filling the void left by CBO’s omission of its alternative fiscal scenario. In a recent piece, How High Will Debt Rise If Current Policy Continues?, CRFB created and reported their own alternative fiscal scenario, writing that “as we show in this piece, debt could grow far higher if policymakers continue to act as they have in recent years.”

Here is a chart from the CFRB study which helps to illustrate the differences; it goes out to 2047 and thereby shows part of the explosion. To be sure, this projection, which was done in April, does not include the impacts of already-enacted or administratively-executed regulatory changes, and clearly does not include tax reform, budget reform, and monetary reform that have been proposed. Without these reforms the explosive story is the same.

But, as I have argued for a long time, with these reforms, the budget projections and the economy would be quite different. The CBO assumes that real GDP growth will be only 1.9 percent per year during the period of its long-term budget projection, which CBO divides into 1.6% for productivity growth and 0.3% for worker-hours growth. With the changes in policy, including with the fiscal consolidation plan implicit in the chart above that recommended budget reform in 2011, growth would be higher–say 3% as I argued here with both productivity growth and labor force participation rates rising. In sum, despite the fog that may have been created by changes in CBO’s budget procedures, the history of these forecasts and the failure of policy thus far show that the economy is still like a caged eagle ready to be set free, a fine analogy for the Fourth of July.


Posted in Budget & Debt

Macro Model Comparison Research Takes Off

Last week a new Macroeconomic Modelling and Model Comparison Network (MMCN) was launched with a research conference at Goethe University Frankfurt. Economists from the IMF, the Fed, the ECB, and other central banks presented and compared policy models along with academics from Chicago, Penn, Amsterdam and elsewhere.  Such collaboration and exchange will define the new network.  Judging from the first conference, it got off to a good start.

The conference led off with a critical review of macroeconomic models used for policy from a finance perspective by Winston Dou, Andrew Lo, Ameya Muley, and Harald Uhlig. The paper surveyed monetary models used at central banks, and it pointed out problems with current models, especially linearized versions, suggesting newer solution, estimation, and evaluation methods.  The paper also proposed research on a new generation of policy models incorporating the financial sector, the government balance sheet, unconventional monetary policy, heterogeneity, reallocation, redistribution effects, nonlinear risk-premiums, time-varying uncertainty, and imperfect product markets.

So there is much work to do, and indeed the conference showed that work had begun. Doug Laxton of the IMF presented research on new models with credit and financial constraints. Roberto Motto examined the ECB’s approach to unconventional monetary policy. John Roberts examined the implications of recent lower estimates of the equilibrium interest rates using both the ECB’s model and the Fed’s main model.  Maik Wolters showed that the evidence was weak that the equilibrium real rate has fallen. There were also parallel sessions on fiscal policy, macro-prudential policy, and international monetary policy along with poster sessions with 20 different presentations with a range of fresh new ideas on models used to find good monetary policy rules.

The network will be operated under the auspices of Centre for Economic Policy Research in London which is directed by Richard Baldwin.  The network, which welcomes researchers interested in policy and model comparisons, is one part of a larger project called the Macroeconomic Model Comparison Initiative (MMCI) organized by Michael Binder, Volker Wieland, and me. That initiative includes the Macroeconomic Model Data Base, which already has 82 models that have been developed by researchers at central banks, international institutions, and universities. Key activities of the initiative are comparing solution methods for speed and accuracy, performing robustness studies of policy evaluations, and providing more powerful and user-friendly tools for modelers.

An essential prerequisite of the initiative is that the models be put forth in transparent ways that make replication easy.  Obviously quantitative macroeconomic models play an important role in informing policy makers about the consequences of monetary and fiscal policies, and an objective of the new initiative is to improve the interface between researchers and policy makers, and thus make better use of economics to improve policy.


Posted in Uncategorized

Reserve Balances and the Fed’s Balance Sheet in the Future

An important part of the Fed’s normalization policy is to reduce its holdings of securities and thereby reserve balances—deposits of banks at the Fed—used to finance these holdings. As I argued when quantitative easing began in 2009, this reduction should be predictable and strategic.  That view was given some empirical support by the “taper tantrum” in 2013, when Ben Bernanke abruptly said in a congressional hearing that the Fed’s purchases of securities would taper in “the next few meetings.” In contrast, when the tapering later became more predictable, markets digested it easily.

The Addendum to the Policy Normalization Principles and Plans recently issued by the Fed conforms to this gradual and predictable approach. The Fed said it intends to reduce its holdings of Treasury and mortgage-backed securities by decreasing reinvestment of principal payments to the extent that they exceed gradually phased-in caps. As stated in the Addendum, the Fed “anticipates that the caps will remain in place once they reach their respective maximums so that the Federal Reserve’s securities holdings will continue to decline in a gradual and predictable manner until the Committee judges that the Federal Reserve is holding no more securities than necessary to implement monetary policy efficiently and effectively. Gradually reducing the Federal Reserve’s securities holdings will result in a declining supply of reserve balances.”

The statement that the supply of reserve balances will decline in a gradual and predicable manner is welcome. But there is still the important question about what the Fed is aiming for. As explained in the Addendum, the “Committee currently anticipates reducing the quantity of reserve balances, over time, to a level appreciably below that seen in recent years but larger than before the financial crisis; the level will reflect the banking system’s demand for reserve balances and the Committee’s decisions about how to implement monetary policy most efficiently and effectively in the future. The Committee expects to learn more about the underlying demand for reserves during the process of balance sheet normalization.”

It is important that the Fed refers to reserve balances so much in this statement. There are two basic approaches to the question of what the Fed should aim for, and the level of reserve balances in the balance sheet is the key difference between them. One approach is for the Fed to aim at an eventual balance sheet and a corresponding level of reserve balances in which the interest rate is determined by the demand and supply of reserves—in other words, by market forces—rather than by an administered rate under interest on excess reserves. (To be sure, during the normalization or transition period with inherited high reserve balances, there is no choice but to use interest on excess reserves). Conceptually this means the Fed would eventually operate under a framework as it did in the two decades before crisis.  Most likely the level of reserve balances will be greater than the levels of 2007, but that will depend on liquidity regulations. The defining concept of this approach is a market determined interest rate.

I think the case can be made for such a framework. The assessment of Peter Fisher, who ran the trading desk at the New York Fed for many years, is that such a framework would work. At the recent monetary policy conference at the Hoover Institution, he said “we could get back and manage it with quantities; it’s not impossible. We could just re-engineer the system and go back to the way we were.”  I agree based on the time I spent in the markets for federal funds in those days watching how they operated and writing up an institutional description and model of how people traded in those markets. If we went back to that framework, there would not be any need for interest on excess reserves. If the Fed wanted to change the short term interest rate, it would just adjust the supply of reserves.  The amount of reserves would be set so that the supply and demand for reserves determine the interest rate.

The Fed could also provide liquidity support if it needed to do so in this framework. Recall the events of 9/11 when the devastating physical damage led the Fed to provide effective lender of last resort loans. So you can have that kind of liquidity support in such a regime.

If it wanted to, the Fed could operate with corridor system in this framework. There would be a lower-interest rate on deposits at the floor of the corridor, a higher-interest rate on borrowing at the ceiling of the corridor, and, most important, a market-determined interest rate above the floor and below the ceiling.

This approach creates an important connection between the Fed’s policy interest rate and the amount of reserves or money in the system. The Fed is responsible for reserves and money, and that connection is important to maintain. Without that connection, you raise the chances of the Fed being a multipurpose institution, which leads people to raise questions about its independence.  The Fed has already been involved in credit allocation with mortgage-backed securities purchases, and Charles Plosser argues it might do much more.

The second approach is a system where the quantity supplied of reserves remains well above the demand, and the interest rate is administered through interest on excess reserves as recently discussed along with other normalization issues by Fed Governor Powell.  The method is sometimes called a “floor” system, but the federal funds rate moves a bit below the floor, so it is not really a floor. In any case, the interest rate is not market determined.

Those who support the second approach argue that more reserves than the amount needed to determine the interest rate are needed for liquidity purposes. Some (see Todd Keister) argue that the payment system doesn’t function well with a smaller amount of reserves. In the past there were large daylight overdrafts. However, one could limit the size of the overdrafts, perhaps as a percentage of collateral. There also may be some regulatory changes that would reduce the demand for liquidity.

Some argue that with a large balance sheet the Fed could provide depository services to regular people, just like it provides depository services to banks, with advantages described by John Cochrane here in an earlier conference volume. The Treasury could provide that service without interfering with the Fed’s operations, however, or there may be other ways to provide the service without creating a disconnect between the interest rate and reserves.

Others argue that a permanently large balance sheet with large reserve balances would allow quantitative easing to be used regularly.  I don’t think quantitative easing has been that effective, and because there is uncertainty about its impact, it is hard to conduct a rules-based monetary policy with such interventions.  Moreover, the spreading of quantitative easing across the international monetary system adds turbulence to exchange rates and capital flows.

In sum, we should not only be thinking about how to reduce the size of the balance sheet in a predictable, strategic way. We should also be thinking about where reserve balances are going.  I think the first proposal described here makes sense. After the normalization, after the transition is finished, interest rates would again be determined by market forces.

Posted in Financial Crisis, Monetary Policy, Regulatory Policy