1/17/25 Podcast – Authors Glenn Hubbard & Tony O’Brien discuss the pros/cons of tariffs and the impact of AI on the economy.

Welcome to the first podcast for the Spring 2025 semester from the Hubbard/O’Brien Economics author team. Check back for Blog updates & future podcasts which will happen every few weeks throughout the semester.

Join authors Glenn Hubbard & Tony O’Brien as they offer thoughts on tariffs in advance of the beginning of the new administration. They discuss the positive and negative impacts of tariffs -and some of the intended consequences. They also look at the AI landscape and how its reshaping the US economy. Is AI responsible for recent increased productivity – or maybe just the impact of other factors. It should be looked at closely as AI becomes more ingrained in our economy.

https://on.soundcloud.com/8ePL8SkHeSZGwEbm8

Did Stephen King Stumble into One of Our “Pitfalls in Decision Making”?

The cover of Steven King’s novel The Stand. (Image from amazon.com)

In Microeconomics, Chapter 10, we have a section on “Pitfalls in Decision Making.” One of those pitfalls is the failure to ignore sunk costs. A sunk cost is one that has already been paid and cannot be recovered.

In his book On Writing: A Memoir of the Craft, King discusses his writing of The Stand (a book he describes as “the one my longtime readers still seem to like the best.”) At one point he had had trouble finishing the manuscript and was considering whether to stop working on the novel:

“If I’d had two or even three hundred pages of single-spaced manuscript instead of more than five hundred, I think I would have abandoned The Stand and gone on to something else—God knows I had done it before. But five hundred pages was too great an investment, both in time and in creative energy; I found it impossible to let go.”

King seems to have committed the error of ignoring sunk costs. The time and creative energy he had put into writing the 500 pages were sunk—whether he abandoned the manuscript or continued writing until the book was finished, he couldn’t get back the time and energy he had expanded on writing the first five hundred pages.  That he had already written 300 pages or 500 pages wasn’t relevant to his decision because if a cost is sunk it doesn’t matter for decision making whether the cost is large or small.

Is it relevant in assessing King’s decision that in the end he did finish The Stand, the novel sold well—earning King substantial royalties—and his fans greatly admire the novel? Not directly because only with hindsight do we know that The Stand was successful. In deciding whether to finish the manuscript, King shouldn’t have worried about the cost of the time and energy he had already spent writing it. Instead, King should have compared the expected marginal cost of finishing the manuscript with the expected marginal benefit from completing the book. Note that the expected marginal benefit could include not only the royalty earnings from sales of the books, but also the additional appreciation he received from his fans for writing what turned out to be their favorite novel.

When King paused working on the manuscript after having written 500 pages, the marginal cost of finishing was the opportunity cost of not being able to spend those hours and creative energy writing a different book. Given the success of The Stand, the marginal benefit to King from completing the manuscript was almost certainly greater than the marginal cost. So, completing the manuscript was the correct decision, even if he made it for the wrong reason!

The Southern California Wildfires and Problems in the Insurance Industry 

Fire damage in the Pacific Palisades. (Photo from Reuters via the Wall Street Journal)

As of January 15, the series of devastating wildfires in Southern California have killed at least 25 people and destroyed billions of dollars’ worth of homes and businesses. Adding to the tragedy is the fact that many homeowners aren’t fully insured against the damage. As a result, they lack the necessary funds to rebuild their homes. Unfortunately for these people, the market for fire insurance in California hasn’t been working well. 

In the United States, regulation of property and casualty insurance occurs at the state level with regulations differing substantially across states. In California, insurance companies face an unusually long regulatory process to receive permission to increase the premiums they charge. The delays in raising premiums have contributed to companies not renewing property insurance policies in some areas, such as those prone to wildfires. In these areas, the payouts the companies expect to make have been higher than the premiums that California regulators have allowed companies to charge policyholders.

The wildfires have ravaged the Pacific Palisades neighborhood of Los Angeles . Although housing prices in the neighborhood are among the highest in the country, an analysis by the Reuters news agency showed that: “Measured against home values, insurance costs are cheaper in the Palisades than in 97% of U.S. postal codes …” For example, the median insurance premium in the Pacific Palisades was “less than residents paid in Glencoe, Illinois, an upscale suburb of Chicago where homes are two-thirds cheaper and the risk of wildfire is minimal.”

Catastrophe modeling is a way of statistically forecasting the probability of events—such as floods or wildfires—occurring that would sharply increase claims by policyholders. Regulations had barred insurance companies from using catastrophe modeling to justify increases in premiums. (State regulators lifted the prohibition on the use of catastrophe modeling shortly before the fires.) These restrictions made it more difficult for companies to charge risk-based premiums, which are based on the probability that a policyholder will file a claim.

Insurance markets can experience adverse selection problems because the people most eager to buy insurance are those with highest probability of requiring an insurance payout. Insurance companies attempt to reduce adverse selection problems by, among other things, charging risk-based premiums. Limiting the ability of insurance companies to charge risk-based premiums increased the adverse selection problems the companies face. To cope with the problem of companies not renewing policies, regulators began requiring companies to renew policies in some Zip codes, particularly those that were in or near areas that had experienced wildfires. This policy further increased adverse selection.

By 2023, some insurers, including State Farm and Allstate—which are two of the largest property insurers in the United States—had decided that they were unlikely to be able cover their costs from offering property insurance policies in California and stopped writing policies in the state. Policyholders who are unable to obtain a policy from a private insurance company typically buy a policy offered through the Fair Access to Insurance Requirements (FAIR) Plan. The FAIR Plan is sponsored by the state government, although operated by private insurance companies. The premiums charged for a FAIR Plan policy are significantly higher than the premiums charged for a traditional policy. Despite the higher premiums, the number of FAIR Plan policies doubled between 2020 and 2025, reaching nearly 500,000. 

The FAIR Plan lacks sufficient funds to pay the claims from policyholders who had lost their homes or businesses in the Southern California wildfires. To cover the deficit, the FAIR Plan will assess private insurance companies, who, in turn, will raise premiums charged to their other policyholders. In this way, some of the costs from the wildfires will be borne by all property insurance policyholders in California, even if they live far from the areas affected by the wildfires.

We discuss moral hazard in insurance markets in Microeconomics and Economics, Chapter 7 (and in Money, Banking, and the Financial System, Chapter 11). In general, moral hazard refers to actions people take after they have entered into a transaction that make the other party to the transaction worse off. Moral hazard in insurance markets occurs when people change their behavior after becoming insured. The way that the insurance market is regulated in California and, in particular, the way that the FAIR Plan is administered increases moral hazard because people who own homes or businesses in areas with a greater risk of damage from wildfires don’t pay premiums that fully reflect that greater risk. In other words, more people live in fire prone areas in California than would do so if the premiums on their insurance policies fully reflected the probability of their making a claim.

Whether, following the wildfires, the California legislature will change the regulations governing the insurance market is unclear at this point. As an insurance agent quoted by the Wall Street Journal put it: “We are in uncharted territory.”

Why Were Glenn and Tony Particularly Happy in 2024?

Because inflation in chocolate chip cookie prices ended, or course! In December 2023, the average retail price of a pound of chocolate chip cookies (even we try not to eat a pound of cookies in one sitting!) was $5.12. In November 2024, the price had declined to $4.92 per pound. As the following figure shows, the days of 20 percent annual inflation in chocolate chip cookie inflation that lasted from September 2022 to April 2023 are behind us—at least for now.

The following figure shows annual inflation in overall food prices as measured by the percentage change in the food component of the consumer price index (CPI) from the same month in the previous year. During 2024, food inflation has been running at an annual rate of between 2.0 percent and 2.5 percent. In contrast, chocolate chip cookies actually experienced deflation of –1.1 percent. So, the price of chocolate chip cookies relative to the prices of other food products declined by more than did the absolute price.

We were led to contemplate chocolate chip cookie prices by this article in the New York Times that discusses the FRED data set. (A subscription may be required to access the article.) Like many other economists, we heavily use the FRED site and greatly appreciate the efforts of those at the Federal Reserve Bank of St. Louis who have made this data resource available.

The Curious Case of the GE Refrigerators

Is the refrigerator above different from the refrigerator below?

Consumer Reports is a magazine and web site devoted to product reviews. Their November-December 2024 issue noted something unusual about the two GE refrigerators shown above. (The images are from the geappliances.com web site.) At the time the issue was printed, the first refrigator above had a price of $2,300 and the second refrigerator had a price of $1,300: Consumer Reports notes that:

“These look-alike fridges offer equally impressive performance, have the same interior features … and are from the same brand. So why the $1,000 price difference? We don’t know.”

(Note: GE referigators, and many other products branded with the GE name are no longer produced by the General Electric company, which dates back to 1892 and was co-founded by Thomas Edison. Today, GE is primarily an aerospace company and GE appliances are produced by the Chinese-owned Haier Smart Home Company.)

If we assume that Consumer Reports is correct and the two refrigerators are identical, what strategy is the firm pursuing by charging different prices for the same product? As we discuss in Microeconomics, Chapter 15, Section 15.5 firms can increase their profits by practicing price discrimination—charging different prices to different customers. To pursue a strategy of price discrimination the firm needs to be able to divide up—or segment—the market for its refrigerators.

Firms sometimes use a high price to signal quality. The old saying “you get what you pay for” can lead some consumers to expect that when comparing two similar goods, such as two models of refrigerators, the one with the higher price also has higher quality. In the appliance section of a large store, such as Lowe’s or the Home Depot, or online at Amazon or another site, you will have a wide variety of refrigerators to choose from. You may have trouble evaluating the features each model offers and be unable to tell whether a particular model is likely to be more or less reliable. So, if you are choosing between the two GE models shown above, you may decide to choose the one with the higher price because the higher price may indicate that the components used are of higher quality.

In this case, GE may be relying on a segmentation in the market between consumers who carefully research the features and the quality of the refrigerators that different firms offer for sale and those consumers don’t. The consumers who do careful research and are aware of all the features of each model may be more sensitive to the price and, therefore, have a high price elasticity of demand. The consumers who haven’t done the research may be relying on the price as a signal of quality and, therefore, have a lower price elasticty of demand.

If what we have just outlined was the firm’s strategy for increasing profit by charging different prices for two models that are, apparently, either identical or very similar, it doesn’t seem to have worked. Note that the model shown in the first photo above is the one that had a price of $2,300 when Consumer Reports wrote about it, but now has a price of $1,154.40. The model shown in the second photo had a price of $1,300, but now has a price of $1,399.00. In other words, the model that had the lower price now has the higher price and the model that had the higher price now has the lower price.

What happened between the time the issue of Consumer Reports published and now? We might conjecture that there are few consumers who would be likely to pay $1,000 more for a refrigator that seems to have the same features as another model from the same company. In other words, segmenting consumers in this way seems unlikely to succeed. Why, though, the firm decided to make its formerly higher-price model its lower-price model, is difficult to explain without knowing more about the firm’s pricing strategy.

The Amazing Rise of Nvidia

Nvidia’s headquarters in Santa Clara, California. (Photo from nvidia.com)

Nvidia was founded in 1993 by Jensen Huang, Chris Malachowsky, and Curtis Priem, electrical engineers who started the company with the goal of designing computer chips that would increase the realism of images in video games. The firm achieved a key breakthrough in 1999 when it invented the graphics processing unit, or GPU, which it marketed under the name GeForce256. In 2001, Microsoft used a Nvidia chip in its new Xbox video game console, helping Nvidia to become the dominant firm in the market for GPUs.

The technology behind GPUs has turned out to be usable not just for gaming, but also for powering AI—artificial intelligence—software. The market for Nvidia’s chips exploded witth technology giants Google, Microsoft, Facebook and Amazon, as well as many startups ordering large quantites of Nvidia’s chips.

By 2016, Nvidia CEO Jen-Hsun Huang could state in an interview that: “At no time in the history of our company have we been at the center of such large markets. This can be attributed to the fact that we do one thing incredibly well—it’s called GPU computing.” Earlier this year, an article in the Economist noted that: “Access to GPUs, and in particular those made by Nvidia, the leading supplier, is vital for any company that wants to be taken seriously in artificial intelligence (AI).”

Nvidia’s success has been reflected in its stock price. When Nvidia became a public company in 1999 by undertaking an initial public offering (IPO) of stock, a share of the firm’s stock had a price of $0.04, adjusted for later stock splits. The large profits Nvidia has been earning in recent years have caused its stock price to rise to more than $140 dollars a share.

(With a stock split, a firm reduces the price per share of its stock by giving shareholders additional shares while holding the total value of the shares constant. For example, in June of this year Nvidia carried out a 10 for 1 stock split, which gave shareholders nine shares of stock for each share they owned. The total value of the shares was the same, but each share now had a price that was 10 percent of its price before the split. We discuss the stock market in Microeconomics, Chapter 8, Section 8.2, Macroeconomics, Chapter 6, Section 6.2, and Economics, Chapter 8, Section 8.2.)

The following figure from the Wall Street Journal shows the sharp increase in Nvidia’s stock price over the past three years as AI has become an increasingly important part of the economy.

Nvidia’s market capitalization (or market cap)—the total value of all of its outstanding shares of stock—is $3.5 trillion.  How large is that? Torsten Sløk, the chief economist at Apollo, an asset management firm, has noted that, as shown in the following figure, Nvidia’s market cap is larger than the total market caps—the total value of all the publicly traded firms—in five large economies.

Can Nvidia’s great success continue? Will it be able to indefinitely dominate the market for AI chips? As we noted in Apply the Concept “Do Large Firms Live Forever?” in Microeconomics Chapter 14, in the long run, even the most successful firms eventually have their positions undermined by competition. That Nvidia has a larger stock market value than the total value of all the public companies in Germany or the United Kingdom is extraordinary and seems impossible to sustain. It may indicate that investors have bid up the price of Nvidia’s stock above the value that can be justified by a reasonable forecast of its future profits.

There are already some significant threats to Nvidia’s dominant position in the market for AI chips. GPUs were originally designed to improve computer displays of graphics rather than to power AI software. So, one way of competing with Nvidia that some startups are trying to exploit is to design chips specifically for use in AI. It’s also possible that larger chips may make it possible to use fewer chips than when using GPUs, possibly reducing the total cost of the chips necessary to run sophisticated AI software. In addition, existing large technology firms, such as Amazon and Microsoft, have been developing chips that may be able to compete with Nvidia.

As with any firm, Nvidia’s continued success requires it to innovate sufficiently to stay ahead of the many competitors that would like to cut into the firm’s colossal profits.

Want a Raise? Get a New Job

Image generated by GTP-4o of someone searching online for a job

It’s become clear during the past few years that most people really, really, really don’t like inflation. Dating as far back as the 1930s, when very high unemployment rates persisted for years, many economists have assumed that unemployment is viewed by most people as a bigger economic problem than inflation. Bu the economic pain from unemployment is concentrated among those people who lose their jobs—and their families—although some people also have their hours reduced by their employers and in severe recessions even people who retain their jobs can be afraid of being laid off.

Although nearly everyone is affected by an increase in the inflation rate, the economic losses are lower than those suffered by people who lose their jobs during a period in which it may difficult to find another one. In addition, as we note in Macroeconomics, Chapter 9, Section 9.7 (Economics, Chapter 19, Section 19.7), that:

“An expected inflation rate of 10 percent will raise the average price of goods and services by 10 percent, but it will also raise average incomes by 10 percent. Goods and services will be as affordable to an average consumer as they would be if there were no inflation.”

In other words, inflation affects nominal variables, but over the long run inflation won’t affect real variables such as the real wage, employment, or the real value of output. The following figure shows movements in real wages from January 2010 through September 2024. Real wages are calculated as nominal average hourly earnings deflated by the consumer price index, with the value for February 2020—the last month before the effects of the Covid pandemic began affecting the United States—set equal to 100. Measured this way, real wages were 2 percent higher in September 2024 than in February 2020. (Although note that real wages were below where they would have been if the trend from 2013 to 2020 had continued.)

Although increases in wages do keep up with increases in prices, many people doubt this point. In Chapter 17, Section 17.1, we discuss a survey Nobel Laurete Rober Shiller of Yale conducted of the general public’s views on inflation. He asked in the survey how “the effect of general inflation on wages or salary relates to your own experience or your own job.” The most populat response was: “The price increase will create extra profifs for my employer, who can now sell output for more; there will be no increase in my pay. My employer will see no reason to raise my pay.”

Recently, Stefanie Stantcheva of Harvard conducted a survey similar to Schiller’s and received similar responses:

“If there is a single and simple answer to the question ‘Why do we dislike inflation,’ it is because many individuals feel that it systematically erodes their purchasing power. Many people do not perceive their wage increases sufficiently to keep up with inflation rates, and they often believe that wages tend to rise at a much slower rate compared to prices.”

A recent working paper by Joao Guerreiro of UCLA, Jonathon Hazell of the London School of Economics, Chen Lian of UC Berkeley, and Christina Patterson of the University of Chicago throws additional light on the reasons that people are skeptical that once the market adjusts, their wages will keep up with inflation. Economists typically think of the real wage as adjusting to clear the labor market. If inflation temporarily reduces the real wage, the nominal wage will increase to restore the market-clearing value of the real wage.

But the authors of thei paper note that, in practice, to receive an increase in your nominal wage you need to either 1)ask your employer to increase your wage, or 2) find another job that pays a higher nominal wage. They note that both of these approachs result in “conflict”: “We argue that workers must take costly actions (‘conflict’) to have nominal wages catch up with inflation, meaning there are welfare costs even if real wages do not fall as inflation rises.” The results of a survey they undertook revealed that:

“A significant portion of workers say they took costly actions—that is, they engaged in conflict—to achieve higher wage growth than their employer offered. These actions include having tough conversations with employers about pay, partaking in union activity, or soliciting job offers.”

Their result is consistent with data showing that workers who switch jobs receive larger wage increases than do workers who remain in their jobs. The following figure is from the Federal Reserve Bank of Cleveland and shows the increase in the median nominal hourly wage over the previous year for workers who stayed in their job over that period (brown line) and for workers who switched jobs (gray line).

Job switchers consistently earn larger wage increases than do job stayers with the difference being particularly large during the high inflation period of 2022 and 2023. For instance, in July 2022, job switchers earned average wage increases of 8.5 percent compared with average increases of 5.9 percent for job stayers.

The fact that to keep up with inflation workers have to either change jobs or have a potentially contentious negotiation with their employer provides another reason why the recent period of high inflation led to widespread discontent with the state of the U.S. economy.

Acemoglu, Johnson, and Robinson Win the 2024 Nobel Prize in Economics

Daron Acemoglu and Simon Johnson (Credit: Acemoglu, Adam Glanzman; Johnson, courtesy of MIT, from news.mit.edu)

James Robinson (photo from news.uchicago.edu)

Many economic studies have a relatively limited objective. For instance, estimating the price elasticity of demand for soda in order to determine the incidence of a soda tax. Or estimating a Keynesian fiscal policy multiplier in order to determine the effects of a change in federal spending or taxes. (We consider the first topic in Microeconomics, Chapter 6, and the second topic in Macroeconomics, Chapter 16.)

Other economic studies consider much broader questions, such as why are some countries rich and other countries poor? As the late Nobel laureate Robert Lucas once wrote: “The consequences for human welfare involved in questions like these are simply staggering: Once one starts to think about them, it is hard to think about anything else.”

Today, the Royal Swedish Academy of Sciences awarded the 2024 Nobel Prize in Economic Sciences to Daron Acemoglu and Simon Johnson of MIT, and to James Robinson of the University of Chicago for “for studies of how institutions are formed and affect prosperity.” Acemoglu, Johnson, and Robinson (AJR) have published work highlighting the key importance of a country’s institutions in explaining whether the country has experienced sustained economic growth. Their work builds on earlier studies by the late Douglas North of Washington University in St. Louis, who received the Nobel Prize in 1993.

The institutional approach to economic growth differs from other approaches that focus on variables such as temperature, prevalence of disease, ethnic fragmentation, resource endowments, or governments adopting flawed development strategies in explaining differences in growth rates in per capita income across countries.

Two of AJR’s most discussed papers are “The Colonial Origins of Comparative Development: An Empirical Investigation,” which was published in the American Economic Review in 2001 (free download here), “Reversal of Fortune: Geography and Institutions in the Making of the Modern World Income Distribution,” which was published in the Quarterly Journal of Economics in 2002 (available here). In these papers, the authors argue that the institutions European countries established in their colonies helped determine economic growth in those countries even decades after colonization.

As with any analysis that covers many countries over long periods of time, AJR’s analysis of the effect of colonialism on economic growth has attracted critiques focused on whether the authors have gathered data properly and whether their data may be better explained with a different approach.

The authors, writing both separately and jointly, have explored many issues beyond the effects of colonialism on economic growth. The wide scope of their research can be seen by reviewing their curricula vitae, which can be found here, here, and here. The announcement by the Nobel committee can be found here.

CPI Inflation Running Slightly Higher than Expected

Image illustrating inflation generated by GTP-4o.

This morning (October 10), the Bureau of Labor Statistics (BLS) released its monthly report on the consumer price index (CPI). As the following figure shows, the inflation rate for September measured by the percentage change in the CPI from the same month in the previous month—headline inflation (the blue line)—was 2.4 percent down from 2.6 percent in August. That was the lowest headline inflation rate since February 2021. Core inflation (the red line)—which excludes the prices of food and energy—was unchanged at 3.3 prcent. Both headline inflation and core inflation were slightly higher than economists surveyed by the Wall Street Journal had expected.  

As the following figure shows, if we look at the 1-month inflation rate for headline and core inflation—that is the annual inflation rate calculated by compounding the current month’s rate over an entire year—we see that headline inflation (the blue line) decreased from 2.3 percent in August to 2.2 percent in September. Core inflation (the red line) increased from 3.4 percent in August to 3.8 percent in September.

Overall, we can say that, taking 1-month and 12 month inflation together, the U.S. economy may still be on course for a soft landing—with the annual inflation rate returning to the Fed’s 2 percent target without the economy being pushed into a recession—but the increase in 1-month core inflation is concerning because most economists believe that core inflation is a better indicator of the underlying inflation rate than is headline inflation. Of course, as always, it’s important not to overinterpret the data from a single month, although this is the second month in a row that core inflation has been well above 3 percent. (Note, also, that the Fed uses the personal consumption expenditures (PCE) price index, rather than the CPI in evaluating whether it is hitting its 2 percent inflation target.)

As we’ve discussed in previous blog posts, Federal Reserve Chair Jerome Powell and his colleagues on the FOMC have been closely following inflation in the price of shelter. The price of “shelter” in the CPI, as explained here, includes both rent paid for an apartment or house and “owners’ equivalent rent of residences (OER),” which is an estimate of what a house (or apartment) would rent for if the owner were renting it out. OER is included to account for the value of the services an owner receives from living in an apartment or house.

As the following figure shows, inflation in the price of shelter has been a significant contributor to headline inflation. The blue line shows 12-month inflation in shelter and the red line shows 1-month inflation in shelter. After rising in August, 12-month inflation in shelter resumed the decline that began in the spring of 2023, falling from 5.2 percent in August to 4.8 percent September. One-month inflation in shelter—which is much more volatile than 12-month inflation in shelter—declined sharply from 6.4 percent in August to 2.7 percent in September. The members of the FOMC are likely to find the decline in inflation in shelter reassuring as they consider another cut to the target for the federal funds rate at the committee’s next meeting on November 6-7. Shelter has a smaller weight of 15 percent in the PCE price index that the Fed uses to gauge whether it is hitting its 2 percent inflation target in contrast with the 33 percent weight that shelter has in the CPI.

Finally, in order to get a better estimate of the underlying trend in inflation, some economists look at median inflation and trimmed mean inflation. Median inflation is calculated by economists at the Federal Reserve Bank of Cleveland and Ohio State University. If we listed the inflation rate in each individual good or service in the CPI, median inflation is the inflation rate of the good or service that is in the middle of the list—that is, the inflation rate in the price of the good or service that has an equal number of higher and lower inflation rates. Trimmed mean inflation drops the 8 percent of good and services with the higherst inflation rates and the 8 percent of goods and services with the lowest inflation rates. 

As the following figure (from the Federal Reserve Bank of Cleveland) shows, median inflation (the orange line) declined slightly from 4.2 percent in August to 4.1 percent in September. Trimmed mean inflation (the blue line) was unchanged at 3.2 percent. These data provide confirmation that core CPI inflation at this point is likely running at least slightly higher than a rate that would be consistent with the Fed achieving its inflation target.

The FOMC cut its target for the federal funds rate by 0.50 percentage point (50 basis points) from 5.50 percent to 5.25 percent to 5.00 percent to 4.75 percent at its last meeting on September 17-18. Some economists and investors believed that the FOMC might cut its target by another 50 basis points at its next meeting on November 6-7. This inflation report makes that outcome less likely. In addition, the release of the minutes from the September 17-18 meeting revealed that a significant number of committee members may have preferred a 25 basis point cut rather than a 50 basis point cut at that meeting:

“However, noting that inflation was still somewhat elevated while economic growth remained solid and unemployment remained low, some participants observed that they would have preferred a 25 basis point reduction of the target range at this meeting, and a few others indicated that they could have supported such a decision.”

Investors who buy and sell federal funds futures contracts expect that the FOMC will cut its target for the federal funds rate by 0.25 percentage point at its November meeting. (We discuss the futures market for federal funds in this blog post.) As shown in the following figure, today these investors assign a probability of 80.3 percent to the FOMC cutting its target for the federal funds rate by 0.25 percentage point and a probability of 19.7 percent to the committee leaving its target unchanged.

(Probably) the Final Word on the Non-Recession of 2022

Image generated by GTP-4o to illustrate GDP.

About one month after a calendar quarter ends, the Bureau of Economic Analyis (BEA) releases its advanced estimate of real GDP. In July 2022, the BEA’s advance estimates indicated that real GDP had declined in both the first and second quarters. A common definition of a recession is two consecutive quarters of declining real GDP. Accordingly, in mid-2022 there were a number of articles in the media suggesting that the U.S. economy was in a recession.

But, as we discussed at the time in this blog post, most economists don’t follow the popular definition of a recession as being two consecutive quarters of declining real GDP. Instead, as we discuss in Macroeconomics, Chapter 10, Section 10.3 (Economics, Chapter 20, Section 20.3), economists typically follow the definition of a recession used by the National Bureau of Economic Research (NBER): “A recession is a significant decline in activity spread across the economy, lasting more than a few months, visible in industrial production, employment, real income, and wholesale-retail trade.” 

During the first half of 2022, the other data that the NBER tracks were all expanding rather than contracting. So, it seemed safe to conclude that despite the declines in real GDP in those quarters, the U.S. economy was not, in fact, in a recession.

That conclusion was confirmed by the BEA in September 2024 when it released its most recent revisions of real GDP . As the following table shows, although the BEA still estimates that real GDP fell during the first quarter of 2022, it now estimates that it increased during the second quarter.

In the earlier post from 2022, we also noted that the BEA publishes data on gross domestic income (GDI), as well as on GDP. As we discuss in Chapter 8, Section 8.1, when considering the circular-flow diagram, the value of every final good and service produced in the economy (GDP) should equal the value of all the income in the economy resulting from that production (GDI). The BEA has designed the two measures to be identical by including in GDI some non-income items, such as sales taxes and depreciation. But as we discuss in the Apply the Concept, “Should We Pay More Attention to Gross Domestic Income?” GDP and GDI are compiled by the BEA from different data sources and can sometimes significantly diverge. 

We noted that although, according to the BEA’s advance estimates, real GDP declined during the first two quarters of 2022, real GDI increased. The following figure shows movements in real GDP and real GDI using the current estimates from the BEA. The revised estimates now show real GDP falling the first quarter of 2021 and increasing in the second quarter while real GDI is still estimated as rising in both quarters. The revisions closed some of the gap between real GDP and real GDI during this period by increasing the estimate for real GDP, which indicates that the advance estimate of real GDI was giving a more accurate measure of what was happening in the U.S. economy.

The figure shows that the revised estimates indicate that real GDP and real GDI moved closely together during 2021, differed somewhat during 2022—with real GDI being greater than real GDP—and differed more substantially during 2023 and 2024—with real GDP now being greater than real GDI. Because the two measures should be the same, we can expect that further revisions by the BEA will bring the two measures closer together.

It’s even possible, but unlikely, that further revisions of the data for 2022 could again present us with the paradox of real GDP declining for two quarters despite other measures of economic activity expanding.