Is it 1987 for AI?

Image generated by ChatGPT 5 of a 1981 IBM personal computer.

The modern era of information technology began in the 1980s with the spread of personal computers. A key development was the introduction of the IBM personal computer in 1981. The Apple II, designed by Steve Jobs and Steve Wozniak and introduced in 1977, was the first widely used personal computer, but the IBM personal computer had several advantages over the Apple II. For decades, IBM had been the dominant firm in information technology worldwide. The IBM System/360, introduced in 1964, was by far the most successful mainframe computer in the world. Many large U.S. firms depended on IBM to meet their needs for processing payroll, general accounting services, managing inventories, and billing.

Because these firms were often reliant on IBM for installing, maintaining, and servicing their computers, they were reluctant to shift to performing key tasks with personal computers like the Apple II. This reluctance was reinforced by the fact that few managers were familiar with Apple or other early personal computer firms like Commodore or Tandy, which sold the TRS-80 through Radio Shack stores. In addition, many firms lacked the technical staffs to install, maintain, and repair personal computers. Initially, it was easier for firms to rely on IBM to perform these tasks, just as they had long been performing the same tasks for firms’ mainframe computers.

By 1983, the IBM PC had overtaken the Apple II as the best-selling personal computer in the United States. In addition, IBM had decided to rely on other firms to supply its computer chips (Intel) and operating system (Microsoft) rather than develop its own proprietary computer chips and operating system. This so-called open architecture made it possible for other firms, such as Dell and Gateway, to produce personal computers that were similar to IBM’s. The result was to give an incentive for firms to produce software that would run on both the IBM PC and the “clones” produced by other firms, rather than produce software for Apple personal computers. Key software such as the spreadsheet program Lotus 1-2-3 and word processing programs, such as WordPerfect, cemented the dominance of the IBM PC and the IBM clones over Apple, which was largely shut out of the market for business computers.

As personal computers began to be widely used in business, there was a general expectation among economists and policymakers that business productivity would increase. Productivity, measured as output per hour of work, had grown at a fairly rapid average annual rate of 2.8 percent between 1948 and 1972. As we discuss in Macroeconomics, Chapter 10 (Economics, Chapter 20 and Essentials of Economics, Chapter 14) rising productivity is the key to an economy achieving a rising standard of living. Unless output per hour worked increases over time, consumption per person will stagnate. An annual growth rate of 2.8 percent will lead to noticeable increases in the standard of living.

Economists and policymakers were concerned when productivity growth slowed beginning in 1973. From 1973 to 198o, productivity grew at an annual rate of only 1.3 percent—less than half the growth rate from 1948 to 1972. Despite the widespread adoption of personal computers by businesses, during the 1980s, the growth rate of productivity increased only to 1.5 percent. In 1987, Nobel laureate Robert Solow of MIT famously remarked: “You can see the computer age everywhere but in the productivity statistics.” Economists labeled Solow’s observation the “productivity paradox.” With hindsight, it’s now clear that it takes time for businesses to adapt to a new technology, such as personal computers. In addition, the development of the internet, increases in the computing power of personal computers, and the introduction of innovative software were necessary before a significant increase in productivity growth rates occurred in the mid-1990s.

Result when ChatGPT 5 is asked to create an image illustrating ChatGPT

The release of ChatGPT in November 2022 is likely to be seen in the future as at least as important an event in the evolution of information technology as the introduction of the IBM PC in August 1981. Just as with personal computers, many people have been predicting that generative AI programs will have a substantial effect on the labor market and on productivity.

In this recent blog post, we discussed the conflicting evidence as to whether generative AI has been eliminating jobs in some occupations, such as software coding. Has AI had an effect on productivity growth? The following figure shows the rate of productivity growth in each quarter since the fourth quarter of 2022. The figure shows an acceleration in productivity growth beginning in the fourth quarter of 2023. From the fourth quarter of 2023 through the fourth quarter of 2024, productivity grew at an annual rate of 3.1 percent—higher than during the period from 1948 to 1972. Some commentators attributed this surge in productivity to the effects of AI.

However, the increase in productivity growth wasn’t sustained, with the growth rate in the first half of 2025 being only 1.3 percent. That slowdown makes it more likely that the surge in productivity growth was attributable to the recovery from the 2020 Covid recession or was simply an example of the wide fluctuations that can occur in productivity growth. The following figure, showing the entire period since 1948, illustrates how volatile quarterly rates of productivity growth are.

How large an effect will AI ultimately have on the labor market? If many current jobs are replaced by AI is it likely that the unemployment rate will soar? That’s a prediction that has often been made in the media. For instance, Dario Amodei, the CEO of generative AI firm Anthropic, predicted during an interview on CNN that AI will wipe out half of all entry level jobs in the U.S. and cause the unemployment rate to rise to between 10% and 20%.  

Although Amodei is likely correct that AI will wipe out many existing jobs, it’s unlikely that the result will be a large increase in the unemployment rate. As we discuss in Macroeconomics, Chapter 9 (Economics, Chapter 19 and Essentials of Economics, Chapter 13) the U.S. economy creates and destroys millions of jobs every year. Consider, for instance, the following table from the most recent “Job Openings and Labor Turnover” (JOLTS) report from the Bureau of Labor Statistics (BLS). In June 2025, 5.2 million people were hired and 5.1 million left (were “separated” from) their jobs as a result of quitting, being laid off, or being fired.

Most economists believe that one of the strengths of the U.S. economy is the flexibility of the U.S. labor market. With a few exceptions, “employment at will” holds in every state, which means that a business can lay off or fire a worker without having to provide a cause. Unionization rates are also lower in the United States than in many other countries. U.S. workers have less job security than in many other countries, but—crucially—U.S. firms are more willing to hire workers because they can more easily lay them off or fire them if they need to. (We discuss the greater flexibility of U.S. labor markets in Macroeconomics, Chapter 11 (Economics, Chapter 21).)

The flexibility of the U.S. labor market means that it has shrugged off many waves of technological change. AI will have a substantial effect on the economy and on the mix of jobs available. But will the effect be greater than that of electrification in the late nineteenth century or the effect of the automobile in the early twentieth century or the effect of the internet and personal computing in the 1980s and 1990s? The introduction of automobiles wiped out jobs in the horse-drawn vehicle industry, just as the internet has wiped out jobs in brick-and-mortar retailing. People unemployed by technology find other jobs; sometimes the jobs are better than the ones they had and sometimes the jobs are worse. But economic historians have shown that technological change has never caused a spike in the U.S. unemployment rate. It seems likely—but not certain!—that the same will be true of the effects of the AI revolution. 

Which jobs will AI destroy and which new jobs will it create? Except in a rough sense, the truth is that it is very difficult to tell. Attempts to forecast technological change have a dismal history. To take one of many examples, in 1998, Paul Krugman, later to win the Nobel Prize, cast doubt on the importance of the internet: “By 2005 or so, it will become clear that the Internet’s impact on the economy has been no greater than the fax machine’s.” Krugman, Amodei and other prognosticators of the effects of technological change simply lack the knowledge to make an informed prediction because the required knowledge is spread across millions of people. 

That knowledge only becomes available over time. The actions of consumers and firms interacting in markets mobilize information that is initially known only partially to any one person. In 1945, Friedrich Hayek made this argument in “The Use of Knowledge in Society,” which is one of the most influential economics articles ever written. One of Hayek’s examples is an unexpected decrease in the supply of tin. How will this development affect the economy? We find out only by observing how people adapt to a rising price of tin: “The marvel is that … without an order being issued, without more than perhaps a handful of people knowing the cause, tens of thousands of people whose identity could not be ascertained by months of investigation are made [by the increase in the price of tin] to use the material or its products more sparingly.” People adjust to changing conditions in ways that we lack sufficient information to reliably forecast. (We discuss Hayek’s view of how the market system mobilizes the knowledge of workers, consumers, and firms in Microeconomics, Chapter 2.)

It’s up to millions of engineers, workers, and managers across the economy, often through trial and error, to discover how AI can best reduce the cost of producing goods and services or improve their quality. Competition among firms drives them to make the best use of AI. In the end, AI may result in more people or fewer people being employed in any particular occupation.  At this point, there is no way to know.

 

Has AI Damaged the Tech Job Market for Recent College Grads?

Image generated by ChatGPT 5

“Artificial intelligence is profoundly limiting some young Americans’ employment prospects, new research shows.” That’s the opening sentence of a recent opinion column in the Wall Street Journal. The columnist was reacting to a new academic paper by economists Erik Brynjolfsson, Bharat Chandar, and Ruyu Chen of Stanford University. (See also this Substack post by Chandar that summarizes the results of their paper.) The authors find that:

“[S]ince the widespread adoption of generative AI, early-career workers (ages 22-25) in the most AI-exposed occupations have experienced a 13 percent relative decline in employment … In contrast, employment for workers in less exposed fields and more experienced workers in the same occupations has remained stable or continued to grow. Furthermore, employment declines are concentrated in occupations where AI is more likely to automate, rather than augment, human labor.”

The authors conclude that “our results are consistent with the hypothesis that generative AI has begun to significantly affect entry-level employment.”

About a month ago, we wrote a blog post looking at whether unemployment among young college graduates has been abnormally high in recent months.  The following figure from that post shows that over time, the unemployment rates for the youngest college graduates (the red line) is nearly always above the unemployment rate for the population as a whole (the green line), while the unemployment rate for college graduates 25 to 34 years old (the blue line) is nearly always below the unemployment rate for the population as a whole. In July of this year, the unemployment rate for the population as a whole was 4.2 percent, while the unemployment for college graduates 20 to 24 years old was 8.5 percent, and the unemployment rate for college graduates 25 to 34 years old was 3.8 percent.

As the following figure (also reproduced from that blog post) shows, the increase in unemployment among young college graduates has been concentrated among males. Does higher male unemployment indicate that AI is eliminating jobs, such as software coding, that are disproportionately male? Data journalist John Burn-Murdoch argues against this conclusion, noting that data shows that “early-career coding employment is now tracking ahead of the [U.S.] economy.”

Another recent paper written by Sarah Eckhardt and Nathan Goldschlag of the Economic Innovation Group is also skeptical of the view that firms adopting generative AI programs is reducing employment in certain types of jobs. They use a measure developed by Edward Felton on Princeton University, and Manav Raj and Robert Seamans of New York University of how exposed particular jobs are to AI (AI Occupational Exposure (AIOE)). The following table from Eckhardt and Goldschlag’s paper shows the five most AI exposed jobs and the five least AI exposed jobs.

They divide all occupations into quintiles based on the exposure of the occupations to AI. Their key results are given in the following table, which shows that the occupations that are most exposed to the effects of AI—quintiles 4 and 5—have lower unemployment rates and higher wages than do the occupations that are least exposed to AI. 

The Brynjolfsson, Chandar, and Chen paper mentioned at the beginning of this post uses a larger data set of workers by occupation from ADP, a private firm that processes payroll data for about 25 percent of U.S. workers. Figure 1 from their paper, reproduced here, shows that employment of workers in two occupations—software developers and customer service—representative of those occupations most exposted to AI declined sharply after generative AI programs became widely available in late 2022.

They don’t find this pattern for all occupations, as shown in the following figure from their paper.

Finally, they show results by occupational quintiles, with workers ages aged 22 to 25 being hard hit in the two occupational quintiles (4 and 5) most exposted to AI. The data show total employment growth from October 2022 to July 2025 by age group and exposure to AI.

Economics blogger Noah Smith has raised an interesting issue about Brynjolfsson, Chandar, and Chen’s results. Why would we expect that the negative effect of AI on employment to be so highly concentrated among younger workers? Why would employment in the most AI exposed occupations be growing rapidly among workers aged 35 and above? Smith wonders “why companies would be rushing to hire new 40-year-old workers in those AI-exposed occupations.” He continues:

“Think about it. Suppose you’re a manager at a software company, and you realize that the coming of AI coding tools means that you don’t need as many software engineers. Yes, you would probably decide to hire fewer 22-year-old engineers. But would you run out and hire a ton of new 40-year-old engineers?

Both the papers discussed here are worth reading for their insights on how the labor market is evolving in the generative AI era. But taken together, they indicate that it is probably too early to arrive at firm conclusions about the effects of generative AI on the job market for young college graduates or other groups.

PCE Inflation Is Steady, but Still Above the Fed’s Target

On August 29, the Bureau of Economic Analysis (BEA) released data for July on the personal consumption expenditures (PCE) price index as part of its “Personal Income and Outlays” report. The Fed relies on annual changes in the PCE price index to evaluate whether it’s meeting its 2 percent annual inflation target.

The following figure shows headline PCE inflation (the blue line) and core PCE inflation (the red line)—which excludes energy and food prices—for the period since January 2017, with inflation measured as the percentage change in the PCE from the same month in the previous year. In July, headline PCE inflation was 2.6 percent, unchanged from June. Core PCE inflation in July was 2.9 percent, up slightly from 2.8 percent in June. Headline PCE inflation and core PCE inflation were both equal to what economists surveyed had forecast.

The following figure shows headline PCE inflation and core PCE inflation calculated by compounding the current month’s rate over an entire year. (The figure above shows what is sometimes called 12-month inflation, while this figure shows 1-month inflation.) Measured this way, headline PCE inflation fell from 3.5 percent in June to 2.4 percent in July. Core PCE inflation increased slightly from 3.2 percent in June to 3.3 percent in July. So, both 1-month PCE inflation estimates are above the Fed’s 2 percent target, with 1-month core PCE inflation being well above target. The usual caution applies that 1-month inflation figures are volatile (as can be seen in the figure), so we shouldn’t attempt to draw wider conclusions from one month’s data. In addition, these data may reflect higher prices resulting from the tariff increases the Trump administration has implemented. Once the one-time price increases from tariffs have worked through the economy, inflation may decline. It’s not clear, however, how long that may take and it’s likely that not all the effects of the tariff increases on the price level are reflected in this month’s data.

As usual, we need to note that Fed Chair Jerome Powell has frequently mentioned that inflation in non-market services can skew PCE inflation. Non-market services are services whose prices the BEA imputes rather than measures directly. For instance, the BEA assumes that prices of financial services—such as brokerage fees—vary with the prices of financial assets. So that if stock prices fall, the prices of financial services included in the PCE price index also fall. Powell has argued that these imputed prices “don’t really tell us much about … tightness in the economy. They don’t really reflect that.” The following figure shows 12-month headline inflation (the blue line) and 12-month core inflation (the red line) for market-based PCE. (The BEA explains the market-based PCE measure here.)

Headline market-based PCE inflation was 2.3 percent in July, unchanged from June. Core market-based PCE inflation was 2.6 percent in July, also unchanged from June. So, both market-based measures show inflation as stable but above the Fed’s 2 percent target.

In the following figure, we look at 1-month inflation using these measures. One-month headline market-based inflation declined sharply to 1.1 percent in July from 4.1 percent in June. One-month core market-based inflation also declined sharply to 2.1 percent in July from 3.8 percent in June. As the figure shows, the 1-month inflation rates are more volatile than the 12-month rates, which is why the Fed relies on the 12-month rates when gauging how close it is coming to hitting its target inflation rate. Still, looking at 1-month inflation gives us a better look at current trends in inflation, which these data indicate is slowing significantly.

As we noted earlier, some of the increase in inflation is likely attributable to the effects of tariffs. The effect of tariffs are typically seen in goods prices, rather than in service prices because tariffs are levied primarily on imports of goods. As the following figure shows, one-month inflation in goods prices jumped in June to 4.8 percent, but then declined sharply to –1.6 in July. One-month inflation in services prices increased from 2.9 percent in June to 4.3 percent in July. Clearly, the 1-month inflation data—particularly for goods—are quite volatile.

Finally, these data had little effect on the expectations of investors trading federal funds rate futures. Investors assign an 86.4 percent probability to the Federal Open Market Committee (FOMC) cutting its target for the federal funds rate at its meeting on September 16–17 by 0.25 percentage point (25 basis points) from its current range of 4.25 percent to 4.5o percent. There has been some speculation in the business press that the FOMC might cut its target by 50 basis points at that meeting, but with inflation remaining above target, investors don’t foresee a larger cut in the target range happening.

In Jackson Hole Speech, Fed Chair Powell Signals a Rate Cut and Introduces the Fed’s Revised Monetary Policy Framework

Photo from federalreserve.gov

Federal Reserve chairs often take the opportunity of the Kansas City Fed’s annual monetary policy symposium held in Jackson Hole, Wyoming to provide a summary of their views on monetary policy and on the state of the economy. In these speeches, Fed chairs are careful not to preempt decisions of the Federal Open Market Committee (FOMC) by stating that policy changes will occur that the committee hasn’t yet agreed to. In his speech at Jackson Hole today (August 22), Powell came about as close as Fed chairs ever do to announcing a policy change in a speech. In addition, Powell announced changes to the Fed’s monetary policy framework that had been in place since 2020.

Congress has given the Federal Reserve a dual mandate to achieve price stability and maximum employment. To reach its goal of price stability, the Fed has set an inflation target of 2 percent, with inflation being measured by the percentage change in the personal consumption expenditures (PCE) price index. In the statement that the FOMC releases after each meeting, it generally indicates the current “balance of risks” to meeting its two goals. In a press conference on July 30 following the last meeting of the FOMC, Powell stated that while the labor market appeared to be in balance at close to maximum employment, inflation was still running above the Fed’s 2 percent annual target.

In today’s speech, Powell stated that “the balance of risks appears to be shifting” and “that downside risks to employment are rising. And if those risks materialize, they can do so quickly in the form of sharply higher layoffs and rising unemployment.” These statements seem to signal that he expects that at its next meeting on September 16–17 the FOMC will cut its target for the federal funds rate from its current range of 4.25 percent to 4.50 percent.

One indication of expectations of future changes in the FOMC’s target for the federal funds rate comes from investors who buy and sell federal funds futures contracts. (We discuss the futures market for federal funds in this blog post.) Yesterday, investors assigned a 75.0 percent probability to the committee cutting its target by 0.25 percentage point (25 basis points) to a range of 4.00 percent to 4.25 percent at its September meeting. After Powell’s speech at 10 a.m. eastern time, the probability of a 25 basis point cut increased to 85.3 percent. As the following figure from the Wall Street Journa shows, the stock market also jumped, with the S&P 500 stock index having increased about 1.5 percent at 2:00 p.m. Investors were presumably expecting that by cutting its federal funds rate target, the FOMC would help to offset some of the current weakness in the labor market. (We discussed the weakness in the latest jobs report in this blog post.)

Powell also announced that the Fed had revised its monetary policy framework, which had been in place since 2020. The previous framework was called flexible average-inflation targeting (FAIT). The policy was intended to automatically make monetary policy expansionary during recessions and contractionary during periods of unexpectedly high inflation. If households and firms accept that the Fed is following this policy, then during a recession when the inflation rate falls below the target, they would expect that the Fed would take action to increase the inflation rate. If a higher inflation rate results in a lower real interest rate, there will be an expansionary effect on the economy. Similarly, if the inflation rate were above the target, households and firms would expect future inflation rates to be lower, raising the real interest rate, which would have a contractionary effect on the economy.

An important point to note is that with a FAIT policy, after a period in which inflation is below 2%, the Fed would aim to keep inflation above 2% for a time to “make up” for the period of low inflation. But the converse would not be true—if inflation runs above 2%, the Fed would attempt to bring the inflation back to 2%, but would not push inflation below 2% for a time to make up for the period of low inflation. The result is that, on average, the economy would run “hotter,” lowering the average unemployment rate over time. Many policymakers at the Fed believed that, in the years before 2019, the unemployment could have been lower without causing the inflation rate to be persistently above the Fed’s target.

With hindsight, some economists and policymakers argue that FAIT was implemented at just the wrong time. The policy was designed to address the problem of inflation running below the 2% target for most of the period between 2012 and 2019, resulting in unemployment being higher  than was consistent with the Fed’s mandate for maximum employment. But, in fact, as the following figure shows, in 2020 the U.S. economy was about to enter a period with the highest inflation rates since the early 1980s. 

In his speech today, Powell noted that:

“The economic conditions that brought the policy rate to the ELB [effective lower bound to the federal funds rate, 0 percent to 0.25 percent] and drove the 2020 framework changes were thought to be rooted in slow-moving global factors that would persist for an extended period—and might well have done so, if not for the pandemic. … In the event, rather than low inflation and the ELB, the post-pandemic reopening brought the highest inflation in 40 years to economies around the world.”

Powell outlined the key changes in the policy framework:

“First, we removed language indicating that the ELB was a defining feature of the economic landscape. Instead, we noted that our ‘monetary policy strategy is designed to promote maximum employment and stable prices across a broad range of economic conditions.'”

“Second, we returned to a framework of flexible inflation targeting and eliminated the ‘makeup’ strategy. As it turned out, the idea of an intentional, moderate inflation overshoot [after a period when inflation had been below the 2 percent annual target] had proved irrelevant. … Our revised statement emphasizes our commitment to act forcefully to ensure that longer-term inflation expectations remain well anchored, to the benefit of both sides of our dual mandate. It also notes that ‘price stability is essential for a sound and stable economy and supports the well-being of all Americans.’ “

“Third, our 2020 statement said that we would mitigate ‘shortfalls,’ rather than ‘deviations,’ from maximum employment. … [T]he use of ‘shortfalls’ was not intended as a commitment to permanently forswear preemption or to ignore labor market tightness. Accordingly, we removed ‘shortfalls’ from our statement. Instead, the revised document now states more precisely that ‘the Committee recognizes that employment may at times run above real-time assessments of maximum employment without necessarily creating risks to price stability.’ … [But] preemptive action would likely be warranted if tightness in the labor market or other factors pose risks to price stability.”

“Fourth, consistent with the removal of ‘shortfalls,’ we made changes to clarify our approach in periods when our employment and inflation objectives are not complementary. In those circumstances, we will follow a balanced approach in promoting them.”

“Finally, the revised consensus statement retained our commitment to conduct a public review roughly every five years.”

To summarize, the two key changes in the framework are: 1) The FOMC will no longer attempt to push inflation beyond its 2 percent goal if inflation has been below that goal for a period, and 2) The FOMC may still attempt to preempt an increase in inflation if labor market conditions or other data make it appear likely that inflation will accelerate, but it won’t necessarily do so just because the unemployment rate is currently lower than what had been considered consistent with maximum employment.

Glenn’s Questions for the Fed

Photo from federalreserve.gov

This opinion column originally ran at Project Syndicate.

While recent media coverage of the US Federal Reserve has tended to focus on when, and by how much, interest rates will be cut, larger issues loom. The selection of a new Fed chair to succeed Jerome Powell, whose term ends next May, should focus not on short-term market considerations, but on policies and processes that could improve the Fed’s overall performance and accountability.

By demanding that the Fed cut the federal funds rate sharply to boost economic activity and lower the government’s borrowing costs, US President Donald Trump risks pushing the central bank toward an overly inflationary monetary policy. And that, in turn, risks increasing the term premium in the ten-year Treasury yield—the very financial indicator that Treasury Secretary Scott Bessent has emphasized. A higher premium would raise, not lower, borrowing costs for the federal government, households, and businesses alike. Moreover, concerns about the Fed’s independence in setting monetary policy could undermine confidence in US financial markets and further weaken the dollar’s exchange rate. 

But this does not imply that Trump should simply seek continuity at the Fed. The Fed, under Powell, has indeed made mistakes, leading to higher inflation, sometimes inept and uncoordinated communications, and an unclear strategy for monetary policy.

I do not share the opinion of Trump and his advisers that the Fed has acted from political or partisan motives. Even when I have disagreed with Fed officials or Powell on matters of policy, I have not doubted their integrity. However, given their mistakes, I do believe that some institutional introspection is warranted. The next chair—along with the Board of Governors and the Federal Open Market Committee—will have many policy questions to address beyond the near-term path for the federal funds rate. 

Three issues are particularly important. The first is the Fed’s dual mandate: to ensure stable prices and maximum employment. Many economists (including me) have been critical of the Fed for exhibiting an inflationary bias in 2021 and 2022. The highest inflation rate in 40 years raised pressing questions about whether the Fed has assigned the right weights to inflation and employment. 

Clearly, the strategy of pursuing a flexible average inflation target (implying that inflation can be permitted to rise above 2% if it had previously been below 2%) has not been successful. What new approach should the Fed adopt to hit its inflation target? And how can the Fed be held more accountable to Congress and the public? Should it issue a regular inflation report? 

The second issue concerns the size and composition of the Fed’s balance sheet. Since the global financial crisis of 2008, the Fed has had a much larger balance sheet and has evolved toward an “ample reserves model” (implying a perpetually high level of reserves). But how large must the balance sheet be to conduct monetary policy, and how important should long-term Treasury debt and mortgage-backed securities be, relative to the rest of the balance sheet? If such assets are to play a central role, how can the Fed best separate the conduct of monetary policy from that of fiscal policy? 

The third issue is financial regulation. What regulatory changes does the Fed believe are needed to avoid the kind of costly stresses in the Treasury market we have witnessed in recent years? How can bank supervision be improved? Given that regulation is an inherently political subject, how can the Fed best separate these activities from its monetary policymaking (where independence is critical)? 

Addressing these policy questions requires a rethink of process, too. The Fed would be more effective in dealing with a changing economic environment if it acknowledged and debated more diverse viewpoints about the roles of monetary policy and financial regulation in how the economy works.

The Fed’s inflation mistakes, overconfidence in financial regulation, and other errors partly reflect the “groupthink” to which all organizations are prone. Regional Fed presidents’ views traditionally have reflected their own backgrounds and local conditions, but that doesn’t translate easily into a diversity of economic views. Instead of choosing Fed officials based on how they are likely to vote at the next rate-setting meeting, Trump should put more weight on intellectual and experiential diversity. Equally, the Fed itself could more actively seek and listen to dissenting views from academic and business leaders. 

Raising questions about policy and process offers guidance about the characteristics that the next Fed chair will need to succeed. These obviously include knowledge of monetary policy and financial regulation and mature, independent judgment; but they also include diverse leadership experience and an openness to new ideas and perspectives that might enhance the institution’s performance and accountability. One hopes that Trump’s selection of the next Fed chair, and the Senate’s confirmation process, will emphasize these attributes.

Solved Problem: How Can Total Employment and the Unemployment Rate Both Increase at the Same Time?

SupportsMacroeconomics, Chapter 9, Economics, Chapter 19, and Essentials of Economics, Chapter 13.

Image generated by ChatGPT

A recent article on axios.com notes that from April 2023 to July 2024, the U.S. economy generated an average net increase of 177,000 jobs per month. Despite that job growth, the unemployment rate during that period increased by 0.8 percentage point. The article observes that: “At first glance, the combination of a rising unemployment rate and strong jobs growth simply does not compute.” How is it possible during a given period for both total employment and the unemployment rate to increase?

Solving the Problem
Step 1: Review the chapter material. This problem is about calculating the unemployment rate, so you may want to review Chapter 9, Section 9.1, “Measuring the Unemployment Rate, the Labor Force Participation Rate, and the Employment-Population Ratio.” 

Step 2: Answer the question by explaining how it’s possible for both the total number of people employed and the unemployment rate to both increase during the same period.  The unemployment rate is equal to the number of people unemployed divided by the number of people in the labor force (multiplied by 100). The labor force equals the sum of the number of people employed and the number of people unemployed.

Let’s consider the situation in a particular month. Suppose that the unemployment rate in the previous month was 4 percent. If, during the current month, both the number of people employed and the number of people unemployed increase, the unemployment rate will increase if the increase in the number of people unemployed as a percentage of the increase in the labor force is greater than 4 percent. The unemployment rate will decrease if the increase in the number of people unemployed as a percentage of the increase in the labor force is less than 4 percent.  

Consider a simple numerical example. Suppose that in the previous month there were 96 people employed and 4 people unemployed. In that case, the unemployment rate was (4/(96 + 4)) x 100 = 4.0%. 

Suppose that during the month the number of people employed increases by 30 and the number of people unemployed increases by 1. In that case, there are now 126 people employed and 5 people unemployed. The unemployment rate will have fallen from 4.0% to (5/(126 + 5)) x 100 = 3.8%.

Now suppose that the number of people employed increased by 30 and the number of people unemployed increases by 3. The unemployment will have risen from 4.0% to (7/(126 + 7)) x 100 = 5.3%.

We can conclude that if both the total number of people employed and the total number of people unemployed increase during a during a period of time, it’s possible for the unemployment rate to also increase.

08-16-25- Podcast – Authors Glenn Hubbard & Tony O’Brien discuss tariffs, Fed independence, & the controversies at the BLS.

In today’s episode, Glenn Hubbard and Tony O’Brien take on three timely topics that are shaping economic conversations across the country. They begin with a discussion on tariffs, exploring how recent trade policies are influencing prices, production decisions, and global relationships. From there, they turn to the independence of the Federal Reserve Bank, explaining why central bank autonomy is essential for sound monetary policy and what risks arise when political pressures creep in. Finally, they shed light on the Bureau of Labor Statistics (BLS), unpacking how its data collection and reporting play a vital role in guiding both public understanding and policymaking.

It’s a lively and informative conversation that brings clarity to complex issues—and it’s perfect for students, instructors, and anyone interested in how economics connects to the real world.

https://on.soundcloud.com/RA09RWn30NyDc8w8Tj

CPI Inflation Comes in Slightly Below Expectations, Increasing Likelihood of Fed Rate Cuts

Fed Chair Jerome Powell (left) and Vice Chair Philip Jefferson (photo from federalreserve.gov)

Today (August 12), the Bureau of Labor Statistics (BLS) released its report on the consumer price index (CPI) for July. The following figure compares headline CPI inflation (the blue line) and core CPI inflation (the red line).

  • The headline inflation rate, which is measured by the percentage change in the CPI from the same month in the previous year, was 2.7 percent in July, unchanged from June. 
  • The core inflation rate, which excludes the prices of food and energy, was 3.0 percent in July, up slightly from 2.9 percent in June. (Note that there was some inconsistency in how the core inflation rate is reported. The BLS, and some news outlets, give the value as 3.1 percent. The unrounded value is 3.0486 percent.)

Headline inflation and core inflation were slightly lower than what economists surveyed had expected.

In the following figure, we look at the 1-month inflation rate for headline and core inflation—that is the annual inflation rate calculated by compounding the current month’s rate over an entire year. Calculated as the 1-month inflation rate, headline inflation (the blue line) declined from 3.5 percent in June to 2.4 percent in July. Core inflation (the red line) increased from 2.8 percent in June to 3.9 percent in July.

The 1-month and 12-month inflation rates are telling somewhat different stories, with 12-month inflation indicating that inflation is stable, although moderately above the Fed’s 2 percent inflation target. The 1-month core inflation rate indicates that inflation may have increased during July. 

Of course, it’s important not to overinterpret the data from a single month. The figure shows that the 1-month inflation rate is particularly volatile. Also note that the Fed uses the personal consumption expenditures (PCE) price index, rather than the CPI, to evaluate whether it is hitting its 2 percent annual inflation target.

A key reason for core inflation being significantly higher than headline inflation is that gasoline prices declined by 23.1 percent at an annual rate in June. As shown in the following figure, 1-month inflation in gasoline prices moves erratically—which is the main reason that gasoline prices aren’t included in core inflation.

Does the increase in inflation represent the effects of the increases in tariffs that the Trump administration announced on April 2? (Note that many of the tariff increases announced on April 2 have since been reduced) The following figure shows 12-month inflation in three categories of products whose prices are thought to be particularly vulnerable to the effects of tariffs: apparel (the blue line), toys (the red line), and motor vehicles (the green line). To make recent changes clearer, we look only at the months since January 2021. In July, prices of apparel fell, while the prices of toys and motor vehicles rose by less than 1.0 percent.

The following figure shows 1-month inflation in these prices of these products. In July, motor vehicles prices and apparel prices increased by less than 1 percent, while toy prices increased by 1.9 percent after having soared soared by 24.3 percent in June. At least for these three products, it’s difficult to see tariffs as having had a significant effect on inflation in July.

To better estimate the underlying trend in inflation, some economists look at median inflation and trimmed mean inflation.

  • Median inflation is calculated by economists at the Federal Reserve Bank of Cleveland and Ohio State University. If we listed the inflation rate in each individual good or service in the CPI, median inflation is the inflation rate of the good or service that is in the middle of the list—that is, the inflation rate in the price of the good or service that has an equal number of higher and lower inflation rates. 
  • Trimmed-mean inflation drops the 8 percent of goods and services with the highest inflation rates and the 8 percent of goods and services with the lowest inflation rates. 

The following figure shows that 12-month trimmed-mean inflation (the blue line) was 3.2 percent in July, unchanged from June. Twelve-month median inflation (the red line) 3.6 percent in July, unchanged from June.

The following figure shows 1-month trimmed-mean and median inflation. One-month trimmed-mean inflation declined from 3.9 percent in June to 2.9 percent in July. One-month median inflation also declined from 4.1 percent in June to 3.7 percent in July. These data indicate that inflation may have slowed in July (the opposite conclusion we noted earlier when discussing 1-month core inflation), while remaining above the Fed’s 2 percent target.

What are the implications of this CPI report for the actions the Federal Reserve’s policymaking Federal Open Market Committee (FOMC) may take at its next meetings? Even before today’s relatively favorable, if mixed, inflation report, the unexpectedly weak jobs report at the beginning of the month (which we discuss in this blog post) made it likely that the FOMC would soon begin cutting its target for the federal funds rate.

Investors who buy and sell federal funds futures contracts assign a probability of 94.3 percent to FOMC cutting its target for the federal funds rate at its September 16–17 meeting by 0.25 (25 basis points) from its current target range of 4.25 percent to 4.50 percent. That probability increased from 85.9 percent yesterday. (We discuss the futures market for federal funds in this blog post.) Investors assign a probability of 61.5 percent to the FOMC cutting its target again by 25 basis points at its October 28–29 meeting, and a probability of 50.3 percent to a third 25 basis point cut at the committee’s December 9–10 meeting.

New Census Data Highlights the Aging of the U.S. Population

Image generated by ChatGPT 5

In June, the U.S. Census Bureau released its population estimates for 2024. Included was the following graphic showing the change in the U.S. population pyramid from 2004 to 2024. As the graphic shows, people 65 years and older have increased as a fraction of the total population, while children have decreased as a fraction of the total population. (The Census considers everyone 17 and younger to be a child.) Between 2004 and 2024, people 65 and older increased from 12.4 percent of the population to 18.0 percent. People younger than 18 fell from 25.0 percent of the population in 2004 to 21.5 percent in 2024.

The aging of the U.S. population reflects falling birth rates. Demographers and economists typically measure birth rates as the total fertility rate (TFR), which is defined by the World Bank as: “The number of children that would be born to a woman if she were to live to the end of her childbearing years and bear children in accordance with age-specific fertility rates currently observed.” The TFR has the advantage over the simple birth rate—which is the number of live births per thousand people—because the TFR corrects for the age structure of a country’s female population. Leaving aside the effects of immigration and emigration, a TFR of 2.1 is necessary to keep a country’s population stable. Stated another way, a country needs a TFR of 2.1 to achieve replacement level fertility. A country with a TFR above 2.1 experiences long-run population growth, while a country with a TFR of less than 2.1 experiences long-run population decline.

The following figure shows the TFR for the United States for each year between 1960 and 2023. Since 1971, the TFR has been below 2.1 in every year except for 2006 and 2007. Immigration has helped to offset the effects on population growth of a TFR below 2.1.

The United States is not alone in experiencing a sharp decline in its TFR since the 1960s. The following figure shows some other countries that currently have below replacement level fertility, including some countries—such as China, Japan, Korea, and Mexico—in  which TFRs were well above 5 in the 1960s. In fact, only a relatively few countries, such as Israel and some countries in sub-Saharan Africa are still experiencing above replacement level fertility.

An aging population raises the number of retired people relative to the number of workers, making it difficult for governments to finance pensions and health care for older people. We discuss this problem with respect to the U.S. Social Security and Medicare programs in an Apply the Concept in Macroeconomics, Chapter 16 (Economics, Chapter 26 and Essentials of Economics, Chapter 18). Countries experiencing a declining population typically also experience lower rates of economic growth than do countries with growing populations. Finally, as we discuss in an Apply the Concept in Microeconomics, Chapter 3, different generations often differ in the mix of products they buy. For instance, a declining number of children results in declining demand for diapers, strollers, and toys.

Before There was Inexpensive Computing …

… there were artists in government agencies drawing time series graphs. As we discuss in this recent blog post, the Bureau of Labor Statistics (BLS) has been in the news lately—undoubtedly much more than the people who work there would like.

This post is not about the current controversy but steps back to make a bigger point: The availability of data has increased tremendously from the time when Glenn and Tony began their academic careers. In the 1980s, personal computers were becoming widespread, but the internet had not yet developed to the point where government statistics were available to download. To gather data usually required a trip to the university library to make photocopies of tables in the print publications of the BLS and other government agencies. You then had to enter the data by hand into very crude—by current standards—spreadsheet and statistical software. The software generally had limited graphing capabilities.

How were the time series figures in print government publications generated? The two photos shown above (both from the website of the Library of Congress) show that the figures were hand drawn by artists. The upper photo is from 1962 and the lower photo is from 1971.

Today, most government data is readily available online. The FRED (Federal Reserve Economic Data) site, hosted by economists at the Federal Reserve Bank of St. Louis makes available thousands of data series. We make use of these series in the Data Exercises included in the end-of-chapter problems in our textbooks. The FRED site makes it easy (we hope!) to do these exercises, including by combining or otherwise transforming data series and by graphing them—no artistic ability required!

H/T