Was Vaccine Development an Example of Successful “Industrial Policy”?

Was the development of COVID-19 vaccines under Operation Warp Speed a successful example of “industrial policy”? On one side, it was a policy, and it led to a product–so that seems like an example. But in a broader sense, the production of the vaccine was a narrow effort. It did not transform any particular US industry, much less the broader industrial base as a whole. So perhaps Operation Warp Speed was a success, but without falling in the “industrial policy” category?

Scott Lincicome and Huan Zhu make a case for the skeptical answer in “Questioning Industrial Policy: Why
Government Manufacturing Plans Are Ineffective and Unnecessary”
(Cato Institute Working Paper #63, June 16, 2021). They write (footnotes omitted):

Finally, the COVID-19 vaccines developed under “Operation Warp Speed” have been heralded as a triumph of American industrial policy, but the first vaccine to market (Pfizer/BioNTech) disproves the assertion. BioNTech was a German company that had been working on mRNA vaccines for years and began its collaboration with Pfizer (based on an earlier working relationship) months before the U.S. government began OWS [Operation Warp Speed] in May 2020 or contracted with the companies for a vaccine in July of that same year. (Management actually predicted in April 2020 that distribution of finished doses would occur in late 2020.) The
companies famously refused government funds for R&D, testing and production – efforts that instead leveraged Pfizer’s substantial pre-existing U.S. manufacturing capacity, as well as multinational research teams, global capital markets and supply chains, and a logistics and transportation infrastructure that had developed over decades. In fact, the Trump administration’s contract with Pfizer was for finished, FDA-approved vaccine doses only and expressly excluded from government reach essentially all stages of vaccine development (i.e., “activities that Pfizer and BioNTech have been performing and will continue to perform without use of Government funding”). There is even some evidence that OWS’ allocation of vaccine materials to participating companies (some of which still have not produced an approved vaccine) may have impeded non-participant Pfizer’s ability to meet its initial production targets and expand production after the vaccine was approved.

Surely, some state support (e.g., support for mRNA research and a large vaccine purchase commitment) was involved both before and during the pandemic, but it all lacked the necessary commercial, strategic, or nationalist elements of “industrial policy.” In fact, mRNA visionary Katalin Karikó actually left her government-supported position at the University of Pennsylvania “because she was failing in the competition to win research grants” and thus “moved to the BioNTech company, where she not only created the Pfizer vaccine but also spurred Moderna to competitive imitation.” The NIH grant supporting her early work actually came through her colleague, Drew Weissman, and “had no direct connection to mRNA research.” Other efforts,
such as Moderna’s mRNA vaccine, had more state support, but the BioNTech/Pfizer vaccine shows that it was not a necessary condition for producing a wildly successful COVID-19 vaccine.

Indeed, Lincicome and Zhu argue some elements of vaccine contractors and their lobbying interactions with US government may have hindered the vaccine development process. They write:

Most recently, a New York Times investigation into Maryland vaccine manufacturer Emergent Biosolutions – a “longtime government contractor that has spent much of the last two decades cornering a lucrative market in federal spending on biodefense” – found that the company invested heavily in lobbying while ignoring various safety and manufacturing best practices; had effectively “captured” the government agency, the Biomedical Advanced Research and Development Authority, authorized to disburse and monitor pandemic-related contracts; yet, despite repeated contracting failures, was rewarded with a $628 million contract to manufacture Covid-19 vaccines. Emergent’s actions ultimately imperiled millions of doses of Johnson & Johnson vaccines and weakened the Strategic National Stockpile by monopolizing its “half-billion-dollar annual budget throughout most of the last decade, leaving the federal government with less money to buy supplies needed in a pandemic.”

One might of course object that Lincicome and Zhu have an overly narrow definition of “industrial policy,” but perhaps the broader less is that “industrial policy” means many things for different people. If “industrial policy” was limited to support for research and development, workforce training, and perhaps occasional government commitments to purchase successful innovations, my sense is that few free-market economists would object. In emergencies like a pandemic, many predominantly free-market economists would be willing to support government steps to prioritize key inputs in supply chains, as well. But of course, this kind of “industrial policy” is a long way from widespread government industrial planning, tariffs against imported goods, and subsidies or even government ownership of favored industries.

A further difficulty is that political discussions of industrial policy can become quite vague. The proponents of industrial policy often focus on issues of concern–say, the loss of well-paid manufacturing jobs–but they are fuzzier on holding themselves accountable for policies that will address the problem. Lincicome and Zhu quote Mancur Olson (from a 1986 book) on this issue: “Those publications that I happen to have seen advocating industrial policy are also relatively vague. Some are so vague that they invite the reaction that industrial policy is neither a good idea nor a bad idea, but no idea at all; that it is the grin without the cat.”

Some Economics of Sawdust

Cambridge University Press has published a 35th anniversary edition of The Economist’s View of the World and the Quest for Well-Being by Steven E. Rhoads. The book offers a sympathetic verbal (that is, no graphs or math) explanation of basic concepts in microeconomics: for example, the opening chapters are “Opportunity Costs,” “Marginalism,” and “Economic Incentives.”

Rhoads is an economically-minded political scientist. This book is not at all an attack on economics: indeed, I think it has sometimes been used as a textbook for a nonmathematical introduction to the economics, both at the undergraduate and with master’s degree programs in areas like public administration. I suspect that the book does a good job of building bridges with those who are skeptical or hostile to what they perceive as the field of economics, because Rhoads is quick to emphasize that economic efficiency and growth are not the only ingredients of human well-being, and that fairness and equality should also play a role. My only real quibble with the book, and its a small one, is that Rhoads seems in some places to think this insight will be news to economists, while my own experience is that economists have been emphasizing for decades how equality and fairness may in some contexts have tradeoffs with efficiency and growth, while in other cases they may complement each other.

The discussion throughout is based on solid explanations and a wealth of interesting examples. To provide a flavor, here’s one example from the introduction of an economic story about sawdust. The story works on several levels: as a basic story about supply and demand, a story about the intricacies of economic interconnectedness, and a parable about the perils of economic central planning. Rhoads writes (footnotes omitted):

In 2008 the price of milk was much higher than usual. An economist asked a dairy farmer, how come? The farmer said his inputs were much more expensive. (Within two years it had gone up by a factor of four for some uses.) He used sawdust to bed his cows more comfortably. They produced more milk when they were more often off their feet. The reason for the increase in the price of sawdust was the sharp downturn in the production of new housing. Since construction of new houses was down, there was less sawdust.

So, imagine you are a politician or a planner trying to satisfy citizens complaining about the high price of milk. … But another problem citizens were complaining about was homelessness and the price of affordable housing. Would you realize that using more sawdust to produce milk would increase the price of housing? Probably not. But it would increase housing costs, because sawdust is also the principal component in particleboard, which is used widely in the building industry. It is cheaper than substitutes such as lumber and plywood. You probably wouldn’t know that.

Many of your constituents also love gardening, and they would not be happy if the sawdust they use to make their mulch became more expensive because some of it was being siphoned off to help “higher-priority” users. Sawdust is also used in the production of charcoal briquettes and as part of a mix to make a lightweight material for dashboards. It would take a planner a lot of time to decide on the fair and efficient allocation of sawdust. …

Of course, no politician or planner would have time to worry about sawdust. If there were no entrepreneurs or markets, sawdust would probably be thrown away or sued only for mulch; no one would know that the waste product had these other uses. Even if they eventually figured it out, how would they decide which usage was the most important and how much should go into it and how much for the second most important usage?

The lowly sawdust example shows that there is a “dense interconnection” of different kinds of scarce resources. No planner could sort out everything efficiently. This is an important reason why we need markets.

The Agricultural Marketing Research Center (a group of universities operating under a grant from the US Department of Agriculture) wrote a few years ago about the many industrial uses of sawdust, as well as the rising importance of sawdust, and wood waste in general, as a source of renewable biomass energy. Here is the AgMRC on uses of sawdust:

Shavings and sawdust may be reground into wood flours, or the wood flour may be recovered as sized “dust” materials that have been screened and separated. Wood flour has major industrial markets in industrial fillers, binders and extenders in industrial products like epoxy resins, fertilizers, adhesives, absorbent materials, felt roofing, inert explosive components, ceramics, floor tiles, cleaning products, wood fillers, caulks and putties, soil extenders and a vast array of plastics. Some wood flours like mesquite may be used in edible flavorings for human or pet consumption.
 
Shavings and sawdust can be marketed for use in molded or laminated composite wood products (e.g., toilet seats, countertops) in automotive materials and in oil and water isolation and solidification products for the environmental control industry. Other uses include fillers, bulk shavings, sawdust, hog fuel (dried bark shavings), meat-smoking chips, barbeque cooking fuels and composite fireplace logs. Landscaping applications include playground “footing,” equestrian arena and other “wood edge footing” (safety margin and walkway material) and some exhibit and tradeshow applications. A few manufacturers are using post-consumer plastic waste mixed with a sawdust extender to make high-value extruded composite decking lumber and similar products for the home improvement market. Currently, a primary use of baled dry shavings is for equine and livestock bedding or small pet bedding applications.

However, the AgMRC emphasizes a future role for sawdust and wood waste products in biomass energy production. I once had a conversation with a professor of forestry who pointed out that the “carbon cycle” in burning fossil fuels and eventually having that carbon return to the form of oil or coal or natural gas was measured in millions of years, while the carbon-cycle in burning wood products and then having that carbon reabsorbed into trees was a matter of years and decades. The AgMRC writes:

At this time and in the near future, wood wastes are and probably will be the most commonly used biomass fuel for heat and power. The most economic sources of wood fuels are usually wood residues from manufacturers (mill residues), discarded wood products or woody yard trimmings diverted from landfills, and non-hazardous wood debris from construction and demolition activities. A significant environmental benefit of using these materials for generating electricity is that their energy value is utilized while landfill disposal is avoided. As long as clean-burning combustion technologies are employed, carbon emissions to the atmosphere can be minimized. 

Recent studies indicate that quantities of available (presently unused) mill and urban wood residues exceed 39 million dry tons per year in the United States. This is enough material to supply more than 7,500 MW, doubling the existing U.S. bio-power capacity in the United States. To illustrate this point, this amount of power could supply the yearly electricity demand of the residential customers in all six New England states.

Moreover, there is agricultural innovation in the growth of “wood grass” species that will probably affect the sawdust industry as well.

The use of crop residues, livestock manures and short-rotation-intensive-culture (SRIC) plantings of fast-growing “wood grass” tree species as fuel resources can improve the economics of farming while solving some of the most intractable environmental problems in agriculture today. In SRIC systems, “wood grass” species are cultivated and then chipped on-site for use in energy production (by combustion) or wood-product manufacturing (composites). The advent of energy crops for power production is a new agricultural market. However, these crops provide soil conservation and nutrient management benefits for the land and may be compatible with government conservation set-aside incentive programs. Increased woody-biomass utilization will impact other groups including architectural and engineering firms, consultants, and processing and handling equipment vendors.

Centralized government decision-making can be a useful method of production when it is focused on a specific goal: create a specific new vaccine or fighter jet, or provide electricity. But the production and use of sawdust are multifarious. Detailed and granular knowledge of people who are deeply involved at every stage of the process, and who have personal incentives to make it work smoothly, is needed to know what is possible now and what innovations might be made. In these settings of what Rhoads calls “dense interconnection,” the decentralized decision-making of markets coordinated by individual incentives and a price mechanism can be remarkably effective.

Interview with Rucker Johnson: Supporting Children

Douglas Clement interviews Rucker Johnson about his research in the Fall 2021 issue of For All, published by the Opportunity & Growth Institute at the Minneapolis Fed (“Rucker Johnson interview: Powering potential,” subtitled “Rucker Johnson on school finance reform, quality pre-K, and integration”). One of the themes of the interview is the potential for gains to both equity and efficiency from supporting socioeconomically disadvantaged children along a variety of dimensions. Here are a couple of examples:

On disparities in spending and opportunity across K-12 schools:

Today, about 75 percent of per pupil spending disparities are between states (rather than between districts within states). And we’ve witnessed that inequality in school spending has risen since 2000. After three decades of narrowing—the ’70s, ’80s, and ’90s—primarily due to the state school finance reforms emphasized in my work with Kirabo Jackson and Claudia Persico, there has been a significant rise in inequality, especially sharply following the Great Recession.

What I want to highlight here is the current disparities nationwide in school resources. School districts with the most students of color have about 15 percent less per pupil funding from state and local sources than predominantly White, affluent areas, despite having much greater need due to higher proportions of poverty, special needs, and English language learners.

Teacher quality is often the missing link that people don’t consider directly when thinking about school resource inequities. For example, schools with a high level of Black and Latino students have almost two times as many first-year teachers as schools with low minority enrollment. And minority students are more likely to be taught by inexperienced teachers than experienced ones in 33 states across the country. … Part of it is that the invisible lines of school district boundaries are powerful tools of segregation. It’s a way of segregating and hoarding access to opportunity. And when I say access to opportunity, I mean quality of teachers, I mean curricular opportunity. For example, only a third of public schools with high Black and Latino enrollment offer calculus. Courses like that are gateways to majoring in STEM in college and having a STEM career. Or simply the fact that less than 30 percent of students in gifted and talented programs are Black or Latino.

On the importance of early interventions and health:

For example, it’s been documented that half of the achievement gap that we observe among third graders was apparent at kindergarten entry. What that reflects, in part, is the strong footprint of early childhood experiences. And that’s why access to quality pre-K can play a significant role, particularly in the lives of lower-income children. Without those public investments in early pre-K programs, they would often not have access to environments that promote nurturing interpersonal relationships and school readiness.

What’s important about this is that during the initial rollout of Head Start, the first 15 years, those programs also significantly improved health, child health. This was because immunizations increased, the quality and continuity of pediatric care significantly increased. This predated a lot of the significant public investments in Medicaid expansions. Partly, it’s that healthier children are better learners. Again, there’s that connection between education and health, pre-K and K-12.

What we’re able to do in that research is leverage the per pupil spending in pre-K programs and the timing of that set of increases at the county level, link it to the student level of children we’re following from birth to adulthood, and connect that with the level of school resources in their K-12 years via the court-ordered timing of school funding reforms in their state and district of upbringing. When we put those pieces together, we found that it was not just that public pre-K spending via Head Start has significant long-term beneficial effects. And it wasn’t only that the K-12 spending has significant positive effects. What we found was that there was a significant synergy; we call it dynamic complementarity.

We found that when children attend poorly funded K-12 environments, the long-term effects of pre-K tend to dissipate. It’s consistent with the fade-out effect that other people have documented. It’s only when the pre-K investments are followed with quality K-12 investments—where they’re going to schools that are well-funded and well-resourced—that we see sustained, positive effects of pre-K spending. Similarly, the effectiveness of K-12 spending is enhanced significantly when it’s preceded by quality pre-K access. In their K-12 years, children are more prepared to learn and to take advantage of the educational opportunities that occur in those K-12 years. When we do them in concert, the effects are more than the sum of the individual parts.

The Benefits of Slaughtering Special Interests in the 1850s

In the first half-century or so of US history, most of the legislation about businesses occurred at the state level, and the bulk of that legislation was goodies for political supporters, like special legislation allowing a politically connected person or group to start a firm or a bank. Starting around 1850, states began to rewrite their constitutions to make such political-corporate favoritism illegal. Naomi R. Lamoreaux and John Joseph Wallis tell the story in “Economic Crisis, General Laws, and the Mid-Nineteenth-Century Transformation of American Political
Economy”
(Journal of the Early Republic, 41: 3, Fall 2021, pp. 403-433; also available as NBER Working paper 27400, June 2020). They write:

In 1851, in the aftermath of an economic crisis that forced the state into default, Indiana rewrote its constitution to require that laws enacted by the legislature “be general and of uniform operation throughout the state.” This directive may not seem remarkable from the standpoint of the twenty-first century; we take it for granted that that is what legislatures do. From the perspective of the mid-nineteenth century, however, the provision was groundbreaking. The first such mandate ever enacted, Indiana’s innovation spread to almost all the other U.S. states over the next few decades …

Before Indiana’s innovation, the main business of legislatures was to enact special or private bills on behalf of specific individuals, organizations, and localities. The year before its new constitution was ratified, for example, Indiana’s general assembly passed 550 acts … About half benefited particular local governments, granting them permission to spend public funds, borrow money, levy taxes, set salaries, fees, duties, and meeting times for administrators and judges, and take a variety of other actions. Almost all the rest (nearly 40 percent)
aided particular individuals or organizations. Some involved personal matters such as divorces, name changes, and the administration of decedents’ estates, but the vast majority conveyed grants of economically valuable privileges such as corporate charters to people specifically named in the bills.

Because some people, groups, and localities were better positioned than others to obtain favors from the legislature, this system of special laws was fundamentally inegalitarian. Indeed, it was precisely because it was inegalitarian that it persisted. The competitive democratic politics of the early nineteenth century drove governing elites to consolidate their political support by doling out favors to members of their factions and denying them to people who belonged to other groups. Those in power benefited from the ability to dispense charters for banks and other valuable economic organizations. Those out of power complained bitterly about this “corruption,” but they behaved in exactly the same way when they were in office, favoring supporters and freezing out opponents. To do otherwise would be to risk losing control of the government and, with that, access to banks and similar advantages.

What caused the change? The proximate cause seems to be that treating the power to start and run a bank or other company as a political favors led to rising debt, and a number of states found themselves in danger of defaulting on these debts. In Indiana, for example, the legislature in the 1830s had a goal of building a north-south canal to stimulate economic growth in the state, but the political negotiating over that general infrastructure goal meant that to get approval, goodies needed to be added for individuals, companies, and communities across the state, to the point that an enormous and unsustainable level of debt was needed to finance the canal. States recognized that one way to avoid the problem was to force private companies to stand on their own two feet.

The shift to a state-level constitutional rule that “laws be of general and of uniform operation throughout the state” had some unexpected and positive results. When corporations were created and empowered state legislatures, it would have seemed peculiar to also set up a regulatory apparatus. But when states allowed anyone to start a bank–not just the politically favored–they also started requiring that banks file reports about their financial soundness and set aside reserves that would be held by state-level bank regulators. It may seem paradoxical, but regulating companies also required separating them from from the government.

Another one of the perhaps unexpected changes of slaughtering the special interests in the 1850s was that states which did so experienced better economic growth, because it was easier for people to start companies. Lamoreaux and Wallis argue that one reason that the US financial sector has been overrepresented in New York for such a long time traces back to the fact that New York was one of the first states to allow banks to be set up freely–rather than only being allowed on a case-by-case basis at the whim of the state legislature. under state-level

Of course, there is a modern analog to this story. If legislation is passed that focuses on certain groups of people and firms, then there will be a general tendency to make sure that the legislation includes something for everyone affiliated with the party in power–and the amount of debt needed to finance such legislation will soar as a result. This is not intended as a partisan jab: it’s a dynamic one can see both in the Trump administration tax bill from back in 2017 as well as in the Democrat-backed spending legislation currently under discussion. If instead the focus is not on who gets what goodies from a law, but instead on the idea that “laws be of general and of uniform operation” then economic growth, sensible regulation of business, and the public welfare are more likely to benefit.

Powers of Ten Day: Understanding Exponents

I forgot to post this old 1977 video on October 10, “Powers of Ten Day,” but if you haven’t seen it (or haven’t seen it in awhile), it’s worth a few minutes of your time.

Most people don’t have accurate intuition about exponents. A common example when talking about government budgets is that people sometimes tend to mix up “millions,” “billions,” and “trillions.” If you are at the grocery story and the checkout clerk mistakenly charged me $36 for a gallon of milk–that is, 10 times the typical price–you would surely notice. If you were charged $360 for a gallon of milk–100 times the usual price, chances are good that you would notice. But when when people misspeak in a way that confuses millions and billions, they are getting confused between two measured that differ by a factor of 1,000, like being charge $3,600 for a gallon of milk and not really noticing. The video emphasizes powers of ten in the physical world, and thus offers a non-algebraic way to convey the power of exponents.

Reduced Wage Inequality Since the Pandemic

Since the start of the pandemic in early 2020, wage growth in the United States has tended to be higher for those at the bottom of the wage distribution. The evidence is from Mitchell Barnes, Lauren Bauer, and Wendy Edelberg, who provide a useful figure illustrating these dynamics for one of their “11 Facts on the Economic Recovery from the COVID-19 Pandemic” (Hamilton Project at the Brookings Institution, September 2021). 

The top lines show growth in nominal wages: the bottom lines show growth in wages after adjusting for inflation. Thus, two conclusions can hold true at the same time: 1) those in the lowest income quartile are doing better than those in the top quartile; and 2) the surge of inflation in recent months has meant that the real buying power of income has dropped.

Granted, the gap here between lowest and highest quartile is not enormous: we’re talking a matter of a few percentage points. But if the change was going the other way, with wage gains more weighted to those in the higher income group, that would be worthy of notice. This seems worthy of notice, too. Barnes, Bauer, and Edelberg write: “Some sectors have seen particularly strong wage gains. For example, over the past 12 months average hourly earnings in the leisure and hospitality sector have grown nearly twice as fast as the overall private industry average. Other sectors seeing strong gains in hourly earnings include retail trade,
transportation and warehousing, and financial activities.”

Why are those at the bottom of the income distribution doing better just now? My guess is that many of the jobs at the bottom of the income distribution were more severely disrupted by the pandemic: for example, think about jobs lost and disrupted in retail stores, or in travel-related industries. As overall GDP growth has recovered, those industries are trying to re-hire. But there are still a lot of pre-pandemic workers who are staying out of the labor force, at least for now. As a result, there are lots of job vacancies along lots of people quitting jobs (which is often a prelude to moving to a new job). Put these together, and the wages for lower-skilled workers are being bid up.

From Offshoring to Nearshoring?

One of the ongoing public narratives, beginning with the start of President Trump’s imposition of tariffs on trade with China and others and continuing up through the pandemic, has been whether a previous pattern of off-shoring–that is, importing goods and inputs from abroad–would emerge. More generally, the question has been whether the US economy would become less attached to Chinese imports.

The Kearney consulting company provides some information on the actual data in its Reshoring Index. The most recent version came out in May, and a summary appears under the title “Global pandemic roils 2020 Reshoring Index, shifting focus from reshoring to right-shoring

JHere’s one basic pattern: total US imports of manufactured goods as a share of US domestic production of manufactured goods. You can see the general rise since 2008, albeit with a small dip in 2019. US imports of manufactured goods are about one-eighth of domestic output of manufactured goods. In my experience, this proportion is considerably lower than a lot of people expect to hear. To put it another way, the potential gains for US manufacturing from being able to export into growing foreign markets around the world are much, much bigger than the potential gains from displacing imports of manufactured goods.

US imports of manufacturing goods from 14 Asian LCCs increased in 2020

What about China in particular? The report looks at imports from China as a share of total US imports from 14 low-cost Asian countries, what the report calls the LCC. The share of these imports from China does start falling in 2018, and in particular it plummets during the production and trade disruptions at the start of the pandemic in the first quarter of 2020.

Th

The drop in China’s share of LCC exports to the US has largely been matched by a rise in Vietnam’s exports to the US. As the Kearney report notes:

US companies have long viewed Vietnam as a viable sourcing option. Since China’s labor costs began to rise in 2007, Vietnam—where labor costs now fall almost 50 percent below those in China—has successfully used its cost advantage to attract global manufacturing business. A few examples include: 

Nike and Adidas reallocated a vast majority of their manufacturing and footwear base from China to Vietnam. 

In 2019, Hasbro announced it hoped to have only 50 percent of its production coming from China by the end of 2020, shifting to new plants in Vietnam and India. Hasbro is continuing to advance its transition to Vietnam and India. 

In 2019 Samsung ended mobile telephone production in China due to rising labor costs and economic slowdowns, shifting production to India and Vietnam. 

In addition, Vietnam is actively participating in free trade agreements to reduce trade barriers, improve market access for its goods, simplify customs procedures, and offer an increasingly attractive business environment. Vietnam has already invested heavily in improving its highways and ports, spending the most on infrastructure among Southeast Asia countries as a percentage of GDP (5.7 percent)

Even if relatively little reshoring of imported manufacturing goods has happened so far, what about the future? The Kearney report offers some data from a survey of US manufacturing executives. A general theme seemed to be that less reliance on imports from China and from the low-cost Asian countries in general was likely. However, a common response was not an expansion of US manufacturing capacity, but instead to emphasize the possibilities of “nearshoring” by relocating many of these supply chains to Mexico or Canada.

In addition, a number of the US manufacturing executives in the survey mentioned the uncertain state of the US workforce during and after the pandemic. Lots of service-oriented jobs can be done virtually, but working at an actual manufacturing job is likely to mean being physically present. One more complication is that 25% of the US manufacturing workforce is 55 or older. Thus, the future of US manufacturing will to some extent be linked to a skilled and willing labor force needed to make it happen.

Goods, Services, and Inventories Since the Pandemic Recession

In most recessions, consumption of services doesn’t move much, while consumption of goods drops fairly sharply and then rebounds over time. The short, sharp pandemic recession was different. With a combination of stay-at-home behavior, risk aversion, and official lockdowns, consumption of services dropped substantially and has not yet fully recovered. Meanwhile, consumption of goods dropped briefly, but quickly rebounded to above the pre-pandemic levels.

Mitchell Barnes, Lauren Bauer, and Wendy Edelberg provide a useful figure illustrating these dynamics for one of their “11 Facts on the Economic Recovery from the
COVID-19 Pandemic”
(Hamilton Project at the Brookings Institution, September 2021). They point out that the US economy has re-attained its pre-pandemic size in the second quarter of this year. But as production of services has not yet fully recovered, production of goods has climbed. The four lines on each graph show the four most recent recessions. The purple line is shows the unusually high rise in goods consumption on the left and the unusual drop in services consumption on the right.

This unexpectedly high consumption of goods, combined with disruptions of supply chains, has created some unusual situations for inventories–that is, what is being held in stock to be sold later. Overall inventories held by businesses can be divided up into three categories: held by manufacturers, held by wholesalers, and what’s held by retailers.

The blue line in this figure shows overall business inventories, measured relative to sales. As you can see, inventories rise during recessions, as unsold goods sit on the shelves. Firms then reduce their ordering, and when demand picks up again after the recession, inventories drop for a time. One interesting fact is that the overall level of business inventories (blue line) has not dropped to unprecedented levels–it’s a 1.25 multiple of sales, similar to levels before and after the Great Recession. But if you focus in on retail inventories, shown by the red line, you can see that retailers (understandably) tended to hold more inventory than the business sector as a whole before the pandemic. But retail inventories have plunged to historically low levels, below the overall levels for the business sector as a whole.

A more detailed breakdown within the retail category shows that some of the sectors with the biggest drop in inventories are motor vehicles, clothing stores, and department stores.

What about the wholesalers who supply the retailers? Their inventories peaked and dropped as well but not as severely. Again, the blue line shows inventories for the overall business sector as a basis for comparison, but the green shows inventories for wholesalers. Their inventory levels are low–for example, it’s not obvious that they have a backlog of inventories waiting to be passed along to the retailers–but the overall level of inventories for wholesalers is not ultra-low compared to pre-pandemic levels.

What about manufacturers? Again, the blue line shows inventories for the overall business sector, one more time, while this time the red line shows inventories for manufacturers. These inventories show the spike and fall one would expect, but overall, they don’t appear that low. On other side, I’ve been reading here and there about shortages for manufacturers of specific inputs, like computer chips used by car manufacturers. Perhaps manufacturers are holding inventories of some inputs because they are hampered by shortages of other inputs.

These shifts in good, services, and inventories are of course reflected in adjustments in actual jobs and businesses. By the standards of past recessions, the dislocations in services industries were much larger and longer than would have been expected, while in goods industries, the rebound was stronger than would have been expected.

The Importance of Basic R&D

R&D combined two ideas: “basic research,” which aims at discovering new science that doesn’t have a near-term commercial application, and all the other research, which does focus on developing a product with a near-term commercial application. For economic growth, both matter. But over time, the big breakthroughs in basic science matter more. The October issue of the semi-annual World Economic Outlook from the International Monetary Fund includes a chapter on “Research and Innovation: Fighting the Pandemic and Boosting Long-term Growth,” with a focus on the importance of basic research.

One emphasis of the report is that new research ideas flow pretty easily across national borders. Thus, other countries benefit from US R&D, and the US economy benefits from foreign R&D. The report notes:

Basic scientific research is a key driver of innovation and productivity, and basic scientific knowledge diffuses internationally farther than applied knowledge. A 10 percent increase in domestic (foreign) basic research is estimated to raise productivity by about 0.3 (0.6) percent, on average. International knowledge spillovers are more important for innovation in emerging market and developing economies than in advanced economies. Easy technology transfer, collaboration, and the free flow of ideas across borders should be key priorities.

Here are couple of interesting figures from the chapter. One shows the rising importance of academic research in patent applications: that is, over time such patent applications are citing a rising amount of academic research.

This figure show the gap between spending on applied and basic research around the world–that is, it shows applied minus basic. The gap has been rising slowly over time, which shows that quicker-to-pay-off applied research has been outpacing basic research.

After presenting various models of the R&D process, the IMF report recommends: “[D]oubling subsidies to private research and boosting public research expenditure by one-third could increase annual growth per capita by around 0.2 percent. Better targeting of subsidies and closer public‑private cooperation could boost this further, at lower public expense. Such investments could start to pay for themselves within a decade or so.”

Whether your focus is on raising the standard of living, improved education and health care, a cleaner environment and reducing carbon emissions, or many other goals, improved technology offers the possibility of doing it better, faster, and cheaper. For example, it’s interesting to speculate on what the pandemic would have been like if it had hit just 25 years earlier, without the technological ability of so many people and services to operate at a distance over the internet: the choices would have narrowed down to greater contagion risks or a truly gruesome economic shutdown.

A Economics Nobel Prize for Causality: Angrist, Card, and Imbens

At first glance, it may appear that the 2021 Nobel prize in economics (more formally the Sveriges Riksbank Prize in Economic Sciences in Memory of Alfred Nobel 2021) was given for two rather separate contributions. The award announcements says that one-half of the award is given to David Card “for his empirical contributions to labour economics,” while the other half is given jointly to Joshua D. Angrist and Guido W. Imbens “for their methodological contributions to the analysis of causal relationships.” (It must have been an interesting if slightly farcical conversation that led to splitting the prize 1/2, 1/4, 1/4, rather than 1/3, 1/3, 1/3, but of such stuff are committee decisions made.)

But the underlying connection between the three co-winners is obvious enough if you read the background materials released by the Nobel committee. Each year, the committee produces a highly readable “Popular science” explanation of the prize, this year titled “Natural experiments help answer important questions,” along with more specialized and longer “Scientific background” paper, this year titled “Answering Causal Questions Using Observational Data.”

As a starting point for understanding the contribution here, it’s useful to begin with an old familiar warning from every introductory statistics class that “correlation does not imply causation.” If two variables A and B are correlated with each other, it’s possible that A causes B, or B causes A, or both A and B are being affected by some unnamed set of factors C, or even that if there is a human tendency to look for patterns in data, but when you just look at enough enough combinations of variables, some of them will be correlated with each other by random chance, with no actual connection between them at all. Back when I was first learning statistics in the late 1970s, it was common for these warnings all to be mentioned in the intro econometrics class, but as a practical matter, we then went right back to calculating correlations.

Scientists often seek to demonstrated causation with controlled experiments. In one famous example, when Louis Pasteur developed a vaccine for sheep anthrax back in 1881, he took a herd of 100 sheep, vaccinated half of them, and then exposed the herd to anthrax. The unvaccinated sheep died and the vaccinated lived. But people aren’t sheep. So if an economist wants to address a question like whether a higher minimum wage causes unemployment (not is correlated with it!) or whether a surge of immigration leads to higher unemployment in the native-born population (not is correlated with it!), how is it possible to use real-world observational data in a way that lets a social science researcher draw conclusions about causality? To put it another way, do real-world events sometimes create a kind of “natural experiment” for researchers to study?

The Nobel committee describes some of the most prominent studies built on this “natural experiment” foundation. For example, consider the question of whether education raises one’s level of income. Yes, it’s of course true that there is a correlation between higher levels of education and income, but perhaps there are underlying persistence characteristics–say, persistence or rule-following or an ability to work with others–that are correlated both with higher education and with higher income. How can we look at real-world data and obtain a causal estimate? The Nobel committee explains.

Joshua Angrist and his colleague Alan Krueger (now deceased) showed how this could be done in a landmark article. In the US, children can leave school when they turn 16 or 17, depending on the state where they go to school. Because all children who are born in a particular calendar year start school on the same date, children who are born early in the year can leave school sooner than children born later in the year. When Angrist and Krueger compared people born in the first and fourth quarters of the year, they saw that the first group had, on average, spent less time in education. People born in the first quarter also had lower incomes than those born in the fourth quarter. As adults they thus had both less education and lower incomes than those born late in the year. Because chance decides exactly when a person is born, Angrist and Krueger were able to use this natural experiment to establish a causal relationship showing that more education leads to higher earnings: the effect of an additional year of education on income was nine per cent.

This study essentially uses the idea that people born in the fourth quarter of a given year are not fundamentally different than those born in the first quarter of the next year, and so the date when you are allowed to leave school in effect created a random separation of these two equivalent groups–logically similar to Pasteur’s vaccine. Researchers began to focus on other situations with the characteristics of a “natural experiment.”

For example, many government programs have an eligibility cut-off, and one can reasonably believe that those who are just barely on one side of the cutoff are pretty much the same as those who are just barely on the other side of the cut-off, with the cutoff itself randomly dividing these two groups. Thus, comparing those barely included from those barely excluded can allow for a causal estimate of the effect of the program.

Other programs are based on an element of randomness: for example, in many cities if the spots in desirable charter high schools are oversubscribed, there is a lottery for who gets admitted. In Oregon a few years back, the state decided to expand Medicaid coverage but because of limited funds, the new benefit was given out by lottery. Sometimes when a new program is implemented, it is rolled out in some areas before others–and those early-adopting areas may occur more-or-less at random. When economists and other social scientists hear about randomization in a program, they start thinking about whether it might serve as a natural experiment to provide evidence about causality.

In other situations, there can be an event or policy choice that works like a natural experiment. The Nobel committee describes a prominent study of immigration done by David Card:

A unique event in the history of the US gave rise to a natural experiment, which David Card used to investigate how immigration affects the labour market. In April 1980, Fidel Castro unexpectedly allowed all Cubans who wished to leave the country to do so. Between May and September, 125,000 Cubans emigrated to the US. Many of them settled in Miami, which entailed an increase in the Miami labour force of around seven per cent. To examine how this huge influx of workers affected the labour market in Miami, David Card compared the wage and employment trends in Miami with the evolution of
wages and employment in four comparison cities. Despite the enormous increase in labour supply, Card found no negative effects for Miami residents with low levels of education. Wages did not fall and unemployment did not increase relative to the other
cities. This study generated large amounts of new empirical work, and we now have a better understanding of the effects of immigration. For example, follow-up studies have shown that increased immigration has a positive effect on income for many groups who were born in the country, while people who immigrated at an earlier time are negatively affected. One explanation for this is that the natives switch to jobs that require good native language skills, and where they do not have to compete with immigrants for jobs.

In effect, the Cuban boatlift of 1980 was a natural experiment, addressing the question: “What would happen if an enormous number of unskilled immigrants arrived suddenly and without much warning in a major US city?” But once you start thinking along these lines, you can consider a variety of other events as natural experiments, too.

The natural experiment examples I have mentioned here are relatively straightforward. But as with many Nobel prizes, the award is given largely because the early work spawned a vast array of follow-up work and altered how economists think about these issues. Even in these relatively clear-cut cases, detailed and multi-faceted arguments have followed about exactly what can be inferred, or not, from looking at the data in different ways. The Nobel committee also describes the “natural experiment’ approach as “quasi-experimental.”

Together the work by this year’s Laureates laid the ground for the design-based approach, which has drastically changed how empirical research is conducted over the past 30 years. … Quasi-experimental variation can come from the many experiments provided by nature, administrative borders, institutional rules, and policy changes. The design-based approach features a clear statement of the assumptions used to identify the causal effect and
validation of these identifying assumptions.

To put it another way, when you hear economists say that a variable is “associated” or “correlated” with another variable, they mean something quite different from when they claim to have found a causal effect. The old statement that “correlation does not equal causation” is now taken with gimlet-eyed earnestness.