A Nobel Prize for Auction Theory: Paul Milgrom and Robert Wilson

Auctions are widely used throughout the economy. The big auction houses like Christie\’s and Sotheby\’s are well-known for selling famous art, and many people have either attended a live auction at a fund-raising event or a flea market or participated in an online auction at a site like eBay. But the behind-the-scenes uses of auctions are far more important. The right for online advertising to appear on your screen is sold in an auction format. When the US government borrows money by selling Treasury debt, it does so in an auction format. When electricity providers sign contracts to purchase electricity from electricity producers, they often use an auction format to do so. Some of the proposals for a buying and selling permits to emit carbon, as a mechanism for the gradual reduction of carbon emissions, would auction off the right to emit carbon. 
One useful property of auctions is that in a number of settings they can discipline the public sector to make decisions based on economic values, rather than favoritism. For example, when a city wants to sign a contract with a company that will pick up the garbage from households, companies can submit bids–rather than having a city council choose the company run by someone\’s favorite uncle. When the US government wants to give companies the right to drill in certain areas for offshore oil, or wishes to allocate radio spectrum for use by phone companies, it can auction off the rights rather than handing them out to whatever company has the best behind-the-scenes lobbyists.  In many countries, auctions are used to privatize selling off a formerly government-owned company.
But the bad thing about auctions is that (like all market mechanisms), they can go sideways and produce undesirable results in certain settings. The Sveriges Riksbank Prize in Economic Sciences in Memory of Alfred Nobel 2020–commonly known as the Nobel Prize in economics, was awarded to Paul R. Milgrom and Robert B. Wilson “for improvements to auction theory and inventions of new auction formats.” For some years now, the Nobel committee has also published a couple of useful reports with each award, one aimed a a popular audience and one with more econo-speak, jargon, and technical detail. I\’ll quote here from both reports: \”Popular science background: The quest for the perfect auction\” and \”Scientific Background: Improvements to auction theory and inventions of new auction formats.\”

A useful starting point is to recognize that auctions can have a wide array of formats. Most people are used to the idea of an auction where an auctioneer presides over a room of people who call out bids, until no one is willing to call out a higher bid. But auctions don\’t need to work in that way. 

An \”English auction\” is one where the bids are ascending, until a highest bid is reached. A \”Dutch auction\”–which is commonly used to sell about 20 million fresh flowers per day–starts with a high bid and then declines, so that the first person to speak up wins. In an open-outcry auction, the bid are heard by everyone, but in a sealed-bid auction, the bids are private. Some auctions have only one round of bidding; others may eliminate some bidders after one round but proceed through multiple rounds. In \”first-price\” auctions, the winner pays what they bid; in \”second-price\” auctions, the winner instead pays whatever was bi by the runner up. 
In some auctions the value of what is being bid on is mostly a \”private value\” to the bidders (the Nobel committee suggests thinking about bidding on dinner with a Nobel economist as an example, but you may prefer to substitute a celebrity of your choice), but in other cases, like bidding on an offshore oil lease, the value of the object is at least to some extent a \”common value,\” because any oil that is found will be sold at the global market price. In some auctions, the bidders may have detailed private information about what is being sold (say, in the case where a house is being sold but you are allowed to do your own inspection before bidding), while in other auctions the information about the object being auctioned may be mostly public. 

In short, there is no single perfect auction. Instead, thinking about how auctions work means considering for any specific context how auction rules and format in that situation, given what determines the value of the auctioned objects and what what kind of information and uncertainty bidders might have. 

If the auction rules aren\’t set up appropriately, the results can go sideways. For some example, Paul Klemperer wrote a some years back on the subject of \”What Really Matters in Auction Design.\” 
One of his examples was about what happened in 1991, when the UK used a process of sealed-bid auctions to see what company would be allowed to provide television services in certain areas. Klemperer writes: 
The 1991 U.K. sale of television franchises by a sealed-bid auction is a dramatic example While the regions in the South and Southeast, Southwest, East, Wales and West, Northeast and Yorkshire all sold in the range of 9.36 to 15.88 pounds per head of population, the only—and therefore winning—bid for the Midlands region was made by the incumbent firm and was just one-twentieth of one penny (!) per head of population. Much the same happened in Scotland, where the only bidder for the Central region generously bid one-seventh of one penny per capita. What had happened was that bidders were required to provide very detailed region-specific programming plans. In each of these two regions, the only bidder figured out that no one else had developed such a plan.
Another problem arises if the bidders find a way to signal each other to hold prices down. In some cases, the bidders can use the bidding process itself to send messages. Here\’s an example from Klemperer: 

In a multilicense U.S. spectrum auction in 1996–1997, U.S. West was competing vigorously with McLeod for lot number 378: a license in Rochester, Minnesota. Although most bids in the auction had been in exact thousands of dollars, U.S. West bid $313,378 and $62,378 for two licenses in Iowa in which it had earlier shown no interest, overbidding McLeod, who had seemed to be the uncontested high bidder for these licenses. McLeod got the point that it was being punished for competing in Rochester and dropped out of that market. Since McLeod made subsequent higher bids on the Iowa licenses, the “punishment” bids cost U.S. West nothing (Cramton and Schwartz, 1999).

Notice that the bids from U.S. West ended in the number 378, which was the lot number where the company wanted McLeod to back off. 

Of course, concerns like these have obvious answers. For example, set a \”reserve price\” or a minimum price that needs to be bid for the object, so no one gets it for (nearly) free. Also, set a rule that all bids need to be in certain fixed amounts, and that increases in bids also need to be in fixed amounts. But making these points both raises practical questions of how this should be done, and also shows some ways in which the practical rules of auctions can matter a lot. 
A more subtle but well-known problem with auctions is called the \”winner\’s curse.\” It was first documented in the context of bidding by companies for off-shore oil leases. An analysis of the bids, along with how much oil was later discovered in the area, found that the \”winner\” of these auctions was on average losing money. The reason is that each individual company was forming its own guess about how much oil was on the site. Naturally, some companies would be more optimistic than others, and the most over-optimistic company of all was likely to bid highest and \”win\” the auction. A problem is that once bidders in an auction become aware of the risk of the winner\’s curse, they may become very reluctant to bid, so that the bids stop representing the actual estimates of value. 
In professional sports, this kind of scenario often plays out when free agents try to encourage bidding among teams for their services. From the player point of view, it only takes one high-end bidder, a bidder who perhaps is ignoring the winner\’s curse, to get a great contract. But many teams may decide to avoid the risk of overpaying and the winner\’s curse by not bidding at all. 
There are various possible responses to a winner\’s curse in an auction format. One is to find ways for the bidders to collect more private information, so that they can be more confident in their bidding. Another is a \”second-price\” auction, where the winner pays the price of the second-highest bidder. This format provides some protection against the winner\’s curse: that is, everyone can feel free to bid as high as they would like, knowing that if they are way out of line with the second-price bid, they will only have to pay the second-price bid. If a second-price bid greatly reduces concerns about the winner\’s curse and leads to more aggressive bidding, it can (counterintuitively) end up raising more money than a first-price auction. 

The auctions that most people participate in are \”private-value auctions,\” where the issue is just how much do you want it–because you are planning to use it rather than to resell it. In this setting, a live auctioneer tries to get people emotionally involved in how much they want something, and in this sense to get them to pay more than they had perhaps planned to pay beforehand. As Ambrose Bierce wrote in his Devil\’s Dictionary published back in 1906: \”AUCTIONEER, n. The man who proclaims with a hammer that he has picked a pocket with his tongue.\”

But auctions for oil leases, spectrum rights, privatized companies, Treasury debt, an so on have some element of being \”common value\” auctions, where the value of what is being sold will be similar across  potential buyers. As the Nobel committee writes: \”Robert Wilson was the first to create a framework for the analysis of auctions with common values, and to describe how bidders behave in such circumstances. In three classic papers from the 1960s and 1970s, he described the optimal bidding strategy for a first-price auction when the true value is uncertain. Participants will bid lower than their best estimate of the value, to avoid making a bad deal and thus be afflicted by the winner’s curse. His analysis also shows that with greater uncertainty, bidders will be more cautious and the final price will be lower. Finally, Wilson shows that the  problems caused by the winner’s curse are even greater when some bidders have better information than others. Those who are at an information disadvantage will then bid even lower or completely abstain from participating in the auction.\”
But when you think about it, many of these \”common value\” auctions actually have a mixture of private values as well. For example, consider bidding on an offshore oil lease. The value of any oil discovered may be a common value. But each individual company may have specific technology for discovering or extracting oil that works better in some situations that others. Some companies may also already be operating nearby, or have facilities nearby. In short, lots of real-world auctions are a mixture of private and common values. As the Nobel committee writes: 

In most auctions, the bidders have both private and common values. Suppose you are thinking about bidding in an auction for an apartment or a house; your willingness to pay then depends on your private value (how much you appreciate its condition, floor plan and location) and your estimate of the common value (how much you might be able to sell it for in the future). An energy company that bids on the right to extract natural gas is concerned with both the size of the gas reservoir (a common value) and the cost of extracting the gas (a private value, as the cost depends on the technology available to the company). A bank that bids for government bonds considers the future market interest rate (a common value) and the number of their customers who want to buy bonds (a private value). … The person who finally cracked this nut was Paul Milgrom, in a handful of papers published around 1980. … This particular result reflects a general principle: an auction format provides higher revenue the stronger the link between the bids and the bidders’ private information. Therefore, the seller has an interest in providing participants with as much information as possible about the object’s value before the bidding starts. For example, the seller of a house can expect a higher final price if the bidders have access to an (independent) expert valuation before bidding starts.

In addition, Milgrom has participated in setting up new kinds of auctions. When auctioning radio spectrum to telecommunications providers, for example, how much you are willing to bid for rights in one geographic area may be linked whether you own the rights in an adjoining area. Thus, rather than auctioning off each geographic area separately–which can lead problems of collusion between bidders– it makes sense to design a Simultaneous Multiple Round Auction, which starts with low prices and allows repeated bids across many areas, so that geographic patterns of ownership can evolve in a single process. There is also a Combinatorial Clock Auction, in which bidders might choose to bid on overall “packages” of frequencies, rather than bidding separately on each license. Milgrom also was a leading developer of the Incentive Auction, which the Nobel committee describes in this way;

The resulting new Incentive auction was adopted by the FCC in 2017. This design combines two separate but interdependent auctions. The first is a reverse auction that determines a price at which the remaining over-the-air broadcasters voluntarily relinquish their existing spectrum-usage rights. The second is a forward auction of the freed-up spectrum. In 2017, the reverse auction removed 14 channels from broadcast use, at a cost of $10.1 billion. The forward auction sold 70 MHz of wireless internet licenses for $19.8 billion, and created 14 MHz of surplus spectrum. The two stages of the incentive auction thus generated just below $10 billion to U.S. taxpayers, freed up considerable spectrum for future use, and presumably raised the expected surpluses of sellers as well as buyers.

The economic theory of auctions is clearly tied up in intimate ways with the practice and design of real-world auctions. More broadly, close analysis of buyers and sellers in the structured environment of auctions can also offer broader insights into how non-auction markets work as well. After all, in some ways a competitive market is just an informal auction with sellers offering bids hoping to get a higher price and buyers making offers hoping to get a lower price. 

For more from Milgrom and Wilson on auctions and related economics, here are some articles from the Journal of Economic Perspectives, where I work as Managing Editor. 

Snapshots of Capital Punishment Trends

Back in 1972, the US Supreme Court outlawed the capital publishment statutes in all 50 states by a 5-4 decision in the case of Furman v. Georgia (408 U.S. 238 [1972]). All nine members of the court wrote separate decisions (!), so the exact legal reasoning behind the decision was less than crystal-clear. But broadly speaking, the decisions held that the death penalty could only be constitutional if applied in a consistent, non-arbitrary, and non-discriminatory way.  

Almost a half-century later, what are the big-picture patterns of capital punishment in the United States? Tracy L. Snell at the US Bureau of Justice Statistics lays out the patterns in \”Capital Punishment, 2018 – Statistical Tables\” (September 2020, NCJ 254786). Here\’s the pattern of US executions over time: 

One way of interpreting this graph was that the death penalty was on its way to being imposed only a very low levels in the 1960s, but after the nine-way Supreme Court decision eliminating all state capital punishment laws inflamed the issue, capital punishment then had a resurgence for a time, before fading again.
There are currently about 2600 prisoners under sentence of death. However, for some years now the number being removed from this category (usually by legal appeal or commutation of sentence rather than by execution) has been exceeding the number added. 
Here\’s a figure showing the racial mix of those under sentence of death. 

Strong opponents of the death penalty will of course feel that one execution is too many. But the Gallup poll suggests that at the national level, supporters of the death penalty continue to outnumber opponents. Moreover, at the state level there will be places with greater support or majority opposition to death penalty. 

This report doesn\’t address policy questions about the death penalty, like whether it may at least in some cases deter murder. My own suspicion is that this question is that statistical methods probably cannot answer this question–but of course, the lack of a clear-cut statistical answer means that each of us will weigh the various risks involved in different ways. For a blog post on this topic from a few years back, see \”The Death Penalty and Deterrence: Why No Clear Answers?\” (June 25, 2012). 

The "No Brown M&Ms" Story: Contract Theory at Work

 Eddie van Halen, the rock guitar legend for the eponymous band Van Halen, died a few days ago. I suspect that many readers already know the \”no brown M&Ms\” story, but it still seems worth memorializing here. 

The story is that when the Van Halen rock band was touring, they required that the promoter or the venue at each location sign an extremely long and detailed contract, with many seemingly arbitrary demands. One item in the contract stated that among the backstage munchies to be provided would be a bowl of M&Ms, but the contract then added \”(WARNING: ABSOLUTELY NO BROWN ONES).\”

The contract was quite clear that violation of any part was cause for the band to immediately cancel the concert, but still receive its full financial payment. I\’m not aware of the band ever cancelling on these grounds, but there were stories that upon finding a brown M&M in the bowl, band members would go on a destructive rampage in the dressing room before being gradually talked down so that they were willing to perform. 

Years later, the logic behind this particular contract provision was explained. Here\’s how the lead singer David Lee Roth told the story in his autobiography (according to Snopes.com): 

Van Halen was the first band to take huge productions into tertiary, third-level markets. We’d pull up with nine eighteen-wheeler trucks, full of gear, where the standard was three trucks, max. And there were many, many technical errors — whether it was the girders couldn’t support the weight, or the flooring would sink in, or the doors weren’t big enough to move the gear through.

The contract rider read like a version of the Chinese Yellow Pages because there was so much equipment, and so many human beings to make it function. So just as a little test, in the technical aspect of the rider, it would say “Article 148: There will be fifteen amperage voltage sockets at twenty-foot spaces, evenly, providing nineteen amperes …” This kind of thing. And article number 126, in the middle of nowhere, was: “There will be no brown M&M’s in the backstage area, upon pain of forfeiture of the show, with full compensation.”

So, when I would walk backstage, if I saw a brown M&M in that bowl … well, line-check the entire production. Guaranteed you’re going to arrive at a technical error. They didn’t read the contract. Guaranteed you’d run into a problem. Sometimes it would threaten to just destroy the whole show. Something like, literally, life-threatening. …

The folks in Pueblo, Colorado, at the university, took the contract rather kinda casual. They had one of these new rubberized bouncy basketball floorings in their arena. They hadn’t read the contract, and weren’t sure, really, about the weight of this production; this thing weighed like the business end of a 747.

I came backstage. I found some brown M&M’s, I went into full Shakespearean “What is this before me?” … you know, with the skull in one hand … and promptly trashed the dressing room. Dumped the buffet, kicked a hole in the door, twelve thousand dollars’ worth of fun.

The staging sank through their floor. They didn’t bother to look at the weight requirements or anything, and this sank through their new flooring and did eighty thousand dollars’ worth of damage to the arena floor. The whole thing had to be replaced. It came out in the press that I discovered brown M&M’s and did eighty-five thousand dollars’ worth of damage to the backstage area.

Well, who am I to get in the way of a good rumor?

Here\’s a 2012 video of Roth telling the \”no brown M&Ms\” story. For economists, of course, the story lives on as an example of contract theory at work. In some cases, it will be costly and time-consuming to check every aspect of a contract (a fact that applies in business and also personal settings). Rather than paying those costs of time and money, over and over, it may save time to choose one perhaps seemingly unimportant item that will be checked first. If that one part of the contract is breached, then overreact. Your reputation will lead future contractual partners to be more careful, and you will ultimately save time and energy. 

Yes, The Other Drivers Went Crazy

The relatively few times I was out on the road back in April and May, right after COVID-19 had started changing all our lives, it seemed to me that the proportion of crazy drivers on the roads was way up. Family and friends had similar hair-raising stories of other drivers blowing through red lights and juking down the highway like a fighter pilot in combat. However, I am congenitally mistrustful of whether anecdotes represent broader patterns, and thus was intrigued that statistics from the National Highway Traffic and Safety Administration (NHTSA) are backing up my experience. 

One NHTSA report is \”Examination of the Traffic Safety Environment During the Second Quarter Of 2020: Special Report\” (October 2020). Here\’s a figure showing total vehicle-miles travelled in the second quarter of 2020 as compared to second quarter 2019. In April, for example, miles travelled were down 40% in 2020 compared to the same month in the previous year.  Similarly, the number of total vehicle trips per day also fell about 40% that month. 

Use of mass transit dropped even more, by as much as 85% from the previous year.  For example, here\’s a graph on monthly urban rail trips: 

There isn\’t any single way of measuring \”unsafe\” driving, but the data provides a lot of clues. Taken together, the clues strong suggest a rise in unsafe driving among those who were on the roads in April and May 2020. 
For example, one clue is to look at the vehicle fatality rate per 100 million vehicle-miles travelled. For the last 10 years or so, it\’s been about 1.1 fatalities per 100 million miles travelled, and this was the rate in the first quarter of 2020. But the fatality rate rose to 1.42 per 100 million vehicle-miles travelled in the second quarter of 2020–a rise of about 30%.  Or to put it another way, vehicle-miles travelled fell 40%, but total fatalities fell only 3%–because those who were driving became more likely to have a fatal accident.
Another clue is to look at data from crashes. For example, a larger share of crashes involved \”unrestrained\” people who weren\’t wearing seat belts and those who were ejected from their cars. The NHTSA reports: 

Individual State’s data have pointed to a trend of reduced seat belt usage during Q2 months in 2020. For example, Virginia experienced a 15.4-percent increase in the number of unrestrained fatalities between January 1 and May 21, compared to the same period in 2019 (Virginia Department of Transportation, 2020). Minnesota also reported a higher proportion of unrestrained fatalities from January 2020 to June 2020 compared to 2019 (see Figure 7) (Office of Traffic Safety, 2020).

Or here\’s a figure showing the share of crashes where emergency medical medical services (EMS) was called where the driver was ejected from the vehicle. The rise in April 2020 (after week 12) is clear. 

Yet another clue comes from testing crash victims who are brought to trauma centers for alcohol or drugs in their system. There\’s no nationally representative data here, but here are the results of a small study of trauma centers in five cities:  

Thomas et al., (in press) analyzed blood samples from patients in five trauma centers who were treated from September 10, 2019, to July 18, 2020. Using March 17 as the cutoff between pre-COVID-19 and COVID-19 periods, the researchers tested the samples for evidence of drug and alcohol use among seriously and fatally injured road users, which included drivers, passengers, bicyclists, pedestrians, and micromobility users. … [There was] significantly higher prevalence of alcohol, cannabinoids (these results are for the active components of marijuana), and opioids during the public health emergency compared to before. … In addition, there was a significant increase in the proportion of people testing positive for more than one category of drugs during the public health emergency. 

Here\’s a final clue: It makes sense that average vehicle speeds were up in April and May 2020, because the sharp decline in traffic meant less congestion, However, it wasn\’t just average speeds that rose, but also the number of extreme speeders. The NHTSA report notes; 

In April and May 2020, news outlets reported large increases in the number of drivers cited for excessive speeds across the country. For example, the Atlanta Journal Constitution reported that the Georgia State Police cited 140 drivers for speeds over 100 miles per hour (mph) in one 2-week period (Wickert, 2020); the Los Angeles Times reported that citations for speeds over 100 mph had increased by 87 percent (McGreevy, 2020); and, the Chicago Tribune reported an increase of 14 percent in speeding citations from automated enforcement (Wisniewski, 2020). Virginia observed that from March 13 to May 21, 2020, speed-related fatalities made up about 50 percent of the overall fatalities, compared to 42 percent in the same period in 2019 (Virginia DOT, 2020).

Why were more people driving crazy in April and May? The NHTSA suggests some possibilities, but doesn\’t try to quantify them. One simple answer is that when drivers see a more wide-open road, more of them will channel their inner NASCAR.   Surely another answer is that many people were stressed, and a stressed driver is more likely to be unsafe. Another is retail alcohol sales overall jumped, and some larger share of drivers were transporting that alcohol internally.

Finally, the  police backed off from traffic enforcement at this time, too, says the NHTSA:

According to a survey released by the International Association of Chiefs of Police (Lum et al., 2020), more than half of the more than 1,000 responding agencies established policies explicitly reducing proactive enforcement including traffic enforcement, in both March and May 2020 when the survey was fielded, and nearly three-quarters had policies mandating the reduction in physical arrests for minor offenses. Drawn from their regular communications with statewide law enforcement entities, NHTSA Regional Administrators (personal communication) shared States’ self-reported decreases in traffic enforcement, including decreases in seat belt enforcement, impaired driving enforcement, and speed enforcement.

I have no deeper lesson here, except to note that the ways in which people react to changes and to incentives can often have many dimensions–some better and some worse–operating all at the same time.

The Education of Angus Deaton

Gordon Rausser and David Zilberman have \”A Conversation with Angus Deaton\” in the Annual Review of Resource Economics (2020, 12: pp. 1-22) For those not already somewhat familiar with Deaton\’s life and work, the first few page provide a capsule overview. The interview goes into more depth about his early education, working with Richard Stone and Arthur Lewis, and his more recent work including his thesis about \”deaths of despair\” in the US and his skepticism about the popular randomized control trial methodology.  As a sample, here are some Deaton\’s comments for how he entered the field of economics, which for modern American academics will sound as if Deaton arrived from some different planet. Deaton says: 

For me in high school, the great virtue of mathematics was you could do it very quickly. I had lots of time for writing and music and for playing sports and things that I actually cared about a lot more. When I went to Cambridge, that really didn’t work anymore. There were a lot of really serious mathematicians there. It’s something that we observe in  all of our classes, I think. You get these kids who come in and they’ve been superstars wherever they were before and suddenly realize they’re not such bright superstars compared with some of the other people in the class.

I spent a couple of years not really doing mathematics but playing cards and going to the movies and enjoying myself. Then at some point the authorities, like my tutor, said, “You can’t go on like this. You’ve really got to do something else. The mathematicians don’t want you anymore.” I said, “Well, I don’t really want them either.” Then I said, “Well, what should I do?” They said, “Well, there’s only one thing for people like you, and it’s called economics.” I had no idea what it even was. I said, “What’s economics?” …

Oh, I would be 21 really. Then I was assigned an economics tutor and some material to read over the summer. I had a summer job working on the Queen Mary and the Queen Elizabeth, going back and forth between Southampton and New York every two weeks. I was selling clothes on board the ships. I remember sitting there with a copy of Samuelson’s Foundations of Economic Analysis (1947), or not that one, but the textbook. … I thought it was wonderful. It just provided that shove, as it were, that nudge if you like. It was more of a shove than a nudge. It put me in a place that I really liked. … 

I’ve often felt sort of closed off either deliberately or because I never trained. I never went to a PhD program. I had that one year of economics as an undergraduate. There were some stars there, like James Mirrlees, a later Nobel laureate, who was teaching. But these lectures were not very comprehensible. I spent a lot of time trying to figure things out for myself. I’ve often felt that I was right outside the mainstream somehow. …

I was very envious of the superstar kids in my environment, the kids who’d done economics for three years and who were recognized geniuses. Mervyn King, for instance, was almost my exact contemporary—I think he is two years younger than I am. He was very much a superstar and never got a PhD, because in those days, if you were really good, you didn’t get a PhD. The PhDs were for the sort of failures who were trying to catch up somehow. I was certainly very envious, but over the years, as you said, I began to realize that the things you really learn are the things you figure out for yourself. Sometimes you make mistakes in figuring those things out by yourself and sometimes those mistakes can be productive. …

Mostly, you look at something, you stare at it, you say, “I just don’t understand this.” Then 999 times out of a thousand, it’s because you made an error that you don’t understand something. But the one other time is the one when everybody’s making an error and it’s just not well understood. I think it was quite helpful not to be super well trained in the conventional stuff. …

All through my career, I found people who would listen tome ask questions and didn’t mind my ignorance. That’s something that the American graduate school experience is very bad at. Students are terrified because they’re so competitive and also because they’re always frightened of confessing that they don’t understand. Part of the benefit to me of not having gone through that education was I was always looking for people who knew stuff, who would tell me.

And here is one of Deaton\’s comments on economic development and growth. 

In the back of Arthur’s book on economic growth (Lewis 1955), he raises this question, which doesn’t get asked enough: “Why? Why do we care about this at all?” Because a lot of people don’t. The Pope doesn’t really seem to care about economic growth very much. I don’t know whether Arthur actually talks about Mozart, but he talks about kids growing up in absolute poverty and how they never have the opportunity to develop what may be extraordinary innate skills. There are these buried talents—lost Mozarts, or lost Einsteins—a great term someone’s been talking about recently.

What development does is give people what Amartya Sen would call capabilities, which you just don’t have if you’re living in grinding poverty. The expressions of human genius and human  creativity are going to be stifled and stamped out if you don’t have economic development. That’s why you should have development. 

US Income Inequality, According to CBO

For a basic overview of income inequality in the United States, the Congressional Budget Office provides a useful overview in its report \”The Distribution of Household Income, 2017\” (October 2020). Versions of this report have been coming out for about 40 years. The CBO uses what is called the Statistics of Income, which is based on a nationally representative sample of individual income tax returns collected by the Internal Revenue Service. Thus, the data for income in 2017 wasn\’t first reported on tax returns until 2018, and it takes a couple of years before the late tax returns have arrived and the comprehensive dataset can be compiled. This CBO report is really just about presenting the data: you need to draw your own policy conclusions. 

Here\’s some basic information on growth of income at different points in the distribution over time. The rise in incomes for households in the bottom fifth or \”quintile\” of the income distribution, not including taxes and transfer payments, was about 35% from 1979 to 2017. The rise in incomes for the middle quintiles of the income distribution was about the same. But here\’s the pattern at the top. 

Clearly, income growth at the very top of the distribution has been at higher rates. For perspective, the average income (before taxes and transfers) for those from the 99th percentile up to the 99.9th percentile was $1.11 million in 2017; for those from the 99.9th percentile to the 99.99th percentile, average income was $5.67 million; for those at the 99.99th percentile and above, average income was $48.54 million. 

Interestingly, the source of income that has grown the fastest is what is called \”business income\”–that is, income from owning or running a business which at the top level includes often includes partnerships in large law firms, running as series of car dealerships, and other substantial enterprises. The CBO writes: 

As a share of income among households in the top 1 percent, business income rose from 11 percent in 1979 to 23 percent in 2017. Meanwhile, average capital income (including capital gains) grew at a slower pace than other forms of income. As a result, it declined as a share of income among households in the top 1 percent of the distribution, from 54 percent of income in 1979 to 41 percent in 2017. Labor income remained roughly constant at about one-third of income among such households from 1979 to 2017.

How have the effects of of federal redistribution changed over time, in terms of taxes and programs that tend to redistribute income? Here\’s a graph showing average federal tax rates over time by income group, where the average federal tax includes income taxes, payroll taxes that finance Social Security and Medicare, and corporate income taxes attributed to those who receive such income. The general pattern is generally lower federal tax rates as a share of income for those at the bottom, and not much overall change in federal tax rates for those at the top. (More detailed breakdowns of federal tax rates for the top 0.1% and 0.01% look a lot like the top 1%.)

What are the effects of federal tax and spending programs in reducing inequality, and how have these effects changed over time? The Gini coefficient is a way of measuring inequality that varies from 0 to 1, with zero being complete equality of incomes. The top line in this figure show rising inequality over time in market incomes. The second line shows the change in this inequality after taking into account Social Security and Medicare payments. The third line shows the change in inequality after taking into account means-tested federal programs aimed at those with lower incomes. The bottom line shows income inequality after taking in account all of the previous changes plus shifts in taxes. 

One basic theme that emerges from figures like this is that no matter what metric you choose, income inequality is up. However, it\’s also true that the combination of government programs has ameliorate the rise in market-based income inequality to some extent: for example, measured by market-based incomed, the Gini rises 13 percentage points from 1979 to 2017  (from .472 to .602), but looking at the bottom line that includes all income and transfers, the rise is about 8 percentage points (from .351 to .434).  

Finally, it\’s also interesting to note that in terms of reducing inequality, a lot of the action happens because of taking Social Security and Medicare into account–that is, it\’s mainly reducing inequality for elderly Americans. Also, a big chunk of that reduction in inequality–the Medicare portion–doesn\’t put any income directly into the pockets of the elderly, but instead goes to paying health care providers. 

Alexander Hamilton: How He Smote the Rock, Touched the Corpse, Birthed Minerva

Even at a time and place when florid public compliments were a hallmark of public oratory, Daniel Webster\’s comment about Alexander Hamilton at a \”Public Dinner at New York\” given in Webster\’s honor on March 10, 1831, stands out. (The text is from the 1903 collection, The Writings and Speeches of Daniel Webster, vol. 2, pp. 48-50, available via the magic of Hathitrust.) Webster said: 

He smote the rock of the national resources, and abundant streams of revenue gushed forth. He touched the dead corpse of the Public Credit, and it sprung upon its feet. The fabled birth of Minerva, from the brain of Jove, was hardly more sudden or more perfect than the financial system of the United States, as it burst forth from the conceptions of ALEXANDER HAMILTON.

But what did Hamilton do to deserve this tribute? Daniel Webster explains at greater length, and I\’ve appended more of his discussion below. But for the those looking for a little more detail on how Hamilton smote the rock, touched the corpse, and birthed Minerva, Richard Sylla and David J. Cowen provide a crisp and readable overview in \”Hamilton and the U.S. Financial Revolution\” (Journal of Applied Corporate Finance, Fall 2019, 31:4, pp. 10-15), which in turn is based on their 2018 book.  

Sylla and Cowen argue that a US \”financial revolution\” occurred in a few years of the early 1790s, led by Hamilton. They argue that Hamilton was a student of financial history, and that he saw connections from earlier small countries that developed their financial markets and then became economically and politically powerful. They write: 

Roughly two centuries before the United States gained independence, the Dutch Republic (or “the United Provinces,” as they were then called), the modern-day Netherlands, had its own financial revolution. That revolution helped the small country to win its independence from Spain, a larger and seemingly more powerful country. Armed with modern finances and an entrepreneurial mercantile spirit, the Dutch Republic became the leading and richest economy of the 17th century—and many historians view it as it the world’s first truly modern economy. Amsterdam became the world’s leading financial center, and it was there that Hamilton succeeded in the 1790s in floating the foreign loans he foresaw would be required to restructure the massive U.S. war debts. And New Amsterdam was, of course, Dutch before it became New York.

Great Britain, the mother country of the United States, followed in the footsteps of the Dutch by handing its throne to a Dutch prince, Willem of Orange, in the Glorious Revolution of 1688. As William III of Britain, he brought over experienced Dutch financiers, who together with Britons launched an English financial revolution. Similar to the ways in which the Dutch exploited their access to capital—or “credit”—to defeat the Spanish, England used it to win all of its protracted wars with France, a larger country, between 1688 and 1815.3 
Hamilton was a student of all this financial history …
What were the main components of the US financial revolution of the early 1790s. Sylla and Cowen sum it up this way: 

To our mind it had six key components:

1. Establishing effective institutions of public finance—revenues and spending—and public debt management to finance the government, restructure its unpaid debts, and establish public credit so that the U.S. government would always be able to borrow on good terms.

2. Founding a central bank to aid the government’s finances and serve as the central node of the country’s emergent banking and financial systems.

3. Creating the U.S. dollar as the country’s unit of account and medium of exchange, and basing it on gold and silver as the monetary base into which bank notes and deposits were convertible, thereby giving it the stability of value that would make it a safe basis for long-term contracts (such as bonds) and a safe asset in which to hold savings.

4. Fostering the growth of a banking system by encouraging state governments to create more banks to aid their own finances and lend to businesses and individual entrepreneurs— in other words, to establish private credit and bank money.

5. Fostering the growth of securities markets to make financial assets—government bonds, private-sector bonds, and company equities (stocks)— liquid and transferrable for the benefit of issuers and investors.

6. Fostering the growth of business corporations, financial (such as banks and insurance companies) and non-financial (such as utilities; road, bridge, and canal companies; and manufacturers), and so encouraging the pooling of capital of individuals that would allow large enterprises to be created and economies of scale to be achieved.

As Treasury Secretary from 1789 to 1795, Hamilton had direct authority concerning the first three items on the list—public credit, the central bank, and establishment of the dollar as the unit of currency. Hamilton had to rely on others—state authorities and private entrepreneurs—to flesh out his financial revolution by implementing the other components—state banks, corporations, and securities markets. But once put in place, the first three elements, as Hamilton foresaw they would, laid the groundwork and provided the inducements for the second three.

In modern times, we have a tendency to take the benefits and workings of the financial system for granted, at least in high-income countries. But what is obvious today was not obvious to many people circa 1790. And when a nation\’s financial system seems shaky–say, during the Great Depression, the US economy during the Great Recession, the struggles of the EU with debtor nations, China\’s current experience with high debt levels and a rigid and inflexible financial system–Hamilton\’s foresight becomes even more remarkable. As Sylla and Cowen describe Hamilton: \”He was ahead of his time. Indeed, since we have come so lately to appreciate what he learned long ago about the connection of finance and credit to growth and power, he has remained ahead of his time right down to the present.\”

_________________

Here\’s the full three-paragraph passage of Webster discussing Hamilton in 1831: 

How can I stand here, to speak of the Constitution of the United States, of the wisdom of its provisions, of the difficulties attending its adoption, of the evils from which it rescued the country, and of the prosperity and power to which it has raised it, and yet pay no tribute to those who were highly instrumental in accomplishing the work? While we are here to rejoice that it yet stands firm and strong, while we congratulate one another that we live under its benign influence, and cherish hopes of its long duration, we cannot forget who they were that, in the day of our national infancy, in the times of despondency and despair, mainly assisted to work out our deliverance. I should feel that I was unfaithful to the strong recollections which the occasion presses upon us, that I was not true to gratitude, not true to patriotism, not true to the living or the dead, not true to your feelings or my own, if I should forbear to make mention of ALEXANDER HAMILTON.

Coming from the military service of the country yet a youth, but with knowledge and maturity, even in civil affairs, far beyond his years, he made this city the place of his adoption; and he gave the whole powers of his mind to the contemplation of the weak and distracted condition of the country. Daily increasing in acquaintance and confidence with the people of New York, he saw, what they also saw, the absolute necessity of some closer bond of union for the States. This was the great object of desire. He never appears to have lost sight of it, but was found in the lead whenever any thing was to be attempted for its accomplishment. One experiment after another, as is well known, was tried, and all failed. The States were urgently called on to confer such further powers on the old Congress as would enable it to redeem the public faith, or to adopt, themselves, some general and common principle of commercial regulation. But the States had not agreed, and were not likely to agree. In this posture of affairs, so full of public difliculty and public distress, commissioners from five or six of the States met, on the request of Virginia, at Annapolis, in September, 1786. The precise object of their appointment was to take into consideration the trade of the United States; to examine the relative situations and trade of the several States; and to consider how far a uniform system of commercial regulations was necessary to their common interest and permanent harmony. Mr. Hamilton was one of these commissioners; and I have understood, though I cannot assert the fact, that their report was drawn by him. His associate from this State was the venerable Judge Benson, who has lived long, and still lives, to see the happy results of the counsels which originated in this meeting. Of its members, he and Mr. Madison are, I believe, now the only survivors. These commissioners recommended, what took place the next year, a general Convention of all the States, to take into serious deliberation the condition of the country, and devise such provisions as should render the constitution of the federal government adequate to the exigencies of the Union. I need not remind you, that of this Convention Mr. Hamilton was an active and efficient member. The Constitution was framed, and submitted to the country. And then another great work was to be undertaken. The Constitution would naturally find, and did find, enemies and opposers. Objections to it were numerous, and powerful, and spirited. They were to be answered; and they were effectually answered. The writers of the numbers of the Federalist, Mr. Hamilton, Mr. Madison, and Mr. Jay, so greatly distinguished themselves in their discussions of the Constitution, that those numbers are generally received as important commentaries on the text, and accurate expositions, in general, of its objects and purposes. Those papers were all written and published in this city. Mr. Hamilton was elected one of the distinguished delegation from the city to the State Convention at Poughkeepsie, called to ratify the new Constitution. Its debates are published. Mr. Hamilton appears to have exerted, on this occasion, to the utmost, every power and faculty of his mind. 

The whole question was likely to depend on the decision of New York. He felt the full importance of the crisis; and the reports of his speeches, imperfect as they probably are, are yet lasting monuments to his genius and patriotism. He saw at last his hopes fulfilled; he saw the Constitution adopted, and the government under it established and organized. The discerning eye of Washington immediately called him to that post, which was far the most important in the administration of the new system. He was made Secretary of the Treasury; and how be fulfilled the duties of such a place, at such a time, the whole country perceived with delight and the whole world saw with admiration. He smote the rock of the national resources, and abundant streams of revenue gushed forth. He touched the dead corpse of the Public Credit, and it sprung upon its feet. The fabled birth of Minerva, from the brain of Jove, was hardly more sudden or more perfect than the financial system of the United States, as it burst forth from the conceptions of ALEXANDER HAMILTON.

Checking on COVID-19 Economics Research: BPEA Fall 2020

 It would be a full-time job to keep up with the flow of economics research on aspects of COVID-19. Those who wish to take a closer look might begin with COVID Economics, a quick-turnaround journal which published its first issue on April 3, and has now just published its 50th issue. In that issue, the editor Charles Wyplosz reports the journal has published 332 papers so far. However, the flow of submissions has been slowing, from 6-7 submissions per day back in April to 1-2 submissions per day now. 

The National Bureau of Economic Research is another useful source for COVID-19 related research. The NBER website has one page that lists COVID-related papers by the week they are released, with a typical week including 5-10 new papers, and another page that organizes the paper by broad subject area (like effects on asset markets, effects of social distancing and other measures, macroeconomic effects, and so on). 

But if your personal idea of the good life doesn\’t involve surfing through these hundreds of research papers, and yet you would still more than a tiny taste of what economists have been doing on this subject, a useful starting point is the set of papers produced for the Fall 2020 Brookings Papers on Economic Activity. These papers both pull together a lot of the existing research and offer additional insights. Drafts of papers, presentation slides, and video are all available. Here\’s a list of the papers. Each link leads to a readable short overview of the main themes of the paper, and then a link to the paper itself: 

Here are a couple of illustrative figures from the paper by Fernández-Villaverde and Jones, showing the results of the COVID-19 epidemic across countries and states. The upper right of these diagrams represent places with high COVID-19 mortality and large economic losses; the upper left is low mortality but high economic losses. The lower left is low mortality and low economic losses. The lower right is high mortality but low economic losses. 
Here\’s a figure showing these health and economic results for countries. Countries that have performed the worst on both dimensions (upper right) would include Spain, the United Kingdom, Italy, and Belgium. The US is similar to Sweden and Netherlands, in having had a high level of COVID-related deaths but lower economic losses. In the bottom left are places like South Korea, Japan, China, Norway, Poland, and Denmark, with economic losses similar to the US but a much lower death rate. Taiwan is the extreme outlier, with almost no COVID-19 deaths and economic gains rather than losses (show here on the negative scale). 
 

Here\’s a similar graph at the level of US states, focusing on the monthly unemployment rate as the measure of economic outcomes. The worst outcomes in the upper right are for Massachusetts, New York, and New Jersey, with both high COVID-19 death rates and high unemployment. In the upper left, some western states like Hawaii, California, and Nevada (along with Pennsylvania) had large economic losses but much lower death rates. The states with both low death rates and low unemployment in the bottom left of the diagram include Utah, Idaho, Nebraska, and Montana. 

As Fernández-Villaverde and Jones emphasize, figuring out the extent to which the better results are a matter of luck or policy (or measurement issues) is an ongoing research task. Moreover, the outcomes are still evolving. Still, graphs like these offer a way of starting to think systematically about where the health and economic effects that have followed in the wake of COVID-19 have been better or worse, and thus offer a useful starting point for additional investigation. 

When Local Governments Subsidize Firms: Some Guidelines

Certain cities and metropolitan areas have been lagging in economic growth for decades, while others have surged ahead.  US regions used to be converging in economic terms, but this pattern has halted. These changes have been contributors to patterns of growing inequality, as well as to political divisions. This raises an obvious question: Can \”place-based\” public policy be used to stimulate the economy in slower-growth areas?  

I\’ve written from time to time about proposals along these lines. For example, a couple of years ago Benjamin Austin, Edward Glaeser, and Lawrence H. Summers considered the problem and some of the options. For example, they pointed out that while additional infrastructure spending may be useful for other purposes, it\’s not clear that it\’s a tool that works well for touching off a wave of growth in a slow-growth area. They end up advocating geographically-targeted employment subsidies for jobs in certain geographic areas. 

Some of the proposals have focused more on spreading out research and development efforts across the country, through some combination of R&D funding at universities and technology centers. For example, last year Jonathan Gruber and Simon Johnson proposed that cities be able to bid for how they would used federal funds to support a local tech center. An independent commission would determining how the funds would be allocated, but the funds would go only to cities with a population of at least 100,000 workers from age 25-64, where the college-educated share of such workers is at least 25%, and where the mean home price is less that $265,000, and the commute is less than 30 minutes. The commission would also look at  measures of patents/worker in that area, as well as whether the area already has highly-ranked graduate school programs in science and tech areas. Robert D. Atkinson, Mark Muro, and Jacob Whiton made a broadly similar proposal for \”growth centers\” along these lines. 

The Summer 2020 issue of the Journal of Economic Perspectives offers a couple of other perspectives on place-based policies. Timothy J. Bartik  writes about \”Using Place-Based Jobs Policies to Help Distressed Communities,\” with a particular focus on the $60 billion or so that state and local governments spend each year on incentives for businesses to locate in their area, mostly in the form of cash and tax incentives to specific companies. Bartik points out that moving people to different locations is hard, and economic and social problems in certain areas are persistent; thus, the potential payoff from additional jobs in depressed areas is high. But Bartik also points out that local politicians often prefer to offer relocation subsidies to a few large firms, who often don\’t even locate in the actually distressed areas. Thus, he offers some suggestions for how to make this approach cost-effective: 

First, place-based jobs policies should be more geographically targeted to distressed places. The benefits of more jobs are at least 60 percent greater in distressed places than in booming places. But our current incentive system does not significantly favor distressed places. 

Second, place-based jobs policies should be more targeted at high-multiplier industries, such as high-tech industries. Governors may claim they want to build the future economy, but state and local governments in practice do not target high-tech. One caveat: high-tech targeting should consider how to increase the access of current residents to these jobs. One model is Virginia’s recent offer for Amazon’s “Headquarters II.” Virginia’s offer included a new Virginia Tech campus in northern Virginia and increased funding at state colleges for tech-related programs. These education programs increased the odds that Virginia residents would fill Amazon’s jobs. 
Third, incentives should not disproportionately favor large firms, especially given the renewed concern in economics over excess market power in product markets and labor markets (Azar, Marinescu, and Steinbuam 2017; Gutiérrez and Phillippon 2017). 
Fourth, place-based jobs policies should put more emphasis on enhancing business inputs. Customized business services, infrastructure, and land development services have the potential to be more cost effective than incentives as ways to increase local jobs and earnings. 
Fifth, place-based policies should be a coordinated package of policies attuned to local conditions. One area may need more infrastructure; another, training; and still another, better land development processes. Place-based policies are complementary. If the local nonemployed are more skilled, job growth increases employment rates more. If more jobs are available, it is easier to design effective training programs. Business inputs are complementary—boosting infrastructure helps growth more if the local economy also has customized business services.
In another essay in the Summer 2020 JEP, Maximilian von Ehrlich and Henry G. Overman consider \”Place-Based Policies and Spatial Disparities across European Cities.\” They point out that across the European Union, as in the United States, spatial disparities in income are large, persistent, and growing. The EU has a \”cohesion\” policy that seeks to reduce these differences by targeting certain metropolitan areas with subsidies, mainly for infrastructure and physical capital, but also to a more limited extent for employment training and subsidies to firms, They find that these policies have been modestly effective in ameliorating the ongoing trends, although not in reversing it. They also find that such subsidies tend to be more effective in areas that already have a relatively high number of educated workers. 
The economic forces that have led certain areas to be economically distressed for long periods of time are obviously powerful and slow to change. As I consider the proposals, I keep running into questions of political economy. Is it possible for the political system to do place-based targeting in a cost-effective way?  After all, the areas most in need of such targeting are also frequently the areas that currently lack economic and political clout. 
When national governments try to target assistance to distressed areas, it\’s common to see a dynamic where the definition of \”distressed\” keeps getting wider and wider, until it includes all major cities and all 50 states. A proposal for spreading R&D more widely across the country or starting \”tech centers\” or \”growth centers\” is likely to run into similar problems. When it comes to eligibility for employment subsidies, big companies know how to play the bureaucracy for maximum eligibility, while small companies do not. When state and local government think about business incentives, they have a bias toward a high-profile and costly deal where the governor or the mayor can shake hands with a CEO of a prominent company, and where there will be a ceremony to stick a shovel in the ground at the site of a new plant. There also seems to be a bias toward building physical infrastructure, surely in part because of behind-the-scenes lobbying for such contracts and also because of the photo opportunities for politicians when such projects are completed. 
In short, I can believe that well-targeted and well-designed incentives could have benefits that exceed  costs for areas that have experience long-term and persistent economic distress. I\’m not confident that the political system can enact a large-scale version of such a program. It would require that the political representatives of metropolitan areas and states with high per capita incomes see it as in the interest of their own area to actively support efforts to launch economic growth in other areas. 

How the US Start-Up Industry is Faltering

One of the long-term strengths of the US economy has been that it fostered the growth of new businesses. Some provided employment for only a few, while others grew into giants. But that dynamic process of new businesses ultimately benefited not just those who worked in them, but also innovation, productivity, and consumers. But as I have pointed out in the past, there are a variety of signs that this business dynamism has been declining. Here are some additional pieces of evidence: 
Thomas Astebro, Serguey Braguinsky, and Yuheng Ding discuss \”Declining Business Dynamism Among Our Best Opportunities: The Role of the Burden of Knowledge\” (September 2020, NBER Wroking paper 27787). They write: 
We employ the nationally representative Survey of Doctorate Recipients to show a decline over the past 20 years in both the rate of startups founded and the share of employment at startups by the highest-educated science and engineering portion of the U.S. workforce. The declines are wide-ranging and not driven by any particular founder demographic category or geographic region or scientific discipline. 
Here\’s a figure focused just on those with PhDs in science and engineering fields. As the authors note: \”The figure reports the share of PhDs in science and engineering who are employed full-time with non-zero salaries in new (five years old or less) private for-profit companies (startups) compared with PhDs in science and engineering who are employed full-time with non-zero salaries in all private for-profit businesses.\” The dashed line shows the share of this group who are employees in startups, while the solid line shows the share who are founders of start-ups. 

They argue that when dealing with new technology, the benefits of working established firm may be rising. They point out that PhDs in science and engineering who are starting firms now do tend to hae more business experience, suggesting that the task of running a new technology-based business might be becoming more complex, even as the potential rewards for doing so may be diminishing. 

First, entrepreneurial outcomes are immensely skewed. Only a very small subset of entrepreneurial ventures make a meaningful contribution to growth, job creation or productivity improvements. The average entrepreneurial venture typically ends up as economically marginal, under-sized and poorly performing enterprise, or a ‘Muppet’. The second finding is that the skewed distribution of outcomes seems to be decreasing over time. Positive outcomes are becoming less common. While the share of firms with growth intentions seems to be increasing, the quality of entrepreneurial ventures seems to be falling, with high-growth outcomes becoming more unlikely. The rare ‘gazelles’ and ‘unicorns’ that disproportionately propel the economy, are becoming rarer. Economically trivial ventures, are becoming more common.

The suggest that a possible answer is the rise of the \”Entrepreneurship Industry,\” which has the goal of selling to people who want to see themselves as entrepreneurs. They write: 

The Entrepreneurship Industry leverages the Ideology of Entrepreneurialism to create products and services that can be marketed to entrepreneurs. The industry grows its own market by encouraging greater entry into entrepreneurship and persistence in entrepreneurial ventures, irrespective of their likelihood of success. In doing so, it has transformed entrepreneurship from a generally gainful economic activity into a largely wasteful form of conspicuous consumption motivated by aspirations to the socially attractive identity of ‘being an entrepreneur’. This form of wasteful entrepreneurship is what we refer to as Veblenian Entrepreneurship. That is entrepreneurship that masquerades as being innovation-driven and growth-oriented but is substantively oriented towards supporting the entrepreneur’s conspicuous identity work.

Josh Lerner and Ramana Nanda offer a different set of concerns in \”Venture Capital\’s Role in Financing Innovation: What We Know and How Much We Still Need to Learn\” (Journal of Economic Perspectives, Summer 2020, pp. 237-61). They argue that while the venture capital industry has had some great successes in the past, \”venture capital financing also has real limitations in its ability to advance substantial technological change.\” In particular, 

Three issues are particularly concerning to us: 1) the very narrow band of technological innovations that fit the requirements of institutional venture capital investors; 2) the relatively small number of venture capital investors who hold and shape the direction of a substantial fraction of capital that is deployed into financing radical technological change; and 3) the relaxation in recent years of the intense emphasis on corporate governance by venture capital firms. We believe these phenomena, rather than being short-run anomalies associated with the ebullient equities market from the decade or so up through early 2020, may have ongoing and detrimental effects on the rate and direction of innovation in the broader economy.

They argue that venture capital firms have become narrower in their focus, looking for firms where the uncertainty about whether the business will succeed is likely to be resolved fairly quickly, and thus less willing to take on a wider variety of start-up ideas where the uncertainty will remain for a substantial time–and where the direct involvement of the venture capital firm in corporate governance over an extended time might make the difference between success and failure. As one example, here\’s how the focus of Charles River Associates has evolved over time:  

Charles River Ventures was founded by three seasoned executives from the operating and investment worlds in 1970. Within its first four years, it had almost completely invested its nearly $6 million first fund into 18 firms. These included classes of technologies that would be comfortably at home in a typical venture capitalist’s portfolio today: a startup designing computer systems for hospitals (Health Data Corporation), a software company developing automated credit scoring systems (American Management Systems), and a firm seeking to develop an electric car (Electromotion, which, alas, proved to be a few decades before its time). Other companies, however, were much more unusual by today’s venture standards: for instance, startups seeking to provide birth control for dogs (Agrophysics), highstrength fabrics for balloons and other demanding applications (N.F. Doweave), and turnkey systems for pig farming (International Farm Systems). Only eight of the 18 initial portfolio companies—less than half—were related to communications, information technology, or human health care.

The portfolio of Charles River Ventures looks very different in December 2019. Of the firms listed as investments, about 90 percent are classified as being related to information technology comprising social networks, applications for consumers, and software and services related to enhancing business productivity. Approximately 5 percent of investments are classified as being related to health care, materials, and energy. This shift in Charles River’s portfolio reflects the patterns of the industry at large …

I don\’t feel as if I have a good handle on all the reasons for the decline in US startup firms. But it does seem to me that a lot of the private sector has become highly focused on start-up firms that involve web-based networking in one way or another. Fortunes can be made in such firms with a relatively small number of employees. In contrast, as Lerner and Nanda point out, start-ups in areas like clean energy or new materials may not have as clear a path to follow, and those thinking about starting such firms may not find it easy to get support from venture capital or with other parts of the finance system for start-ups.