Both the budgets and methods of US statistical agencies like the Bureau of Labor Statistics and the Bureau of Economic Analysis have come under political challenge. this double-edged attack raises a question of sincerity: If the real goal is to enable these agencies to update their methods for the information age–for example, to rely less on survey-based instruments and more on administrative data collected for other purposes–then this policy requires an expansion of their budgets. Given the budgets of these statistical agencies are barely a ripple in the ocean of federal spending, raising their budgets would make no perceptible difference in the long-run trajectory of federal spending, deficits, and debt.
On the other hand, if the budgets and staffing of US statistical agencies are to be continually cut, which has been the pattern over the last decade or so and even more ferociously in the present, then expressing a concern that these agencies should update their methods is just empty talk. It gives rise to a reasonable concern that the objection is not to their statistical methods: indeed, one suspects many of those complaining about quality of the output of these agencies don’t actually know the mixture of methods used, and why the result is a combination of early and revised estimates. Intead, the critics tend to praise the economic numbers they like, while claiming that all the numbers they don’t like are just political bias, without any recognition that all the numbers are produced using common methods.
Nathan Goldschlag of the Economic Innovation Group offers “A Q&A with John Haltiwanger,” someone who really knows this stuff, in “US statistical agencies deserve support and funding. They also need to reform” (August 27, 2025). Here are some comments from Haltiwanger:
The U.S. federal statistical system continues to produce timely and high-quality economic indicators, but it is built on a foundation that was largely conceptualized and developed in the mid-20th century. This system remains heavily survey-centric, relying on a mix of high- and low-frequency surveys of households and businesses. With the digitization of nearly all aspects of modern life, administrative records and private-sector digital data are now more accessible than ever. …
At the same time, the current system is under increasing strain and requires fundamental re-engineering. Survey response rates have been steadily declining, a trend that accelerated during the pandemic. Reaching households by phone has become more difficult, and many businesses find that survey instruments — even online forms — do not align with their internal information systems. In a telling development, the U.K.’s Office for National Statistics (ONS) recently suspended (temporarily) the publication of unemployment estimates based on household surveys due to critically low response rates.
Beyond operational challenges, the accelerated pace of economic change increasingly challenges the capacity of current statistical systems to measure innovation and productivity growth effectively. Accounting for the effects of product turnover and quality change in estimates of inflation, real output, and productivity has become increasingly difficult under the prevailing survey-based framework.
Federal statistical agencies are acutely aware of these challenges and the need to modernize. However, the imperative to maintain the flow of official statistics along with the lack of investment and limited resources overall have only permitted incremental steps to modernize. The time has come to invest in a 21st-century statistical system that fully harnesses the potential of the digital economy. Such a system would deliver more accurate, timely, and detailed data while reducing the reporting burden on households and businesses. During the transition, it will be essential to maintain continuity and comparability; for example, legacy and modern systems will need to operate in parallel for a period of time. In addition, a modern system will likely blend survey, administrative, and private-sector data. Investing now is critical to building a future-ready infrastructure for economic measurement.
As one example that Haltiwanger points out, the data on measures of inflation, like the Consumer Price Index, has traditionally been collected by hundreds of actual “shoppers” going to actual stores all over the country every month and recording the prices for a specific selection of items. But as barcodes have become commonplace, it’s now becoming possible to collect barcode data on goods with specific characteristics, along with data on prices and quantities sold of those goods. But redesigning the measure of inflation based on this data isn’t a simple task–and ultimately may not be a cheaper approach, either.
In addition, if the government statistical agencies are going to rely more on data collected by private firms–like actual barcode data from actual consumer purchases stores–there will be questions about making sure that privacy is protected and the data remains anonymized. Again, this seems quite possible, but adds a level of complexity and cost.
For politicians, all that matters in government statistics is the final number that pops out of the calculation. But for statisticians and economists, what matter is spelling out a systematic method that can be used over time to produce comparable results. In addition, a systematic method can be understood, and criticized, and can evolve over time. From this perspective, the actual numbers that emerge at the end of the month or the quarter are less important than using a systematic and well-specified approach to estimating the number.
Aug 27, 2025





