
The NBER‑hosted conference “Inflation: Frontiers of Research and Policy” highlighted a growing consensus that the United States’ inflation measurement framework is anchored in 20th‑century survey methods and is ill‑suited for today’s fast‑changing economy. Speakers, including Jim Piterba and the Reset Demonstration team, outlined how the Economic Measurement Research Institute and its Reset project aim to modernize price and quantity statistics by tapping into private‑sector scanner data and AI‑driven analytics. Key insights emphasized the heavy burden of traditional surveys, the mismatch between revenue and price data, and the rapid turnover of products that current systems cannot capture efficiently. By partnering with data aggregators such as Surkana and Circona, the Reset team is constructing near‑census, monthly price and quantity indices for roughly two‑thirds of consumer goods, using item‑level UPC‑SKU information to generate Laspeyres‑type and superlative indices that are more timely and granular than the official CPI. Notable examples included a reference to a teenage‑founded AI firm now valued at $1 billion, which can automate survey responses, and the adoption of similar scanner‑based approaches by statistical agencies in New Zealand, Australia, and the Netherlands. The presenters also demonstrated a pilot price index for food‑at‑home items, comparing it directly to the CPI and highlighting the potential to address chain‑drift and turnover issues in future iterations. The implications are profound: policymakers could receive real‑time, high‑resolution inflation signals, improving monetary policy decisions and reducing the reporting lag that hampers economic analysis. Moreover, the role of federal statistical agencies may shift from data collection to data stewardship, overseeing the integration of private‑sector streams while ensuring confidentiality and methodological consistency.

The NBER‑sponsored event titled “Assessing the U.S. Medical Innovation System” convened economists, health‑policy scholars, and industry experts to examine how public funding mechanisms shape biomedical research. Organizers highlighted the central question: does the NIH peer‑review process penalize investigators who...

Researchers presented an analytical and quantitative study showing that slower fiscal adjustment after recessions—i.e., delaying tax hikes to shrink budget shortfalls—can act as a powerful dynamic automatic stabilizer in an economy with non‑Ricardian households. Using an overlapping‑generations (perpetual youth) model...

David Card, a Nobel‑winning economist, presented a fourteen‑year project linking Austria’s and Germany’s social‑security databases to measure the earnings impact of international migration. By matching individuals via name and date of birth, the team assembled a quarterly panel of 168,000...

The NBER Economic Fluctuations and Growth program featured Kunal Sanani’s paper on “complete pass‑through in levels,” challenging the conventional view that upstream cost shocks are only partially transmitted to downstream prices. Sanani shows that when pass‑through is measured in absolute dollars...

The paper presented examines how long‑term fiscal policy—specifically the speed of debt‑and‑deficit adjustment—interacts with monetary policy in a New‑Keynesian framework that incorporates overlapping‑generations, non‑Ricardian (hand‑to‑mouth) households. By replacing the standard Ricardian assumption with a more realistic liquidity‑constrained consumer base, the...

The NBER‑Peterson Foundation conference brought together leading public‑finance scholars to reassess how long‑term fiscal health should be measured. Organizers highlighted a debate sparked years earlier between Marcus and Ricardo over the relevance of the R‑G relationship and the proper...