BLOG

Published 28 August 2025 by Andrei Mihai

Measuring What Works: The New Frontier of Econometrics

Guido W. Imbens during his Lindau premiere

Economists aren’t usually in a hospital when they’re planning their next research. But Joshua D. Angrist was.

The Laureate signed up for Moderna’s mRNA vaccine trial in 2021. He sat through long doctor visits, received a shot (which ended up being a placebo), and then got thinking about how the trial was designed.

“I started to wonder why they don’t worry more (or at least worry in the way that I think they should) about non-compliance. And in particular, I wondered why trialists don’t do more “IV”, meaning more instrumental variables.”

An instrumental variable is used in statistical analysis to estimate the causal effect of another variable on an outcome, especially when there’s potential for bias. It gets to the heart of econometrics, a branch of economics that has become one of the most powerful ways in which economists make sense of the world.

At the 2025 Lindau Nobel Laureate Meeting in Economics, Angrist and his fellow Laureate Guido W. Imbens returned to the basics of econometrics: what it means to claim one thing causes another, and why the answer often determines lives, policies, and billions of dollars. Alongside them, a younger generation showed how econometrics is shifting, embracing LinkedIn profiles, electricity prices, and even large language models.

The Past, Present, and Future of Causality

Correlation can be a trickster. Two things may rise and fall together, like wages and housing prices or coffee drinking and longevity — but that doesn’t mean one causes the other. Causality is the real prize. Without it, we risk mistaking coincidence for truth, pouring billions into programmes or treatments that look promising on paper but do nothing in reality.

Guido W. Imbens
Guido W. Imbens

Yet, as Guido Imbens reminded the audience, causation was not always given this much importance in research. “It was essentially zero up to the late 1980s and then has been going up fast. Now about 15 percent or so of the papers in the leading statistics journals that are about causality.”

He pointed back to Jan Tinbergen, the Dutch economist who is sometimes regarded as the “father” of econometrics. Tinbergen tried to link prices to potato flour demand nearly a century ago. “It’s a remarkable paper,” Imbens says. “He only has nine observations but he still manages to get some fairly credible estimates. He essentially shows how you can use instrumental variables there to estimate the demand function using instruments for supply.”

Yet surprisingly, although Tinbergen was also awarded the Nobel Prize, econometrics got little recognition for a few decades. Things changed when economists found better ways to use real-life events as a laboratory. They leaned on “natural experiments,” situations where chance or policy created random-like divisions. A draft lottery could reveal the effects of military service or a new school policy that could mimic random assignment, for instance. These tools gave economists the confidence to look for causal connections more consistently.

But Imbens says econometrics is just getting started. He turned to the prospects of the field, and mentioned that artificial intelligence could play a significant role.

He described how large language models (the same AI technology that powers chatbots) can help economists brainstorm new instruments, those elusive variables that allow them to tease out cause from correlation. So you can essentially use AI as a sort of research assistant, but you can also develop “agentic experiments,” where large language models simulate human agents and run virtual trials. These experiments can be strongly predictive in some instances, the Laureate mentions.

Messy Trials and the Power of Instruments

If Imbens painted the wide landscape, Joshua Angrist dug deeper into specifics. His talk was laser-focused on the stubborn messiness of real-world data. In particular, he focused on clinical trials. Clinical trials are the closest thing we have to a controlled experiment in human health. Participants are randomly assigned to treatments or placebos, and doctors track outcomes to see what works. The randomization is supposed to strip away bias, leaving only the true effect of a drug or procedure.

But reality is so much messier. In reality, people drop out, skip doses, or seek treatments outside the study. That’s where the relevance here comes in: trials may be randomized in design, but the messiness of human behavior can blur their results. And that’s exactly where econometric tools, like the ones Angrist champions, can sharpen the picture.

An analysis called “Intention-to-treat” is the common the standard in medicine. This includes all randomized participants in the groups they were originally assigned to, regardless of whether they adhered to the study protocol or not. But that lumps in all the no-shows and crossovers. The common alternative called a “per protocol” analysis, only counts people who stuck to the plan. But in this case, you get the opposite problem: selection bias. Those who comply may be healthier, wealthier, or more motivated than those who don’t.

So What’s an Economist to Do?

Angrist’s instrumental variables approach doesn’t identify the compliers in the data one by one. Instead, it uses the original random assignment as a statistical lever. The randomization is clean. It’s uncorrelated with motivation, health, or wealth. IV essentially asks: “What’s the effect on those who actually obeyed their random assignment?” This isolates the causal effect for the compilers without bringing in hidden differences between rule-followers and rule-breakers. Where per protocol risks fooling us with selection bias, IV rescues the original spirit of the trial, creating a fairer test of cause and effect.

The difference brought by this analysis was striking. Trials that once suggested screening barely made a difference suddenly suggested that for people who actually got screened, cancer deaths dropped significantly. “There’s a sense in which there’s a puzzle that’s resolved here,” Angrist said.

It was a powerful demonstration of how econometrics can make a difference, and an illustration of the kind of work for which the 2021 Nobel Prize in Economics was awarded to Angrist and Imbens.

The Next Generation

After the Laureates, the stage turned over to younger researchers. Their projects hinted at what econometrics might look like in the next decade: eclectic, data-rich, and unafraid to “dance” with other fields.

Moritz Seebacher
Moritz Seebacher

One presentation turned to the modern labor market. Moritz Seebacher from the ifo Institute mined LinkedIn profiles to measure multidimensional skills as a “measure of human capital”. Instead of relying on degrees or test scores, his work tapped into real-time data about what workers can actually do (or claim they can do). This approach has potential to be more revealing than classical measures, and findings also shed light on gender disparities: women often acquire similar qualifications early on, but diverge in their late twenties and thirties, reflecting career breaks and unequal opportunities.

Amar Venugopal
Amar Venugopal

Another frontier came from Stanford’s Amar Venugopal, who focused on causal inference outcomes learned from text.

His work pushes econometrics into natural language, asking how policies shape not just numbers, but the words people use. Could large-scale text analysis reveal hidden impacts of laws, speeches, or corporate reports? It’s a radical expansion of what “data” even means.

Other young researchers explored robustness in panel data, better inference for regression discontinuities, equivalence testing (proving two treatments are really the same), and forecasting electricity prices. The breadth of the work showed just how far-reaching econometrics and its tools can really be.

Together, all the sessions featuring econometrics painted a vivid picture of economics escaping its old confines. It’s not just about wages, GDP, or interest rates anymore. It’s about health trials, digital footprints, climate policy, and words. It’s about figuring what’s actually true in real life and society; and we definitely need more of that in current times.

Andrei Mihai

Andrei is a science communicator and a PhD candidate in geophysics. He co-founded ZME Science, where he tries to make science accessible and interesting to everyone and has written over 2,000 pieces on various topics – though he generally prefers writing about physics and the environment. Andrei tries to blend two of the things he loves (science and good stories) to make the world a better place – one article at a time.