The “Trouble with Macroeconomics,” according to a working paper by Paul Romer that is posted on his website, relates to dishonest identification assumptions, in particular in DSGE models used for policy analysis. Romer singles out calibration, assumptions about distribution functions and strong priors as culprits. Romer argues that [b]eing a Bayesian means that your software never barfs and I agree with the harsh judgment by Lucas and Sargent (1979) that the large Keynesian macro models of the day relied on identifying assumptions that were not credible. The situation now is worse. Macro models make assumptions that are no more credible and far more opaque. Romer also offers a meta-model of himself as a critic of “post-real models” and “facts with unknown truth value.
Topics:
Dirk Niepelt considers the following as important: DSGE model, Identification, Macroeconomics, Notes
This could be interesting, too:
Dirk Niepelt writes “Governments are bigger than ever. They are also more useless”
Dirk Niepelt writes The New Keynesian Model and Reality
Dirk Niepelt writes Urban Roadway in America: Land Value
Dirk Niepelt writes “Macroeconomics II,” Bern, Fall 2024
The “Trouble with Macroeconomics,” according to a working paper by Paul Romer that is posted on his website, relates to dishonest identification assumptions, in particular in DSGE models used for policy analysis. Romer singles out calibration, assumptions about distribution functions and strong priors as culprits.
Romer argues that
[b]eing a Bayesian means that your software never barfs
and
I agree with the harsh judgment by Lucas and Sargent (1979) that the large Keynesian macro models of the day relied on identifying assumptions that were not credible. The situation now is worse. Macro models make assumptions that are no more credible and far more opaque.
Romer also offers a meta-model of himself as a critic of “post-real models” and “facts with unknown truth value.”