The “Trouble with Macroeconomics,” according to a working paper by Paul Romer that is posted on his website, relates to dishonest identification assumptions, in particular in DSGE models used for policy analysis. Romer singles out calibration, assumptions about distribution functions and strong priors as culprits. Romer argues that [b]eing a Bayesian means that your software never barfs and I agree with the harsh judgment by Lucas and Sargent (1979) that the large Keynesian macro models of the day relied on identifying assumptions that were not credible. The situation now is worse. Macro models make assumptions that are no more credible and far more opaque. Romer also offers a meta-model of himself as a critic of “post-real models” and “facts with unknown truth value.
Topics:
Dirk Niepelt considers the following as important: DSGE model, Identification, Macroeconomics, Notes
This could be interesting, too:
Dirk Niepelt writes Budgetary Effects of Ageing and Climate Policies in Switzerland
Dirk Niepelt writes SNB Annual Report
Dirk Niepelt writes Banks and Privacy, U.S. vs Canada
Dirk Niepelt writes “Topics in Macroeconomics,” Bern, Spring 2024
The “Trouble with Macroeconomics,” according to a working paper by Paul Romer that is posted on his website, relates to dishonest identification assumptions, in particular in DSGE models used for policy analysis. Romer singles out calibration, assumptions about distribution functions and strong priors as culprits.
Romer argues that
[b]eing a Bayesian means that your software never barfs
and
I agree with the harsh judgment by Lucas and Sargent (1979) that the large Keynesian macro models of the day relied on identifying assumptions that were not credible. The situation now is worse. Macro models make assumptions that are no more credible and far more opaque.
Romer also offers a meta-model of himself as a critic of “post-real models” and “facts with unknown truth value.”