WARNING:
JavaScript is turned OFF. None of the links on this concept map will
work until it is reactivated.
If you need help turning JavaScript On, click here.
This Concept Map, created with IHMC CmapTools, has information related to: Parameter-ranges, The Debate about the Stern-Review and the Economics of Climate Change visualized according to the rules and conventions of Logical Argument Mapping (LAM), click on the small, bent arrow at the bottom right of this text box to get back to the Stern Review's main argumentation start here "The parameter ranges used as model inputs are calibrated to the scientific and economic literatures on climate change", "PAGE2002 in effect summarises the range of underlying research studies." supports (Stern 2006e, 153) "The parameter ranges used as model inputs are calibrated to the scientific and economic literatures on climate change", Stern (2006: 61) draws "very specific inferences from just four scenarios," although "the forty scenarios which informed the IPCC work under the third assessment report were explicitly stated to be “equally valid with no assigned probabilities of occurrence.�? "Stern then employ expected utility modelling, which is know to be an inadequate representations of human behaviour e.g., assuming away loss aversion (Perrings, 2003). Subjective probability density functions then give precise computer generated outcomes. This belies the fact that prices cannot be predicted by economists with such accuracy over short time horizons let alone over 200 years, and that climate change is endogenous to economic production systems so causing all prices to change with every scenario. Comparative statistics, shifting from one equilibrium to another, conceal complex processes of change. Thus Stern manage to convert unknown and unknowable futures into events with known probabilities, and miraculously strong uncertainty becomes weak uncertainty." (Spash 709) objects "The parameter ranges used as model inputs are calibrated to the scientific and economic literatures on climate change", regarding extreme and catastrophic impacts, the more pessimistic estimates by Cline (1992) have been ignored. "Instead, the PAGE2002 model used in the SR broadly follows Nordhaus and Boyer (2000) by including an aggregated probabilistic formulation in which, in every year of each model run, there is some probability (proportional to temperature) of extra GDP losses attributed to unspecified catastrophic impacts. The incorporation of this calculation in the end has the simple effect of raising the expected damages at any particular temperature, and thus at any specified level of emissions. The particular way in which the catastrophic damage function is calculated is necessarily quite arbitrary, as there is no well established basis for any such function or associated PDF." (Baer-Spash, 178) objects "The parameter ranges used as model inputs are calibrated to the scientific and economic literatures on climate change", "a Monte Carlo model is used to create a probability density function (PDF) of climate outcomes and associated economic damages for a specified emissions pathway, based on 1000 ‘runs’ of the model." However, PAGE2002 "requires subjective PDFs for over 30 crucial inputs, everything from the climate sensitivity to the ratio of climate damages in different regions in response to temperature increase. In practice, for only climate sensitivity is there any significant literature on an appropriate PDF; for the remainder, the authors simply use their judgment based on any available evidence, however scanty. Furthermore, the PDFs used are triangular, which means there is zero probability of a value above or below some arbitrarily specified point." (Baer-Spash, 176) objects "The parameter ranges used as model inputs are calibrated to the scientific and economic literatures on climate change"