Predictions of doom scaled back

The authors below rightly point to the fact that there is a great and unpredictable variability in climate events and say that natural variability is enough to account for the recent "pause" in warming.  They also point out that only the more modest predictions of doom fit in with what has actually happened.

So they show that the models are not necessarily wrong but can offer no evidence that they are right.  To do that they would have to dig out some of the mythical warmth from the deep oceans.

They do not face the fact that their research shows that all the climate variations to date fall within the range of natural variability -- so manmade CO2 effects do not need to be invoked to explain any climate events so far

I have appended the journal abstract to the article below.  I have paragraphed it in the hope of making it more widely comprehensible


Global warming more moderate than worst-case models, empirical data suggest

Summary:

A study based on 1,000 years of temperature records suggests global warming is not progressing as fast as it would under the most severe emissions scenarios outlined by the Intergovernmental Panel on Climate Change. Natural decade-to-decade variability in surface temperatures can account for some much-discussed recent changes in the rate of warming. Empirical data, rather than climate models, were used to estimate this variability.

We are seeing "middle of the road" warming. Natural variability in surface temperatures -- caused by interactions between the ocean and atmosphere, and other natural factors -- can account for observed changes in the recent rates of warming from decade to decade, new data suggests.

A new study based on 1,000 years of temperature records suggests global warming is not progressing as fast as it would under the most severe emissions scenarios outlined by the Intergovernmental Panel on Climate Change (IPCC).

"Based on our analysis, a middle-of-the-road warming scenario is more likely, at least for now," said Patrick T. Brown, a doctoral student in climatology at Duke University's Nicholas School of the Environment. "But this could change."

The Duke-led study shows that natural variability in surface temperatures -- caused by interactions between the ocean and atmosphere, and other natural factors -- can account for observed changes in the recent rates of warming from decade to decade.

The researchers say these "climate wiggles" can slow or speed the rate of warming from decade to decade, and accentuate or offset the effects of increases in greenhouse gas concentrations. If not properly explained and accounted for, they may skew the reliability of climate models and lead to over-interpretation of short-term temperature trends.

The research, published today in the peer-reviewed journal Scientific Reports, uses empirical data, rather than the more commonly used climate models, to estimate decade-to-decade variability.

"At any given time, we could start warming at a faster rate if greenhouse gas concentrations in the atmosphere increase without any offsetting changes in aerosol concentrations or natural variability," said Wenhong Li, assistant professor of climate at Duke, who conducted the study with Brown.

The team examined whether climate models, such as those used by the IPCC, accurately account for natural chaotic variability that can occur in the rate of global warming as a result of interactions between the ocean and atmosphere, and other natural factors.

To test how accurate climate models are at accounting for variations in the rate of warming, Brown and Li, along with colleagues from San Jose State University and the USDA, created a new statistical model based on reconstructed empirical records of surface temperatures over the last 1,000 years.

"By comparing our model against theirs, we found that climate models largely get the 'big picture' right but seem to underestimate the magnitude of natural decade-to-decade climate wiggles," Brown said. "Our model shows these wiggles can be big enough that they could have accounted for a reasonable portion of the accelerated warming we experienced from 1975 to 2000, as well as the reduced rate in warming that occurred from 2002 to 2013."
Further comparative analysis of the models revealed another intriguing insight.

"Statistically, it's pretty unlikely that an 11-year hiatus in warming, like the one we saw at the start of this century, would occur if the underlying human-caused warming was progressing at a rate as fast as the most severe IPCC projections," Brown said. "Hiatus periods of 11 years or longer are more likely to occur under a middle-of-the-road scenario."

Under the IPCC's middle-of-the-road scenario, there was a 70 percent likelihood that at least one hiatus lasting 11 years or longer would occur between 1993 and 2050, Brown said. "That matches up well with what we're seeing."

There's no guarantee, however, that this rate of warming will remain steady in coming years, Li stressed. "Our analysis clearly shows that we shouldn't expect the observed rates of warming to be constant. They can and do change."

Journal Reference:
Patrick T. Brown, Wenhong Li, Eugene C. Cordero and Steven A. Mauget. Comparing the Model-Simulated Global Warming Signal to Observations Using Empirical Estimates of Unforced Noise. Scientific Reports, April 21, 2015 DOI: 10.1038/srep09957

SOURCE
Comparing the model-simulated global warming signal to observations using empirical estimates of unforced noise

Patrick T. Brown et al.

Abstract

The comparison of observed global mean surface air temperature (GMT) change to the mean change simulated by climate models has received much public and scientific attention. For a given global warming signal produced by a climate model ensemble, there exists an envelope of GMT values representing the range of possible unforced states of the climate system (the Envelope of Unforced Noise; EUN).

Typically, the EUN is derived from climate models themselves, but climate models might not accurately simulate the correct characteristics of unforced GMT variability. Here, we simulate a new, empirical, EUN that is based on instrumental and reconstructed surface temperature records.

We compare the forced GMT signal produced by climate models to observations while noting the range of GMT values provided by the empirical EUN. We find that the empirical EUN is wide enough so that the interdecadal variability in the rate of global warming over the 20th century does not necessarily require corresponding variability in the rate-of-increase of the forced signal.

The empirical EUN also indicates that the reduced GMT warming over the past decade or so is still consistent with a middle emission scenario's forced signal, but is likely inconsistent with the steepest emission scenario's forced signal.

SOURCE


No comments:

Post a Comment

All comments containing Chinese characters will not be published as I do not understand them