99.999% certainty humans are driving global warming: new study

from The Conversation

 

A new study finds overwhelming odds that humans have contributed to higher global temperatures – so how much are we willing to gamble that it’s wrong? Kraevski Vitaly/Shutterstock

There is less than 1 chance in 100,000 that global average temperature over the past 60 years would have been as high without human-caused greenhouse gas emissions, our new research shows.

Published in the journal Climate Risk Management today, our research is the first to quantify the probability of historical changes in global temperatures and examines the links to greenhouse gas emissions using rigorous statistical techniques.

Our new CSIRO work provides an objective assessment linking global temperature increases to human activity, which points to a close to certain probability exceeding 99.999%.

Our work extends existing approaches undertaken internationally to detect climate change and attribute it to human or natural causes. The 2013 Intergovernmental Panel on Climate Change Fifth Assessment Report provided an expert consensus that:

It is extremely likely [defined as 95-100% certainty] that more than half of the observed increase in global average surface temperature from 1951 to 2010 was caused by the anthropogenic [human-caused] increase in greenhouse gas concentrations and other anthropogenic forcings together.

Decades of extraordinary temperatures

July 2014 was the 353rd consecutive month in which global land and ocean average surface temperature exceeded the 20th-century monthly average. The last time the global average surface temperature fell below that 20th-century monthly average was in February 1985, as reported by the US-based National Climate Data Center.

This means that anyone born after February 1985 has not lived a single month where the global temperature was below the long-term average for that month.

We developed a statistical model that related global temperature to various well-known drivers of temperature variation, including El Niño, solar radiation, volcanic aerosols and greenhouse gas concentrations. We tested it to make sure it worked on the historical record and then re-ran it with and without the human influence of greenhouse gas emissions.

Our analysis showed that the probability of getting the same run of warmer-than-average months without the human influence was less than 1 chance in 100,000.

We do not use physical models of Earth’s climate, but observational data and rigorous statistical analysis, which has the advantage that it provides independent validation of the results.

Detecting and measuring human influence

Our research team also explored the chance of relatively short periods of declining global temperature. We found that rather than being an indicator that global warming is not occurring, the observed number of cooling periods in the past 60 years strongly reinforces the case for human influence.

We identified periods of declining temperature by using a moving 10-year window (1950 to 1959, 1951 to 1960, 1952 to 1961, etc.) through the entire 60-year record. We identified 11 such short time periods where global temperatures declined.

Our analysis showed that in the absence of human-caused greenhouse gas emissions, there would have been more than twice as many periods of short-term cooling than are found in the observed data.

There was less than 1 chance in 100,000 of observing 11 or fewer such events without the effects of human greenhouse gas emissions.

CSIRO scientists Dr Steve Rintoul, Dr John Church and Dr Pep Canadell explain how and why the Earth’s climate is warming.

The problem and the solution

Why is this research important? For a start, it might help put to rest some common misunderstandings about there being no link between human activity and the observed, long-term trend of increasing global temperatures.

Our analysis – as well as the work of many others – shows beyond reasonable doubt that humans are contributing to significant changes in our climate.

Good risk management is all about identifying the most likely causes of a problem, and then acting to reduce those risks. Some of the projected impacts of climate change can be avoided, reduced or delayed by effective reduction in global net greenhouse gas emissions and by effective adaptation to the changing climate.

Ignoring the problem is no longer an option. If we are thinking about action to respond to climate change or doing nothing, with a probability exceeding 99.999% that the warming we are seeing is human-induced, we certainly shouldn’t be taking the chance of doing nothing.

 

 

 

 

Leave a comment

Filed under Climate Change

Finally! The Greenland deglaciation paradox sorted.

from ScienceDaily

Bo Vinther prepares an ice core for visual inspection. Credit: Photograph by Christian Morel

A new study of three ice cores from Greenland documents the warming of the large ice sheet at the end of the last ice age — resolving a long-standing paradox over when that warming occurred.

 Large ice sheets covered North America and northern Europe some 20,000 years ago during the coldest part of the ice age, when global average temperatures were about four degrees Celsius (or seven degrees Fahrenheit) colder than during pre-industrial times. And then changes in Earth’s orbit around the sun increased the solar energy reaching Greenland. Beginning some 18,000 years ago, release of carbon from the deep ocean led to a graduate rise in atmospheric carbon dioxide (CO2).

Yet past analysis of ice cores from Greenland did not show any warming response as would be expected from an increase in CO2 and solar energy flux, the researchers note.

In this new study, funded by the National Science Foundation and published this week in the journal Science, scientists reconstructed air temperatures by examining ratios of nitrogen isotopes in air trapped within the ice instead of isotopes in the ice itself, which had been used in past studies.

Not only did the new analysis detect significant warming in response to increasing atmospheric CO2, it documents a warming trend at a rate closely matching what climate change models predict should have happened as Earth shifted out of its ice age, according to lead author Christo Buizert, a postdoctoral researcher at Oregon State University and lead author on the Science article.

“The Greenland isotope records from the ice itself suggest that temperatures 12,000 years ago during the so-called Younger Dryas period near the end of the ice age were virtually the same in Greenland as they were 18,000 years ago when much of the northern hemisphere was still covered in ice,” Buizert said. “That never made much sense because between 18,000 and 12,000 years ago atmospheric CO2 levels rose quite a bit.”

“But when you reconstruct the temperature history using nitrogen isotope ratios as a proxy for temperature, you get a much different picture,” Buizert pointed out. “The nitrogen-based temperature record shows that by 12,000 years ago, Greenland temperatures had already warmed by about five degrees (Celsius), very close to what climate models predict should have happened, given the conditions.”

Reconstructing temperatures by using water isotopes provides useful information about when temperatures shift but can be difficult to calibrate because of changes in the water cycle, according to Edward Brook, an Oregon State paleoclimatologist and co-author on the Science study.

“The water isotopes are delivered in Greenland through snowfall and during an ice age, snowfall patterns change,” Brook noted. “It may be that the presence of the giant ice sheet made snow more likely to fall in the summer instead of winter, which can account for the warmer-than-expected temperatures because the snow records the temperature at the time it fell.”

In addition to the gradual warming of five degrees (C) over a 6,000-year period beginning 18,000 years ago the study investigated two periods of abrupt warming and one period of abrupt cooling documented in the new ice cores. The researchers say their leading hypothesis is that all three episodes are tied to changes in the Atlantic meridional overturning circulation (AMOC), which brings warm water from the tropics into the high northern latitudes.

The first episode caused a jump in Greenland’s air temperatures of 10-15 degrees (C) in just a few decades beginning about 14,700 years ago. An apparent shutdown of the AMOC about 12,800 years ago caused an abrupt cooling of some 5-9 degrees (C), also over a matter of decades.

When the AMOC was reinvigorated again about 11,600 years ago, it caused a jump in temperatures of 8-, 11 degrees (C), which heralded the end of the ice age and the beginning of the climatically warm and stable Holocene period, which allowed human civilization to develop.

“For these extremely abrupt transitions, our data show a clear fingerprint of AMOC variations, which had not yet been established in the ice core studies,” noted Buizert, who is in OSU’s College of Earth, Ocean, and Atmospheric Sciences. “Other evidence for AMOC changes exists in the marine sediment record and our work confirms those findings.”

In their study, the scientists examined three ice cores from Greenland and looked at the gases trapped inside the ice for changes in the isotopic ration of nitrogen, which is very sensitive to temperature change. They found that temperatures in northwest Greenland did not change nearly as much as those in southeastern Greenland — closest to the North Atlantic — clearly suggesting the influence of the AMOC.

“The last deglaciation is a natural example of global warming and climate change,” Buizert said. “It is very important to study this period because it can help us better understand the climate system and how sensitive the surface temperature is to atmospheric CO2.”

“The warming that we observed in Greenland at the end of the ice age had already been predicted correctly by climate models several years ago,” Buizert added. “This gives us more confidence that these models also predict future temperatures correctly.”

 

From Science

Greenland deglaciation puzzles

Louise Claire Sime, British Antarctic Survey, High Cross, Cambridge, CB23 7PP, UK.

About 23,000 years ago, the southern margins of the great Northern Hemisphere ice sheets across Europe and North America began to melt. The melt rate accelerated ∼20,000 years ago, and global sea level eventually rose by ∼130 m as meltwater flowed into the oceans. Ice cores from the Greenland and Antarctic ice sheets show the rise in atmospheric CO2 concentrations that accompanied this shift in global ice volume and climate. However, discrepancies in the temperature reconstructions from these cores have raised questions about the long-term relationship between atmospheric CO2 concentrations and Arctic temperature. On page 1177 of this issue, Buizert et al. (1) report temperature reconstructions from three locations on the Greenland ice sheet that directly address these problems.

Abstract

Greenland ice core water isotopic composition (δ18O) provides detailed evidence for abrupt climate changes but is by itself insufficient for quantitative reconstruction of past temperatures and their spatial patterns. We investigate Greenland temperature evolution during the last deglaciation using independent reconstructions from three ice cores and simulations with a coupled ocean-atmosphere climate model. Contrary to the traditional δ18O interpretation, the Younger Dryas period was 4.5° ± 2°C warmer than the Oldest Dryas, due to increased carbon dioxide forcing and summer insolation. The magnitude of abrupt temperature changes is larger in central Greenland (9° to 14°C) than in the northwest (5° to 9°C), fingerprinting a North Atlantic origin. Simulated changes in temperature seasonality closely track changes in the Atlantic overturning strength and support the hypothesis that abrupt climate change is mostly a winter phenomenon.

 

Original Science Daily article here

Science paper here

Leave a comment

Filed under Uncategorized

It’s All About Fresh Water — Rapid Sea Level Rise Points To Massive Glacial Melt in Antarctica

Originally posted on robertscribbler:

It’s all about fresh water. In this case, massive freshwater outflows from the vast glaciers covering Antarctica.

This week, a new scientific report published in the Journal Nature found that from 1992 through 2012 freshwater outflow from Antarctica’s massive glaciers exceeded 400 gigatons each year. An immense flood of cold, fresh water. One that helped push sea levels rapidly higher around the Antarctic continent.

But with glacial melt on the rise and with mountains of ice now inexorably sliding seaward, these freshwater flows may just be the start of even more powerful outbursts to come. And such prospective future events have far-ranging implications for sea level rise, global weather, sea ice, human-caused climate change, and world ocean health.

Flood of Fresh Water Drives More Sea Level Rise Than Expected

The researchers discovered the tell-tale signature of this vast freshwater flood through chemical analysis of the seas surrounding Antarctica. The…

View original 1,132 more words

Leave a comment

Filed under Uncategorized

The BOM and data manipulation

All the right-wing nut jobs, unable to produce primary scientific data that disproves anthropogenic climate change and global warming, tend to resort to the time-honored but completely dishonorable tradition of attacking the messenger. The usual suspects have been attacking the Bureau of Meteorology because they don’t like the message that the globe is warming and humans are to blame. The BOM, faced with data that has discontinuities due to instrument changes, location changes and encroachment by urbanisation has had to homogenise their data to remove any factors in the temperature series that aren’t either natural or caused by anthropogenic climate change. Sounds pretty reasonable to me. Of course the idiots out there hear about it and automatically assume the BOM scientists are deliberately fudging data to “tell the global warming story”. I can only assume that none of those idiots have even a basic understanding of statistics, because if they did, they would surely leap at the chance to have their statistical take downs published in scientific journals and bathe in the glory of their statistical brilliance made legitimate?

From the Conversation

No, the Bureau of Meteorology is not fiddling its weather data

Australia’s weather records need careful analysis to correct any introduced errors. Photographic Collection from Australia/Wikimedia Commons, CC BY

Over the past week or so, the Bureau of Meteorology has stood accused of fudging its temperature data records to emphasise warming, in a series of articles in The Australian. The accusation hinges on the method that the Bureau uses to remove non-climate-related changes in its weather station data, referred to as “data homogenisation”.

If true, this would be very serious because these data sets underpin major climate research projects, including deducing how much Australia is warming. But it’s not true.

Crunching the numbers

Data homogenisation techniques are used to varying degrees by many national weather agencies and climate researchers around the world. Although the World Meteorological Organization has guidelines for data homogenisation, the methods used vary from country to country, and in some cases no data homogenisation is applied.

Homogenisation can be necessary for a range of reasons: sometimes stations move, instruments or reporting practices change, or surrounding trees or buildings at a site are altered. Changes can be sudden or gradual. These can all introduce artificial “jumps” (in either direction) in the resulting temperature records. If left uncorrected, these artifacts could leave the data appearing to show spurious warming or cooling trends.

There are many methods that can be used to detect these “inhomogeneities”, and there are other methods (although much harder to implement) that can adjust the data to make sure it is consistent through time. The Bureau uses such a technique to create its Australian Climate Observations Reference Network – Surface Air Temperature (ACORN-SAT) data set. These data are then used to monitor climate variability and change in Australia, to provide input for the State of the Climate reports, and for other purposes too.

In a statement about its climate records, the Bureau said:

The Bureau measures temperature at nearly 800 sites across Australia, chiefly for the purpose of weather forecasting. The ACORN-SAT is a subset of this network comprising 112 locations that are used for climate analysis. The ACORN-SAT stations have been chosen to maximise both length of record and network coverage across the continent. For several years, all of this data has been made publicly available on the Bureau’s web site.

Complex methods

Australia has played a leading role in developing this type of complex data-adjustment technique. In 2010, the Bureau’s Blair Trewin wrote a comprehensive article on the types of inhomogeneities that are found in land temperature records. As a result the International Surface Temperature Initiative (ISTI) has set up a working group to compare homogenisation methods.

Some of our own research at the ARC Centre of Excellence for Climate System Science has tried, with the help of international colleagues, to assess the impacts that different choices can make when using these different homogenisation methods. Much of our work focuses on temperature extremes. We have studied the impacts on large-scale extreme temperature data of changing station networks, different statistical techniques, homogenised versus non-homogenised data, and other uncertainties that might arise.

Our data on extreme temperature trends show that the warming trend across the whole of Australia looks bigger when you don’t homogenise the data than when you do. For example, the adjusted data set (the lower image below) shows a cooling trend over parts of northwest Australia, which isn’t seen in the raw data.

Trends in the frequency of hot days over Australia – unadjusted data using all temperature stations that have at least 40 years of record available for Australia from the GHCN-Daily data set.

Click to enlarge
Trends in the frequency of hot days over Australia – adjusted ACORN-SAT data. The period of trend covers 1951-2010 when both datasets have overlapping data. All data used in figures are available from http://www.climdex.org

Click to enlarge

High-quality data

Far from being a fudge to make warming look more severe than it is, most of the Bureau’s data manipulation has in fact had the effect of reducing the apparent extreme temperature trends across Australia. Cherrypicking weather stations where data have been corrected in a warming direction doesn’t mean the overall picture is wrong.

Data homogenisation is not aimed at producing a predetermined outcome, but rather is an essential process in improving weather data by spotting where temperature records need to be corrected, in either direction. If the Bureau didn’t do it, then we and our fellow climatologists wouldn’t use its data because it would be misleading. What we need are data from which spurious warming or cooling trends have been removed, so that we can see the actual trends.

Marshalling all of the data from the Bureau’s weather stations can be a complicated process, which is why it has been subjected to international peer-review. The Bureau has provided the details of how it is done, despite facing accusations that it has not been open enough.

Valid critiques of data homogenisation techniques are most welcome. But as in all areas of science, from medicine to astronomy, there is only one place that criticisms can legitimately be made. Anyone who thinks they have found fault with the Bureau’s methods should document them thoroughly and reproducibly in the peer-reviewed scientific literature. This allows others to test, evaluate, find errors or produce new methods.

This process has been the basis of all scientific advances in the past couple of centuries and has led to profoundly important advances in knowledge. Abandoning peer-reviewed journals in favour of newspaper articles when adjudicating on scientific methods would be profoundly misguided.

 

Original article here

UPDATE: Today the Conversation expanded on the BOM and data article, by giving advice on how to do your own simple calculations to track temperature changes over time using raw unadjusted data. So it shouldn’t be long before these wannabe right right-wing nut jobs start doing their own analyses and bombarding journals with their fantastic findings that show AGW isn’t happening and that its all a giant conspiracy implemented by the Jewish Bankers to bring in socialism masquerading as environmentalism so when the lizard people want to fake another moon landing they will be better able to control the masses with their global chemtrail program.

 

 

2 Comments

Filed under Climate Change

A Time to Kill: Meryl Dorey and her AVsN give medical advice on seizures.

Originally posted on reasonablehank:

After an extensive investigation into the Australian Vaccination skeptics Network, the NSW Health Care Complaints Commission issued a Public Health Warning on April 30 2014 [my bold]:

Public warning

The Commission has established that AVN does not provide reliable informationin relation to certain vaccines and vaccination more generally. The Commission considers that AVN’s dissemination of misleading, misrepresented and incorrect information about vaccination engenders fear and alarm and is likely to detrimentally affect the clinical management or care of its readers.

Given the issues identified with the information disseminated by AVN, the Commission urges general caution is exercised when using AVN’s website or Facebook page to research vaccination and to consult other reliable sources, including speaking to a medical practitioner, to make an informed decision.

The Commission has recommended that AVN amend its published information with regard to the above issues and the Commission will monitor the implementation of these…

View original 448 more words

Leave a comment

Filed under Uncategorized

Russia’s warming faster than the rest of the planet—and seeing disease, drought, and forest fires as a result

Originally posted on Quartz:

When Vladimir Putin declined to support the Kyoto Protocol, a treaty to limit carbon emissions, he famously quipped that higher temperatures might actually benefit Russia since its people would have to spend less on fur coats.

Well, he’s getting his wish. Changes in wind and ocean currents caused by global warming shift heat around unevenly, causing some areas to heat up dramatically even as other regions cool. Russia, it turns out, is in the unusually hot category. Between 1976 and 2012, average Russian temperatures rose 0.43°C (0.8°F) a decade—more than twice the global average of 0.17°C—according to a new report out by Russia’s climate and environment agency (pdf, link in Russian).

The increase in the average temperature in Russia.

Trends in Russia’s average temperatures.

This is a big problem for a variety of reasons, say Russia’s climate scientists. Hotter temperatures appear to be driving a spike in episodes of dangerous extreme weather:

Dangerous-extreme-weather-is-on-the-rise-in-Russia-Incidents-e-g-floods-drought-cyclones-5-year-moving-average_chartbuilder (2)

The frequency of forest fires

View original 205 more words

Leave a comment

Filed under Uncategorized

The cost of man-made disasters

About once a week or so, I receive an email from random people/businesses wanting me to post something they think is relevant to my blog. More often than not they just want to generate traffic and/or get a bit of free advertising. So far, I am yet to post anything because I don’t wish to be used in that way and don’t wish to promote private businesses, that I know nothing about. Today I am making an exception because the infographic I was sent does two things.

First it highlights just how stupid we humans are in terms of how we treat our home. It shows oil spills, nuclear disasters, the great plastic garbage patch in the Pacific and a few others and it puts a price tag on them.

The second thing it does, is it fails to mention the cost of anthropogenic climate change, and I thought that was interesting, because the annual cost of that is orders of magnitude greater than the one-off costs of the disasters it lists. More on that in a moment.

Here is the infographic. Note: In no way do I endorse the educational courses this mob are promoting. I don’t know enough about them, if the courses are legitimate or value for money or whatever. I just like the picture.

manmade-disasters

Chernobyl is listed here as the most expensive man-made disaster at $235 Billion. I don’t know if that figure is a direct cost or if ongoing opportunity costs are factored in and I’m not going to bother checking, because it pales into insignificance against the cost of anthropogenic climate change.

In September, 2012 a large study, entitled Climate Vulnerability Monitor: A Guide to the Cold Calculus of A Hot Planet was published by the Europe based DARA group and the Climate Vulnerable Forum. Commissioned by 20 governments, it was written by more than 50 scientists, economists and policy experts. From the executive summary…

Climate change caused economic losses estimated close to 1% of global GDP for the year 2010, or 700 billion dollars (2010 PPP). The carbon-intensive economy cost the world another 0.7% of GDP in that year, independent of any climate change losses. Together,  carbon economy- and climate change-related losses amounted to over 1.2 trillion dollars in 2010.
The cost figure of 1.7% of global GDP is expected to rise to 3.2% annually by 2030. That’s a lot of money for a human caused disaster, but hey, it’s only money. The report also estimates that human deaths caused by climate change will reach 100 million by 2030. Sobering thought.
That report is here.
The infographic website is here.

Leave a comment

Filed under Climate Change