All the right-wing nut jobs, unable to produce primary scientific data that disproves anthropogenic climate change and global warming, tend to resort to the time-honored but completely dishonorable tradition of attacking the messenger. The usual suspects have been attacking the Bureau of Meteorology because they don’t like the message that the globe is warming and humans are to blame. The BOM, faced with data that has discontinuities due to instrument changes, location changes and encroachment by urbanisation has had to homogenise their data to remove any factors in the temperature series that aren’t either natural or caused by anthropogenic climate change. Sounds pretty reasonable to me. Of course the idiots out there hear about it and automatically assume the BOM scientists are deliberately fudging data to “tell the global warming story”. I can only assume that none of those idiots have even a basic understanding of statistics, because if they did, they would surely leap at the chance to have their statistical take downs published in scientific journals and bathe in the glory of their statistical brilliance made legitimate?
From the Conversation
No, the Bureau of Meteorology is not fiddling its weather data
Over the past week or so, the Bureau of Meteorology has stood accused of fudging its temperature data records to emphasise warming, in a series of articles in The Australian. The accusation hinges on the method that the Bureau uses to remove non-climate-related changes in its weather station data, referred to as “data homogenisation”.
If true, this would be very serious because these data sets underpin major climate research projects, including deducing how much Australia is warming. But it’s not true.
Crunching the numbers
Data homogenisation techniques are used to varying degrees by many national weather agencies and climate researchers around the world. Although the World Meteorological Organization has guidelines for data homogenisation, the methods used vary from country to country, and in some cases no data homogenisation is applied.
Homogenisation can be necessary for a range of reasons: sometimes stations move, instruments or reporting practices change, or surrounding trees or buildings at a site are altered. Changes can be sudden or gradual. These can all introduce artificial “jumps” (in either direction) in the resulting temperature records. If left uncorrected, these artifacts could leave the data appearing to show spurious warming or cooling trends.
There are many methods that can be used to detect these “inhomogeneities”, and there are other methods (although much harder to implement) that can adjust the data to make sure it is consistent through time. The Bureau uses such a technique to create its Australian Climate Observations Reference Network – Surface Air Temperature (ACORN-SAT) data set. These data are then used to monitor climate variability and change in Australia, to provide input for the State of the Climate reports, and for other purposes too.
In a statement about its climate records, the Bureau said:
The Bureau measures temperature at nearly 800 sites across Australia, chiefly for the purpose of weather forecasting. The ACORN-SAT is a subset of this network comprising 112 locations that are used for climate analysis. The ACORN-SAT stations have been chosen to maximise both length of record and network coverage across the continent. For several years, all of this data has been made publicly available on the Bureau’s web site.
Australia has played a leading role in developing this type of complex data-adjustment technique. In 2010, the Bureau’s Blair Trewin wrote a comprehensive article on the types of inhomogeneities that are found in land temperature records. As a result the International Surface Temperature Initiative (ISTI) has set up a working group to compare homogenisation methods.
Some of our own research at the ARC Centre of Excellence for Climate System Science has tried, with the help of international colleagues, to assess the impacts that different choices can make when using these different homogenisation methods. Much of our work focuses on temperature extremes. We have studied the impacts on large-scale extreme temperature data of changing station networks, different statistical techniques, homogenised versus non-homogenised data, and other uncertainties that might arise.
Our data on extreme temperature trends show that the warming trend across the whole of Australia looks bigger when you don’t homogenise the data than when you do. For example, the adjusted data set (the lower image below) shows a cooling trend over parts of northwest Australia, which isn’t seen in the raw data.
Far from being a fudge to make warming look more severe than it is, most of the Bureau’s data manipulation has in fact had the effect of reducing the apparent extreme temperature trends across Australia. Cherrypicking weather stations where data have been corrected in a warming direction doesn’t mean the overall picture is wrong.
Data homogenisation is not aimed at producing a predetermined outcome, but rather is an essential process in improving weather data by spotting where temperature records need to be corrected, in either direction. If the Bureau didn’t do it, then we and our fellow climatologists wouldn’t use its data because it would be misleading. What we need are data from which spurious warming or cooling trends have been removed, so that we can see the actual trends.
Marshalling all of the data from the Bureau’s weather stations can be a complicated process, which is why it has been subjected to international peer-review. The Bureau has provided the details of how it is done, despite facing accusations that it has not been open enough.
Valid critiques of data homogenisation techniques are most welcome. But as in all areas of science, from medicine to astronomy, there is only one place that criticisms can legitimately be made. Anyone who thinks they have found fault with the Bureau’s methods should document them thoroughly and reproducibly in the peer-reviewed scientific literature. This allows others to test, evaluate, find errors or produce new methods.
This process has been the basis of all scientific advances in the past couple of centuries and has led to profoundly important advances in knowledge. Abandoning peer-reviewed journals in favour of newspaper articles when adjudicating on scientific methods would be profoundly misguided.
Original article here