Twitter | Search | |
Robert Rohde 20 Mar 17
. is now providing "raw" temperature series and grids without outlier detection or homogenization.
Reply Retweet Like
Robert Rohde 20 Mar 17
Replying to @RARohde
Such data, uncorrected for various measurement artifacts, is intended for diagnostic and teaching purposes.
Reply Retweet Like
Robert Rohde 20 Mar 17
Replying to @RARohde
When studying climate, one has to be careful with weather data. Changes in instrumentation and site location, often introduce artifacts.
Reply Retweet Like
Robert Rohde 20 Mar 17
Replying to @RARohde
By looking at metadata and comparing stations with their neighbors, many artifacts can be identified and corrected for.
Reply Retweet Like
Robert Rohde
Globally, the impact of such corrections on land average temperatures are small.
Reply Retweet Like More
Robert Rohde 20 Mar 17
Replying to @RARohde
Locally, however, some regions are more strongly impacted. For example, US changes in technology and procedures created large biases.
Reply Retweet Like
Robert Rohde 20 Mar 17
Replying to @RARohde
Comparing the full data set, with quality control and artifact corrections, to the "raw" version allows these effects to be explored.
Reply Retweet Like
Victor Venema 20 Mar 17
Replying to @RARohde
Homogenisation does not change the global mean warming since 1960 much, but is hugely important to get patterns right.
Reply Retweet Like
Victor Venema 20 Mar 17
Replying to @RARohde
For regional warming homogenisation matters a lot, thus also for climate modes, past climate impacts & decadal climate prediction.
Reply Retweet Like
Olof Reimer 22 Mar 17
Replying to @RARohde
Actually, the raw dataset agrees better with models, especially in the early part. Wonder if skeptics like that?
Reply Retweet Like