This story was originally published by The Huffington Post and is reproduced here as part of the Climate Desk collaboration.
Three hundred and twenty miles north of the Arctic Circle, a weather station in America’s northernmost city of Utqiaġvik (formerly Barrow), Alaska, has been quietly collecting temperature data since the 1920s.
Early this month, while preparing a report on US climate, experts at the National Centers for Environmental Information (NCEI) noticed something odd: They were missing data from Utqiaġvik for all of 2017, and some of 2016.
It turns out the temperatures recorded at Utqiaġvik over that time were warmer than had ever seen before. So much so, in fact, that an automated computer system set up to police data and remove irregularities had flagged it as unreal and excluded it from the report.
Here’s how Deke Arndt, chief of NOAA’s Climate Monitoring Branch, explained the event:
In an ironic exclamation point to swift regional climate change in and near the Arctic, the average temperature observed at the weather station at Utqiaġvik has now changed so rapidly that it triggered an algorithm designed to detect artificial changes in a station’s instrumentation or environment and disqualified itself from the NCEI Alaskan temperature analysis, leaving northern Alaska analyzed a little cooler than it really was.
In his ensuing in-depth breakdown of how something like this could happen, Arndt noted that, over time, things like a weather station’s precise location, temperature recording equipment and basic procedures can change, leading to variations in its data.
To account for that, the NCEI has developed an algorithm that helps filter out the noise and alert scientists if something—a broken sensor, say—needs to be checked out.