Tim McDonnell joined Climate Desk after stints at Mother Jones and Sierra magazine. He remains a cheerful guy despite covering climate change all the time. Originally from Tucson, Tim loves tortillas and epic walks.
Ever since the 2009 climate talks in Copenhagen, world leaders have agreed on 2 degrees Celsius (3.6 degrees F) as the maximum acceptable global warming above preindustrial levels to avert the worst impacts of climate change (today we're at about 0.8 degrees C). But a new study, led by climatologist James Hansen of Columbia University, argues that pollution plans aimed at that target would still result in "disastrous consequences," from rampant sea level rise to widespread extinction.
A major goal of climate scientists since Copenhagen has been to convert the 2 degree limit into something useful for policymakers, namely, a specific total amount of carbon we can "afford" to dump into the atmosphere, mostly from burning fossil fuels in power plants (this is known as a carbon budget). This fall, the UN's Intergovernmental Panel on Climate Change pegged the number at 1 trillion metric tons of carbon, or about twice what we've emitted since the late 19th Century; if greenhouse gas emissions continue as they have for the last few decades, we're on track to burn through the remaining budget by the mid-2040s, meaning immediately thereafter we'd have to cease emissions forever to meet the warming target.
The study, which was co-authored by Columbia economist Jeffrey Sachs and published today in the journal PLOS ONE, uses updated climate models to argue that the IPCC's carbon budget would in fact produce warming up to twice the international limit, and that even the 2-degree limit would likely yield catastrophic impacts well into the next century. In other words, the study says, two of the IPCC's fundamental figures are wrong.
"We should not use [2 degrees] as a target," Hansen said in a meeting with reporters on the Columbia campus in Manhattan. "It doesn't have any scientific basis."
The IPCC's climate models leave out the effect of some slow natural systems, like changes in the area of ice sheets and the release of methane from melting permafrost.
A better target to avoid devastating climate impacts, Hansen said, would be 1 degree Celsius of warming (only slightly above what we've already experienced), although he readily admitted that such a goal is essentially unattainable. According to IPCC estimates, human activities have already committed us to that level of warming even if we suddenly stopped burning all fossil fuels today. A grim, but perhaps more realistic, vision of what the end of this century will hold comes from the the International Energy Agency, which predicts that temperatures could rise as much as 6 degrees Celsius by 2100 if greenhouse gas emissions continue unabated.
As we reported this week, some of the world's richest nations are lagging behind on their climate protection pledges. Most often, these commitments follow the formula: "We aim to reduce greenhouse gas emissions X percent below year Y levels by year Z." It seems like a straightforward proposition, but have you ever wondered where those numbers come from? The answer is a scientific concept known as the carbon budget, and like a teenager with her first credit card, we're well on our way to blowing right through it.
In the video above, Kelly Levin, a climate policy expert at the World Resources Institute, explains what our carbon budget is, how much we've already "spent," and why it matters.
Back in 2009, delegates to the UN climate summit in Copenhagen agreed that in order to avoid the worst potential impacts of climate change, global temperature rise should be limited to 2 degrees Celsius above preindustrial levels. For their report this fall, scientists on the UN's Intergovernmental Panel on Climate Change looked at how emissions of carbon dioxide and other greenhouse gases have warmed the planet since the Industrial Revolution, and extrapolated how much more we could emit before breaking the Copenhagen limit, the same way you might draft a budget to keep your checking account balance above zero.
Children stand amidst the rubble of Cebu, Philippines, after Typhoon Haiyan.
Aid agencies are still digging through rubble in the Philippines in the wake of Super Typhoon Haiyan, which was just one of many record-smashing oceanic storms to spring up in the last decade. Insurance adjusters have already pegged Haiyan's price tag alone—counting damage to homes, businesses, and farms—at $14.5 billion. Today, as politicians and policy wonks dive into a second week of UN climate talks in Warsaw, the Philippines' lead delegate has called for developed nations whose industrial emissions drive climate change to foot the bill for disasters like this. It could be one hell of a bill: Natural disasters altogether have cost the world $3.8 trillion since 1980, according to a new report from the World Bank.
Using data from Munich Re, the world's largest reinsurance (insurance for insurers) agency, World Bank analysts found that 74 percent of that cost arose from weather-related disasters like hurricanes and droughts. They also found, as the chart below shows, that annual costs are on the rise, from around $50 billion a year in the 1980s to $200 billion a year today, thanks to a rising number of disasters and growing economic development:
Deforestation is a tricky problem to nail down. We know forests are shrinking, but knowing exactly where and by how much often means compiling locally reported data that can be shoddy, incomplete, or outdated, according to University of Maryland geographer Matthew Hansen. Better data would be an invaluable tool for resource managers looking to preserve trees, and for climate scientists who want to crunch how much carbon they can store, Hansen realized. So he set about to create the most high-resolution map of global forests ever made, partnering with Google Earth to process some 650,000 images taken by NASA satellites over the last decade.
The result, published today in Science, is a stunning series of time-lapse maps, along with an interactive mapping tool, that reveal the Earth lost about 888,000 square miles of forest between 2000 and 2012, roughly the area of the US east of the Mississippi River. The loss, which was most dramatic in the tropics, was primarily due to logging, urban development, strip mining, and other human impacts, Hansen said, but the figure also includes loss from fires, earthquakes, and other natural disasters. The maps are accurate to 11-square-mile units, close enough to see logging roads and individuals stands of trees, which gave the researchers an unprecedented look at the complete extent and rate of deforestation on a global and hyper-local scale.
In the exclusive video above, Hansen takes us on a tour of his new maps and the startling situation they reveal.
Producing super-detailed maps like these would have been impossible without Google's massive computer power; Hansen estimated his own computer would have taken 15 years to process all the images, while Google's servers churned them out in a few days.
The data could be used to track the impact of forest protection policies, and hold a microscope to the forested areas most at risk.
"It's a big leap forward in terms of a set of facts, a set of observations on what this dynamic is," Hansen said.
One of the next steps, Hansen said, is to use the data to gauge exactly what this deforestation means for climate change. Trees are one of the largest "sinks" for carbon dioxide; previous studies suggest forests absorb a third of the carbon released by burning fossil fuels.
Michael Gerrard, the Center's director, said his team combed through all 50 reports to see how accurately and comprehensively climate change was taken into consideration, if at all, and grouped them into four ranked categories:
No discussion of climate change or inaccurate discussion of climate change.
Minimal mention of climate change related issues.
Accurate but limited discussion of climate change and/or brief discussion with acknowledgement of need for future inclusion.
Thorough discussion of climate change impacts on hazards and climate adaptation actions.
While FEMA itself acknowledged this summer that climate change could increase areas at risk from flooding by 45 percent overt the next century, states are not required to discuss climate change in their mitigation plans. The Columbia analysis didn't take into account climate planning outside the scope of the mitigation plans, like state-level greenhouse gas limits or renewable energy incentives. And as my colleague Kate Sheppard reported, some government officials have avoided using climate science terminology even in plans that implicitly address climate risks; states that didn't use terms like "climate change" and "global warming" in their mitigation plans were docked points in Columbia's ranking algorithm.
Gerrard said he wasn't surprised to find more attention paid to climate change in coastal states like Alaska and New York that are closest to the front lines. But he was surprised to find that a plurality of states landed in the least-prepared category, suggesting a need, he said, for better communication of non-coastal risks like drought and heat waves.
"We had hoped that more of the states would have dealt with [climate change] in a more forthright way," he said.