Tim McDonnell joined the Climate Desk after stints at Mother Jones and Sierra magazine, where he nurtured his interest in environmental journalism. Originally from Tucson, Tim loves tortillas and epic walks.
Deforestation is a tricky problem to nail down. We know forests are shrinking, but knowing exactly where and by how much often means compiling locally reported data that can be shoddy, incomplete, or outdated, according to University of Maryland geographer Matthew Hansen. Better data would be an invaluable tool for resource managers looking to preserve trees, and for climate scientists who want to crunch how much carbon they can store, Hansen realized. So he set about to create the most high-resolution map of global forests ever made, partnering with Google Earth to process some 650,000 images taken by NASA satellites over the last decade.
The result, published today in Science, is a stunning series of time-lapse maps, along with an interactive mapping tool, that reveal the Earth lost about 888,000 square miles of forest between 2000 and 2012, roughly the area of the US east of the Mississippi River. The loss, which was most dramatic in the tropics, was primarily due to logging, urban development, strip mining, and other human impacts, Hansen said, but the figure also includes loss from fires, earthquakes, and other natural disasters. The maps are accurate to 11-square-mile units, close enough to see logging roads and individuals stands of trees, which gave the researchers an unprecedented look at the complete extent and rate of deforestation on a global and hyper-local scale.
In the exclusive video above, Hansen takes us on a tour of his new maps and the startling situation they reveal.
Producing super-detailed maps like these would have been impossible without Google's massive computer power; Hansen estimated his own computer would have taken 15 years to process all the images, while Google's servers churned them out in a few days.
The data could be used to track the impact of forest protection policies, and hold a microscope to the forested areas most at risk.
"It's a big leap forward in terms of a set of facts, a set of observations on what this dynamic is," Hansen said.
One of the next steps, Hansen said, is to use the data to gauge exactly what this deforestation means for climate change. Trees are one of the largest "sinks" for carbon dioxide; previous studies suggest forests absorb a third of the carbon released by burning fossil fuels.
Michael Gerrard, the Center's director, said his team combed through all 50 reports to see how accurately and comprehensively climate change was taken into consideration, if at all, and grouped them into four ranked categories:
No discussion of climate change or inaccurate discussion of climate change.
Minimal mention of climate change related issues.
Accurate but limited discussion of climate change and/or brief discussion with acknowledgement of need for future inclusion.
Thorough discussion of climate change impacts on hazards and climate adaptation actions.
While FEMA itself acknowledged this summer that climate change could increase areas at risk from flooding by 45 percent overt the next century, states are not required to discuss climate change in their mitigation plans. The Columbia analysis didn't take into account climate planning outside the scope of the mitigation plans, like state-level greenhouse gas limits or renewable energy incentives. And as my colleague Kate Sheppard reported, some government officials have avoided using climate science terminology even in plans that implicitly address climate risks; states that didn't use terms like "climate change" and "global warming" in their mitigation plans were docked points in Columbia's ranking algorithm.
Gerrard said he wasn't surprised to find more attention paid to climate change in coastal states like Alaska and New York that are closest to the front lines. But he was surprised to find that a plurality of states landed in the least-prepared category, suggesting a need, he said, for better communication of non-coastal risks like drought and heat waves.
"We had hoped that more of the states would have dealt with [climate change] in a more forthright way," he said.
Climate deniers like to point to the so-called global warming "hiatus" as evidence that humans aren't changing the climate. But according a new study, exactly the opposite is true: The recent slowdown in global temperature increases is partially the result of one of the few successful international crackdowns on greenhouse gases.
Back in 1988, more than 40 countries, including the US, signed the Montreal Protocol, an agreement to phase out the use of ozone-depleting gases like chlorofluorocarbons (today the Protocol has nearly 200 signatories). According to the EPA, CFC emissions are down 90 percent since the Protocol, a drop that the agency calls "one of the largest reductions to date in global greenhouse gas emissions." That's a blessing for the ozone layer, but also for the climate. CFCs are a potent heat-trapping gas, and a new analysis published today in Nature Geoscience finds that slashing them has been a major driver of the much-discussed slowdown in global warming.
"The recent decrease in warming, presented by global warming skeptics as proof that humankind cannot affect the climate system, is shown to have a direct human origin."
Without the Protocol, environmental economist Francisco Estrada of the Universidad Nacional Autónoma de México reports, global temperatures today would be about a tenth of a degree Celsius higher than they are. That's roughly an eighth of the total warming documented since 1880.
Estrada and his co-authors compared global temperature and greenhouse gas emissions records over the last century and found that breaks in the steady upward march of both coincided closely. At times when emissions leveled off or dropped, like during the Great Depression, the trend was mirrored in temperatures; likewise for when emissions climbed.
"With these breaks, what's interesting is that when they're common that's pretty indicative of causation," said Pierre Perron, a Boston University economist who developed the custom-built statistical tests used in the study.
The findings put a new spin on investigation into the cause of the recent "hiatus." Scientists have suggested that several temporary natural phenomena, including the deep ocean sucking up more heat, are responsible for this slowdown. Estrada says his findings show that a recent reduction in heat-trapping CFCs as a result of the Montreal Protocol has also played an important role.
"Paradoxically, the recent decrease in warming, presented by global warming skeptics as proof that humankind cannot affect the climate system, is shown to have a direct human origin," Estrada writes in the study.
Federal agencies are required to clear the way for more climate change adaptations, like this house being raised out of the floodplain in Virginia.
Just a few days after the Treasury Department announced it would no longer back funding for most overseas coal-fired power plants, today President Obama issued a new executive order that lays the groundwork for how the US will prepare for climate change within its borders. The order is the latest in a series of policies stemming from the president's Climate Action Plan; earlier this year, for example, the administration issued new greenhouse gas emission limits for power plants and cars. But rather than addressing carbon pollution, per se, today's plan focuses on how cities and states can prepare for the climate impacts already on the way.
"We need to work on bipartisan solutions, and put politics aside," said Mayor James Brainard of Carmel, Indiana, a Republican who is one of the local officials taking part in a new advisory task force created by today's order. "The climate is changing, and we need to be prepared for it."
So what does the order call for? Here's what you need to know:
Prioritize climate-ready projects: In the wake of Superstorm Sandy, many civic planning experts called for future infrastructure plans—for bridges, roads, housing development, and the like—to emphasize climate resilience (a popular buzzword among climate wonks that means being able to quickly bounce back from disasters).
Today's order requires federal agencies to support and incentivize "smarter, more climate-resilient investments" through grants, guidance, and other forms of assistance. These could include moving roads away from crumbling coasts or requiring seaside homes to be built higher above the floodplain. The order also directs agencies to "identify and seek to remove or reform barriers that discourage" resilient investments—for example, policies that currently encourage cities to apply weak rebuilding standards after natural disasters.
"What we're seeing here is a promise that resources that might have been dedicated just to rebuilding, there would now be a mandate to rebuild in a more resilient fashion," said Rachel Cleetus, a climate economist at the Union of Concerned Scientists.
The order gives a nod to natural systems, too: Federal agencies are required to look for ways to protect places like watersheds, marshes (which are themselves an important protective barrier from sea level rise), and forests from climate impacts and are directed deliver specific recommendations to the White House within nine months.
One year ago tomorrow, storm surge from Hurricane Sandy set off a fire in Breezy Point, Queens, that leveled more than 100 homes. Now, construction is underway to rebuild the community from the ground up. But in July 2012, Congress decided to slash subsidies for federal flood insurance, and many residents now worry that rising rates could soon make this quiet beachside neighborhood unaffordable.