Blue Marble - August 2010

Are Email Attachments Bad for the Environment? Part II

| Mon Aug. 16, 2010 2:30 AM PDT

Last week, I wrote about the environmental impact of email attachments. This stirred up lots of discussion among commenters here on the Blue Marble (almost as much as the oatmeal and Greek yogurt Econundrums) and also over at Kevin Drum's blog, Andrew Sullivan's blog, and the New York Times' Freakonomics blog. While some of the the comments did tend toward the all-caps "you idiot" end of things, many readers made interesting points. For example, one of Sullivan's commenters says:

I can't say for sure (given that her source didn't go into the details about his 17.5 kettles number), but that seems more like a lifecycle-analysis assuming the pictures are kept rather than viewed and deleted to free up memory space.  The same thinking can apply to the power use of the devices used to view the e-mails: I'm using my computer all day, so the marginal power consumption caused by receiving an e-mail with four attachments is probably negligible, and could even be negative; if I were running a program that required all of the computing power of my machine, but replaced some of that time with looking at LOLcats that my friend had sent me (without shifting the higher computer to another time), then distracting e-mails could actually prove a net power saver.

Matthew Yeager, a data storage expert who works for the UK data services and solutions company Computacenter, was my main source for the piece, and he has responded to some of the points that readers brought up in an email. Here's a portion of it:

Yes, at first glance I agree that the context of boiling a kettle 17.4x being equivalent to emailing a 4.7mb attachment seems too fantastic to believe...as does the worldwide datacentre industry being at parity with the airlines for CO2 production with 2% of all man-made CO2 comes from computers and communications technology.

The sources for both of these statistics can be found both directly in the body of the blog post as well as below for the BBC interview.

(Sources: Life-cycle analysis carried out by Mark Mills of Digital Power Group, translated into kettle boilings with help from the Energy Savings Trust [UK]; Green IT: A New Industry Shock Wave by Simon Mingay [Gartner])

It is important to remember that the point of the BBC piece was to discuss the 'data deluge' and how the creation of data is affecting our planet and, in this instance, Computacenter customers. Indeed, a recent statistic putting data growth in perspective has us creating as much data in two days as we did in all of 2003. "The real issue is user-generated content,” Schmidt said. He noted that pictures, instant messages, and tweets all add to this. (Source: Eric Schmidt, Google)

I note that in many comments (as well as the Andrew Sullivan commenter) commentators give what they believe to be well thought out arguments and related mathematical equation, however your focus remains on the server alone and does not take into account the power/cooling and related environmentals for the datacentre where the server resides, the networking equipment required to connect the server, the data storage array(s) attached to the server, et al.

It is important to take into account that the power calculations come from existing equipment and not the power to deploy them in the first place; there is much wastage at all levels of a traditionally deployed technology infrastructure of siloed storage, servers, networks, et al which can be calculated largely from the product datasheets as well as physical measurement such as that conducted and verified by the Digital Power Group [citation above].

Equally, a traditional datacentre can further complicate and exacerbate power issues...only 30% of the power entering a datacentre is actually spent powering the computer chips, whilst 70% of the energy is spent cooling the processors and removing the heat they produce with better than 99.9% of energy entering a microprocessor turned into heat. (Source: Bruno Michel, manager of advanced thermal packaging at IBM Zurich Research Laboratories.)

Hence we must also factor in the physical inefficiencies of the datacentre and the equipment therein to arrive at the 17.4x kettle boil statistic.

In my experience of working with hundreds of customers worldwide, it is not uncommon to find datacentres which have been incorrectly laid out [e.g. hot rows, cool rows] as well as ageing datacentre equipment which take up more power as well as require more power and cooling to service than that which you reference.

I've asked Yeager whether a PDF of the Digital Power Group study is available; I'll post it later if it turns up.

OK commenters, go nuts! (Respectfully nuts, that is, of course.)

Update: Here's an interesting piece from Wired UK (PDF) on the Internet's energy use. (Credit: Wired UK, July 2009, page 41)

Got a burning eco-quandary? Submit it to econundrums@motherjones.com. Get all your green questions answered by signing up for our weekly Econundrums newsletter here.

Advertise on MotherJones.com

Science of the Spill: 13 August 2010

| Fri Aug. 13, 2010 1:27 PM PDT

The news from science is trickling out more slowly now. Many researchers have collected a ton of new data, which they're currently in the process of analyzing. Expect a scientific baby boom about nine months from now. But what is getting out is really interesting:

  • A growing concern that BP's use of dispersants facilitated the release of toxic oil componentsincluding benzene, a carcinogen, and toluene, which can cause neurological damage—and which are still remaining in the water. (In my article, BP's Deep Secrets, I wrote about the alarming volatility of the oil washing ashore in Louisiana.)
  • Concerns about the long-term effects of even weathered oil. A series of papers (here and here) highlight the ongoing problems with pink salmon and Pacific herring eggs and larvae in the wake of the Exxon Valdez disaster, from the kind of of weathered oil that NOAA and BP would like us to believe is now harmless.
  • Perhaps as a beneficial side-effect of BP's  cataclysm, a Candian court handed down a temporary injunction on behalf of a group representing Inuit interests to halt plans for seismic studies of the Earth's crust in eastern Canadian Arctic waters. This is a thinly-veiled effort to locate Arctic oil and gas reserves. The 8 August decision by the Nunavut Court of Justice bars the RV Polarstern from beginning its research activities over fears the cruise could impact wildlife and raise the prospects for oil drilling in the region. Sonar sounding for fossil fuels imperils bowhead whales, their calves, and other marine life.

Eco-News Roundup: Friday August 13

| Fri Aug. 13, 2010 4:16 AM PDT

Blue Marble-ish news from our other blogs

Love and Marriage: Kevin Drum argues marriage has enough room for gays and straights.

Birthright Economy: As immigrants have more babies, they fuel a younger, healthier workforce.

BP Cleans Ship: BP is letting go nearly 10,000 clean-up workers despite persistent oil.

Involuntary Labor: New documentary shows anti-abortion forces' persuasive tactics.

New Low: A child soldier kept at GITMO describes his torture and interrogations.

Gold Mined: Wall Street cuts ties with a mountain-top ruining mining company.

 

How Much Do You Spend on Transport? New Web App Aims To Show You

| Thu Aug. 12, 2010 3:30 PM PDT

Two questions: How much do you spend each month on rent or a mortgage payment? And how much do you spend each month on transportation?

Chances are you can answer the first question instantly. But the second? Hmm ... Transportation costs are trickier, since you don't pay them all at once or all to the same place, and they can vary month to month. But they're a significant expense for most households, and a seriously overlooked one.

Chicago's Center for Neighborhood Technology offers an easy way to get a rough estimate with Abogo, a new website that measures what an average household in a neighborhood spends on getting around—including car ownership, car use, and transit use. Plug in an address and the site brings up a color-coded map. For example, Barack and Michelle Obama's old block in Chicago's Kenwood neighborhood has an average transportation cost of $706 a month, compared with $821 for the metro region.abogo.cnt.orgabogo.cnt.org

The site draws on nine data sets, mostly from the US census, that include density, average commute time, commuters per household, and a transit connectivity index. It also calculates greenhouse-gas emissions from transportation—Kenwood residents average 0.36 metric tons per month, about half the metro average.

‘Environmentalism’ Can Never Address Climate Change

| Thu Aug. 12, 2010 3:13 PM PDT

In my opening statement at the debate the other night, I had 60 seconds to reach a half-drunk, half-interested crowd. In those circumstances, you realize pretty quickly that you have to cut straight to the core of things. I hadn't really thought it out in advance, but I realized just before I went on stage that the first thing I wanted to say is simple: I'm not an environmentalist and these aren't environmental challenges.

There's been a lot of talk lately about what went wrong on the climate bill, but it's always struck me that the original wrong turn was the introduction of climate change to American politics as an "environmental issue." It is the mother of all framing errors—the one from which all others flow.

Environmentalism has a well-defined socioeconomic niche in American life. There are distinct cultural markers; familiar tropes and debates; particular groups designated to lobby for change and economic interests accustomed to fighting it; conventional methods of litigation, regulation, and legislation. Environmental issues take a very specific shape.

My Impromptu Oil Spill Lecture

| Wed Aug. 11, 2010 12:59 PM PDT

At the latest stop on my book tour, in response to a bunch of questions about the Gulf catastrophe, I delivered an impromptu oil spill lecture. It's a decent distillation of my latest article, BP's Deep Secrets. Thanks to Tuc Radio for recording, editing, and posting this video.

Advertise on MotherJones.com

Follow the (Dirty Energy) Money

| Wed Aug. 11, 2010 4:00 AM PDT

How much influence is dirty energy wielding in Congress? There's now a convenient one-stop shop for finding that information. Oil Change International, along with a long list of partners, launched DirtyEnergyMoney.org on Tuesday, a new hub that allows users to follow the fossil fuel money.

On the site, we learn that 110th Congress (2007-2008) was the "dirtiest," with fossil fuel interests spending a record $22,713,081 to influence policy. Sen. John Cornyn (R-Texas) takes the award for "Dirtiest Politician," accepting more than $1.8 million from fossil fuel interests since 1999. Oil giant Koch Industries, which Blue Marble readers will recognize as a major funder of climate change denial work, is the has spent the most of any individual company on buying Congress since 1999, at $4,382,491. The site also uses graphics to illustrate the ties between the energy industry and our representatives.

Some other interesting findings that they've compiled from the statistics:

Overall, the coal industry has been friendlier to the Democrats than Republicans thus far in the 111th Congress, with over $3.7 million going to Democratic members of the House and Senate, compared with about $2.8 million to Republicans.
Republicans continue to take more oil and gas money, with the oil and gas industry contributing over $5.1 million to Republicans and $3.1 million to Democrats.

They also note that the senators who voted in favor of a measure to take away the Environmental Protection Agency's authority to regulate planet-warming gases in June "took on average two and a half times as much Dirty Energy Money as those who voted against it."

Appalachian Voices, Chesapeake Climate Action Network, Energy Action Coalition, Earthworks, Friends of the Earth, Greenpeace, MoveOn, Public Citizen, True Majority, 1Sky, and 350.org are partners in the site.

Climate Regulation's Newest Foe: Enviros?

| Wed Aug. 11, 2010 3:10 AM PDT

The Environmental Protection Agency's plan to roll out regulations on planet-warming gases may hit a snag in the coming months—opposition from environmentalists. While industry groups are mounting major legal challenges to the agency's ability to regulate greenhouse gases, the EPA's decision to scale back the number of pollution sources subject to these new rules has raised the ire of greens who worry the agency isn't being aggressive enough.

Bill Snape, senior counsel at the Center for Biological Diversity, says the EPA is guilty of "extreme backtracking" on greenhouse gas regulations. The group has announced its intention to sue the EPA over the coming climate regulations, which it deems too weak.

The EPA already faces a host of challenges to the coming greenhouse gas regulations, with a number of states, industry groups, and trade associations challenging even the underlying finding that greenhouse gases pose risks to human health. Groups like Peabody Energy Corp., the National Mining Association, and the US Chamber of Commerce, as well as a number of states, have challenged other aspects of the rules, too, arguing that EPA regulations will be too stringent and would cripple the economy.

The back story: In 2007, the Supreme Court determined that the EPA could regulate greenhouse gases under the Clean Air Act if the agency determined that the emissions were a threat to human health. The Bush administration refused to make a determination one way or the other. But in April 2009, the Obama administration EPA reached that determination, triggering the regulatory process.

But the Clean Air Act was written to deal with pollutants like lead, nitrogen dioxide, and sulfur dioxide, which are emitted in much lower amounts than carbon dioxide, for example. The limits for those pollutants under the law range from 100 to 250 tons per year. (The average coal-fired power plant, by comparison, emits 4 million US tons of CO2 per year.) A threshold that low for carbon dioxide and other greenhouse gases would be difficult to employ, the EPA says, and would require regulating a number of small businesses that are not the main culprit when it comes to climate change. A threshold of just 250 tons would cover many more emission sources than would be reasonable, or necessary, to regulate.

With this in mind, the EPA decided to issue a rule that would guide their process for regulating greenhouse gas emissions from non-mobile sources (i.e., not cars and trucks), in order to focus on much larger generators of greenhouse gases like those coal-fired power plants and heavy manufacturers—those that make things like cement and steel, which require quite a bit of energy. The draft version of the rule, issued in September, would have required all sources that increased emissions more than 25,000 tons of greenhouse gases per year to obtain special operating permits. But when the agency issued the final rule in May, it had raised that threshold for existing facilities to 75,000 tons per year and 100,000 tons for new or modified facilities. According to the timeline EPA Administrator Lisa Jackson has laid out, the rules would begin phasing in next January.

The EPA has made the point of this rule, called a "tailoring rule," clear—it wants to focus primarily on the biggest sources of planet-warming gases, like coal-fired power plants, rather than small ones "that clearly were not appropriate at this point to even consider regulating," as EPA assistant administrator for air Gina McCarthy said in May. The EPA indicated that it believed the lower limit might mean some hospitals or even large apartment buildings could end up covered by the rule, which they didn't want—at least not initially. Under the EPA's rule, regulations targeting emitters in the 25,000-ton range wouldn't take effect unitl 2016.

That's "too far away," says the Center for Biological Diversity's Snape. He worries that the date will only get pushed further back as industry fights the rules in court. And, of course, there's always the possibility that another administration could come into office and never follow through with the rule at all. Snape accused the Obama administration of scaling back the EPA rules in order to curry favor with industry for new climate legislation, which isn't likely to happen in Congress anytime soon, anyway.

Haley Barbour, Oil Expert

| Tue Aug. 10, 2010 12:46 PM PDT

There have been many stupid stories in the past weeks about the Gulf oil spill. But this one might just take the cake: "Mississippi leaders: Spill's environmental impact overhyped."

It's hard to know where to begin on this one. Perhaps the most important question: When did Mississippi Gov. Haley Barbour become an expert on oil toxicity?

Barbour, DMR Director Bill Walker, DEQ Director Trudy Fisher and Lt. Gov. Phil Bryant have all described the oil and tar that has made its way to Mississippi waters and shores as "nontoxic." Barbour said the "first cuts," or volatile chemicals in the crude such as benzene, evaporate quickly near the well site and that the oil that makes it further into the Gulf is "emulsified and weathered."

Wait, it gets more comical: Oil is just like toothpaste, according to Barbour:

Barbour has also said the risk to wildlife from oiling is not as bad as some have been saying.
"Once it gets to this stage, it's not poisonous," Barbour said. "But if a small animal got coated enough with it, it could smother it. But if you got enough toothpaste on you, you couldn’t breathe."

I wonder if it's Barbour's years as a big-time lobbyist for interests like tobacco and big oil that qualify him to render these informed opinions. In fact, Barbour's time lobbying for dirty energy interests appears to be his only qualification for talking about the subject.

At least the others quoted in the article are nominally better prepared to comment on the science. But that doesn't make them any less wrong. Contrary to what they state in the piece, oil is toxic, it's not just "CO2 and water" as one state official implies, and evidence has shown that it is getting into the food chain. But maybe reality is different in Mississippi.

BP's Dispersed Oil Polluting Ocean's Most Mysterious Waters

| Tue Aug. 10, 2010 12:10 PM PDT

The epic 10-year long Census of Marine Life, now winding to a close, reveals we know less about the biodiversity of the midwater ocean than any other region—less than we know about shallow waters, coastal waters, the seafloor, even the deep seafloor. Yet the vast featureless realm of the deep open ocean is home to some of the richest aggregations of life on our planet.

In my new MoJo cover article, BP's Deep Secrets, rushed into early release, I write about cutting-edge science casting light (well, casting sound, actually) on the vast strata of life known as the deep scattering layer (DSL) inhabiting the deep pelagic zone. Virtually all the marine we know and love from the surface ocean (whales, dolphins, sea turtles, sharks, billfish, predatory fish) dives down to feed on the DSL, which rises every night like a great dumbwaiter from the deep bearing every manner of seafood delicacy on a platter of darkness.

If you've ever wondered what dolphins do at night, then you'll want to read about the amazing work of Kelly Benoit-Bird at Oregon State University and Margaret McManus at the University of Hawaii Manoa. I had the good fortune to spend time at sea with them recently as they deciphered the mysteries of the dark ocean off Hawaii. We happened to be offshore recording the vertical migrations of the deep scattering layer and the dolphins feeding on it at the moment the Deepwater Horizon well exploded. Suddenly my article looking at a better future for the oceans took an unexpected turn.

Sadly, BP's abominable catastrophe is hammering the migratory realm of the DSL harder than any other ocean environment. In pumping massive quantities of chemical dispersants directly to the Deepwater Horizon wellhead 5,000 feet down, BP transformed oil that would have floated to the surface—where it might have been skimmed—into submerged plumes of dispersed droplets destined to migrate the superhighways of underwater currents long after the oil at the surface has weathered away.

Now a new paper in the open access PLoS ONE analyzes our understanding of the distribution of marine biodiversity in the global ocean—as a crucial first step towards more effective and sustainable ocean management in the future. From the paper:

Recent efforts to collate location records from marine surveys enable us to assemble a global picture of recorded marine biodiversity. They also effectively highlight gaps in our knowledge of particular marine regions. In particular, the deep pelagic ocean—the largest biome on Earth—is chronically under-represented in global databases of marine biodiversity. The deep pelagic ocean remains biodiversity's big wet secret. Given both its value in the provision of a range of ecosystem services, and its vulnerability to threats including overfishing and climate change, there is a pressing need to increase our knowledge of Earth's largest ecosystem.

Clearly the deep pelagic ocean is also severely threatened by our oil addiction and by the corporations recklessly drilling for it in the darkness beyond our sight.