Unprecedented Arctic ozone hole of 2011.: NASA image by Eric Nash and Robert Simmon with data from the Aura Microwave Limb Sounder team.Unprecedented Arctic ozone hole of 2011. NASA image by Eric Nash and Robert Simmon with data from the Aura Microwave Limb Sounder team.

An unexpected ozone hole—the first of its kind—opened above the Arctic this past spring, with a whopping ozone loss of more than 80 percent 11 to 12 miles/18 to 20 kilometers above Earth. That thinning rivals the worst of Antarctica's ozone woes. We've become used to a persistent hole above Antarctica, which, despite our phase-out of CFCs, refuses to heal... thanks to a positive feedback loop between ever-increasing greenhouse gas emissions and a paradoxical cooling in the stratosphere. (I first wrote about this in MoJo's The Thirteenth Tipping Point.) Only 9 years ago the World Meteorological Organization surmised the Arctic would never produce an ozone hole, since it lacked a polar vortex and really cold temperatures. But as the authors of a new paper in Nature report this week, the Arctic got the vortex this year, and: "Our results show that Arctic ozone holes are possible even with temperatures much milder than those in the Antarctic." We know that decreased ozone leads to increased UV-B radiation, skin cancers, and reduced yields of two-thirds of the 300 most important crop varietals. Europe's winter wheat crop likely took a noticeable hit from this year's Arctic ozone hole. As to why the stratosphere gets colder in a warming world, Jeff Masters at WunderBlog sums it up: "If the surface atmosphere warms, there must be compensating cooling elsewhere in the atmosphere in order to keep the amount of heat given off by the planet the same and balanced. As emissions of greenhouse gases continue to rise, their cooling effect on the stratosphere will increase. This will make recovery of the stratospheric ozone layer much slower." 

US Army Sgt. Andrew Wall, Security Forces team leader attached to Provincial Reconstruction Team Zabul, provides security for his civil affairs teammates during a visit to the district center in Shinkai, Afghanistan, Oct. 4, 2011. Sergeant Wall is deployed from Charlie Company, 182nd Infantry Division, Massachusetts National Guard. (US Air Force photo/Senior Airman Grovert Fuentes-Contreras)


There's been a boomlet this year in books and articles suggesting that innovation in recent decades has slowed to a trickle and economic productivity is flattening out for the foreseeable future. Peter Thiel has been pushing this meme for a while, Tyler Cowen made a splash in January with his e-book, The Great Stagnation, and Neal Stephenson nearly took the World Policy Institute offline last week with his essay, "Innovation Starvation." Talking about our innovation drought is suddenly all the rage. But is it really true? Or is it mostly just a product of discouragement borne of several years of lousy economic performance?

Honestly, I'm not sure. But maybe it's worth thinking out loud about this a little. The complaints mostly take two basic forms. The first I call "Where's my jetpack?!?" and it's pretty easily disposed of. The argument here is that back in the 1950s we thought the future would bring us flying cars, electricity too cheap to meter, and vacations on the moon. But none of that has happened. What gives?

The answer is prosaic: Forecasters in the '50s were wrong. It's not that the future never arrived—it's that the future brought us different stuff than we thought we were going to get. Our lack of flying cars simply doesn't tell us anything about the pace of innovation.

The second form of the innovation argument is more substantive. I call it the Great-Grandma Argument, and it compares innovation in the first half of the 20th century to innovation since then. Our Great-Grandma from 1900, we're told, would be totally flabbergasted if she were whisked to the year 1950. So much new stuff! But our mothers and fathers from 1950? If they were magically transported to 2011, they'd recognize almost everything they saw. Yawn.

There's obviously something to this. The end of the 19th century and the first half of the 20th century was an astonishingly fertile period: lightbulbs, radios, autos, airplanes, refrigerators, penicillin, TVs, air conditioners, the telephone, and much more. The period since then has seen the digital computer and....that's about it. Things like cell phones and flat screen TVs are mere technological improvements, not genuinely new inventions.

Which is true enough. But although I've often thought about innovation this way too, the more I've chewed it over the more I've decided that it misses something. Most of the best known inventions of the early 20th century were actually offshoots of two really big inventions: electrification and the internal combustion engine. By contrast, the late 20th century had one really big invention: digital computers. Obviously two is more than one, but still, looked at that way, the difference between the two periods becomes a bit more modest. The difference between the offshoots of those big inventions is probably more modest than we think too. Just as we once made better and better use of electrification, we're now making better and better use of digital computing. And to call all these computing-inspired inventions mere "improvements" is like calling TV a mere improvement of radio. These are bigger deals than we often think. We have computers themselves, of course, plus smartphones, the internet, CAT scans, vastly improved supply chain management, fast gene sequencing, GPS, Lasik surgery, e-readers, ATMs and debit cards, video games, and much more.

Wait a second. Video games? Am I joking? No indeed. Give some thought to just what innovation and productivity gains are for.

In his fight for smaller government, Florida Rep. Ritch Workman wants to do something for the little people: He wants to let 'em fly. The Melbourne Republican has decided that the state's 22-year-old ban on dwarf-tossing in bars is keeping height-challenged residents from realizing their full career potential in a recession. "To me it's an archaic kind of Big Brother law that says, 'We don't like that activity,'" Workman told Florida Current reporter Bruce Ritchie. "Well, there is nothing immoral or illegal about that activity. All we really did by passing that law was take away some employment from some little people."

Once a staple of spring-break barrooms from Key West to Pensacola, dwarf-tossing (once incorrectly and more offensively referred to as midget-tossing) involves seeing which PBR-pickled frat brother can throw a Velcro-encased dwarf higher up a fabric-lined wall. State lawmakers banned the practice in 1989, finding it not only demeaning but physically taxing on the small subjects. But Workman's introduced legislation that would repeal the ban, taking his lead from such high-minded libertarian thinkers as TV newsman John Stossel. (That pundit's reaction to a dwarf-toss ban: "Give me a break!")

A couple of years ago, in a bellwether for how hard it's going to be to ever seriously rein in healthcare costs, there was an instant and thunderous backlash against a new recommendation that women with no risk factors put off routine mammograms until age 50. A small number of famous breast cancer survivors who had been diagnosed at a young age took immediately to the airwaves, and that was all she wrote. Within 48 hours, HHS Secretary Kathleen Sebelius had disowned her own task force and assured the nation that absolutely nothing would change.

Now the same group that made the mammogram recommendation is back:

The U.S. Preventive Services Task Force, which triggered a firestorm of controversy in 2009 when it raised questions about routine mammography for breast cancer, will propose downgrading its recommendations for prostate-specific antigen (PSA) for prostate cancer onTuesday, wading into what is perhaps the most contentious and important issue in men’s health.

....“The harms studies showed that significant numbers of men — on the order of 20 to 30 percent — have very significant harms,” Moyer, a professor of pediatrics at Baylor College of Medicine, said in a telephone interview Thursday.

There are never any perfect answers to these questions. We could start routinely testing everyone at age 20, and it's almost certain that at least a few treatable cancers would get screened. At every cutoff point, whether it's age related or condition related, you have to decide if the cost of tightening the testing criteria outweighs the benefit. What you can't do is simply decide that cutoffs should never be tightened because, inevitably, there will be a cost. It might be small, but it's always there. And then the USPSTF becomes a death panel because that's a handy thing for demagogues to call it.

So we'll see how this one goes. My previous brush with prostate screening is here.

Young Westboro Baptist Church protesters.

If you're ever at a loss for what the Westboro Baptist Church in Topeka, Kansas, is all about, take a gander at its website, tastefully titled GodHatesFags.com. There, you'll learn that a "modern militant homosexual movement" poses "a clear and present danger to the survival of America." And that to combat this menace, the church has conducted 46,635 demonstrations since June 1991 "at homosexual parades and other events, including funerals of impenitent sodomites (like Matthew Shepard) and over 400 military funerals of troops whom God has killed in Iraq/Afghanistan in righteous judgment against an evil nation." At these protests, church members parade around with signs declaring "FAGS BURN IN HELL" and "THANK GOD FOR AIDS."

Margie Phelps, daughter of Westboro Baptist patriarch Fred Phelps Sr., announced the church's latest picket target last night on Twitter:

Predictably, bloggers are having a field day with the delicious irony that the tweet condemning the iPhone's progenitor came…via iPhone.

Reached by phone Thursday, Fred Phelps Jr.—Fred Sr.'s son and one of the roughly 100 members of the church—expanded on his sister's rationale for the planned picket. "The main thing in my mind," Phelps said, "is that [Jobs] operated in a company that was recognized around the world as being gay-friendly." Phelps wasn't sure where this recognition came from, but he insisted, "I've read that several places. I don't think there's any dispute about that." (In 2008, a Prime Access/PlanetOut poll determined Apple to be the second most gay-friendly American brand, behind only Bravo.)

Octopi Wall Street!

A few days ago, photographer and idea blogger David Friedman tweeted, "Octopi Wall Street. You can have that." Beyond the Occupy Wall Street-inspired wordplay, he was on to something. There's a long American tradition of mixing economic populism with cephalopods.

Rolling Stone's Matt Taibbi famously described Goldman Sachs as "a great vampire squid wrapped around the face of humanity, relentlessly jamming its blood funnel into anything that smells like money." More recently, Mother Jones cartoonist Zina Saunders drew the Koch brothers as the twin heads of an oil- and money-spewing "Kochtopus." But the first comparisons of moneyed interests to voracious tentacled creatures date back to the Gilded Age. Here, a quick review of the metaphor's greatest hits.

In this 1882 illustration, a grinning 10-tentacled octopus (decapus?) headed by California railroad tycoons ensnares everything in its path, from farmers and miners to an entire sailing ship.

caption TK: credit TKThe Curse of California Wikimedia Commons


The arms of the Traction Monster, drawn by George Luks in 1899, include a variety of monopolistic entities, from the Steel Trust to John D. Rockefeller's Standard Oil, a favorite target of antitrust foes.

The Monopoly Octopus: Wikimedia CommonsThe Traction Monster Wikimedia Commons


In this 1904 illustration by Udo Keppler, Standard Oil wraps its tentacles around the Capitol and average Americans, while eyeing the White House.

caption TK: credit TKStandard Oil vs. America Library of Congress


Standard Oil again, this time drawn as "A Horrible Monster, whose tentacles spread poverty, disease and death."

caption TK: credit TKA Horrible Monster International Team of Comics Historians


In this 1899 cartoon, the Devil Fish of California Politics (a San Francisco Democratic party boss) emerges from a Sea of Corruption, his "rapacious maw" agape.

The Devil Fish of California: Library of CongressThe Devil Fish of California Politics Library of Congress


Octopus-mania also extended to other causes. Here, the Liquor Octopus taunts the entire world in a 1919 prohibitionist cartoon.

The Liquor Octopus: Anti-Saloon LeagueThe Liquor Octopus Anti-Saloon League

Note the octopus lounging in the Fountain of Taxation, in which the Burden of Taxation trickles down to the Laboring Class. "Eventually the bottom basin gets it."

caption TK: credit TKThe Fountain of Taxation Library of Congress


Which brings us to this Occupy Wall Street stencil by artist Molly Crabapple, in which Taibbi's vampire squid tips its top hat to its Turn of the Century forebears.

caption TK: credit TKFight the Vampire Squid Molly Crabapple


For many more examples of classic octopus propaganda, check out Vulgar Army's well-stocked collection.

The Best American Science Writing 2011 (HarperCollins/Ecco Press) is out.

I'm honored to say that my Mother Jones article, BP's Deep Secrets, is one of ten pieces chosen for the volume. Of it, one reviewer wrote:

"'Deep Secrets' by Julia Whitty: An examination of the recent oil spill near Louisiana. This was my personal favorite. What we didn’t know about the gulf environment before the oil spill we may never know if the environment’s destroyed."

This year's edition is edited by Rebecca Skloot (The Immortal Life of Henrietta Lacks), her father, the poet and essayist Floyd Skloot, and Jesse Cohen.

Support your local indie bookstore and a science writer or two and buy a copy. Or two. Thanks!

In the Wall Street Journal today, Robert Bryce offers up five "obvious truths" about climate change. His first four are mostly practical observations, and it so happens that I actually agree with most of them. We carbon taxers have lost the war for now, we are going to need more energy in the future, greenhouse gas control is a global issue, and we do need to get more efficient at generating energy. I might not like it, but these things are all mostly true.

But then there's his fifth "obvious truth":

5) The science is not settled, not by a long shot. Last month, scientists at CERN, the prestigious high-energy physics lab in Switzerland, reported that neutrinos might—repeat, might—travel faster than the speed of light. If serious scientists can question Einstein's theory of relativity, then there must be room for debate about the workings and complexities of the Earth's atmosphere.

Well, there you go. The fact that a neutrino might — unconfirmed but still possibly might! — travel faster than light means that climate change models are crap. This is what passes for serious scientific thinking on the right.

The practical issues surrounding climate change are gargantuan. I myself am pessimistic that the human race will collectively decide to address them, which is why I support research into geoengineering as a possible last resort and, in the meantime, hope and pray that we figure out a way to generate lots of clean energy in the fairly near future.

But that has nothing to do with whether or not climate change is real. It is. Our current models might turn out to be off by 10% or 20% or 50% — in either direction, mind you — but they're not wrong. When you dump greenhouse gases into the atmosphere, more heat is trapped and the planet warms up very quickly (on geological scales). And when the planet warms up, lots of very bad stuff happens. It's just head-in-the-sand foolish to pretend otherwise. Even after Einstein came along, you'd still kill yourself just as badly if you jumped out of an airplane. Newton wasn't very wrong, after all.

In my post this morning about why Apple lost the personal computing battle, I noted that a big part of the reason was the much lower cost of PCs vs. Macs. Matt Yglesias tweets back:

Actually, they did in a way. The original version of Windows was designed to work with the first CGA color adapter, and in order to keep costs down that adapter only supported 16 colors. Later adapters supported more colors, but Windows retained a considerable amount of backward compatibility with old hardware for a very long time. Thus, even as late as the early-90s, versions of Windows were still using logos that rendered properly on ancient hardware.

If everyone will indulge me in a bit of nostalgia, I want to make a broader point here. To understand why PCs beat Macs, you have to understand the era in which the battle was fought. And in that era, the 80s and early 90s, the personal computer world was controlled almost hegemonically by business customers. It's hard to overstate just how overwhelming this dominance was: corporate customers probably outnumbered home users by three or four to one, and even at that, a lot of the home users bought PCs mainly because they wanted to bring in work from the office. It was this corporate domination of the market that drove its early evolution. Here are a few of the ways this played out:

  • The IBM imprimatur. This was absolutely key to legitimizing the business market. Corporate IT managers just flatly weren't going to buy a million dollars worth of personal computers from their corner Radio Shack or from some shaggy-haired guy in Cupertino. They had lots of IBM gear they needed to interoperate with, they had IBM networks they needed to plug into, and they had IBM service contracts already in place to cover their maintenance needs. Initially, the only way they were going to buy PCs was if they came from IBM, and later on only if they were compatible with all their existing IBM PCs.
  • Backward compatibility. Home users get annoyed when new software isn't backward compatible. When I upgraded to Windows 7, I lost the ability to play my favorite computer Yahtzee game. Boo hoo. But corporate IT managers are absolutely rabid about backward compatibility. This isn't because they want to play Yahtzee. It's because they have huge fleets of hardware, some of it quite elderly, and they want new software to work on it. What's more, tucked away in various corners of the company there are people running ancient custom applications that are mission critical and absolutely can't break when the OS or the network software is upgraded. Companies like IBM and Microsoft take this very seriously, and it drives a lot of their design decisions. This is why you end up with bloated operating systems and oddities like ugly Windows logos.
  • Portability. As I said earlier, laugh all you want at the original Compaq portable that was the size of a sewing machine, but in 1983 it was a big deal. Ditto for the clamshells that debuted later in the decade, weighing in at a svelte 8-10 pounds. But big or not by today's standards, they were portable. And since business people travel a lot, having a portable machine you could take out to a client's site was a godsend. Apple just didn't have anything to compete here.
  • Business applications. Unless you were there, it's hard to explain just how thoroughly Lotus 1-2-3 was the killer app of the early 80s. Everyone used it, and it was available only on PCs. Likewise, lots of serious business apps ran on top of databases like dBase or R:Base, and those were available only on PCs.
  • Networks. Printers were expensive, so IT managers needed PCs to be on a corporate network. Novell and Banyan networks were designed with PCs in mind, and only the very courageous tried to make a Macintosh work on a PC network. It wasn't impossible or anything, but believe me, you were a lot better off sticking to PCs on the networks available back then.
  • Flexibility/Expandability. For a few years in the 80s I was the product manager for a very sophisticated communications board for IBM PCs. It allowed corporations to build specialized apps that used X.25 or SDLC or things like that, and it wasn't available for Macs. Why? Because Apple didn't allow you to plug boards into a Mac. PCs did. So if you needed a specialized piece of hardware, you could get it. Or if you just wanted more serial ports or more memory, you could pop in an AST 6-Pack and you were good to go. This was a big deal for corporate IT guys. Apple didn't offer it.
  • Low cost. Nuff said about that. Thanks to intense competition, PCs were just way less expensive than Macintoshes.

Back in the 80s and early 90s, if you wanted to buy a bunch of personal computers for your company, you'd ask around. And your financial analysts would tell you they used 1-2-3, so you'd better buy PCs. Your IT guy would tell you the corporate network was all built around PCs, so you'd better buy PCs. The CFO would tell you your project budget was a million bucks, so you'd better buy PCs. So guess what? You bought PCs. That's why Apple lost the market share war.

Why this walk down memory lane? Just to explain the environment that got us where we are today, an environment that's largely fading away. But it existed 30 years ago. Business buyers didn't buy PCs because they were mindless drones, they bought them because they really, truly had excellent reasons for preferring them to Macintoshes. And when you bought a PC for your home, it made sense to get one that was compatible with the PC in your office.

This created a virtuous1 circle: corporate customers preferred PCs, and as a result, when Microsoft set their development priorities, they listened to their big corporate customers. And given a choice between a clunky but functional mail merge function or a snazzier user interface, those customers voted unanimously for the mail merge function. And they voted for low cost, backward compatibility, network functionality, and portability. So that's what they got.

Today, software has so much functionality that it makes sense even for business users to start thinking more about ease of use and design esthetics. But back then it really didn't. So, quite rationally, you got frenetic development of new features even if it sometimes came at the cost of reliability and ease of use. That environment may be long gone, but it explains a lot about why Windows PCs look and feel clunkier than Macs but still rule the roost nonetheless.

1Well, a circle, anyway. You can decide for yourself if it was virtuous.