Kevin Drum - October 2011

Keep Asking Yourself One Question: Whose Side Am I On?

| Fri Oct. 7, 2011 1:28 PM EDT

Paul Krugman writes today about the Occupy Wall Street protesters:

Now, it’s true that some of the protesters are oddly dressed or have silly-sounding slogans, which is inevitable given the open character of the events. But so what? I, at least, am a lot more offended by the sight of exquisitely tailored plutocrats, who owe their continued wealth to government guarantees, whining that President Obama has said mean things about them than I am by the sight of ragtag young people denouncing consumerism.

This is a really important point, and it's especially important for sober, mainstream, analytical liberal folks. Like me. So consider this post a warning to myself.

If you go to any tea party event, you'll hear some crackpot stuff and see some people dressed up in crackpot costumes (tricorner hats etc.). By "crackpot," I mean stuff so outré that even movement conservatives know it's crazy and want nothing to do with it. Of course, it gets reported in the media occasionally, and when it does, snarky liberals have a field day with it.

But does this scare off anyone on the right? It does not. They ignore it, or dismiss it, or try to explain it away, and then continue praising the overall movement. The fact that liberals have found some hook to deliver a blast of well-timed mockery just doesn't faze them. They know whose side they're on.

So Krugman is right: liberals need to take the same attitude. Are there some crackpots at the Occupy Wall Street protests who will be gleefully quoted by Fox News? Sure. Are some of the organizers anarchists or socialists or whatnot? Sure. Is it sometimes hard to discern a real set of grievances from the protesters? Sure.

But so what. Ignore it. Dismiss it. Explain it away. Do whatever strikes your fancy. But don't let any of this scare you off. We can put up with a bit of mockery if we keep the chart above firmly in front of our faces. Just keep reminding yourself: a mere three years after the financial industry nearly destroyed the planet, Wall Street is bigger and more profitable than ever while a tenth of the rest of us remain mired in unemployment. Even after nearly destroying the planet, virtually nothing has changed. That's the outrage, not a few folks with funny costumes or wacky slogans. Always keep in mind whose side you're on.

Advertise on MotherJones.com

Yes, Government is Responsible For Our Sluggish Economy

| Fri Oct. 7, 2011 11:57 AM EDT

As you've probably heard by now, employment was up by 103,000 last month. Since the economy needs to generate 100-150,000 jobs each month just to keep up with population growth, this means that in real terms we're either treading water or actually moving backward. In any case, it's a lousy report.

But rather than run my usual chart showing this, here's a peek at the details from the BLS report instead. As you can see, our problem is that the private sector is producing more jobs — though slowly — but the public sector is shrinking. That's been the story for a long time now, though you might not know it given the ceaseless clamor from the right about big government, overbearing regulations, and out-of-control spending. The evening news isn't likely to point this out, but the truth is that government employment is down across the board, and that's a big reason our economy has remained so sluggish. Not only are conservatives in Congress grimly determined to prevent any substantive action to improve the economy, but conservatives around the country are actively making things worse. And we wonder why people have such a dim view of our elected officials.

Harry Reid Goes Sub-Nuclear

| Fri Oct. 7, 2011 11:28 AM EDT

Harry Reid, in a fit of spinefulness, killed off a Senate rule last night. There are really only two things you need to know about this:

  1. The rule itself was an obscure and trivial delaying tactic that, until now, neither party had used for decades. It does not directly affect either cloture or the filibuster, so stop drooling.
  2. The rule was eliminated by a majority vote that overturned a ruling of the parliamentarian.

#1 doesn't matter. (Though details are here if you're a masochist.) #2 might be a big deal. For starters, if you can change the Senate rules by simply overruling the parliamentarian on a majority vote, you can change pretty much any Senate rule by a majority vote. For seconders, Harry Reid actually got the entire Democratic caucus to go along with this. That's.....sort of amazing.

No one knows how this is going to play out in the future. One possibility is that it's a nothingburger. Overturning an obscure rule doesn't set much of a precedent, and likewise, uniting the Democratic caucus over something so arcane doesn't mean much either. Mitch McConnell and his friends will squawk, and then life will go back to normal. What's more, the proposition that a parliamentarian's ruling can be overturned on a majority vote isn't really anything new. It hasn't been used much, but it's a precedent that's been in place for decades.

Still, there's at least the possibility that it's very much a somethingburger. It might be something Republicans take advantage of if they win a Senate majority in the next election. In the nearer future, it might mean Democrats are finally figuring out that if they don't hang together, they will assuredly all hang separately. If I had to guess, I'd vote that this is a nothingburger, but it's worth keeping an eye on.

(It will, of course, also inspire Fox/Drudge/Tea Party shrieks about totalitarianism and Democratic thuggery, but that can be safely ignored. The real action will all be behind the scenes.)

China is Losing Its Edge

| Fri Oct. 7, 2011 10:46 AM EDT

This is one of the reasons that I'm a little less preoccupied by China than some people:

Rising Chinese labour costs are changing the economics of global manufacturing and could contribute to the creation of 3m jobs in the US by 2020, according to a study being released on Friday.

....The Boston Consulting Group estimates that the trend could cut the US’s merchandise trade deficit with the rest of the world, excluding oil, from $360bn in 2010 to about $260bn by the end of the decade. The shift would also reduce its soaring deficit with China, which reached $273bn in 2010 and has triggered an intense political controversy over China’s exchange rate policies.

This has been inevitable for a long time. As China grows and gets richer, its workers will get paid more and it will make less and less sense to move U.S. production there. It's a natural brake on offshoring. Add to it China's demographic trends and you have a country that still has a bright future but is almost certainly not going to be able to keep up the torrid growth rates of the past few decades. Once it hits per capita GDP of $10-15 thousand or so, continued progress is going to come ever more slowly.

At the same time, this isn't automatically great news for American manufacturing, which, in the short term, is just likely to migrate to India and Malaysia and other countries with even lower labor costs than China. And as for our current account deficit, the key phrase in the article above is "excluding oil." Obviously China is a significant factor in our trade deficit, but oil is both a bigger and more persistent one. If we want to tackle that — and we do! — we need both macroeconomic action (a weaker dollar) and policy action (ways to reduce our use of OPEC oil). Both a weaker dollar and a carbon tax are our friends right now.

Why the Future Is Brighter Than You Think

| Fri Oct. 7, 2011 5:55 AM EDT

There's been a boomlet this year in books and articles suggesting that innovation in recent decades has slowed to a trickle and economic productivity is flattening out for the foreseeable future. Peter Thiel has been pushing this meme for a while, Tyler Cowen made a splash in January with his e-book, The Great Stagnation, and Neal Stephenson nearly took the World Policy Institute offline last week with his essay, "Innovation Starvation." Talking about our innovation drought is suddenly all the rage. But is it really true? Or is it mostly just a product of discouragement borne of several years of lousy economic performance?

Honestly, I'm not sure. But maybe it's worth thinking out loud about this a little. The complaints mostly take two basic forms. The first I call "Where's my jetpack?!?" and it's pretty easily disposed of. The argument here is that back in the 1950s we thought the future would bring us flying cars, electricity too cheap to meter, and vacations on the moon. But none of that has happened. What gives?

The answer is prosaic: Forecasters in the '50s were wrong. It's not that the future never arrived—it's that the future brought us different stuff than we thought we were going to get. Our lack of flying cars simply doesn't tell us anything about the pace of innovation.

The second form of the innovation argument is more substantive. I call it the Great-Grandma Argument, and it compares innovation in the first half of the 20th century to innovation since then. Our Great-Grandma from 1900, we're told, would be totally flabbergasted if she were whisked to the year 1950. So much new stuff! But our mothers and fathers from 1950? If they were magically transported to 2011, they'd recognize almost everything they saw. Yawn.

There's obviously something to this. The end of the 19th century and the first half of the 20th century was an astonishingly fertile period: lightbulbs, radios, autos, airplanes, refrigerators, penicillin, TVs, air conditioners, the telephone, and much more. The period since then has seen the digital computer and....that's about it. Things like cell phones and flat screen TVs are mere technological improvements, not genuinely new inventions.

Which is true enough. But although I've often thought about innovation this way too, the more I've chewed it over the more I've decided that it misses something. Most of the best known inventions of the early 20th century were actually offshoots of two really big inventions: electrification and the internal combustion engine. By contrast, the late 20th century had one really big invention: digital computers. Obviously two is more than one, but still, looked at that way, the difference between the two periods becomes a bit more modest. The difference between the offshoots of those big inventions is probably more modest than we think too. Just as we once made better and better use of electrification, we're now making better and better use of digital computing. And to call all these computing-inspired inventions mere "improvements" is like calling TV a mere improvement of radio. These are bigger deals than we often think. We have computers themselves, of course, plus smartphones, the internet, CAT scans, vastly improved supply chain management, fast gene sequencing, GPS, Lasik surgery, e-readers, ATMs and debit cards, video games, and much more.

Wait a second. Video games? Am I joking? No indeed. Give some thought to just what innovation and productivity gains are for.

Time for Another Death Panel Uproar

| Fri Oct. 7, 2011 1:17 AM EDT

A couple of years ago, in a bellwether for how hard it's going to be to ever seriously rein in healthcare costs, there was an instant and thunderous backlash against a new recommendation that women with no risk factors put off routine mammograms until age 50. A small number of famous breast cancer survivors who had been diagnosed at a young age took immediately to the airwaves, and that was all she wrote. Within 48 hours, HHS Secretary Kathleen Sebelius had disowned her own task force and assured the nation that absolutely nothing would change.

Now the same group that made the mammogram recommendation is back:

The U.S. Preventive Services Task Force, which triggered a firestorm of controversy in 2009 when it raised questions about routine mammography for breast cancer, will propose downgrading its recommendations for prostate-specific antigen (PSA) for prostate cancer onTuesday, wading into what is perhaps the most contentious and important issue in men’s health.

....“The harms studies showed that significant numbers of men — on the order of 20 to 30 percent — have very significant harms,” Moyer, a professor of pediatrics at Baylor College of Medicine, said in a telephone interview Thursday.

There are never any perfect answers to these questions. We could start routinely testing everyone at age 20, and it's almost certain that at least a few treatable cancers would get screened. At every cutoff point, whether it's age related or condition related, you have to decide if the cost of tightening the testing criteria outweighs the benefit. What you can't do is simply decide that cutoffs should never be tightened because, inevitably, there will be a cost. It might be small, but it's always there. And then the USPSTF becomes a death panel because that's a handy thing for demagogues to call it.

So we'll see how this one goes. My previous brush with prostate screening is here.

Advertise on MotherJones.com

The Right Can't Handle the Truth, Climate Edition

| Thu Oct. 6, 2011 6:06 PM EDT

In the Wall Street Journal today, Robert Bryce offers up five "obvious truths" about climate change. His first four are mostly practical observations, and it so happens that I actually agree with most of them. We carbon taxers have lost the war for now, we are going to need more energy in the future, greenhouse gas control is a global issue, and we do need to get more efficient at generating energy. I might not like it, but these things are all mostly true.

But then there's his fifth "obvious truth":

5) The science is not settled, not by a long shot. Last month, scientists at CERN, the prestigious high-energy physics lab in Switzerland, reported that neutrinos might—repeat, might—travel faster than the speed of light. If serious scientists can question Einstein's theory of relativity, then there must be room for debate about the workings and complexities of the Earth's atmosphere.

Well, there you go. The fact that a neutrino might — unconfirmed but still possibly might! — travel faster than light means that climate change models are crap. This is what passes for serious scientific thinking on the right.

The practical issues surrounding climate change are gargantuan. I myself am pessimistic that the human race will collectively decide to address them, which is why I support research into geoengineering as a possible last resort and, in the meantime, hope and pray that we figure out a way to generate lots of clean energy in the fairly near future.

But that has nothing to do with whether or not climate change is real. It is. Our current models might turn out to be off by 10% or 20% or 50% — in either direction, mind you — but they're not wrong. When you dump greenhouse gases into the atmosphere, more heat is trapped and the planet warms up very quickly (on geological scales). And when the planet warms up, lots of very bad stuff happens. It's just head-in-the-sand foolish to pretend otherwise. Even after Einstein came along, you'd still kill yourself just as badly if you jumped out of an airplane. Newton wasn't very wrong, after all.

Yet More on the PC vs. Mac Wars

| Thu Oct. 6, 2011 5:45 PM EDT

In my post this morning about why Apple lost the personal computing battle, I noted that a big part of the reason was the much lower cost of PCs vs. Macs. Matt Yglesias tweets back:

Actually, they did in a way. The original version of Windows was designed to work with the first CGA color adapter, and in order to keep costs down that adapter only supported 16 colors. Later adapters supported more colors, but Windows retained a considerable amount of backward compatibility with old hardware for a very long time. Thus, even as late as the early-90s, versions of Windows were still using logos that rendered properly on ancient hardware.

If everyone will indulge me in a bit of nostalgia, I want to make a broader point here. To understand why PCs beat Macs, you have to understand the era in which the battle was fought. And in that era, the 80s and early 90s, the personal computer world was controlled almost hegemonically by business customers. It's hard to overstate just how overwhelming this dominance was: corporate customers probably outnumbered home users by three or four to one, and even at that, a lot of the home users bought PCs mainly because they wanted to bring in work from the office. It was this corporate domination of the market that drove its early evolution. Here are a few of the ways this played out:

  • The IBM imprimatur. This was absolutely key to legitimizing the business market. Corporate IT managers just flatly weren't going to buy a million dollars worth of personal computers from their corner Radio Shack or from some shaggy-haired guy in Cupertino. They had lots of IBM gear they needed to interoperate with, they had IBM networks they needed to plug into, and they had IBM service contracts already in place to cover their maintenance needs. Initially, the only way they were going to buy PCs was if they came from IBM, and later on only if they were compatible with all their existing IBM PCs.
  • Backward compatibility. Home users get annoyed when new software isn't backward compatible. When I upgraded to Windows 7, I lost the ability to play my favorite computer Yahtzee game. Boo hoo. But corporate IT managers are absolutely rabid about backward compatibility. This isn't because they want to play Yahtzee. It's because they have huge fleets of hardware, some of it quite elderly, and they want new software to work on it. What's more, tucked away in various corners of the company there are people running ancient custom applications that are mission critical and absolutely can't break when the OS or the network software is upgraded. Companies like IBM and Microsoft take this very seriously, and it drives a lot of their design decisions. This is why you end up with bloated operating systems and oddities like ugly Windows logos.
  • Portability. As I said earlier, laugh all you want at the original Compaq portable that was the size of a sewing machine, but in 1983 it was a big deal. Ditto for the clamshells that debuted later in the decade, weighing in at a svelte 8-10 pounds. But big or not by today's standards, they were portable. And since business people travel a lot, having a portable machine you could take out to a client's site was a godsend. Apple just didn't have anything to compete here.
  • Business applications. Unless you were there, it's hard to explain just how thoroughly Lotus 1-2-3 was the killer app of the early 80s. Everyone used it, and it was available only on PCs. Likewise, lots of serious business apps ran on top of databases like dBase or R:Base, and those were available only on PCs.
  • Networks. Printers were expensive, so IT managers needed PCs to be on a corporate network. Novell and Banyan networks were designed with PCs in mind, and only the very courageous tried to make a Macintosh work on a PC network. It wasn't impossible or anything, but believe me, you were a lot better off sticking to PCs on the networks available back then.
  • Flexibility/Expandability. For a few years in the 80s I was the product manager for a very sophisticated communications board for IBM PCs. It allowed corporations to build specialized apps that used X.25 or SDLC or things like that, and it wasn't available for Macs. Why? Because Apple didn't allow you to plug boards into a Mac. PCs did. So if you needed a specialized piece of hardware, you could get it. Or if you just wanted more serial ports or more memory, you could pop in an AST 6-Pack and you were good to go. This was a big deal for corporate IT guys. Apple didn't offer it.
  • Low cost. Nuff said about that. Thanks to intense competition, PCs were just way less expensive than Macintoshes.

Back in the 80s and early 90s, if you wanted to buy a bunch of personal computers for your company, you'd ask around. And your financial analysts would tell you they used 1-2-3, so you'd better buy PCs. Your IT guy would tell you the corporate network was all built around PCs, so you'd better buy PCs. The CFO would tell you your project budget was a million bucks, so you'd better buy PCs. So guess what? You bought PCs. That's why Apple lost the market share war.

Why this walk down memory lane? Just to explain the environment that got us where we are today, an environment that's largely fading away. But it existed 30 years ago. Business buyers didn't buy PCs because they were mindless drones, they bought them because they really, truly had excellent reasons for preferring them to Macintoshes. And when you bought a PC for your home, it made sense to get one that was compatible with the PC in your office.

This created a virtuous1 circle: corporate customers preferred PCs, and as a result, when Microsoft set their development priorities, they listened to their big corporate customers. And given a choice between a clunky but functional mail merge function or a snazzier user interface, those customers voted unanimously for the mail merge function. And they voted for low cost, backward compatibility, network functionality, and portability. So that's what they got.

Today, software has so much functionality that it makes sense even for business users to start thinking more about ease of use and design esthetics. But back then it really didn't. So, quite rationally, you got frenetic development of new features even if it sometimes came at the cost of reliability and ease of use. That environment may be long gone, but it explains a lot about why Windows PCs look and feel clunkier than Macs but still rule the roost nonetheless.

1Well, a circle, anyway. You can decide for yourself if it was virtuous.

Advertising in the Age of Social Media

| Thu Oct. 6, 2011 2:44 PM EDT

Julian Sanchez points out today that chain restaurants are largely an answer to a signaling problem: once cars allowed us to routinely travel to unfamiliar places, we needed a way to avoid truly awful food. Chains may not have offered the best food in a given place, but they guaranteed that you wouldn't get something too horrible.

Branding and marketing in general serve this same signaling purpose, but what happens if consumer rating services like Yelp take over the world?

Imagine [] what effect it might have if, five or ten years hence, augmented reality using sophisticated image recognition were as ubiquitous as Internet-enabled phones are becoming in the developed world. Imagine that, for nearly any product consumers encountered, some kind of aggregate rating—based on whatever criteria the individual has determined are most important—would simply appear, with minimal effort. Simply looking at an aisle of products—or even passing shops on the street—I might effortlessly learn which were deemed most satisfactory by people with tastes similar to mine. My incentive to take the time to rank products would be provided by my desire to give the system a basis for determining which other user’s rankings were most likely to be relevant for me. (Think here of Netflix recommendations or other type of social filtering, where contributing ratings enables the system to make better predictions about what I am likely to enjoy.)

With such information more directly available, marketing would become far less relevant to the buyer—and a far less worthwhile investment for the producer. Products, of course, would still need to be distinguished in some way, but a seller with a superior product would be far better able to compete without investing in a costly national marketing campaign. Advertising might be initially important in raising awareness about a new product and building an initial pool of reviews, but its salience would rapidly diminish.

I'd need to think about this some more to decide if I agree. In general, I feel that the power of corporate marketing is routinely underestimated by internet-centric consumers. Remember the Cluetrain Manifesto? Well, it turned out that to a large extent, corporate America adapted just fine to the power of conversation and ended up controlling large swathes of the internet, not the other way around. I suspect that corporate advertisers will adapt just fine too. Marketing is simply too central to human activity to be reined in significantly.

Will marketing change a lot? Probably so. But my gut feel is that it will remain controlled by gigantic, rich, sophisticated players for a long time. They'll just figure out ever better and subtler ways of keeping us from knowing it.

Why Apple Never Conquered the Computing World

| Thu Oct. 6, 2011 1:05 PM EDT

Matt Yglesias, a longtime Apple junkie, wants to know why the rest of the computing industry sucks so bad:

It always seemed to me, as an Apple fan, that the qualities Apple put together were pretty basic — gadgets that work well, which a lot of people do, paired with good design sense. And in fields that aren’t computers and electronics, lots of people seem to do this....In computing, not so much. Even at the height of Microsoft’s power in the late-’90s, Windows 98 was oddly ugly. Surely the richest company on the planet could hire someone to design a better logo than this, right? Why were the default color combinations on Excel charts so wretched? Why didn’t anyone else bother to design power adaptors that look good?

On the power adapter thing, I've long wondered about that too. This is not exactly rocket science. And I can't believe that Apple's version is really that much more expensive than the brick used by everyone else. It's weird.

But on the broader question, there really is an answer. Make no mistake: Apple under Jobs did a great job. But Steve Jobs chose to keep Apple a niche product, aimed at people who could afford to spend a lot of money for a computer that worked precisely the way he wanted it to, and did so with a nice design aesthetic. There are plenty of people who like this vision, but "plenty" still means about 10% of the market or so.

The rest of the personal computing world took a different turn. No one company controlled everything, the PC was a wide-open environment, and it was both cheap and aimed at the business market, where green eyeshade accountants simply didn't care if the Windows logo was ugly. Yes, the competition over price, features, flexibility, and bringing new applications to market was so frenetic that there was a price to be paid in reliability. But no matter how much you hate it, lots and lots of people decided this was a superior approach. Sure, the parts didn't work together as well as they did on a Macintosh, but there were a lot more parts available. Sure, the design aesthetic was clunky, but lots of people didn't care and the cost was often half of a comparable Macintosh. Sure, the Mac did a few things better in the page makeup and illustration fields, but PCs did a lot of things better in the business software field, in both the front office and the back office.

This whole argument reminds me of the great VHS vs. Betamax controversy. Consumers are stupid! screamed the Beta fans when their format died. Beta was clearly a superior format. Well, no, it wasn't. There's no single continuum of "quality": every piece of technology ever invented is a series of compromises. Beta provided better picture quality, but with short runtimes and relatively high cost. VHS made a different set of compromise: adequate picture quality with higher runtimes and lower cost. That set of compromises turned out to be more popular.

Ditto for PCs. By hook or by crook, PCs and Macintoshes simply represent a different set of compromises. If you're primarily a writer or an artist, aren't too price sensitive, don't care about setting up an office network, and value good design, then a Macintosh is a great computer. But don't kid yourself: you're accepting a certain set of compromises, not picking an objectively better product. If you're primarily a financial analyst or a product manager, want lots of choices of computing platform and software, work primarily on a corporate network, don't want to spend a lot of money, and don't really care about design aesthetics, then a PC is a better choice for you. This was especially true in the 80s and 90s, when PCs and Macintoshes were initially duking it out for market share. You can laugh at those old Compaq sewing machines all you want, but in 1983 they were revolutionary and it was many years before Apple had anything to compete with it.

A lot of these differences are less pronounced now than they used to be. Although price is still a big difference, Macs are more network friendly and have a broader range of software than they did in the past. Likewise, PCs have a better design sense and work better than they used to. To a large extent, PC market share today is just an artifact of the inertia PCs gained in the 80s and 90s. Still, that inertia happened for a reason, and only part of it was the famous Microsoft marketing juggernaut. The PC market and the Macintosh market evolved as a different set of compromises to address a similar set of problems, and in the end, the PC's compromises attracted more buyers. Neither one was better or worse. They were just — as Steve Jobs might have said — different.